back to indexKevin Systrom: Instagram | Lex Fridman Podcast #243
Chapters
0:0 Introduction
0:22 Origin of Instagram
42:29 New social networks
61:34 Selling Instagram to Facebook
84:36 Features
88:42 Facebook
125:24 Whistleblower
136:43 Machine learning
146:41 Advice for startups
152:33 Money
158:33 Love
160:50 Meaning of life
00:00:00.000 |
The following is a conversation with Kevin Systrom, 00:00:16.820 |
And now, here's my conversation with Kevin Systrom. 00:00:25.880 |
let me ask you about the origin story of Instagram. 00:00:42.840 |
turned down Mr. Mark Zuckerberg and Facebook, 00:00:56.120 |
can you take me through the origin story of Instagram, 00:01:07.480 |
Instagram started out of a company actually called Bourbon, 00:01:16.200 |
And a couple of things were happening at the time. 00:01:21.400 |
not a lot of people remember what was happening 00:01:35.880 |
I'm gonna tell the world that I'm at this place. 00:01:38.680 |
- What's the idea behind this kind of app, by the way? 00:01:48.080 |
So the whole idea was to share with the world 00:01:49.840 |
what you were doing, specifically with your friends, right? 00:02:00.520 |
"What if I built a better version of Foursquare?" 00:02:05.440 |
why don't I like Foursquare or how could it be improved? 00:02:12.960 |
"I think that if you have a few extra features, 00:02:24.240 |
we were going to attack Foursquare and the likes 00:02:35.720 |
So one day we were sitting down and we asked ourselves, 00:02:51.760 |
that we thought people uniquely loved about our product 00:03:03.160 |
was not something products let you do, really. 00:03:07.800 |
"Post an album of your vacation from two weeks ago." 00:03:17.400 |
or at least I don't think they did at the time. 00:03:27.280 |
posting a photo of what you were doing at the moment 00:03:34.160 |
because we had noticed that people who used our service, 00:03:41.760 |
And yes, like we went through and we added filters 00:03:49.560 |
realized that no one wanted another check-in app. 00:03:54.000 |
but one that was much more about what you're doing 00:04:08.760 |
- When you were thinking about what people like, 00:04:14.480 |
You said, "We sat down, we wrote some stuff down on paper." 00:04:27.880 |
where does that list of three things come from exactly? 00:04:31.520 |
- Only after having studied machine learning now 00:04:57.400 |
and being able to like look at the world, try something, 00:05:01.000 |
figure out how you're wrong, how wrong you are, 00:05:04.240 |
and then nudge your company in the right direction 00:05:12.280 |
And I don't, we didn't know we were doing it at the time, 00:05:14.720 |
but that's basically what we were doing, right? 00:05:24.840 |
What, like what resonates, what doesn't resonate? 00:05:33.680 |
And it turns out if you do that enough, quickly enough, 00:05:36.320 |
you can get to a solution that has product market fit. 00:05:42.640 |
and they don't, either their learning rate's too slow, 00:05:45.520 |
they sit there and they're just, they're adamant 00:05:47.440 |
that they're right, even though the data's telling them 00:05:49.280 |
they're not right, or they, their learning rate's too high 00:06:01.960 |
what we were saying is, what are the three possible, 00:06:05.720 |
whether they're local or global maxima in our world, right? 00:06:18.680 |
And we just said, okay, like, what if we just cut out 00:06:21.400 |
most of the other stuff and focus on that thing? 00:06:24.080 |
And then it happened to be a multi-billion dollar business 00:06:32.320 |
Well, nobody ever writes about neural networks 00:06:44.680 |
- When you said the way people are using the app, 00:06:47.360 |
is that the loss function for this neural network 00:06:54.920 |
or do you have to track exactly what they're doing, 00:07:03.160 |
And it was for relatives and I like to cook a lot. 00:07:08.160 |
And I worked really hard on picking the specific dishes 00:07:13.720 |
and I was really proud because I had planned it out 00:07:22.120 |
- I don't know if you're a big Thanksgiving guy, 00:07:25.680 |
is when the turkey is cold and some things are hot 00:07:36.080 |
that I think maybe 10 people have purchased in the world, 00:07:39.400 |
but I'm one of them and I use it for recipe planning, 00:07:49.000 |
- It's not over-engineering, it's just engineering. 00:07:54.760 |
with some uncertainty with a lot of things going on. 00:08:06.480 |
- I love cooking, I love food, I love coffee, 00:08:12.200 |
And they always just take out a piece of paper 00:08:21.920 |
Why not just work backwards from the end goal, right? 00:08:26.840 |
So I probably over-specified it a bit using a Gantt chart, 00:08:38.720 |
Anyway, I was telling a story about Thanksgiving. 00:08:55.160 |
feedback is really hard to get, honestly, from people. 00:08:59.920 |
And I sat down after dinner, I said, guys, I want feedback. 00:09:07.840 |
literally everyone just said everything was great. 00:09:19.560 |
not something as high stakes as Thanksgiving dinner, okay? 00:09:22.080 |
Thanksgiving dinner, it's not that high stakes, 00:09:29.600 |
and you're trying to make something wonderful, 00:09:45.560 |
People have trouble saying, I don't like this, 00:09:48.840 |
or this isn't great, or this is how it's failed me. 00:09:51.960 |
In fact, you usually have two classes of people. 00:10:00.440 |
please tell me what you hate most about this, 00:10:09.720 |
and it's hard to parse out what is true and what isn't. 00:10:23.480 |
And that's why with whatever project I work on, even now, 00:10:27.320 |
collecting data from the beginning on usage patterns, 00:10:30.280 |
so engagement, how many days of the week do they use it, 00:10:34.360 |
how many, I don't know, if we were to go back to Instagram, 00:10:41.160 |
And don't be overly scientific about it, right? 00:10:44.760 |
'Cause maybe you have 50 beta users or something. 00:10:48.360 |
But what's fascinating is that data doesn't lie. 00:11:11.520 |
before time spent became kind of this loaded term there, 00:11:15.800 |
the idea that people's currency in their lives is time, 00:11:21.040 |
and they only have a certain amount of time to give things, 00:11:32.720 |
If they don't use it, it's because it's not great. 00:11:35.560 |
So the moral of the story is you can ask all you want, 00:11:46.840 |
data can obscure the key insight if you're not careful. 00:11:59.840 |
and they will give you totally different insights, 00:12:02.520 |
especially when you're trying to create something 00:12:20.000 |
or moments of happiness that are long lasting 00:12:22.360 |
versus like dopamine short-term, all of those things. 00:12:29.680 |
you can just get away with just asking the question, 00:12:44.440 |
but how hard was it to make pictures the first class citizen? 00:12:50.480 |
Like at whatever point Instagram became this feed of photos, 00:13:20.760 |
by staring at, I don't know, a single metric like airspeed. 00:13:33.960 |
Like it turns out you have to synthesize a bunch of metrics 00:13:43.120 |
If it's zero, you're probably gonna fall out of the sky. 00:13:46.400 |
So generally you look around and you have the scan going. 00:13:56.400 |
But people have trouble explaining how they actually feel. 00:14:01.400 |
So just, it's about synthesizing both of them. 00:14:10.520 |
where the feed became square photos basically. 00:14:21.600 |
so I believe the biggest companies are founded 00:14:30.840 |
And the biggest technical shift that happened 00:14:34.920 |
was the advent of a phone that didn't suck, the iPhone. 00:14:40.800 |
the first iPhone almost had, like it wasn't that good, 00:14:44.600 |
but compared to everything else at the time, it was amazing. 00:14:49.000 |
And by the way, the first phone that had an incredible camera 00:14:54.000 |
that could like do as well as the point and shoot 00:15:04.680 |
what will change because everyone has a camera 00:15:08.600 |
And it was so clear to me that the world of social networks 00:15:13.300 |
before it was based in the desktop and sitting there 00:15:25.440 |
If not only did you have a camera that fit in your pocket, 00:15:29.640 |
but by the way, that camera had a network attached to it 00:15:36.120 |
And a bunch of people saw it at the same time. 00:15:57.080 |
Now my phone takes pretty great photos, right? 00:16:01.360 |
Back then they were blurry, not so great, compressed, right? 00:16:05.300 |
Two, it was really slow, like really slow to upload a photo. 00:16:13.760 |
and explain to you why they're all the same size 00:16:17.020 |
And three, man, if you wanted to share a photo 00:16:25.240 |
and select all of them and upload individually. 00:16:27.840 |
And so we were like, all right, those are the pain points. 00:16:32.000 |
So one, instead of, because they weren't beautiful, 00:16:36.080 |
we were like, why don't we lean into the fact 00:16:40.680 |
My photography teacher gave me this Holga camera 00:16:43.040 |
and I'm not sure everyone knows what a Holga camera is, 00:16:44.960 |
but they're these old school plastic cameras. 00:16:47.200 |
I think they're produced in China at the time. 00:16:50.800 |
And I wanna say the original ones were like from the 70s 00:16:56.040 |
They're supposed to be like $3 cameras for the every person. 00:16:59.040 |
They took nice medium format films, large negatives, 00:17:07.920 |
and they kind of like light leaked into the side. 00:17:16.920 |
And I remember using that in Florence and just saying, 00:17:19.040 |
well, why don't we just like lean into the fact 00:17:20.880 |
that these photos suck and make them suck more, 00:17:26.260 |
And it turns out that had product market fit. 00:17:29.240 |
They were willing to share their not so great photos 00:17:31.760 |
if they looked not so great on purpose, okay? 00:17:37.440 |
- That's where the filters come into the picture. 00:17:42.600 |
and make them look extra crappy to where it becomes art. 00:17:47.040 |
And I mean, add light leaks, add like an overlay filter, 00:17:51.720 |
make them more contrasty than they should be. 00:17:54.120 |
The first filter we ever produced was called X-Pro2 00:17:58.080 |
and I designed it while I was in this small little 00:18:01.000 |
bed and breakfast room in Todos Santos, Mexico. 00:18:03.640 |
I was trying to take a break from the bourbon days. 00:18:11.080 |
And that was on that trip, worked on the first filter 00:18:15.080 |
because I said, you know, I think I can do this. 00:18:17.240 |
And I literally iterated one by one over the RGB values 00:18:22.060 |
in the array that was the photo and just slightly shifted. 00:18:26.280 |
Basically there was a function of R, function of G, 00:18:29.400 |
function of B that just shifted them slightly. 00:18:38.520 |
It just mapped from one color space to another color space. 00:18:47.320 |
I think it used to take two or three seconds to render. 00:18:50.500 |
Only eventually would I figure out how to do it on the GPU. 00:19:02.800 |
By the way, anyone who's watching or listening, 00:19:06.000 |
it's amazing what you can get away with in a startup, 00:19:10.080 |
as long as the product outcome is right for the user. 00:19:14.080 |
Like you can be slow, you can be terrible, you can be, 00:19:22.480 |
And then the question is just about compressing, 00:19:27.680 |
so that they get that product market fit instantly. 00:19:32.840 |
where those three seconds would make or break the app, 00:19:47.500 |
And delays in listening to music is a huge negative, 00:19:57.200 |
how do you know when those three seconds are okay? 00:20:16.180 |
I wasn't, I just knew how to do what I knew how to do. 00:20:23.560 |
why don't I just iterate over the values and change them? 00:20:29.600 |
compared to the alternatives, no one else used OpenGL. 00:20:35.840 |
- So everyone else was doing it the dumb way. 00:20:37.120 |
And in fact, they were doing it at a high resolution. 00:20:51.000 |
we iterated over a lot fewer pixels than our competitors 00:21:00.960 |
I mean, three seconds feels pretty good, right? 00:21:03.320 |
So on a relative basis, we were winning like a lot. 00:21:09.760 |
we actually focused on latency in the right places. 00:21:13.120 |
So we did this really wonderful thing when you uploaded. 00:21:22.720 |
and then you'd go to the, you'd go to the edit screen, 00:21:27.760 |
And on that caption screen, you'd start typing, 00:21:31.200 |
and you'd think, okay, like, what's a clever caption? 00:21:34.200 |
And I said to Mike, "Hey, when I worked on the Gmail team, 00:21:37.440 |
When you typed in your username or your email address, 00:21:43.260 |
like the probability, once you enter in your username, 00:21:47.160 |
that you're going to actually sign in is extremely high. 00:22:03.640 |
I always thought that was so fascinating and unintuitive. 00:22:07.000 |
And I was like, "Mike, why don't we just do that? 00:22:08.920 |
But like, we'll just upload the photo and like assume 00:22:14.160 |
And if you don't, forget about it, we'll delete it, right?" 00:22:24.680 |
and you'd see this little progress bar just go, 00:22:30.200 |
We were no faster than anyone else at the time, 00:22:33.000 |
but by choosing 512 by 512 and doing it in the background, 00:22:51.520 |
it's a shell game, you're just hiding the latency. 00:22:58.760 |
And I think that, so you were willing to prop 00:23:01.080 |
with a slow filter if it meant you could share it immediately. 00:23:14.560 |
And then there's some like tricks you get around 00:23:20.600 |
Like, I don't know if Spotify starts downloading 00:23:27.840 |
that are not rocket science that really help. 00:23:31.760 |
- And all of that was stuff you were explicitly 00:23:42.600 |
I mean, I'm not sure if you've met my co-founder, Mike, 00:23:45.520 |
but he's a pretty nice guy and he's very reasonable. 00:23:48.520 |
And we both just saw eye to eye and we're like, yeah, 00:23:51.720 |
it's like, if you make this fast or at least seem fast, 00:23:57.400 |
I mean, honestly, I think the most contentious thing, 00:24:01.280 |
was I was on an iPhone 3G, so like the not so fast one. 00:24:07.760 |
And he had a brand new iPhone 4, that was cheap. 00:24:15.160 |
Like when he would scroll from photo to photo, 00:24:20.120 |
But on my phone, every time you got to a new photo, 00:24:22.520 |
it was like, ka-chunk, ka-chunk, allocate memory, 00:24:32.120 |
You didn't actually say that, he's nicer than that. 00:24:34.560 |
But I could tell he wished like I would just stop being cheap 00:24:39.440 |
But what's funny is we actually sat there working 00:24:41.280 |
on that little detail for a few days before launch. 00:24:50.800 |
for all these people who didn't have nice phones, 00:25:01.880 |
what's the cool computer science problem they can solve, 00:25:22.920 |
And listen, if it happens to involve something complex 00:25:30.520 |
those experiences are just sitting there waiting to be built 00:25:37.520 |
But everyone is just like so stuck in their own head 00:25:44.800 |
- And you also, maybe to flip the loss function there is, 00:25:48.200 |
you're trying to minimize the number of times 00:25:55.440 |
when you go to the next photo, it freezes for a little bit. 00:25:58.200 |
So it's almost, as opposed to maximizing pleasure, 00:26:00.720 |
it's probably easier to minimize the number of like, 00:26:06.440 |
And as we all know, you just make the pleasure negative 00:26:13.080 |
- We're mapping this all back to neural networks. 00:26:16.560 |
Which is, I don't know a lot about machine learning, 00:26:25.920 |
and planning out more than the greedy single experience, 00:26:43.720 |
But like, what is the right way to onboard someone? 00:26:52.640 |
So not just saying, oh, did the photo load slowly 00:26:55.640 |
a couple times, or did they get a great photo 00:27:00.080 |
But like, what are the things that are gonna make 00:27:10.120 |
not just minimize bad experiences in the short run, 00:27:17.880 |
And I'm not gonna claim that I thought that way at all 00:27:23.120 |
But if I were going back and giving myself any advice, 00:27:25.560 |
it would be thinking, what are those second order effects 00:27:30.360 |
And it turns out having your friends on the service 00:27:34.840 |
So starting with a very small group of people 00:27:39.920 |
which we did, we seeded the community very well, I think, 00:27:55.200 |
building a certain kind of community within the app. 00:27:57.880 |
See, I wasn't sure what exactly you meant by that 00:28:01.920 |
Maybe you can elaborate, but as I understand now, 00:28:04.400 |
it can literally mean get your friends onto the app. 00:28:11.960 |
You can build an amazing restaurant or bar or whatever, 00:28:16.680 |
right, but if you show up and you're the only one there, 00:28:20.280 |
is it like, does it matter how good the food is? 00:28:41.360 |
I'm gonna say genius, even though it wasn't really genius, 00:29:00.880 |
I didn't even realize that this thing was a social network 00:29:27.000 |
- Yeah, but the thing is, when you say friends, 00:29:29.560 |
I mean, we're not necessarily referring to friends 00:29:32.720 |
So you're not bringing your physical friends with you. 00:29:40.120 |
that it's almost like building any kind of community. 00:30:00.320 |
But the fact that you could connect with people 00:30:02.160 |
who took really beautiful photos in a certain style 00:30:05.320 |
all around the world, whether they were travelers, 00:30:07.320 |
it was the beginning of the influencer economy. 00:30:11.400 |
It was these people who became professional Instagrammers 00:30:24.120 |
And all of a sudden you had this moment in the day 00:30:27.120 |
and sure, you could see what your friends were doing, 00:30:43.920 |
You didn't have to speak English to use it, right? 00:30:54.400 |
And even if our translations were pretty poor, 00:30:57.840 |
the idea that you could just connect with other people 00:31:07.120 |
Like what programming language you were talking about? 00:31:16.280 |
The only thing that was complex about Instagram 00:31:18.680 |
at the beginning, technically, was making it scale. 00:31:23.480 |
And we were just plain old objective C for the client. 00:32:01.240 |
'cause it's all these machine learning libraries 00:32:22.520 |
still exists and people use for basically the backend. 00:32:27.280 |
And then you threw a couple interesting things in there. 00:32:32.200 |
It was a little bit like hipster database at the time. 00:32:41.040 |
But we used it because it had a bunch of geo features built 00:32:45.600 |
in because we thought we were gonna be a check nap. 00:32:51.440 |
and you were into Postgres before it was cool. 00:32:55.040 |
not only hipster photo company, hipster tech company. 00:33:00.120 |
We also adopted Redis early and like loved it. 00:33:13.960 |
There was nothing, no machine learning at all, zero. 00:33:20.760 |
Are we talking about a hundred users, a thousand users? 00:33:30.960 |
I mean, seriously, like you can get away with a lot 00:33:38.400 |
Like I think a lot of startups try to over-engineer 00:33:41.080 |
their solutions from the beginning to like really scale 00:33:45.680 |
That being said, most of the first two years of Instagram 00:33:48.600 |
was literally just trying to make that stack scale. 00:33:54.160 |
It was like, literally just like, where do we put the data? 00:34:11.200 |
- Can you speak to the choices you make at that stage 00:34:18.480 |
computer infrastructure or do you build in-house? 00:34:22.640 |
- I'm only laughing because we, when we launched, 00:34:35.920 |
When I worked at a company called Odeo that became Twitter, 00:34:38.520 |
I remember visiting our space in San Francisco. 00:34:41.200 |
You walked in, you had to wear the ear things 00:34:49.920 |
And I was the intern, so I just like held things. 00:34:52.720 |
But I thought to myself, oh, this is how it goes. 00:35:02.280 |
and they were like, oh, how are things going? 00:35:03.560 |
We're like, ah, you know, try to scale this thing. 00:35:17.760 |
how deep in it we were because we had no idea 00:35:24.400 |
Anyway, that night we went back to the office 00:35:27.000 |
and we got on AWS, but we did this really dumb thing 00:35:32.960 |
but we brought up an instance, which was our database. 00:35:37.960 |
It was gonna be a replacement for our database. 00:35:40.920 |
But we had it talking over the public internet 00:35:43.720 |
to our little box in LA that was our app server. 00:36:07.920 |
We made a bunch of really dumb mistakes initially. 00:36:10.640 |
I think the question is how quickly do you learn 00:36:13.600 |
And do you do the right thing immediately right after? 00:36:16.360 |
- So you didn't pay for those mistakes by failure. 00:36:23.600 |
I guess there's a lot of ways to sneak up to this question 00:36:37.680 |
that's easy to spread across a large number of instances. 00:36:51.600 |
Just find product market fit, duct tape it together. 00:36:56.600 |
I think there's a big caveat here, which I wanna get to. 00:36:59.520 |
But generally all that matters is product market fit. 00:37:06.880 |
Do not worry about when 50,000 people use your product 00:37:10.280 |
because you will be happy that you have that problem 00:37:18.640 |
where they go from nothing to something overnight 00:37:35.040 |
worry way too much about scaling way too early 00:37:38.000 |
and forget that they actually have to make something 00:37:48.000 |
I mean, hiring quickly people who have seen the game before 00:38:11.400 |
when we wanted to rewrite parts of the product 00:38:14.600 |
and know that we weren't breaking something else. 00:38:17.240 |
Tests are one of those things where it's like, 00:38:27.000 |
when you don't want them to break and they're annoying 00:38:30.440 |
But looking back, I think that like long-term optimal, 00:38:38.760 |
because anyone could touch any part of the product 00:38:41.920 |
and know that they weren't gonna bring down the site 00:38:45.880 |
- At which point do you know product market fit? 00:38:50.520 |
Is it all it takes is like 10 people or is it a thousand? 00:39:03.840 |
So, it depends how big your business is trying to be. 00:39:08.840 |
But if I were signing up a thousand people a week 00:39:18.360 |
And even like as you started getting more people 00:39:27.360 |
or totally depends what type of business you're in. 00:39:39.640 |
So, I spent a lot of time when I work with startups 00:39:44.360 |
have you looked at that cohort versus this cohort, 00:39:48.440 |
or whether it's people signing up for the service. 00:39:52.280 |
But a lot of people think you just have to hit some mark 00:39:57.800 |
but really seven-ish billion people in the world, 00:40:01.960 |
most people forever will not know about your product. 00:40:05.320 |
There are always more people out there to sign up. 00:40:07.880 |
It's just a question of how you turn on the spigot. 00:40:20.320 |
Or do you just try to find product market fit 00:40:24.280 |
and get a lot of users to enjoy using your thing? 00:40:35.440 |
he was one of our earlier investors and he was saying, 00:40:39.120 |
hey, like, have you been doing any angel investing lately? 00:40:42.560 |
I'm just like focused on what I want to do next. 00:40:44.760 |
And he said, the number of financings have just gone bonkers 00:40:51.760 |
like people are throwing money everywhere right now. 00:41:00.640 |
do you have an inkling of how you're going to make money? 00:41:04.200 |
Or are you really just like waving your hands? 00:41:07.440 |
I would not like to be an entrepreneur in the position of, 00:41:11.480 |
well, I have no idea how this will eventually make money. 00:41:19.160 |
let's say you wanted to start a social network, right? 00:41:22.920 |
Not saying this is a good idea, but if you did, 00:41:25.920 |
there are only a handful of ways they've made money 00:41:27.800 |
and really only one way they've made money in the past, 00:41:30.920 |
So, you know, if you have a service that's amenable to that, 00:41:36.800 |
and then I wouldn't worry too much about that 00:41:40.920 |
you can hire some smart people and figure that out. 00:41:49.080 |
especially the ones doing like enterprise software, 00:41:54.880 |
to be worried about money from the beginning, 00:42:01.640 |
Of course you need to be worried about money, 00:42:12.120 |
If you have a roadmap to that, then that's great. 00:42:17.480 |
maybe never, like we're working on this Metaverse thing, 00:42:30.440 |
you said you're not saying it's necessarily a good idea 00:42:46.840 |
or a Twitter or an Instagram and maybe even greater scale? 00:42:56.400 |
- Yeah, if I knew, I'd probably be doing it right now 00:43:00.160 |
- I mean, there's a lot of ways to ask this question. 00:43:03.640 |
One is create a totally new product market fit, 00:43:13.160 |
or literally out-compete Facebook at its own thing 00:43:23.840 |
is to look for the cracks, look for the openings. 00:43:31.080 |
I mean, no one competed with the core business of Google. 00:43:33.480 |
No one competed with the core business of Microsoft. 00:43:56.040 |
Or that Facebook wasn't doing or not paying attention to 00:44:13.840 |
I can't remember the name, but talk about moats 00:44:19.720 |
is where your competitor like literally can't pivot 00:44:22.240 |
because structurally they're set up not to be there. 00:44:30.480 |
do you know how many people are like, images? 00:44:52.840 |
but wait, now another image-based social network 00:44:58.800 |
Like the prior, so you asked me, is it possible? 00:45:05.680 |
is because my prior is that it's happened once every, 00:45:09.600 |
I don't know, three, four or five years consistently. 00:45:13.000 |
And I can't imagine there's anything structurally 00:45:22.920 |
and there's no reason to believe that's going to stop. 00:45:25.560 |
- And it's subtle too, because like you said, 00:45:32.640 |
but there's something fundamentally different 00:45:37.600 |
and a five second video and a 15 second video 00:45:45.000 |
I mean, I think one of the reasons Snapchat exists 00:45:50.160 |
on posting great, beautiful manicured versions 00:46:03.400 |
I just like wish I could share something going on in my day. 00:46:06.760 |
Like, do I really have to put it on my profile? 00:46:12.760 |
And that opened up a door, it created a market, right? 00:46:16.640 |
And then what's fascinating is Instagram had an explore page 00:46:20.440 |
for the longest time and it was image driven, right? 00:46:28.360 |
That is effectively TikTok, but obviously focused on videos. 00:46:32.520 |
And it's not like you could just put the explore page 00:46:39.720 |
These are the hard parts about product development 00:46:44.200 |
but they're all versions of the same thing with varying, 00:46:49.000 |
if you line them up in a bunch of dimensions, 00:46:54.320 |
they're different values of the same dimensions, 00:46:57.000 |
which is like, I guess, easy to say in retrospect. 00:46:59.680 |
But like, if I were an entrepreneur going after that area, 00:47:05.160 |
What needs to exist because TikTok exists now? 00:47:08.480 |
- So I wonder how much things that don't yet exist 00:47:18.040 |
So in the space of how the feed is generated. 00:47:21.680 |
So we kind of talk about the actual elements of the content. 00:47:37.240 |
Because a lot of the criticism towards social media 00:47:47.360 |
there's certainly a hunger for social media algorithms 00:47:56.840 |
I don't think anyone, everyone's like complaining, 00:48:04.160 |
but I keep doing it 'cause I'm addicted to it. 00:48:19.320 |
I wonder if it's possible to disrupt in that space. 00:48:36.720 |
Because that's why you call it a social network. 00:48:39.720 |
But what does social networks actually do for you? 00:48:47.520 |
hey, there's this site, it's a social network. 00:48:54.400 |
One is that people you know and have social ties with 00:48:59.080 |
distribute updates through whether it's photos or videos 00:49:18.120 |
which is there's all this content out in the world 00:49:20.840 |
that's entertaining, whether you wanna watch it 00:49:25.160 |
And matchmaking between content that exists in the world 00:49:38.480 |
But my point is it could be video, it could be text, 00:49:51.800 |
Like they basically distributed interesting content to you. 00:50:03.560 |
because I think people are part of the root cause 00:50:07.200 |
So for instance, often in recommender systems, 00:50:13.760 |
which is just like of our vast trove of stuff 00:50:18.400 |
what small subset should we pick for you, okay? 00:50:28.360 |
Then there's a ranking step, which says, okay, 00:50:42.520 |
Step one is we've limited everything you could possibly see 00:50:46.840 |
to things that your friends have chosen to share 00:50:52.320 |
What things do people generally want to share? 00:50:54.600 |
They wanna share things that are gonna get likes, 00:51:27.440 |
And that's something I've been thinking a lot about 00:51:38.080 |
from accounts that you've followed for whatever reason? 00:51:40.960 |
And TikTok, I think has done a wonderful job here, 00:51:47.200 |
And if you produce something fascinating, it'll go viral. 00:51:50.920 |
But like, you don't have to be someone that anyone knows. 00:51:54.800 |
You don't have to have built up a giant following. 00:52:00.080 |
You don't have to try to maintain those followers. 00:52:01.880 |
You literally just have to produce something interesting. 00:52:04.680 |
That is, I think, the future of social networking. 00:52:11.720 |
it's far less about people manipulating distribution 00:52:15.880 |
and far more about what is like, is this content good? 00:52:32.120 |
- So that's almost like creating an internet. 00:52:33.640 |
I mean, that's what Google did for web pages. 00:52:54.520 |
and have people discover stuff on that new internet 00:53:05.120 |
Like I always talked about, as I was studying this stuff, 00:53:08.480 |
I would always use the word query and document. 00:53:10.520 |
So I was like, why are they saying query and document? 00:53:22.120 |
I'm not gonna claim to know how Instagram or Facebook 00:53:26.200 |
but you know, if you want to find a match for a query, 00:53:30.320 |
the query is actually the attributes of the person, 00:53:36.760 |
maybe some kind of summarization of their interests. 00:53:43.520 |
And by the way, documents don't have to be texts. 00:53:48.240 |
I don't know what the limit is on TikTok these days. 00:54:03.240 |
I mean, I have spent a lot of time thinking about this 00:54:06.280 |
and I don't claim to have mastered it at all, 00:54:08.840 |
but I think it's so fascinating about where that will go 00:54:18.160 |
So the other thing from an alien perspective, 00:54:40.680 |
the best of that person or the worst of that person. 00:54:50.840 |
When we look at the big picture of our lives, 00:54:59.320 |
Am I proud that I experienced or read those things 00:55:04.840 |
Just in that kind of self-reflective kind of way. 00:55:10.240 |
I wonder if it's possible to have different metrics 00:55:26.920 |
And that feels like it's actually strongly correlated 00:55:34.760 |
but in the long-term, it's like the same kind of thing 00:55:37.880 |
where you really fall in love with a product. 00:55:51.640 |
understanding that you're a better human being 00:55:55.080 |
And that's what great relationships are made from. 00:56:01.680 |
and we like being together or something like that. 00:56:09.280 |
that could be optimized for by the algorithms. 00:56:12.160 |
But anytime I kind of talk about this with anybody, 00:56:21.040 |
by the engagement, if it's ad-driven especially." 00:56:31.680 |
One is to pull back the curtain on daily meetings 00:56:36.680 |
inside of these large social media companies. 00:56:42.880 |
or at least the people that are tweaking these algorithms 00:56:47.920 |
And there's these things called value functions, 00:56:50.080 |
which are like, "Okay, we can predict the probability 00:56:58.240 |
or the probability that you will leave a comment on it, 00:57:09.480 |
that basically has a bunch of heads at the end 00:57:18.240 |
And then in these meetings, what they will do is say, 00:57:29.160 |
And there may be even some downstream thing, right? 00:57:40.080 |
"Well, what are our goals for this quarter at the company? 00:57:50.640 |
Into a single scalar which they're trying to optimize. 00:57:59.840 |
something called meaningful interactions, right? 00:58:04.160 |
And I don't actually have any internal knowledge. 00:58:24.560 |
but what I will say is that trade-offs abound. 00:58:28.760 |
And as much as you'd like to solve for one thing, 00:58:31.680 |
if you have a network of over a billion people, 00:58:34.680 |
you're going to have unintended consequences either way. 00:58:38.840 |
So what you're describing is effectively a value model 00:58:43.400 |
this is the thing that I spent a lot of time thinking about. 00:58:48.600 |
in a way that like actually measures someone's happiness? 00:58:58.160 |
well, I kind of think like the more you use the product, 00:59:02.960 |
That was always the argument at Facebook, by the way. 00:59:09.240 |
Turns out there are like a lot of things you use more 00:59:14.040 |
just let's think about whether it's gambling or whatever, 00:59:22.760 |
obviously you can't map utility and time together easily. 00:59:28.800 |
So when you look around the world and you say, 00:59:30.440 |
well, what are all the ways we can model utility? 00:59:34.360 |
please, if you know someone smart doing this, 00:59:42.680 |
like everyone interesting I know in machine learning, 00:59:46.440 |
like I was really interested in recommender systems 00:59:57.640 |
- You just made people at OpenAI and DeepMind very happy. 01:00:01.120 |
- But I mean, but what's interesting is like, 01:00:06.640 |
I mean that paper that where they just took Atari 01:00:09.200 |
and they used a ConvNet to basically just like 01:00:15.080 |
Absolutely mind blowing, but it's a game, great. 01:00:24.400 |
Like how can you construct that feed in such a way 01:00:29.400 |
that optimizes for a diversity of experience, 01:00:38.400 |
it turns out in reinforcement learning, again, 01:00:41.160 |
as I've learned, like reward design is really hard. 01:00:45.160 |
And I don't know, like how do you design a scalar reward 01:00:51.280 |
I mean, do you have to measure dopamine levels? 01:00:59.440 |
Currently it feels like there's not enough signals 01:01:01.640 |
coming from the human being users of this algorithm. 01:01:19.960 |
rethinking the algorithm, rethinking reward functions, 01:01:37.400 |
when Instagram, along with its massive employee base 01:01:42.400 |
of 13 people was sold to Facebook for $1 billion. 01:01:48.000 |
What was the process like on a business level, 01:01:52.120 |
What was that process of selling to Facebook like? 01:01:58.080 |
which is I worked in corporate development at Google, 01:02:08.080 |
And I had sat through so many of these meetings 01:02:12.080 |
We actually, fun fact, we never acquired a single company 01:02:15.720 |
So I can't claim that I had like a lot of experience, 01:02:53.000 |
and we're kind of like the hot startup at the time. 01:02:57.400 |
And I remember going into a specific VC and saying, 01:03:04.520 |
And I've never seen so many jobs drop, all in unison, right? 01:03:10.200 |
And I was like thanked and walked out the door 01:03:16.600 |
from someone who was connected to them and said, 01:03:25.000 |
And I can't tell if that was like just negotiating or what, 01:03:30.000 |
but it's true, like no one offered us more, right? 01:03:44.480 |
well, literally no one's biting at 500 million. 01:03:47.000 |
And eventually we would get Sequoia and Greylock 01:03:50.480 |
and others together at 500 million basically post. 01:03:54.520 |
It was 450 pre, I think we raised $50 million. 01:04:02.800 |
Like, I don't know if it was because we were just coming out 01:04:05.280 |
of the hangover 2008 and things were still on recovery mode. 01:04:11.480 |
But then along comes Facebook and after some negotiation, 01:04:16.480 |
we've 2X to the number from a half a billion to a billion. 01:04:30.200 |
We thought we could, their resources would help us scale it. 01:04:58.760 |
And thus began the giant long bull run of 2012 01:05:07.840 |
about like how many unicorns exist and it's absurd. 01:05:11.880 |
But then again, never underestimate technology 01:05:16.560 |
And man, costs have dropped and man scale has increased. 01:05:19.880 |
And you can make businesses make a lot of money now, 01:05:29.760 |
to sell a company with 13 people for a billion dollars? 01:05:38.160 |
and say 500 million or 1 billion with Mark Zuckerberg? 01:05:49.440 |
- Especially like you said before the unicorn parade. 01:05:58.800 |
- You were at the head of the unicorn parade. 01:06:01.680 |
- It's the, yeah, it's a massive unicorn parade. 01:06:14.560 |
I mean, it's not, you couldn't mark to market 01:06:18.040 |
You didn't quite understand what it would be worth 01:06:23.040 |
So in a market, an illiquid market where you have one buyer 01:06:25.640 |
and one seller and you're going back and forth. 01:06:31.320 |
who were willing to invest at a certain valuation. 01:06:54.880 |
restaurants, whatever, like for a good six months after, 01:07:01.040 |
a lot of attention on the product, a lot of attention. 01:07:06.360 |
And you're like, wait, like I made a lot of money, 01:07:12.520 |
I don't know, like I don't really keep in touch with Mark, 01:07:17.920 |
is not exactly the most happy job in the world. 01:07:21.720 |
and it's really tough when you're in the limelight. 01:07:24.400 |
So the decision itself was like, oh, cool, this is great. 01:07:40.800 |
It always, I've heard like Olympic athletes say 01:07:48.320 |
Is it something like that where it feels like 01:07:52.960 |
it was kind of like a thing you were working towards? 01:08:08.480 |
I'm still human, I still have all the same problems. 01:08:11.760 |
Or is it just like negative attention of some kind? 01:08:16.080 |
But to be clear, there was a lot of happiness 01:08:20.840 |
Like we won the Superbowl of startups, right? 01:08:32.920 |
Of course we want to create great things that people love, 01:08:43.960 |
So they call it the we have arrived syndrome, 01:08:56.120 |
And I remember we had a product manager leave very early on 01:09:00.440 |
And he said to me, I just don't believe I can learn anything 01:09:12.400 |
So from 2012, all the way to the day I left in 2018, 01:09:18.960 |
with which I realized, oh, we thought we won. 01:09:23.560 |
but like there are a hundred billion dollar companies. 01:09:26.200 |
And by the way, on top of that, we had no revenue. 01:09:34.960 |
And then competitors and how fun was it to fight Snapchat? 01:09:38.360 |
Oh my God, like it was, it's like Yankees, Red Sox. 01:09:47.920 |
but the amount you can learn through that process, 01:09:52.720 |
what I've realized in life is that there is no, 01:09:58.680 |
there's always more challenge, just at different scales. 01:10:13.160 |
I say like, choose the game you like to play, right? 01:10:19.680 |
you enjoy to some extent practicing basketball. 01:10:30.480 |
which is we might've sold, but it was like, great. 01:10:40.240 |
Now I imagine you didn't ask this, but okay, so I left. 01:10:43.560 |
There's a little bit of like, what do you do next? 01:10:46.680 |
Right, like, what do you, how do you top that thing? 01:10:50.720 |
The question is like, when you wake up every day, 01:10:58.200 |
we all turn into dirt, doesn't matter, right? 01:11:09.800 |
learning like short-term versus long-term objectives. 01:11:16.560 |
what you're doing, knowing that it's gonna be painful? 01:11:21.280 |
Knowing that like, no matter what you choose, 01:11:25.560 |
Whether you sit on a beach or whether you manage 01:11:28.480 |
a thousand people or 10,000, it's gonna be painful. 01:11:38.560 |
and it's a maturation process you just have to go through. 01:11:47.120 |
how much money you make, you have to wake up the next day 01:11:50.120 |
and choose the hard life, whatever that means next. 01:11:56.720 |
- I guess what I'm trying to say is slightly different, 01:12:07.800 |
And then you just start worrying about stupid stuff. 01:12:09.960 |
Like, I don't know, like did so-and-so forget 01:12:16.240 |
Like, or, oh, I'm so angry and my shipment of wine 01:12:19.400 |
didn't show up and I'm sitting here on the beach 01:12:27.760 |
- Yeah, and at least meditation is like productive chilling, 01:12:32.600 |
to calm down and be, but backing up for a moment. 01:12:37.320 |
You might as well be like playing in the game 01:12:54.720 |
And the other way of looking at this, by the way, 01:13:10.360 |
the second one or the third one, not that great. 01:13:15.160 |
or everyone forgets the second and the third one. 01:13:17.520 |
So there's just this constant process of like, 01:13:26.800 |
has been this awesome new chapter of being like, 01:13:30.640 |
man, that's something I didn't understand at all. 01:13:33.640 |
And now I feel like I'm 1/10 of the way there. 01:13:40.240 |
So I distracted us from the original question. 01:13:42.600 |
- No, and we'll return to the machine learning 01:13:44.560 |
'cause I'd love to explore your interest there. 01:13:46.520 |
But I mean, speaking of sort of challenges and hard things, 01:13:50.040 |
is there a possible world where sitting in a room 01:14:05.640 |
What was the calculation that you were making? 01:14:26.360 |
I'm not sure you'll believe me when I say this, 01:14:36.160 |
and get out to a lot of people really, really quickly 01:14:39.560 |
with the support, with the talent of a place like Facebook? 01:14:44.560 |
I mean, people often ask me what I would do differently 01:14:49.040 |
And I say, well, I'd probably hire more carefully 01:14:51.240 |
'cause we showed up just like before I knew it, 01:14:53.760 |
we had like 100 people on the team and 200, then 300. 01:14:57.800 |
I don't know where all these people were coming from. 01:15:02.400 |
They were just like internal transfers, right? 01:15:05.120 |
So it's like relying on the Facebook hiring machine, 01:15:07.480 |
which is quite sort of, I mean, it's an elaborate machine. 01:15:27.280 |
when Ev sold Blogger to Google and then Google went public. 01:15:30.920 |
Remember, we sold before Facebook went public. 01:15:34.620 |
There was a moment at which the stock price was $17, 01:15:40.200 |
I remember thinking, what the did I just do, right? 01:15:43.360 |
Now at 320-ish, I don't know where we are today, 01:15:48.200 |
but like, okay, like the best thing by the way is like, 01:15:53.200 |
when the stock is down, everyone calls you a dope. 01:15:56.800 |
And then when it's up, they also call you a dope, 01:16:04.680 |
- So, but you know, the choice is to strap yourself 01:16:12.800 |
That's a difficult choice because there's a world. 01:16:19.800 |
which is Elon and others decided to sell PayPal 01:16:25.140 |
I mean, how much was it, about a billion dollars? 01:16:26.920 |
I can't remember. - Something like that, yeah. 01:16:28.400 |
- I mean, it was early, but it's worth a lot more now. 01:16:34.800 |
If you are an entrepreneur and you own a controlling stake 01:16:47.480 |
as a personality attached to this company, right? 01:16:50.920 |
But if you sell it and you're getting yourself 01:16:53.000 |
enough capital and you like have enough energy, 01:16:57.980 |
Or in Elon's case, like a bunch of other things. 01:16:59.960 |
I don't know, like I lost count at this point. 01:17:09.600 |
But I don't know, like, do you think Elon like cares about, 01:17:15.240 |
Like I just, he is, he created a massive capital 01:17:25.480 |
because when you are an entrepreneur attached to a company, 01:17:28.400 |
you gotta stay at that company for a really long time. 01:17:51.160 |
- There are not a lot of companies that you can be part of 01:18:10.540 |
and he wants to press the sign up button on your product. 01:18:17.420 |
The number of doors and experiences that that opened up was, 01:18:23.920 |
I mean, the people I got to meet, the places I got to go, 01:18:40.840 |
It wasn't about the money, it was about the game. 01:18:46.840 |
Is that, so here's, I often think about this, 01:18:49.360 |
like how hard is it for an entrepreneur to say no? 01:18:54.880 |
so every, like basically the sea of entrepreneurs 01:19:01.680 |
The thing you were sitting before is a dream. 01:19:24.440 |
- And so that's an interesting decision to say, 01:19:32.120 |
The reason I also say it's an interesting decision 01:19:34.520 |
because it feels like per our previous discussion, 01:19:37.800 |
if you're launching a social network company, 01:19:43.800 |
whatever that number is, if you're successful. 01:19:50.780 |
with one of the social media, social network companies, 01:19:53.800 |
that want to buy you, whether it's Facebook or Twitter, 01:20:13.680 |
How many have successfully made that decision, I guess, 01:20:28.600 |
There are also companies that don't sell, right? 01:20:31.400 |
And take the path and say, we're going to be independent. 01:20:46.840 |
like $110 million offer from Google or something. 01:20:52.660 |
And I remember there was like this big TechCrunch article 01:20:58.440 |
after talking deeply about their values and everything. 01:21:01.760 |
And I don't know the inner workings of Foursquare, 01:21:05.720 |
but like, I'm certain there were many conversations 01:21:11.760 |
Recently, like, I mean, what other companies? 01:21:16.960 |
Like, I don't know, maybe people were really interested 01:21:20.560 |
Like, there are plenty of moments where people say no 01:21:36.280 |
So I don't know, I used to think a lot about this 01:21:38.920 |
and now I just don't because I'm like, whatever. 01:21:58.440 |
why not just be like, I don't know, it was a game, 01:22:10.720 |
and it's funny you brought up Clubhouse, it is very true. 01:22:14.200 |
It seems like Clubhouse is, you know, on the downward path 01:22:19.480 |
and it's very possible to see a billion plus dollar deal 01:22:23.240 |
at some stage, maybe like a year ago, half a year ago, 01:22:28.520 |
I think Facebook was flirting with that idea too. 01:22:30.960 |
And I think a lot of companies probably were. 01:22:37.120 |
You know what, there's not like a badass public story 01:22:47.000 |
the success or the failure, more often failure. 01:22:58.560 |
I'd give them like a very fighting chance to figure it out. 01:23:02.440 |
There are a lot of times when people call Twitter 01:23:09.800 |
and not knowing anything about their internals, 01:23:11.800 |
like there's a strong chance they will figure it out 01:23:28.160 |
and then you've just never heard from them again, 01:23:36.200 |
I think it's easy for someone with enough money 01:23:51.560 |
like the number of zeros that you're talking about. 01:23:58.560 |
is not something many people are equipped to do, 01:24:01.360 |
especially not people where I think we had founded 01:24:03.880 |
the company a year earlier, maybe two years earlier, 01:24:31.640 |
Like we're, you know, now it's next shot, right? 01:24:49.200 |
like there's photos, there's one minute videos on Instagram, 01:24:54.200 |
there's IGTV, there's stories, there's reels, there's live. 01:25:04.320 |
because it feels like they're close together, 01:25:09.600 |
fundamentally distinct, like each of the things I mentioned. 01:25:15.920 |
the design philosophies behind some of these, 01:25:17.840 |
how you were thinking about it during the historic war 01:25:22.480 |
between Snapchat and Instagram, or just in general, 01:25:25.800 |
like this space of features that was discovered. 01:25:30.080 |
- There's this great book by Clay Christensen 01:25:39.280 |
but within it, there's effectively an expression 01:25:59.160 |
and you look at your product, you ask yourself, 01:26:11.680 |
and you hire products to be employees effectively. 01:26:39.920 |
With likes and, and it turns out that that will, 01:26:59.080 |
In fact, it serves it better than the original product 01:27:01.960 |
because when you're large and have an enormous audience, 01:27:06.120 |
you're worried about people seeing your stuff 01:27:11.600 |
is going to see your photo of you doing something. 01:27:13.480 |
And so it turns out that that is a more efficient way 01:27:16.680 |
of performing that job than the original product was. 01:27:26.400 |
Now, I will claim that other parts of the product over time 01:27:49.560 |
of like this job to be done that is in common. 01:27:52.080 |
And then there's just like different ways of doing it, right? 01:27:55.400 |
Apple, I think does a great job with this, right? 01:28:06.200 |
Even if they require like silly specific cords to work, 01:28:22.480 |
is that it has never fully, I think, appreciated the job 01:28:34.480 |
And then they're confused why it doesn't work. 01:28:44.320 |
So I remember I was a very early Facebook user. 01:28:47.520 |
The reason I was personally excited about Facebook 01:28:50.720 |
is because you can, first of all, use your real name. 01:29:00.840 |
I like anonymity for certain things, Reddit and so on, 01:29:09.280 |
so that I can connect with other friends of mine 01:29:13.640 |
And there's a reliable way to know that I'm real 01:29:20.000 |
And it's kind of like, I liked it for the reasons 01:29:27.140 |
But like without the form, like not everybody is dressed up 01:29:30.940 |
and being super polite, like more like with friends. 01:29:34.620 |
But then it became something much bigger than that, 01:29:49.420 |
that's not just about connecting directly with friends. 01:30:03.660 |
- Well, let's go back to the founding of Facebook 01:30:06.180 |
and why it worked really well initially at Harvard 01:30:19.840 |
that exist in the world that want to transact. 01:30:23.340 |
And when I say transact, I don't mean commercially. 01:30:25.380 |
I just mean they want to share, they want to coordinate, 01:30:28.420 |
they want to communicate, they want a space for themselves. 01:30:36.380 |
And if actually you look at the most popular products 01:30:42.100 |
if you look at things like Groups, Marketplace, 01:30:47.540 |
And Groups is effectively like everyone can found 01:30:50.380 |
their own little Stanford or Dartmouth or MIT, right? 01:30:52.900 |
And find each other and share and communicate 01:31:00.380 |
That is the core of what Facebook was built around. 01:31:03.880 |
And I think today is where it stands most strongly. 01:31:13.680 |
It feels like it's not a first-class citizen. 01:31:16.100 |
I know I may be saying something without much knowledge, 01:31:26.220 |
It feels like there needs to be a little bit more structure 01:31:33.860 |
Like Reddit is basically groups that are public and open 01:31:50.220 |
Facebook shines, I think, when it leans into that. 01:32:10.700 |
like even if you can compete with that other company, 01:32:13.060 |
even if you can figure out how to bolt it on, 01:32:16.240 |
eventually you come back and you look at the app 01:32:18.020 |
and you're like, "I just don't know why I opened this app. 01:32:24.620 |
I mean, you listed all the things at Instagram 01:32:43.100 |
but sorry, actually not use case, original job. 01:32:49.340 |
Being true to that and like being really good at it 01:32:56.140 |
I think that's how to make a company last forever. 01:33:03.900 |
about why Facebook is in the position it is today 01:33:07.580 |
is if they have had a series of product launches 01:33:16.340 |
I think they'd be in a totally different world. 01:33:36.180 |
And like both of these companies, by the way, 01:33:54.020 |
for what they're doing on these things," right? 01:34:03.780 |
it's really hard to keep consumers on your side. 01:34:11.260 |
I'm not even gonna place a judgment right here and right now. 01:34:13.340 |
I'm just gonna say that it is way better to be in a world 01:34:26.580 |
for Facebook has been fairly hard on this dimension. 01:34:40.100 |
- Yeah, I mean, TikTok isn't just literally an algorithm. 01:34:57.220 |
By the way, it's not defined by who you follow. 01:34:59.480 |
It's defined by some magical thing that, by the way, 01:35:02.700 |
to show you a certain type of content for some reason, 01:35:23.780 |
and sometimes even hated by many people in public? 01:35:38.100 |
even a more mysterious question for me, Bill Gates. 01:35:41.840 |
- What is the Bill Gates version of the question? 01:35:55.680 |
I think Mark Zuckerberg's distrust is the primary one, 01:36:05.580 |
but it's just, if you look at like the articles 01:36:14.500 |
it's confusing to me because it's like the public 01:36:27.340 |
"There's a strong case that founder-led companies 01:36:30.460 |
"have this problem and that a lot of Mark's issues 01:36:34.400 |
"today come from the fact that he is a visible founder 01:36:38.320 |
"with this story that people have watched in both a movie 01:36:42.900 |
"and they followed along and he's this boy wonder kid 01:36:45.440 |
"who became one of the world's richest people." 01:36:59.020 |
with enormous wealth and power for a variety of reasons. 01:37:07.400 |
one of which is a lot of people were really unhappy 01:37:10.400 |
about not the last election, but the last, last election. 01:37:37.200 |
Like Elon arguably, like he kept his factory open here 01:37:42.640 |
which arguably a lot of people would be against. 01:37:47.220 |
- While saying a bunch of crazy offensive things 01:37:52.360 |
- And like basically, gives the middle finger to the SEC, 01:38:02.920 |
- So I do think that the founder and slash CEO of a company 01:38:17.720 |
So, I mean, that's why it's interesting to ask you 01:38:20.120 |
because you were the founder and CEO of a social network. 01:38:30.780 |
even Jack Dorsey's this light, not to the degree, 01:38:45.940 |
I think you're right, but like he's not loved. 01:38:59.220 |
but like I really do think how a product makes someone feel, 01:39:12.740 |
like let's just go with this thesis for a second. 01:39:25.580 |
But in general, it delivers stuff quickly to you 01:39:36.700 |
But like people kind of feel that way about them. 01:39:42.860 |
But he's doing the same space stuff Elon's doing, 01:39:52.900 |
I don't have much proof other than my own intuition. 01:39:55.500 |
He is literally about living the future, Mars, space. 01:40:01.980 |
It's about going back to that feeling as a kid 01:40:08.180 |
People get behind that because it's a sense of hope 01:40:25.940 |
or they're living a FOMO type life where you're like, 01:40:34.100 |
by and large when I was there was not about FOMO, 01:40:39.900 |
although it certainly became that way closer to the end. 01:40:43.660 |
It was about the sense of wonder and happiness 01:40:53.580 |
- For the longest time, the way people explained to me, 01:40:59.380 |
If you want to go to make, feel good about life, 01:41:01.820 |
you go to Instagram to enjoy, celebrate life. 01:41:05.860 |
is they gave me the benefit of the doubt because of that. 01:41:10.020 |
is kind of makes you angry, it's where you argue. 01:41:15.180 |
that he wasn't actually the CEO for a very long time 01:41:19.260 |
So I'm not sure how much of the connection got made. 01:41:32.580 |
or it makes you not feel happy, I don't know, 01:41:36.740 |
like people are really angry about Comcast or whatever. 01:41:42.100 |
They had to rebrand, they became meta, right? 01:41:54.940 |
I think your thesis is very strong and correct, 01:42:00.100 |
but I still personally put some blame on individuals. 01:42:10.360 |
there's something about childlike wonder to him, 01:42:16.400 |
Something about, I think more so than others, 01:42:48.620 |
So your thesis really holds up for Steve Jobs, 01:42:53.060 |
because I think people didn't like Steve Jobs, 01:43:05.800 |
And if they keep delivering a product that people like 01:43:08.940 |
by being public and saying things that people like, 01:43:19.840 |
to explain to people why certain things happen, 01:43:32.480 |
or the certain kinds of like bullying happens, 01:43:36.440 |
which is like, it's human nature combined with algorithm. 01:43:42.560 |
how the spread of quote unquote misinformation happens. 01:43:53.720 |
And anything that looks at all like censorship 01:43:55.840 |
can create huge amounts of problems as a slippery slope. 01:44:03.200 |
And anytime you inject humans into the system, 01:44:12.760 |
First of all, design products that don't have those problems 01:44:15.200 |
and second of all, when they have those problems, 01:44:21.080 |
when you run a social network company, your job is hard. 01:44:26.000 |
I will say the one element that you haven't named 01:44:28.760 |
that I think you're getting at is just bedside manner. 01:44:38.600 |
had an uncanny ability in public to have bedside manner. 01:44:59.440 |
And I don't even know the name of the company. 01:45:07.440 |
- And you sit there and you're just like, what? 01:45:11.800 |
And if you lack that, or if that's just not intuitive to you, 01:45:21.520 |
And then add on top of that, missteps of companies, right? 01:45:25.780 |
I don't know if you have any friends from the past 01:45:50.400 |
where they feel like they don't trust the company 01:45:57.080 |
especially if you don't have that bedside manner. 01:46:00.000 |
And again, like I'm not attributing this specifically 01:46:02.520 |
to Mark because I think a lot of companies have this issue 01:46:06.320 |
where, one, you have to be trustworthy of the company 01:46:14.480 |
to be really relatable in a way that's very difficult 01:46:23.200 |
Jack does a pretty good job of this by being a monk. 01:46:34.380 |
Like I literally shared a desk like this with him at Odeo. 01:46:40.920 |
I remember he would leave early on like Wednesdays 01:46:49.360 |
I mean, money makes people like more creative 01:46:52.240 |
and more thoughtful, like extreme versions of themselves. 01:47:02.720 |
- We were working on some JavaScript together. 01:47:15.620 |
Yeah, I mean, not terrible, just like boring, 01:47:21.660 |
I'm surprised anyone paid me to be in the room as an intern 01:47:31.200 |
It was very helpful, so thank you if you're listening. 01:47:33.780 |
- I mean, is there Odeo that's a precursor to Twitter? 01:47:40.700 |
that this Jack Dorsey guy could be also a head 01:47:46.180 |
And second, did you learn anything from the guy that, 01:47:58.900 |
why does the world play its game in a certain way 01:48:05.800 |
Like, I mean, it's also weird that Mark showed up 01:48:18.160 |
It's a small world, but let me tell a fun story about Jack. 01:48:27.240 |
I think Ev was feeling like people weren't working hard 01:48:50.740 |
And I remember on a Friday, Ev decided he was going 01:49:01.360 |
We all put in a bucket and he tallied the votes. 01:49:21.400 |
We were a small company, but I, as the intern, 01:49:24.040 |
- So everybody knew how many votes they got individually? 01:49:27.100 |
And I think it was one of these self-accountability things. 01:49:37.100 |
like I couldn't imagine he would become what he'd become 01:49:41.120 |
But I had a profound respect that the new guy 01:49:53.860 |
But that's the one story that I remember of him, 01:49:55.680 |
like working with him specifically from that summer. 01:50:00.360 |
- There's kind of a pushback in Silicon Valley 01:50:06.120 |
the thing you admired to see the new guy working so hard, 01:50:10.280 |
that thing, what is the value of that thing in a company? 01:50:13.800 |
- See, this is, I like, just to be very frank, 01:50:17.760 |
Like I saw this really funny video on TikTok. 01:50:22.720 |
It was like, I'm taking a break from my mental health 01:50:27.560 |
So I was like, oh, it is kind of phrased that way, 01:50:35.000 |
I have worked so hard to do the things that I did. 01:50:51.720 |
I have a lot more gray hair now than I did back then. 01:51:07.200 |
because I don't actually think there's anything 01:51:09.040 |
in this world that says you have to work hard. 01:51:11.380 |
But I do think that great things require a lot of hard work. 01:51:16.400 |
So there's no way you can expect to change the world 01:51:22.880 |
the folks that I respect the most have nudged the world 01:51:25.160 |
in like a slight direction, slight, very, very slight. 01:51:29.520 |
Like even if Elon accomplishes all the things 01:51:34.000 |
he wants to accomplish, we will have nudged the world 01:51:37.040 |
in a slight direction, but it requires enormous amount. 01:51:40.160 |
There was an interview with him where he was just like, 01:51:43.200 |
he was interviewed, I think at the Tesla factory 01:51:51.240 |
but he was like visibly shaken about how hard 01:51:53.560 |
he had been working and he was like, this is bad. 01:51:55.880 |
And unfortunately, I think to have great outcomes, 01:51:57.640 |
you actually do need to work at like three standard 01:51:59.920 |
deviations above the mean, but there's nothing saying 01:52:09.960 |
And second, they certainly don't have to work hard. 01:52:12.800 |
- But I think hard work in a company should be admired. 01:52:22.960 |
you shouldn't feel good about yourself for not working hard. 01:52:34.360 |
I hate running, but I certainly don't feel good 01:52:42.960 |
There's certain values that you have in life. 01:52:47.640 |
is one of the things I think that should be admired. 01:53:03.120 |
It was helpful for me to get a sense of people from that. 01:53:13.880 |
but sometimes I'll say both if that's an option. 01:53:24.760 |
how to work smart and they're literally just lazy. 01:53:36.400 |
When you're younger and you say it's better to work smart, 01:53:54.320 |
because I'm too dumb to know how to work smart. 01:53:56.840 |
And people who are self-critical in this way, 01:53:59.120 |
in some small amount, you have to have some confidence. 01:54:03.880 |
that means you're going to actually eventually figure out 01:54:07.520 |
And then to actually be successful, you should do both. 01:54:14.040 |
which is that no one's forcing you to do anything. 01:54:22.280 |
So if you major in, I don't know, theoretical literature, 01:54:33.560 |
- Yeah, think about like theoretical Spanish lit 01:54:41.720 |
And then the number of people I went to Stanford with 01:54:44.280 |
who get out in the world and they're like, wait, what? 01:54:51.640 |
of people who have majored in esoteric things 01:54:54.800 |
So I just want to be clear, it's not about the major, 01:54:56.680 |
but every choice you make, whether it's to have kids, 01:55:09.360 |
And there's a reason why certain very successful people 01:55:14.520 |
but people who run very, very large companies or startups 01:55:33.080 |
I think it's important that the contract is clear, 01:55:36.120 |
which is to say, let's imagine you were joining 01:55:44.680 |
we're all working really, really hard right now. 01:55:57.920 |
Like, let's go back to the Yankees for a second. 01:56:05.760 |
or pitching practice or whatever for your position, right? 01:56:11.400 |
And that's actually the world that it feels like we live in 01:56:13.920 |
in tech sometimes, where people both want to work 01:56:18.480 |
but like don't actually want to work that hard. 01:56:23.280 |
Because if you sign up for some of these things, 01:56:29.000 |
There's so many wonderful careers in this world 01:56:33.760 |
But when I read about companies going to like four day 01:56:38.280 |
because I can't get enough done with a seven day week. 01:56:44.920 |
And it's like, no, I work pretty smart, I think in general. 01:56:49.320 |
if I hadn't like some amount of working smart. 01:56:55.880 |
I don't do it much anymore just because of kids 01:57:00.340 |
But one of the most important things to learn 01:57:06.140 |
But to me and to cyclists, like resting is a function 01:57:12.680 |
It's not like a thing that you do for its own merits. 01:57:17.880 |
your muscles don't recover, and then you're just not as, 01:57:21.520 |
- You should probably, the successful people I've known 01:57:27.160 |
but they know they have to do it for the longterm. 01:57:29.800 |
- They think their opposition is getting stronger 01:57:39.040 |
So, I mean, I use this thing called training peaks, 01:57:41.240 |
and it's interesting 'cause it actually mathematically shows 01:57:44.060 |
like where you are on the curve and all this stuff. 01:57:47.440 |
- But you have to rest, like you have to have that rest, 01:57:49.400 |
but it's a function of going harder for longer. 01:57:56.480 |
but a lot of people will hide behind laziness 01:57:58.580 |
by saying that they're trying to optimize for the long run, 01:58:00.860 |
and they're not, they're just not working very hard. 01:58:11.840 |
- And some of that requires for the leadership 01:58:13.880 |
to have the guts, the boldness to communicate effectively 01:58:17.920 |
I mean, sometimes I think most of the problems arise 01:58:20.760 |
from the fact that the leadership is kind of hesitant 01:58:40.620 |
And, you know, Ray at Bridgewater is always fascinating 01:58:44.220 |
because, you know, people, it's been called like a cult 01:58:52.820 |
They're like, "Listen, this is what it's like to work here. 01:58:59.260 |
And that's not going to feel right to everyone. 01:59:00.920 |
And if it doesn't feel right to you, totally cool. 01:59:04.880 |
But if you work here, you are signing up for this. 01:59:16.720 |
like no one's forcing you to work there, right? 01:59:23.560 |
kind of got stuck in a funny moment, which is, 01:59:28.340 |
to give me honest feedback of how I did on the interview. 01:59:39.220 |
did you accomplish what you were hoping to accomplish?" 02:00:00.320 |
which is, I'd love to spread some of the ideas 02:00:06.920 |
I was like, "All right." - Back to my original point, 02:00:13.920 |
And one of the other things I learned from him 02:00:18.820 |
man, humans really like to pretend like they've come to, 02:00:24.760 |
that they've come to some kind of meeting of the minds. 02:00:27.120 |
Like if there's conflict, if you and I have conflict, 02:00:30.160 |
it's always better to meet face-to-face, right? 02:00:40.700 |
even if we're not actually solving the conflict, 02:00:45.920 |
Oh yeah, it didn't really bother me that much. 02:00:50.880 |
But like, no, in our minds, we're still there. 02:00:54.360 |
- So this is one of the things that as a leader, 02:01:06.760 |
And you're like, oh, I'm going to be a leader. 02:01:09.200 |
And the idea is, well, I'm just going to ask people. 02:01:25.360 |
between me and a lot of other entrepreneurs is like, 02:01:28.280 |
I wasn't completely confident we could do it. 02:01:30.440 |
Like we could be alone and actually be great. 02:01:40.600 |
I have to be cocky and just say, I can do this on my own. 02:01:46.800 |
Some people do say that, and then it's not right. 02:01:57.440 |
- And also you talked about the personality of leaders 02:02:12.480 |
at least internally, was like, say when I screwed up. 02:02:15.880 |
And like, point out how I was wrong about things. 02:02:21.640 |
Everyone thinks they have to bat 1,000, right? 02:02:27.040 |
The best quant hedge funds in the world bat 50.001%. 02:02:45.000 |
The question is, how many at-bats do you get? 02:02:48.160 |
And on average, are you better on average, right? 02:02:57.640 |
I mean, Steve Jobs was wrong at a lot of stuff. 02:03:08.120 |
no one will ever want to watch video on the iPod. 02:03:26.200 |
- I actually think it's really valuable that, 02:03:31.520 |
where Instagram became worth like $300 billion 02:03:36.080 |
I kind of like that my life is relatively normal now. 02:03:41.800 |
I'm not making a claim that I live a normal life. 02:03:46.200 |
where there are like 15 Sherpas following me, 02:03:52.640 |
I actually like that I have a sense of humility of like, 02:03:56.560 |
I may not found another thing that's nearly as big, 02:04:04.440 |
I have to, we haven't talked about machine learning yet, 02:04:07.360 |
but my favorite thing is all these like famous, 02:04:11.280 |
you know, tech guys who have worked in the industry, 02:04:15.000 |
pontificating about the future of machine learning 02:04:31.760 |
or at least understand some of those fundamentals, 02:04:33.760 |
and the people that are just saying philosophical stuff 02:04:40.880 |
because the people who program are often not philosophers. 02:04:45.840 |
They can't write an op-ed for the Wall Street Journal. 02:04:49.080 |
- So like, it's nice to be both a little bit, 02:04:52.880 |
- My point is the fact that I have to learn stuff 02:04:59.640 |
- Yeah, I mean, again, I have a lot of advantages. 02:05:03.040 |
Like, but my point is it's awesome to be back in a game 02:05:19.240 |
but like I can't fast forward 10 years to know. 02:05:24.920 |
I have to ask you one last thing on Instagram. 02:05:34.960 |
of Instagram's harmful effect on teenage girls 02:05:38.720 |
as per their own internal research studies on the matter. 02:05:52.080 |
- You know, I often question where does the blame lie? 02:05:59.240 |
Is the blame at the people that originated the network, me, 02:06:06.760 |
Is the blame at like the decision to combine the network 02:06:10.520 |
with another network with a certain set of values? 02:06:14.420 |
Is the blame at how it gets run after I left? 02:06:20.500 |
Like, is it the driver or is it the car, right? 02:06:23.660 |
Is it that someone enabled these devices in the first place, 02:06:31.800 |
Or is it the users themselves, just human nature? 02:06:37.760 |
- Sure, and like the idea that we're going to find 02:06:45.680 |
And then the question is like, is it true at all, right? 02:06:50.800 |
or that it's true, but there's always more nuance here. 02:07:02.400 |
And I bet you there are a lot of positive effects 02:07:07.600 |
And where I've come to in my thinking on this 02:07:10.160 |
is that I think any technology has negative side effects. 02:07:13.400 |
The question is, as a leader, what do you do about them? 02:07:18.160 |
or do you just like not really believe in them? 02:07:26.240 |
we're going to acknowledge when there are true criticisms, 02:07:29.520 |
we're going to be vulnerable and that we're not perfect 02:07:33.640 |
and we're going to be held accountable along the way. 02:07:36.240 |
I think that people generally really respect that. 02:07:43.640 |
has had issues in the past is where they say things like, 02:07:46.840 |
can't remember what Mark said about misinformation 02:07:50.280 |
There was that like famous quote where he was like, 02:08:14.520 |
and what you say and don't believe you truly feel that way. 02:08:25.360 |
- So to me, it's much more about your actions 02:08:42.360 |
Is it the people you might know unit connecting you to, 02:08:50.480 |
- So we'd have to have a much deeper, I think, 02:09:02.560 |
there might be some complicated games being played here. 02:09:07.000 |
For example, you know, as somebody, I really love psychology 02:09:15.640 |
It's very difficult to study human beings well at scale 02:09:19.960 |
because the questions you ask affect the results. 02:09:27.240 |
that asks some question of which we don't know 02:09:29.640 |
the full details and there's some kind of analysis, 02:09:37.000 |
And so you can have thousands of employees at Facebook. 02:09:40.920 |
One of them comes out and picks whatever narrative 02:09:46.400 |
coupled with the other really uncomfortable thing 02:09:51.720 |
seem to understand they get a lot of clickbait attention 02:09:55.800 |
from saying something negative about social networks. 02:09:58.400 |
Certain companies, like they even get some clickbait stuff 02:10:03.400 |
about Tesla or about, especially when it's like, 02:10:07.680 |
when there's a public famous CEO type of person, 02:10:22.120 |
they get positive on that, but absent a product, 02:10:33.560 |
and with this whole dynamic of journalism and so on, 02:10:38.260 |
with "Social Dilemma" being really popular documentary, 02:10:42.100 |
it's like, all right, my concern is there's deep flaws 02:10:49.640 |
we need to deal with, like the nature of hate, 02:10:56.280 |
And then there's people who are trying to use that 02:11:02.400 |
off of blaming others for causing more of the problem 02:11:10.240 |
I'm not saying this is like, I'm just uncomfortable 02:11:12.880 |
with, I guess, not knowing what to think about any of this 02:11:16.720 |
because a bunch of folks I know that work at Facebook 02:11:21.480 |
I mean, they're quite upset by what's happening 02:11:26.880 |
good people inside Facebook that are trying to do good. 02:11:30.680 |
And so like all of this press, Jan is one of them. 02:11:43.640 |
the full nature of efforts that's going on at Facebook. 02:11:48.200 |
- Well, you just, I think, very helpfully explained 02:11:57.120 |
One is, I think I have been surprised at the scale 02:12:02.120 |
with which some product manager can do an enormous amount 02:12:19.260 |
Like I think I read a couple of them when they got published 02:12:21.900 |
and I haven't even spent any time going deep. 02:12:30.420 |
Like talk about challenging the idea of open culture 02:12:34.380 |
and like what that does to Facebook internally. 02:12:37.500 |
If Facebook was built, like I remember like my office, 02:12:43.300 |
we had this like no visitors rule around my office 02:12:45.680 |
'cause we always had like confidential stuff up on the walls 02:12:48.160 |
and everyone was super angry 'cause they were like, 02:12:50.840 |
that goes against our culture of transparency. 02:12:52.880 |
And like Mark's in the fish cube or whatever they call it, 02:13:00.800 |
And I don't know, I mean, other companies like Apple 02:13:10.240 |
And I don't know what this does to transparency 02:13:19.500 |
And you can say, well, you should have nothing to hide, 02:13:22.020 |
right, but to your point, you can pick out documents 02:13:28.500 |
So what happens to transparency inside of startups 02:13:31.380 |
and the culture that startups or companies in the future 02:13:37.980 |
that becomes the next Facebook, will it be locked down? 02:13:44.220 |
Part two, like I don't think that you could design 02:13:49.220 |
a more like a well-orchestrated handful of events 02:13:54.520 |
from the like 60 minutes to releasing the documents 02:14:00.540 |
in the way that they were released at the right time. 02:14:03.420 |
That takes a lot of planning and partnership. 02:14:06.500 |
And it seems like she has a partner at some firm, right, 02:14:16.300 |
you'd have to really believe in what you are doing, 02:14:20.900 |
really believe in it, because you are personally 02:14:36.120 |
And I don't love these conspiracy theories about like, 02:14:39.700 |
oh, she's being financed from some person or people. 02:14:48.660 |
someone thought they were doing something wrong 02:14:54.700 |
I don't know if courageous is the word, but like, 02:14:58.260 |
so without getting into like, is she a martyr? 02:15:02.620 |
Is she, right, like, let's put that aside for a second. 02:15:08.640 |
To your point, a lot of the things that like, 02:15:12.120 |
people have been worried about are already in the documents 02:15:21.040 |
I'm thankful that I am focused on new things with my life. 02:15:27.320 |
it's a really hard problem that probably Facebook 02:15:32.280 |
I'm actually just fascinated by how hard this problem is. 02:15:35.340 |
There are fundamental issues at Facebook in tone 02:15:43.240 |
And since people, organizations are not people. 02:15:51.760 |
who like literally just want to push reinforcement 02:15:59.400 |
Like they're not thinking about political misinformation. 02:16:08.380 |
and an enormously profitable machine that has trade-offs. 02:16:18.200 |
You are not completely separate from the system. 02:16:21.400 |
So I agree, it can feel really frustrating to feel 02:16:27.220 |
that you're working on something completely unrelated 02:16:34.420 |
You have to acknowledge, it's like the Ray Dalio thing. 02:16:36.420 |
You have to look in the mirror and see if there's problems 02:16:48.620 |
of the exciting places, the recommender systems 02:16:52.820 |
Where else in the world, in the space of possibilities 02:16:58.220 |
do you think we're going to see impact of machine learning 02:17:03.220 |
On the philosophical level, on a technical level, 02:17:09.500 |
- Well, I think the obvious answers are climate change. 02:17:16.720 |
Think about how much fuel or just waste there is 02:17:47.260 |
to maximize whatever economic impact we want to have. 02:17:51.120 |
Because right now, those two are very much tied together. 02:17:53.940 |
And I don't believe that that has to be the case. 02:17:56.580 |
There's this really interesting, you've read it. 02:18:11.320 |
I mean, they've done like resource planning for servers 02:18:17.840 |
I don't know if that was at Google or somewhere else, 02:18:19.600 |
but like, okay, great, you do it for servers, 02:18:21.920 |
but what if you could do it for just capacity 02:18:28.000 |
Of course, there's all the self-driving cars. 02:18:31.120 |
I don't know, I'm not gonna pontificate crazy ideas 02:18:36.360 |
using reinforcement learning or machine learning. 02:18:43.640 |
- So it's interesting to think about machine learning 02:18:53.440 |
So if you optimize, say Google Maps, something like that, 02:18:57.280 |
trajectory planning or what a map quest first. 02:19:10.820 |
- Like, and that's going to be very inefficient 02:19:17.820 |
with Waze where like I was sent on like a weird route 02:19:26.720 |
- You were the ant they sent off for exploration. 02:19:28.400 |
- Kevin's definitely gonna be the Guinea pig. 02:19:34.160 |
- Oh, going through it, I was like, oh, this is fun. 02:19:36.000 |
Like now they get data about this weird shortcut. 02:19:38.280 |
And actually I hit all the green lights and it worked. 02:19:44.720 |
- I could see you slowing down and stopping at a green light 02:19:51.000 |
like I feel like that was fairly unsatisfying 02:20:02.120 |
got better for users and it got better for the company. 02:20:06.200 |
I didn't fully understand it as an executive. 02:20:08.200 |
And I think that's actually one of the issues that, 02:20:11.040 |
and when I say understand, I mean the mathematics of it. 02:20:14.040 |
Like I understand what it does, I understand that it helps. 02:20:17.040 |
But there are a lot of executives now that talk about it 02:20:23.680 |
or they talked about the internet like 10 years ago. 02:20:31.240 |
So my sense is the next generation of leaders 02:20:37.200 |
in reinforcement learning, supervised learning, whatever. 02:20:40.200 |
And they will be able to thoughtfully apply it 02:20:42.920 |
to their companies in the places it is needed most. 02:20:56.440 |
Just to get a fundamental first principles understanding 02:21:01.040 |
So supervised learning from an executive perspective, 02:21:04.400 |
supervised learning, you have to have a lot of humans 02:21:18.560 |
sending an imperfect machine learning system out there 02:21:25.040 |
We label it by human and we send it back and forth. 02:21:55.760 |
solve the game of go and achieve incredible levels 02:22:01.120 |
Can you formulate the problem you're trying to solve 02:22:05.160 |
in a way that's amenable to reinforcement learning? 02:22:07.760 |
And can you get the right kind of signal at scale? 02:22:11.840 |
And that's kind of fascinating to see which part 02:22:19.440 |
- The fascinating thing about reinforcement learning, 02:22:37.120 |
And that is fascinating 'cause we used to just like, 02:22:42.160 |
like hope it worked and that was the fanciest version of it. 02:22:45.320 |
But now you look at it, I'm like trying to learn this stuff. 02:22:47.400 |
And I look at it, I'm like, there are like 17 02:22:56.080 |
Generally, if you're trying to like build a neural network, 02:22:58.880 |
they're pretty well-trodden ways of doing that. 02:23:07.360 |
And in reinforcement learning, I feel like the consensus 02:23:12.640 |
And by the way, it's really hard to get it to converge 02:23:17.560 |
And it like, so there are all these really interesting ideas 02:23:22.280 |
you know, like for instance, in self-driving, right? 02:23:25.720 |
Like you don't want to like actually have someone 02:23:29.360 |
get in an accident to learn that an accident is bad. 02:23:35.400 |
just simulating crazy dogs that run into the street. 02:23:45.160 |
self-driving cars, let's talk about social networks. 02:23:48.520 |
How can you produce a better, more thoughtful experience 02:23:55.840 |
And honestly, in talking to some of the people 02:24:01.640 |
most people are like, yeah, we tried a lot of things, 02:24:12.120 |
if you look at this paper that they published 02:24:17.120 |
So even at these like extremely large scales, 02:24:21.760 |
if we are not yet touching what reinforcement learning 02:24:24.320 |
can truly do, imagine what the next 10 years looks like. 02:24:29.600 |
So I really like the use of reinforcement learning 02:24:34.760 |
like with self-driving cars, it's modeling pedestrians. 02:24:37.760 |
So the nice thing about reinforcement learning, 02:24:40.440 |
it can be used to learn agents within the world. 02:24:49.360 |
like you don't hard code the way they behave. 02:24:55.200 |
was it Jack Dorsey talks about healthy conversations. 02:24:58.280 |
You talked about meaningful interactions, I believe. 02:25:15.000 |
And then you can really learn how to design the interface, 02:25:18.120 |
how you design much of aspects of the experience 02:25:21.400 |
in terms of how you select what's shown in the feed, 02:25:26.120 |
It feels like if you can connect reinforcement learning 02:25:30.920 |
And I think if you have a company and leadership 02:25:36.560 |
and can apply this technology in the right way, 02:25:41.440 |
It is mostly likely going to be a group of people 02:25:44.440 |
we've never heard about, startup from scratch, right? 02:25:47.960 |
And you asked if like new social networks could be built. 02:25:54.300 |
And whoever starts it, it might be some kids in a garage 02:25:58.400 |
that took these classes from these people, you, right? 02:26:01.280 |
Like, and they're building all of these things 02:26:07.400 |
who just like throws around reinforcement learning 02:26:11.880 |
I truly believe that it is the most cutting edge 02:26:24.040 |
It's super hard to find people that truly understand it. 02:26:38.280 |
I think self-driving is way harder than people realize. 02:26:41.840 |
- Let me ask you in terms of that kid in the garage 02:26:47.040 |
if they want to start a new social network or a business? 02:26:50.760 |
What advice would you give to somebody with a big dream 02:26:59.500 |
that even if it fails, like it was so fun, right? 02:27:07.760 |
We started Instagram because we loved photography. 02:27:12.460 |
I had seen what other social networks had done 02:27:14.520 |
and I thought, hmm, maybe we did a spin on this, 02:27:20.400 |
Like it wasn't like, it wasn't written out anywhere 02:27:28.840 |
I would have been like, I don't know, that was fun. 02:27:32.280 |
and does it position you well for the next experience? 02:27:37.280 |
That's the advice that I would give to anyone 02:27:41.160 |
which is like, does this meet with your ultimate goals? 02:27:47.160 |
because all of that, by the way, is bullshit. 02:27:48.800 |
Like you can get super famous and super wealthy. 02:27:52.320 |
And I think generally those are not things that, 02:27:55.920 |
again, it's easy to say with like a lot of money 02:27:59.400 |
that somehow like it's not good to have a lot of money. 02:28:01.760 |
It's just, I think that complicates life enormously 02:28:06.840 |
So I think it is way more interesting to shoot for, 02:28:11.720 |
that provides value in the world, that I love building, 02:28:17.480 |
That's what I would do if I were starting from scratch. 02:28:26.400 |
that you get up every morning and you're like, 02:28:36.720 |
If you were to imagine, put yourself in the mind of some- 02:28:44.360 |
No, but it's like high level, you focus on community. 02:28:53.320 |
In all honesty, I think these things are so hard to build 02:29:04.960 |
- My model is three circles and they overlap. 02:29:24.640 |
That I wanna work on because even when this is hard, 02:29:31.720 |
And the last circle is like, what does the world need? 02:29:37.720 |
'Cause there are a lot of startups that exist 02:29:39.280 |
that just no one needs or very small markets need. 02:29:44.840 |
I think if you're like, if you're good at it, 02:29:49.680 |
you're passionate about it and the world needs it. 02:29:54.160 |
and just think about those circles and think, 02:29:59.480 |
It's small, but can I get that middle section? 02:30:10.120 |
and really honest about the circle that the world needs. 02:30:16.320 |
And as opposed really honest about the passion, 02:30:22.520 |
As opposed to like some kind of dream of making money, 02:30:24.600 |
all those kinds of stuff, like literally love doing. 02:30:26.600 |
- I had a former engineer who decided to start a startup. 02:30:29.920 |
And I was like, are you sure you wanna start a company 02:30:38.160 |
and playing basketball are two very, very different things. 02:30:41.400 |
And like not everyone fully understands the difference. 02:30:51.960 |
because like they're in the middle of it now. 02:30:54.600 |
But it's really important to figure out what you're good at, 02:31:15.480 |
or think there's a bigger market than there actually is. 02:31:18.640 |
When you confuse your passion for things with a big market, 02:31:23.600 |
Like just because you think it's cool doesn't mean 02:31:28.760 |
Again, I'm a fairly like, I'm a strict rationalist on this. 02:31:32.580 |
And like sometimes people don't like working with me 02:31:40.440 |
and make bold proclamations about visiting Mars. 02:31:46.160 |
I'm like, okay, I wanna build this really cool thing 02:31:48.480 |
that's fairly practical and I think we could do it. 02:31:52.320 |
And what's cool though is like, that's just my sweet spot. 02:31:55.060 |
I'm not like, I just, I can't with a straight face 02:32:01.800 |
- What do you think about the Facebook renaming itself 02:32:15.080 |
And so it's just, again, those circles I think 02:32:21.040 |
but it's important to realize that like market matters, 02:32:36.500 |
was funding in your own journey helpful, unhelpful? 02:32:46.560 |
- You mean venture funding? - Venture funding 02:32:48.360 |
or anything, borrow some money from your parents. 02:32:57.440 |
Is there some kind of wisdom you can give there 02:33:00.280 |
because you were exceptionally successful very quickly? 02:33:03.040 |
- Funding helps as long as it's from the right people. 02:33:10.640 |
And I'll talk about myself funding myself in a second, 02:33:18.020 |
I don't really have another person putting pressure on me 02:33:24.480 |
But let's like talk about people getting funding 02:33:29.020 |
We raised money from Matt Kohler at Benchmark. 02:33:32.520 |
He's brilliant, amazing guy, very thoughtful. 02:33:41.140 |
where they raised money from the wrong person 02:33:42.920 |
or the wrong firm where incentives weren't aligned. 02:34:12.040 |
as hiring someone for your team rather than taking money. 02:34:20.920 |
to do the right thing that I think is healthy 02:34:37.680 |
And I asked him, like I was trying to figure out 02:34:42.600 |
And I asked him something about like angel investing. 02:34:48.960 |
I was like, "I don't know, like you're connected. 02:35:05.960 |
and like enable visiting Mars or something, right? 02:35:12.800 |
But I had never really thought of it that way. 02:35:14.920 |
But also with that comes an interesting dynamic 02:35:26.960 |
So you need to create other versions of that truth teller. 02:35:34.540 |
that's gonna be one of the interesting challenges 02:35:36.700 |
is how do you create that truth telling situation? 02:35:46.760 |
because I think it creates a truth telling type environment. 02:35:59.060 |
some kind of venture where you're investing in yourself? 02:36:07.860 |
I'm definitely not gonna sit on the beach, right? 02:36:13.520 |
and I gotta imagine I will help fund it, right? 02:36:21.740 |
and this is bad 'cause the S&P's done wonderfully 02:36:34.200 |
- And you can invest in yourself in the way Elon does, 02:36:37.780 |
which is basically go all in on this investment. 02:36:41.340 |
Maybe that's one way to achieve accountability 02:36:58.840 |
One of the things I think Mark said to me early on 02:37:06.300 |
we were talking about people who had left operating roles 02:37:11.700 |
and he was like, "A lot of people convince themselves 02:37:13.380 |
"they work really hard, like they think they work really hard 02:37:21.060 |
There is something about lighting a fire underneath you 02:37:25.020 |
and burning bridges such that you can't turn back 02:37:28.340 |
that I think, we didn't talk about this specifically, 02:37:33.980 |
You need to have that because there's this self-delusion 02:37:37.060 |
at a certain scale, "Oh, I have so many board calls. 02:37:40.940 |
"Oh, we have all these things to figure out." 02:37:44.100 |
This is one of the hard parts about being an operator. 02:37:52.900 |
but operating is just one of the hardest things on earth. 02:38:01.540 |
not just throwing capital in and hoping it grows. 02:38:11.580 |
But yeah, when your ass is on the line and it's your money, 02:38:15.860 |
talk to me in 10 years, we'll see how it goes. 02:38:24.980 |
and you look forward to the day full of challenges, 02:38:33.620 |
- What's the role in this heck of a difficult journey 02:38:41.700 |
What's the role of love in the human condition? 02:38:49.500 |
no way I could do what I do if we weren't together. 02:39:03.820 |
it's not about someone pulling their weight in places. 02:39:12.580 |
Mike and I and our partnership as co-founders is fascinating 02:39:18.940 |
'cause I don't think Instagram would have happened 02:39:28.620 |
that allowed us to build a better thing because of it. 02:39:32.380 |
Nicole, sure, she pushed me to work on the filters early on. 02:39:38.580 |
But the truth of it is being able to level with someone 02:39:44.740 |
and have someone see you for who you are before Instagram 02:39:49.340 |
and know that there's a constant you throughout all of this 02:39:53.260 |
and be able to call you when you're drifting from that, 02:39:55.980 |
but also support you when you're trying to stick with that. 02:39:58.380 |
That's true friendship/love, whatever you want to call it. 02:40:08.540 |
"Hey, I know you're going to do this Instagram thing." 02:40:12.860 |
"You should do it because even if it doesn't work, 02:40:16.820 |
we can move to a smaller apartment and it'll be fine. 02:40:33.520 |
She's like, "Yeah, it's great, but I love you for you. 02:40:39.660 |
That's beautiful with the Gantt chart and Thanksgiving, 02:40:42.700 |
which I still think is a brilliant effing idea. 02:40:53.060 |
so have you discovered meaning to this whole thing? 02:40:55.460 |
Why the hell are we descendants of apes here on earth? 02:41:18.940 |
You're still self-conscious about the same things. 02:41:31.700 |
and that we're all chasing these materialistic things, 02:41:38.220 |
it's almost like the Truman Show when he gets to the edge 02:41:47.580 |
"Oh, sure, it's great that we all chase money and fame 02:42:03.860 |
and that there are all these other things that truly matter. 02:42:11.500 |
That's not a case for the four-day work week. 02:42:13.900 |
What that is a case for is designing your life 02:42:20.460 |
I think we go around the sun a certain number of times 02:42:43.380 |
"Ooh, I got to choose mindfully and purposefully 02:42:54.060 |
Because you're going to wake up one day and ask yourself, 02:42:55.740 |
"Why the hell you spent the last 10 years doing X, Y, or Z?" 02:43:03.700 |
doing things on purpose because you choose to do them, 02:43:10.420 |
And not just floating down the river of life, 02:43:28.300 |
like I haven't figured out the meaning of life 02:43:35.060 |
And it's way more of like opting into the game 02:43:43.220 |
And like, don't let it happen to you, opt in. 02:43:46.940 |
- Kevin, it's great to end on love and the meaning of life. 02:43:54.740 |
- You gave me like a light into some fascinating aspects 02:44:00.180 |
And I can't honestly wait to see what you do next. 02:44:11.980 |
please check out our sponsors in the description. 02:44:19.460 |
"Focusing on one thing and doing it really, really well 02:44:26.500 |
Thank you for listening and hope to see you next time.