back to indexBuilding the Silicon Brain - Drew Houston of Dropbox
Chapters
0:0 Introductions
0:43 Drew's AI journey
4:14 Revalidating expectations of AI
8:23 Simulation in self-driving vs. knowledge work
12:14 Drew's AI Engineering setup
15:24 RAG vs. long context in AI models
18:6 From "FileGPT" to Dropbox AI
23:20 Is storage solved?
26:30 Products vs Features
30:48 Building trust for data access
33:42 Dropbox Dash and universal search
38:5 The evolution of Dropbox
42:39 Building a "silicon brain" for knowledge work
48:45 Open source AI and its impact
51:30 "Rent, Don't Buy" for AI
54:50 Staying relevant
58:57 Founder Mode
63:10 Advice for founders navigating AI
67:36 Building and managing teams in a growing company
00:00:06.840 |
This is Alessio, partner and CTO at Decibel Partners. 00:00:14.940 |
- So we're not gonna talk about the Dropbox story. 00:00:25.320 |
So as you know, most of our audience is engineering folks, 00:00:29.520 |
You obviously run Dropbox, which is a huge company, 00:00:38.480 |
- What was like the first interaction you had 00:00:48.760 |
those people started out as engineers before that. 00:00:56.560 |
Just really loved, I wanted to make computer games, 00:01:09.320 |
but I didn't do like grad level computer science. 00:01:12.520 |
I sort of got distracted by all the startup things, 00:01:15.920 |
But about several years ago, I made a couple of things. 00:01:19.560 |
I knew I wanted to go from being an engineer to a founder. 00:01:22.280 |
And then, but sort of the becoming a CEO part 00:01:28.540 |
One is that, I mean, there's a lot of repetitive 00:01:32.080 |
and manual work you have to do as an executive 00:01:34.800 |
that is actually lends itself pretty well to automation, 00:01:42.340 |
I guess what we call classical machine learning these days. 00:01:56.660 |
more elaborate scripts to understand basic classifiers 00:02:01.380 |
and regression and again, basic information retrieval 00:02:06.860 |
And there's sort of two things that came out of that. 00:02:11.680 |
And even just studying old school machine learning 00:02:22.620 |
you're giving an algorithm and spelling out to the computer 00:02:29.880 |
actually flip that, give it sort of the answer you want 00:02:39.440 |
when I would write tools to figure out time audits 00:02:44.640 |
Is this meeting a one-on-one or is it a recruiting thing 00:02:49.600 |
I started out doing that manually with my assistant 00:02:51.600 |
but then found that this was a very automatable task, 00:03:04.560 |
but any time it hit the usual malformed English 00:03:15.120 |
Maybe it would sort of identify the part of speech 00:03:20.840 |
I mean, actually, I started trying something like this, 00:03:27.880 |
And it was super hard to get those things working. 00:03:35.440 |
So actually, I'd kind of written it off a little bit. 00:03:38.040 |
But then the ChatGPT launch and GPT-3, for sure. 00:03:45.400 |
this was sort of November-ish, 2022, like everybody else. 00:04:05.800 |
and I'm coding these AI tools to automate writing 00:04:10.400 |
or to assist with writing and all these different use cases. 00:04:12.840 |
- You're like, "I'm never going back to work. 00:04:14.300 |
"I'm gonna automate all of it before I get to 5G." 00:04:17.240 |
- Ever since then, I've always been coding prototypes 00:04:20.880 |
and just stuff to make my life more convenient, 00:04:28.160 |
I think it was probably over 400 hours this year, 00:04:30.880 |
so far, coding, 'cause I had my paternity leave 00:04:33.160 |
where I was able to work on some special projects. 00:04:47.360 |
- Yeah, Swigs and I were with Sam Allman in October '22. 00:04:52.640 |
and that's why we started this podcast eventually. 00:04:54.960 |
But you did an interview with Sam seven years ago, 00:04:58.200 |
and he asked you, "What's the biggest opportunity 00:05:00.880 |
And you were like, "Machine learning and AI." 00:05:07.880 |
How should people think about revalidating expectations 00:05:19.840 |
- Heuristics for thinking about that, or how is... 00:05:23.960 |
is pretty, has evolved a lot since when I started. 00:05:27.280 |
I mean, I think everybody intuitively starts with like, 00:05:33.720 |
And the tricky thing is often those prognostications 00:05:36.680 |
are right, but they're right in terms of direction, 00:05:40.520 |
For example, even in the early days of the internet, 00:05:43.320 |
the '90s, when things were even like tech space, 00:05:45.040 |
and even before the browser, things like that, 00:05:48.640 |
people were like, "Oh man, you're gonna have, 00:05:54.500 |
"you're gonna be able to watch any movie ever created." 00:05:58.800 |
"It took 20 years for that to actually happen." 00:06:01.600 |
And before you got to DoorDash, you had to get, 00:06:05.560 |
and before you get to Spotify, you had to do like Napster, 00:06:07.880 |
and Kazaa, and LimeWire, and a bunch of like, 00:06:21.480 |
And then I think with AI, it's the same thing. 00:06:23.960 |
People are like, "Oh, it's gonna completely upend society 00:06:30.760 |
The question is like, when is that gonna happen? 00:06:32.680 |
And then with AI specifically, I think there's also, 00:06:36.680 |
in addition to sort of the general tech category, 00:06:41.320 |
I think that AI is particularly susceptible to that. 00:06:49.340 |
captured everybody's imaginations 10, 12 years ago. 00:06:55.320 |
"there's not gonna be a human driver on the road to be seen." 00:07:00.880 |
where we're in a world where you can sort of sometimes 00:07:05.880 |
Exciting, but just took a lot longer than people think. 00:07:09.640 |
And the reason is there's a lot of like time, 00:07:11.720 |
there's like a lot of engineering challenges, 00:07:13.600 |
but then there's a lot of other like societal time constants 00:07:22.560 |
that's a useful kind of framework in driving, 00:07:26.300 |
People sort of skip to like level five, full autonomy, 00:07:28.600 |
or we're gonna have like an autonomous knowledge worker 00:07:34.800 |
kind of projection that that's gonna take a long time. 00:07:37.300 |
But then when you think about level one or level two, 00:07:42.320 |
you know, we're seeing a lot of traction with those. 00:07:44.160 |
So what you see really working is the level one autonomy 00:07:49.160 |
in the AI world would be like the tab autocomplete 00:07:57.640 |
Obviously you wanna get to the highest level you can 00:08:03.600 |
and the capability just isn't there in the early innings. 00:08:05.840 |
And so, and then you think of other level one, 00:08:09.840 |
like Google Maps probably did more for self-driving 00:08:16.840 |
just like taken care of for you autonomously. 00:08:24.360 |
maybe one of the big breakthroughs was like simulation. 00:08:31.440 |
You know, how do you simulate like a product review? 00:08:53.120 |
It's pretty tricky even within your company to be like, 00:08:55.240 |
all right, let's open all this up as quote training data. 00:08:57.680 |
But, you know, I can start with my own emails 00:09:01.080 |
without running into the same kind of like privacy 00:09:08.200 |
And so that is like one level of bootstrapping. 00:09:10.520 |
But actually four or five years ago during COVID, 00:09:14.400 |
we decided, you know, a lot of companies were thinking about 00:09:17.720 |
We decided to really lean into remote and distributed work 00:09:25.640 |
And COVID kind of ripped up a bunch of things, 00:09:27.920 |
but I think everybody was sort of pleasantly surprised 00:09:33.680 |
And actually you were sort of find work was decoupled 00:09:39.320 |
which meant that things people had dreamed about 00:09:48.880 |
should we sort of hit the fast forward button 00:09:50.520 |
or should we hit the rewind button, go back to 2019? 00:09:58.520 |
We still, the in-person part's really important. 00:10:00.400 |
We can kind of come back to our working model, 00:10:04.360 |
everybody is gonna be in some kind of like distributed 00:10:07.640 |
So like, instead of like running away from this, 00:10:10.800 |
like let's do a full send, let's really go into it. 00:10:13.160 |
Let's live in the future a few years before our customers. 00:10:16.440 |
Let's like turn Dropbox into a lab for distributed work. 00:10:23.840 |
And then absolutely, like we have products like Dropbox Dash, 00:10:30.600 |
That was like very elevated in priority for me after COVID 00:10:35.800 |
we're putting a lot more stress on the system 00:10:40.680 |
And so even just like getting the right information 00:10:43.800 |
is a big fundamental challenge in knowledge work 00:10:52.600 |
yeah, there's, we can both get a lot of natural 00:10:55.640 |
like training data from just our own like strategy docs 00:10:59.800 |
There's obviously a lot you can do with synthetic data. 00:11:01.960 |
And, you know, actually like LLMs are pretty good 00:11:04.960 |
at being like imitating generic knowledge workers. 00:11:10.320 |
But yeah, the way I look at it is like really turn Dropbox 00:11:17.200 |
like what are the big problems we're going to have 00:11:18.640 |
is just the complexity on our screens just keeps growing. 00:11:20.960 |
And the whole environment gets kind of more out of sync 00:11:23.560 |
with what makes us like cognitively productive and engaged. 00:11:28.000 |
And then even something like Dash was initially seeded. 00:11:35.520 |
And along that whole learning journey with AI, 00:11:40.160 |
like that had just been the tooling for that. 00:11:42.960 |
The open source stuff had finally gotten to a place 00:11:44.600 |
where it was a pretty good developer experience. 00:11:45.920 |
And so, you know, in a few days I had sort of a 00:11:50.280 |
I'm like, oh my God, like this completely works. 00:11:53.080 |
You don't even have to get the keywords right. 00:12:01.600 |
If you choose like the right algorithm and the right approach 00:12:08.440 |
you can apply all these other techniques to give them, 00:12:11.080 |
kind of bootstrap kind of like task maturity pretty quickly. 00:12:16.200 |
let's talk about the Joe Haas and AI engineering stuff. 00:12:30.920 |
although I'm like super excited about things like Cursor 00:12:36.720 |
I mean, some off the shelf parts, some pretty custom. 00:12:41.280 |
just like AI chat UI basically as just the UI layer, 00:12:52.840 |
I mean, Sonnet 3.5 is probably the best all around. 00:12:55.920 |
But then these things are like pretty limited 00:13:01.000 |
there's a separate thing where I can say like, 00:13:02.640 |
include all these files by default with the request. 00:13:09.040 |
And I'm building mostly like prototype toy apps, 00:13:15.080 |
And so it can do these like end to end diffs basically. 00:13:22.560 |
So I have my own, when I'm on a plane or something, 00:13:28.240 |
I actually bring a gaming laptop on the plane with me. 00:13:31.480 |
It's like a little like blue briefcase looking thing. 00:13:36.680 |
And then I have, I can do like transcription. 00:13:41.040 |
Like I have an 8 billion, like llama will run fine. 00:13:44.680 |
- And you're using like a LLama to run the model? 00:13:47.160 |
- No, I use, I have my own like LLM inference stack. 00:13:50.640 |
I mean, I use this, the backend is somewhat interchangeable. 00:13:53.440 |
So everything from like XLama to VLM or SGLang, 00:13:58.080 |
there's a bunch of these different backends you can use. 00:14:01.760 |
before all this tooling was like really available. 00:14:05.840 |
I've built like my own like whole crazy environment 00:14:17.160 |
and things like that, do you like using them? 00:14:29.960 |
which means I have to sort of like scope my innovation 00:14:32.120 |
to like very specific places or like my time. 00:14:35.520 |
So for the front end, it'll be like a pretty vanilla stack, 00:14:46.840 |
then there's a whole other thing on like the backend, 00:14:48.640 |
like how do you get sort of run all these models locally 00:14:55.640 |
The scaffolding on the backend is pretty straightforward, 00:14:57.560 |
but then a lot of it is just like the LLM inference 00:15:02.000 |
of how you do generation, caching, things like that. 00:15:05.840 |
And then there's a lot, like a lot of the work is 00:15:09.680 |
like take an email, get a new, or a document, 00:15:14.440 |
any of these kinds of primitives that you work with 00:15:18.440 |
render them in a format that an LLM can understand. 00:15:20.880 |
So there's like a lot of work that goes into that too. 00:15:24.160 |
- Yeah, I built a kind of like email triage assistant. 00:15:27.600 |
And like, I would say 80% of the code is like Google, 00:15:36.320 |
And then I tried to do all these like NLP things, 00:15:39.440 |
and then to my dismay, like a bunch of reg Xs 00:15:42.640 |
were like, got you like 95% of the way there. 00:15:59.360 |
- Well, they kind of have different strengths and weaknesses. 00:16:03.240 |
I mean, it's been awesome in the last 12 months. 00:16:05.200 |
Like now you have these like long context models 00:16:09.000 |
You know, you can put a book in Sonnet's context. 00:16:12.240 |
And then now with the later versions of Llama, 00:16:16.320 |
So that's sort of the new normal, which is awesome. 00:16:27.880 |
And actually, if you provide too much irrelevant context, 00:16:42.360 |
it's basically building this whole like brain 00:16:44.120 |
that's like read everything your company's ever written. 00:16:46.840 |
And so that's not going to fit into your context window. 00:16:56.680 |
And I think these things will keep like horse trading. 00:16:58.640 |
Like maybe if, you know, a million or 10 million 00:17:00.920 |
is the new, tokens is the new context length, 00:17:16.160 |
especially because you have to scale up products 00:17:20.240 |
You do, and your toy app is not going to scale to that 00:17:22.960 |
from a cost or latency or performance standpoint. 00:17:25.440 |
So I think you really need these like hybrid architectures 00:17:28.880 |
that where you have very like purpose fit tools, 00:17:36.760 |
You're going to use like a fine-tuned 8 billion model 00:17:38.960 |
or sort of the minimum model that gets you the right output. 00:17:42.640 |
And then a smaller model also is like a lot more cost 00:17:48.400 |
- Yeah. Let's jump into the Dropbox AI story. 00:17:56.720 |
And then how did you communicate that internally? 00:17:59.440 |
You know, I know you have a pretty strong like mammal culture. 00:18:14.360 |
kind of around that honeymoon time, unfortunately. 00:18:17.680 |
In January, I wrote this like memo to the company, 00:18:19.880 |
like around basically like how we need to play offense 00:18:26.280 |
the kind of concrete is set and like the winners 00:18:29.080 |
are the winners and things are kind of frozen. 00:18:37.040 |
or the concrete unfreezes and you can sort of build, 00:18:39.360 |
do things differently and have a new set of winners. 00:18:43.520 |
As a result of a lot of that sort of personal hacking 00:18:45.880 |
and just like thinking about this, I'm like, yeah, 00:18:58.760 |
And then that got, and then calling on everybody 00:19:02.000 |
in the company to really think about in your world, 00:19:08.640 |
And FileGPT, which is sort of this Dropbox AI 00:19:14.320 |
from our engineering team as we like called on everybody 00:19:17.320 |
to like really think about what we should be doing 00:19:23.040 |
like a bunch of engineers just kind of hacked that together. 00:19:29.600 |
you can have kind of the most straightforward 00:19:31.520 |
possible integration of AI, which is a good thing. 00:19:38.680 |
And so like a pretty basic implementation of RAG 00:19:47.880 |
when we released just like the starting engines 00:19:53.080 |
- It's funny where you're basically like these files 00:19:55.600 |
that people have, they really don't want them in a way, 00:19:57.760 |
you know, like you're storing all these files 00:19:59.240 |
and like you actually don't want to interact with them. 00:20:02.800 |
And that's kind of what also takes you to Dash eventually, 00:20:06.320 |
you actually don't really care where the file is. 00:20:08.440 |
You just want to be the place that aggregates it. 00:20:10.880 |
How do you think about what people will know about files? 00:20:18.200 |
and they're just kind of like a pointer that goes somewhere 00:20:23.720 |
I mean, there's a lot of potential complexity 00:20:28.160 |
what's the difference between a file and a URL? 00:20:35.600 |
All right, well, now if it's real-time collaborative, 00:20:38.680 |
It's like a structured data you're sort of collaborating, 00:20:41.360 |
you know, that's keeping in sync, blah, blah, blah. 00:20:47.600 |
let's work back from like how humans think about this stuff 00:20:53.160 |
oh, here are my files and here are my links or cloud docs. 00:21:03.160 |
So it starts from primitives more like those, 00:21:07.680 |
And then start from like a more ideal experience. 00:21:12.760 |
we kind of have this situation that we'll look like, 00:21:16.360 |
where, all right, how do you manage your work stuff? 00:21:21.920 |
that literally hasn't changed since the early '80s, right? 00:21:33.120 |
you have Chrome or a browser that has so many tabs open, 00:21:45.200 |
was purpose-built to be the home for your work stuff 00:21:52.960 |
we get stuck in these local maxima pretty often in tech, 00:21:56.680 |
where we're obviously aware that files are not going away, 00:22:03.520 |
and where files are still gonna be the tool you use 00:22:20.600 |
Media generally, if you're making music or photos or video, 00:22:31.800 |
that a lot of stuff that used to be in Word docs 00:22:35.200 |
or Excel files, all that has tilted towards the browser 00:22:50.760 |
Now, on the other hand, it would be ironic and bad 00:22:55.960 |
that you're like, well, if it touches a file, 00:23:02.920 |
So there's a convergence that I think makes sense over time. 00:23:13.200 |
And then what's the idealized product experience? 00:23:15.280 |
And then what are the technical underpinnings of that 00:23:20.200 |
- I think it's counterintuitive that in Dash, 00:23:26.800 |
you really don't want people to store a file somewhere, 00:23:30.480 |
How do you think about the importance of storage? 00:23:32.440 |
And do you kind of feel storage is almost solved, 00:23:40.640 |
if you're dealing with large quantities of data, 00:23:44.960 |
or if you're dealing with 10 gig video files like that. 00:23:48.280 |
Then you sort of inherit all the problems of sync 00:23:59.680 |
I would have said, well, Dropbox syncs your files. 00:24:03.040 |
And we didn't even really have a mission of the company 00:24:06.960 |
I just don't want to carry my thumb driving around 00:24:22.840 |
even we were like, all right, Dropbox provides storage. 00:24:26.520 |
they're like, that's not how we see this at all. 00:24:29.880 |
Actually, Dropbox is not just like a hard drive in the cloud. 00:24:35.680 |
or it's a place like I started a small business. 00:24:39.000 |
or it's like, yeah, it's not keeping files in sync. 00:24:53.720 |
they're like, no, we're not buying the storage. 00:25:01.560 |
people are buying the ability to work from anywhere 00:25:13.280 |
which is like, what is the sort of higher order thing, 00:25:19.480 |
Storage in the new world is kind of incidental to that. 00:25:21.800 |
I mean, it still matters for things like video 00:25:26.080 |
we provide you like the cheapest bits in the cloud. 00:25:28.720 |
But it is a big pivot from Dropbox is the company 00:25:33.400 |
that syncs your files to now where we're going 00:25:35.560 |
is Dropbox is the company that kind of helps you organize 00:25:39.520 |
I started the company 'cause I kept forgetting 00:25:41.120 |
my thumb drive, but with the question I was really asking, 00:25:44.120 |
I was like, why is it so hard to like find my stuff, 00:25:46.680 |
organize my stuff, share my stuff, keep my stuff safe? 00:25:50.720 |
You know, I'm always like one washing machine. 00:25:54.880 |
with all my prior company stuff in the pocket of my shorts 00:26:01.720 |
this is like medieval that we have to think about this. 00:26:03.640 |
So that same mindset is how I approach where we're going. 00:26:21.520 |
The shape of the problem and the shape of the solution 00:26:24.560 |
You know, instead of 100 files on your desktop, 00:26:26.280 |
it's now 100 tabs in your browser, et cetera. 00:26:30.160 |
- How has the idea of a product evolved for you? 00:26:32.720 |
So, you know, famously Steve Jobs started by Dropbox 00:26:35.600 |
and he's like, you know, this is just a feature. 00:26:40.760 |
How, in the age of AI, how do you think about, you know, 00:26:43.640 |
maybe things that used to be a product are now features 00:26:46.120 |
because the AI on top of it, it's like the product, 00:26:48.280 |
like what's your mental model to think about it? 00:26:51.080 |
So I don't think there's really like a bright line. 00:26:53.680 |
I don't know if like I use the word features and products 00:26:56.720 |
in my mental model that much of how I break it down. 00:27:06.080 |
but it does start from that place of like, all right, 00:27:08.520 |
we have all these new colors we can paint with. 00:27:10.640 |
And all right, what are these higher order needs 00:27:17.800 |
They're always need to be able to like find it, 00:27:19.320 |
or, you know, all the verbs I just mentioned. 00:27:21.840 |
It's like, okay, how can we make like a better painting 00:27:24.920 |
and then how can we use some of these new colors? 00:27:31.160 |
the way you find stuff, organize stuff, share stuff, 00:27:34.840 |
After COVID, it's gonna be completely different. 00:27:38.600 |
But I think it is also important to, you know, 00:27:41.800 |
you have to do more than just work back from the customer 00:27:46.480 |
and we've learned a lot of this the hard way sometimes. 00:27:58.400 |
So can we build the best way to find your stuff 00:28:04.240 |
for the vast majority of the billion knowledge workers 00:28:12.520 |
Obviously, you should just have like one search box, 00:28:20.480 |
that is like meaningfully better from the status quo? 00:28:23.760 |
Okay, then can we like get distribution and growth? 00:28:27.200 |
Like, that's sort of the next thing you learn. 00:28:30.120 |
what's the product, what's the product, what's the product? 00:28:31.920 |
Then you're like, wait, wait, we need distribution 00:28:37.440 |
or sort of needles you have to thread at the same time. 00:28:44.000 |
there's really this like self-serve viral model 00:28:48.380 |
borrowed from a lot of the consumer internet playbook 00:28:50.840 |
and like what Facebook and social media were doing 00:28:52.680 |
and then translated that to sort of the business world. 00:28:59.600 |
storage happened to be some, in the beginning, 00:29:01.080 |
happened to be something people were willing to pay for. 00:29:05.800 |
I'm going to have to buy an external hard drive. 00:29:09.360 |
I'm going to have to pay for something one way or another. 00:29:11.200 |
People were already paying for things like backup. 00:29:14.280 |
But then the last domino is like defensibility. 00:29:19.080 |
but then, you know, what do you do when the incumbents, 00:29:21.520 |
the next chess move for them is just like copy, bundle, kill. 00:29:28.160 |
and they'll like give it away for free or no added cost. 00:29:32.920 |
scar tissue from being on the wrong side of that. 00:29:37.280 |
for all four or five variables or whatever at once, 00:29:40.320 |
or you can sort of have, you know, some flexibility, 00:29:42.600 |
but the more of those gates that you get through, 00:29:52.280 |
there's been a lot of focus on the large language model, 00:30:02.240 |
Like there's sort of this weirdly self-commoditizing thing 00:30:07.240 |
if they're kind of on this like Pareto frontier 00:30:11.240 |
Being number two, you know, if you're not on that frontier, 00:30:17.460 |
like your model literally has zero economic value 00:30:21.880 |
LLMs generate output that can be used to train or improve. 00:30:27.760 |
that are specific to the large language model. 00:30:34.600 |
And, you know, certainly at the bottom with NVIDIA 00:30:40.760 |
like the people who have the customer relationship, 00:30:48.720 |
- Do you think AI is making people more careful 00:30:55.760 |
but it's like, whatever, I'm just throwing it out there. 00:31:02.600 |
But like, how have you seen, especially customers, 00:31:05.200 |
like think about what to put in, what to not? 00:31:10.040 |
nobody should be concerned about this, right? 00:31:14.880 |
and for companies information to be kind of ground up 00:31:25.880 |
and me personally, and with my Dropbox hat on. 00:31:30.400 |
And you know, we had experience with this too 00:31:34.320 |
Like, wait, I'm gonna take my stuff on my hard drive 00:31:44.760 |
I'm gonna put my credit card number into this website. 00:31:49.680 |
and put it in a bank instead of under my mattress. 00:31:51.880 |
You know, so there's a long history of like tech 00:31:54.280 |
and comfort, so in some sense AI is kind of another round 00:31:59.160 |
And then when I think about like defensibility for Dropbox, 00:32:01.480 |
like that's actually a big advantage that we have 00:32:06.600 |
We only get, we only make money if you pay us, 00:32:13.280 |
We're not training the next foundation model. 00:32:19.520 |
into an ecosystem, like the whole point of Dropbox 00:32:26.280 |
all right, in the world of AI, where should our lane be? 00:32:33.440 |
But to me, it was like a lot of the like trust advantages 00:32:45.280 |
these AI principles, very table stake stuff of like, 00:32:48.280 |
here's transparency, we wanna give people control, 00:32:59.840 |
'cause everybody wants like a trusted partner 00:33:02.040 |
as they sort of go into the wild world of AI. 00:33:04.080 |
And then, you know, you also see people cutting corners 00:33:06.880 |
and, you know, or just there's a lot of uncertainty 00:33:08.920 |
or, you know, moving the pieces around after the fact, 00:33:16.080 |
the race was kind of being the system of record, 00:33:21.600 |
if I can use Dash to like access my Google Drive file, 00:33:24.400 |
why would I pay Google for like their AI feature? 00:33:31.960 |
About, you know, not being able to capture all the value 00:33:39.000 |
but I'm curious if you think things will get more closed 00:33:45.120 |
And I think you have to be like a trustworthy partner 00:33:48.840 |
if they think you're gonna eat their lunch, right? 00:33:52.840 |
And like all the companies are quite sophisticated 00:33:55.640 |
So we try to, like we know that's gonna be the reality. 00:34:00.520 |
anyone's like Google Drive's lunch or anything. 00:34:03.520 |
Actually we'll like integrate with Google Drive, 00:34:12.360 |
We're not really reliant on being like the store of record. 00:34:15.320 |
And there are pros and cons to this decision. 00:34:23.760 |
which is to get, you know, that Google Doc or whatever. 00:34:28.680 |
This is all part of our like brand reputation. 00:34:32.720 |
to use whatever tools or operating system they want. 00:34:35.320 |
We're not taking anything away from our partners. 00:34:37.000 |
We're actually like making their thing more useful 00:34:41.720 |
I mean, on the margin there might be something like, 00:34:43.000 |
well, okay, to the extent you do rag and summarize things, 00:34:47.640 |
You know, we also know there's like infinity investment 00:35:01.720 |
you learn after some time in this business that like, 00:35:04.520 |
yeah, there's some places that are just gonna be 00:35:10.920 |
Everybody's kind of trying to solve the same problem 00:35:12.600 |
and they just start duplicating all each other effort. 00:35:21.960 |
yeah, and everybody's like fixated on the agent 00:35:27.400 |
like we have the opportunity to like really fix search 00:35:30.080 |
or build a self-organizing Dropbox or environment 00:35:32.680 |
or there's all these other things that can be a compliment. 00:35:34.840 |
'Cause we don't really want our customers to be thinking like, 00:35:41.840 |
In a lot of ways, actually some of the things that we do 00:35:49.240 |
we actually give admins IT like universal visibility 00:35:57.560 |
And as a precondition to installing something like co-pilot 00:36:02.320 |
or Dash or Glean or any of these other things, right? 00:36:06.560 |
hey, before we like turn all the lights in here, 00:36:12.400 |
And there just haven't been good tools to do that. 00:36:14.440 |
And post AI, you would do it completely differently. 00:36:27.640 |
How do you think about building for AI versus people? 00:36:36.760 |
versus models are just kind of like ingesting. 00:36:39.200 |
Do you think about building products differently, 00:36:44.320 |
and like agents and whatnot versus like just people? 00:36:54.400 |
of completely autonomously organizing your environment, 00:37:00.880 |
when the sort of user has to fit itself to the computer 00:37:07.440 |
or something where you have some kind of good partnership. 00:37:12.280 |
you don't have to do all this like manual effort. 00:37:14.200 |
And so like the command line was sort of subsumed 00:37:20.520 |
Maybe chat will be, chat will be an increasing, 00:37:23.920 |
like will be an increasing part of the puzzle. 00:37:29.440 |
And then as far as like the sort of plumbing of like, 00:37:32.080 |
well, is this gonna be consumed by an LLM or a human? 00:37:34.320 |
Like fortunately, like you don't really have to 00:37:45.360 |
the easier you make something to read for a human, 00:37:47.040 |
the easier it is for an LLM to read to some extent as well. 00:37:49.800 |
But we really think about what's that kind of right, 00:37:51.800 |
how do we build that right, like human machine interface 00:37:56.760 |
but then it's super easy to translate your intent 00:37:59.520 |
into like the, you know, however you want your folder, 00:38:02.480 |
setting your environment set up or like your preferences. 00:38:05.440 |
- What's the most underrated thing about Dropbox 00:38:09.480 |
- Well, I think this is just such a natural evolution 00:38:14.080 |
Like when people think about the world of AI, 00:38:22.320 |
And I think we also did like our first thing so well 00:38:30.000 |
that it was like pretty tough to come up with a sequel. 00:38:32.840 |
And we had a bit of a sophomore slump and, you know, 00:38:35.680 |
I think actually a lot of kids do use Dropbox 00:38:52.480 |
with these fundamental problems and these like, 00:38:54.320 |
that affect, you know, a billion knowledge workers 00:38:56.240 |
around just finding, organizing, sharing your stuff 00:39:00.960 |
And there's a ton of unsolved problems in those four verbs. 00:39:06.080 |
but just even think about like a whole new generation 00:39:10.160 |
without the ability to like organize their things. 00:39:13.760 |
And if you just have like a giant infinite pile of stuff, 00:39:20.920 |
that were pretty helpful in prior decades, right? 00:39:35.080 |
like if my operating system updates the wrong way 00:39:38.080 |
or if I just more commonly just declared tab bankruptcy, 00:39:41.000 |
it's like your whole workspace just clears itself out 00:39:45.000 |
And you're like, on what planet is this a good idea? 00:39:47.800 |
Like, you know, there's no like concept of like, 00:39:53.080 |
And so that was like a big motivation for things like Dash. 00:40:02.360 |
you know, what do I do if I have a Google doc 00:40:06.320 |
There's no collection that holds mixed format things. 00:40:12.560 |
hidden in plain sight, like it's missing primitives. 00:40:14.480 |
Like yeah, files have folders, songs have playlists, 00:40:17.160 |
links have, you know, there's no, somehow we miss that. 00:40:20.520 |
And so we're building that with stacks in Dash, 00:40:22.720 |
where it's like a mixed format, smart collection 00:40:26.200 |
just share whatever you need internally, externally, 00:40:28.520 |
and have it be like a really well-designed experience 00:40:36.280 |
You know, we talked a little bit about security 00:40:38.480 |
Like IT signs all these compliance documents, 00:40:40.680 |
but in reality has no way of knowing where anything is 00:40:44.520 |
It's actually better for them to not know about it 00:40:46.360 |
than to know about it and not be able to do anything about it. 00:40:49.400 |
we found that there were like literally people in IT 00:40:52.520 |
whose jobs it is to like manually go through, 00:40:55.080 |
log into each, like log into Office, log into Workspace, 00:40:58.480 |
log into each tool and like go comb through one by one 00:41:02.000 |
the links that people have shared and like un-share. 00:41:03.960 |
There's like an un-share guy in all these companies. 00:41:06.360 |
And that job is probably about as fun as it sounds. 00:41:12.600 |
I guess what makes technology a good business 00:41:14.280 |
is for every problem it solves, it like creates a new one. 00:41:17.920 |
So there's always like a sequel that you need. 00:41:20.160 |
And so, you know, I think the happy version of our Act Two 00:41:31.280 |
but broadband and everything wasn't ready for it. 00:41:36.400 |
but the value prop the whole time was just like, 00:41:37.760 |
let me press play on something I want to see. 00:41:40.280 |
And they did a really good job about bringing people along 00:41:43.720 |
You would think like, oh, the DVD mailing piece 00:41:51.360 |
And they did have some false starts in that transition. 00:41:55.440 |
they were able to take that DVD mailing audience, 00:42:10.400 |
And like both of those worlds were like super, 00:42:21.080 |
And Netflix did a great job of like activating 00:42:23.280 |
their Act One advantages and winning in Act Two 00:42:27.080 |
So I don't think people see Dropbox that way. 00:42:36.040 |
So fortunately we have like better and better answers 00:42:43.480 |
being like the Silicon Brain interface basically for people. 00:42:47.480 |
And writ large, I mean, I think what's so exciting 00:42:50.160 |
about AI and everybody's got their own kind of take on it, 00:42:53.360 |
but if you like really zoom out civilizationally 00:43:03.320 |
Certainly one of the, I mean, there are a lot of points, 00:43:06.600 |
you think about things like the Industrial Revolution, 00:43:19.040 |
or machines made of like wood or something, right? 00:43:24.600 |
And then suddenly, you know, the Industrial Revolution, 00:43:27.360 |
things like electricity, it suddenly it's like, 00:43:28.840 |
all right, mechanical energy is now available on demand. 00:43:33.880 |
And then suddenly we consume a lot more of it. 00:43:36.360 |
And then the standard of living goes way, way, way, way up. 00:43:38.800 |
That's been pretty limited to the physical realm. 00:43:43.760 |
that's really the first time we can kind of bottle up 00:43:50.320 |
a lot of our mechanical or physical busy work to machines 00:43:53.720 |
that freed us up to make a lot of progress in other areas. 00:43:57.800 |
now we can offload a lot more of our cognitive 00:44:08.040 |
It's sort of like, no, but we're more leveraged. 00:44:09.600 |
We can move a lot more earth with a bulldozer than a shovel. 00:44:13.040 |
And so that's like what is at the most fundamental level, 00:44:21.240 |
like half of our brain that's sort of coming online, 00:44:27.600 |
They have very complimentary strengths and weaknesses. 00:44:31.880 |
There's also this weird tangent we've gone on as a species 00:44:35.080 |
to like where knowledge workers have this like epidemic 00:44:38.600 |
of burnout, great resignation, quiet quitting. 00:44:44.000 |
but I think that's one of the biggest problems we have 00:44:54.760 |
unforced errors that we're doing where it's like, 00:45:02.280 |
And fortunately it's also what makes us engaged. 00:45:07.600 |
but then we go to work and then increasingly going to work 00:45:12.120 |
And you're like, if you wanted to design an environment 00:45:14.920 |
that made it impossible to ever get into a flow state 00:45:18.320 |
or ever be able to focus, like what we have is that. 00:45:21.240 |
And that was the thing that just like seven, eight years ago 00:45:25.040 |
why like knowledge work is so jacked up on this adventure. 00:45:29.640 |
in like the most cognitively polluted environment possible. 00:45:34.480 |
when we're working remotely and things like that. 00:45:41.240 |
why this was like a problem that wasn't fixing itself. 00:45:43.800 |
And I'm like, maybe there's something Dropbox 00:45:45.960 |
And, you know, things like Dash are the first step, 00:45:49.600 |
Well, I mean, now, like, well, why are humans 00:45:54.040 |
It's like, well, we're just, all of the tools we have today, 00:45:58.120 |
all of the weight, the burden to the human, right? 00:46:13.880 |
And then you look at that, you're like, wait, 00:46:15.200 |
I'm looking at my phone, it says like 80,000 unread things. 00:46:21.000 |
Fortunately, that's why things like our silicon brain 00:46:27.760 |
where it's like, actually, computers have no problem 00:46:33.960 |
And to some extent, this was already happening 00:46:36.240 |
with computing, you know, Excel is an aversion 00:46:43.280 |
so many of these little subtasks and tasks we do at work 00:46:50.560 |
an important metaphor to me, 'cause it mirrors 00:47:00.520 |
much better at these like parallel computations. 00:47:03.440 |
We talk a lot about like human versus machine 00:47:18.880 |
a lot since, you know, playing Quake 2 or something. 00:47:27.160 |
And so you really have to like redesign the work 00:47:35.720 |
of these really like heavy lifting GPU tasks. 00:47:43.320 |
And so I think we need to think about knowledge work 00:47:49.440 |
Let's offload all the stuff that can be automated. 00:47:58.280 |
of actually being able to do something about it. 00:48:00.280 |
- It's funny, I gave a talk to a few government people 00:48:04.800 |
where we used to make machines to replace human labor. 00:48:11.400 |
And now you're doing the same thing with the brain. 00:48:13.720 |
And the data centers are kind of computational power plants, 00:48:29.520 |
to actually maybe run a model that is not even that good 00:48:33.600 |
How do you think about some of the differences 00:48:35.760 |
in the open source ethos with like traditional software, 00:48:38.640 |
where it's like really easy to run and act on it 00:48:42.480 |
but like I'm kind of limited to what I can do with it. 00:48:45.280 |
- Yeah, well, I think with every new era of computing, 00:48:50.520 |
is this going to be like an open one or a closed one? 00:48:52.800 |
And, you know, there's pros and cons to both. 00:48:58.400 |
But, you know, I think you look at how the mobile, 00:49:06.160 |
Everybody sort of party that everybody could, you know, 00:49:14.160 |
like given the capital intensity of what it takes 00:49:27.720 |
Or is it going to be, you know, more open and accessible? 00:49:31.360 |
And I'm like super happy with how that's just, 00:49:38.120 |
You know, fortunately you're seeing, you've seen 00:49:39.880 |
in real life, yeah, even if people aren't, you know, 00:49:45.480 |
you've seen like the price performance of these models 00:49:51.200 |
which is, it's sort of like many Moore's laws 00:49:55.960 |
Like that wouldn't have happened without open source, right? 00:50:03.040 |
without having to buy an internet information server license. 00:50:10.840 |
where like people would contribute their code 00:50:22.640 |
and of just sort of the democratization of access to compute 00:50:27.040 |
And so I think it's been like phenomenally successful 00:50:31.120 |
And pretty much anything you care about, I believe, 00:50:33.160 |
even like safety, you can have a lot more eyes on it 00:50:35.640 |
and transparency instead of just something is happening. 00:50:37.800 |
And there was three places with nuclear power plants 00:50:41.760 |
So I think it's, you know, it's been awesome to see. 00:50:44.000 |
And then, and again, for like wearing my Dropbox hat, 00:50:53.760 |
It's, you know, there are a lot of different configurations, 00:50:58.240 |
And even before you even talk about getting on the device, 00:51:00.520 |
like, you know, you need this whole kind of constellation 00:51:21.600 |
Like, especially as you have more small models, 00:51:23.320 |
like you want to control more of the economics, 00:51:32.200 |
where like thinking about the future is a lot easier, 00:51:36.200 |
So, I mean, there's definitely this like big surge 00:51:39.960 |
in demand as like, there's sort of this FOMO-driven bubble 00:51:43.280 |
of like all of big tech taking their headings 00:51:45.800 |
and shipping them to Jensen for a couple years. 00:51:48.640 |
And then you're like, all right, well, first of all, 00:51:52.920 |
And you know, in the late '90s with like Fiber, 00:51:55.600 |
you know, this huge race to like own the internet, 00:52:12.640 |
But, you know, the simplest way I think about it is like, 00:52:22.120 |
You know, I wouldn't want to be buying like pallets 00:52:28.560 |
when like the 386 and 486 and Pentium and everything 00:52:31.400 |
are like clearly coming, they're around the corner. 00:52:44.880 |
to like provide different capabilities in the product, 00:52:47.280 |
you know, cut, you know, slashing by 10, 100, 1,000X. 00:52:55.080 |
you have to believe that the sort of supply and demand 00:52:58.960 |
And then there's also like non-NVIDIA stacks, 00:53:04.960 |
that are super interesting and outperformed NVIDIA stack 00:53:13.480 |
where we were with like hard drives or storage 00:53:15.680 |
when we sort of went back from the public cloud. 00:53:21.840 |
We know what the cost of a hard drive and a server 00:53:24.400 |
and, you know, terabyte of bandwidth and all the inputs 00:53:30.120 |
But to like rely on the public cloud to pass that along 00:53:34.640 |
than like relying on the kindness of strangers. 00:53:37.000 |
So we decided to bring that in house and still do, 00:53:41.880 |
That said, like the public cloud is like scaled 00:53:45.400 |
and just good all around than we would have predicted. 00:53:48.000 |
'Cause actually back then we were worried like, 00:53:49.320 |
is the public cloud gonna even scale fast enough 00:53:53.720 |
But yeah, I think we're in the early innings. 00:53:56.960 |
So I think renting and not sort of preserving agility 00:54:06.880 |
Yeah, it's kind of like, okay, if this really works, 00:54:15.040 |
like this is one where you could just have these things 00:54:17.560 |
that just like, okay, there's just like a new kind of piece 00:54:22.320 |
So I think there's still, I mean, it's like not that likely, 00:54:25.000 |
but I think this is an area where it actually could, 00:54:28.520 |
and out of nowhere, suddenly, all of a sudden, 00:54:34.280 |
he references, Andy Grove's "Only the Paranoid Survive." 00:54:45.520 |
How do you think about staying relevant for so long? 00:54:47.840 |
Now, it's been 17 years, you've been doing Dropbox. 00:54:51.160 |
And maybe we can touch on founder mode and all of that. 00:54:55.000 |
- Yeah, well, first, what makes tech exciting 00:55:00.600 |
And your customers never are like, oh no, we're good now. 00:55:07.920 |
files are not even that relevant to the modern, 00:55:11.560 |
but like, you know, so much is tilted elsewhere. 00:55:16.120 |
and think about, on the one level, like what is, 00:55:19.000 |
and think of these different layers of abstraction. 00:55:20.600 |
Like, well, yeah, the technical service we provide 00:55:28.480 |
well, technically we mail people physical DVDs 00:55:31.760 |
and then we have to switch like streaming and codecs 00:55:52.160 |
but of business or government or sports or military, 00:55:58.720 |
it would have been like totally new as a 25 year old, 00:56:04.320 |
there's not a lot of great things about getting older, 00:56:06.720 |
this actually has like a million like precedents, 00:56:12.320 |
you know, about like the future of GPUs from like, 00:56:15.400 |
I don't know how, you know, how Formula One teams work, 00:56:18.280 |
or you can draw all these like weird analogies 00:56:20.440 |
that are super helpful in guiding you from first principles 00:56:27.240 |
But like, you know, build shit we're really proud of. 00:56:32.680 |
you sort of become blind to like how technology works 00:56:36.400 |
And it's something like carrying a thumb drive. 00:56:41.800 |
or like have to carry a big external hard drive around. 00:56:43.640 |
So you're always thinking like, oh, this is awesome. 00:56:54.960 |
And Apple comes along and is like, this is dumb. 00:56:56.320 |
You should have like a catalog, artists, playlists. 00:56:58.720 |
You know, then Spotify is like, hey, this is dumb. 00:57:00.720 |
Like you should, why are you buying these things 00:57:05.760 |
why is this like such a single player experience? 00:57:08.280 |
and there should be AI curated, et cetera, et cetera. 00:57:11.920 |
And then a lot of it is also just like drawing, 00:57:14.840 |
connecting dots between different disciplines, right? 00:57:16.680 |
So a lot of what we did to make Dropbox successful 00:57:18.520 |
is like we took a lot of the consumer internet playbook, 00:57:20.840 |
applied it to business software from a virality 00:57:25.440 |
And then, you know, I think there's a lot of, 00:57:26.800 |
you can draw from the consumer realm and what's worked there 00:57:29.120 |
and that hasn't been ported over to business, right? 00:57:34.760 |
yeah, when you sign into Netflix or Spotify or YouTube 00:57:38.440 |
or any consumer experience, like what do you see? 00:57:44.720 |
You see like this whole, and it went on evolution, right? 00:57:53.080 |
to 30 channels, 100, but then something like 1,000 channels, 00:58:02.160 |
wait, wait, we just need to like rethink the system here. 00:58:08.600 |
And so, you know, I think the consumer experiences 00:58:10.920 |
that are like smart, you know, when you sign into Netflix, 00:58:15.000 |
It's like, here are a bunch of smart defaults. 00:58:21.000 |
here are some, you know, reasonable suggestions. 00:58:23.480 |
And then it's like, okay, I watched "Strive to Survive." 00:58:28.200 |
it's like a complete, it's a learning system, right? 00:58:29.840 |
So a combination of design, machine learning, 00:58:33.160 |
and just like the courage to like rethink the whole thing. 00:58:42.120 |
In the consumer experience, there's no filing things away. 00:58:44.240 |
Everything's just all sort of auto-curated for you 00:58:55.600 |
And so like, okay, can we do something about that? 00:58:57.960 |
- You know, you're one of the last founder CEOs. 00:59:05.000 |
- I'm like 300 years old, and why can't I be a founder? 00:59:07.000 |
- No, but I'm saying like, when you run a company, 00:59:08.920 |
like, you've had multiple executives over the years. 00:59:11.600 |
Like, how important is that for the founder to be CEO 00:59:15.960 |
we're changing the way the company and the strategy works. 00:59:18.120 |
It's like, we're really taking this seriously. 00:59:19.520 |
Versus like, you could be a public CEO and be like, 00:59:21.600 |
hey, I got my earnings call, and like, whatever, 00:59:24.200 |
I just need to focus on getting the right numbers. 00:59:26.320 |
Like, how does that change the culture in the company? 00:59:32.920 |
You know, I think founder mode's kind of this Rorschach test. 00:59:41.000 |
I think it's also like a destination you get to 00:59:46.680 |
So if you think about, you know, imagine someone, 00:59:50.560 |
You know, giving a med student a scalpel on day one, 01:00:03.160 |
like 17 years into it than they were one year into it. 01:00:06.440 |
I think part of why founder mode is so resonant is, 01:00:09.920 |
or it's like striking such a chord with so many people is, 01:00:12.440 |
yeah, there's a real power when you have like a directive, 01:00:15.160 |
intuitive leader who can like decisively take the company 01:00:22.840 |
And I think every founder who makes it this long, 01:00:36.460 |
or like nothing was working and they weren't the cool kids. 01:00:44.000 |
There are many times where I thought about that, 01:00:47.520 |
it all comes together and you sort of start being able 01:00:50.360 |
So you've sort of seen enough and learned enough. 01:00:52.320 |
And as long as you keep your learning rate up, 01:00:54.160 |
you can kind of surprise yourself in terms of like 01:00:56.000 |
how capable you can become over a long period. 01:00:58.460 |
And so I think there's a lot of like founder CEO journey, 01:01:22.160 |
you sort of default that because there's like, 01:01:25.160 |
you realize pretty quickly like nothing gets done 01:01:30.200 |
Then you scale and then you're like, you get, 01:01:32.440 |
you know, a lot of actually pretty good advice 01:01:38.760 |
and like give them real responsibilities and empower people. 01:01:41.720 |
And that's like a whole discipline called like management 01:01:53.800 |
and you hire a 45-year-old exec from, you know, 01:01:56.640 |
high-flying company and a guy who was running 01:01:58.720 |
like a $10 billion P&L and came to work for Dropbox 01:02:02.840 |
where we were like a fraction of a billion dollar P&L. 01:02:05.560 |
And, you know, what am I going to tell him about sales? 01:02:26.800 |
It builds all these like kind of weird structures 01:02:31.280 |
And then at some point you learn enough of this 01:02:38.440 |
and you've learned enough to like know what to do 01:02:42.400 |
And then on the other side, you lean way back in. 01:02:46.720 |
this company is like not running the way I want it. 01:02:56.040 |
And if you can do it right and like make it to that point, 01:02:58.040 |
like then the job becomes like a lot of fun and exciting 01:03:05.840 |
But it's not, it's like a really rough learning journey. 01:03:10.400 |
I've had many therapy sessions with founders, CEOs. 01:03:23.560 |
"Fortunately, the Dropbox founders are too stupid 01:03:30.120 |
You have a lot of companies that sound the same, 01:03:36.600 |
Do you have any advice for founders trying to navigate 01:03:39.160 |
like the IDMAs today and like what they should do, 01:03:41.680 |
what are like counterintuitive things maybe to try? 01:03:46.880 |
bringing together some of what we've covered, 01:03:48.280 |
I think there's a lot of very common kind of category errors 01:03:52.680 |
One is, you know, thinking starting from the technology 01:03:58.800 |
And I think every founder has to start with what you know. 01:04:01.240 |
Like you're, yeah, you know, maybe if you're an engineer, 01:04:05.000 |
but don't know any of the other next, you know, hurdle, 01:04:17.920 |
And for me, that meant you have to be like super systematic 01:04:24.120 |
Like literally no one else will do that for you. 01:04:26.200 |
And so then you have to have like, all right, well, 01:04:30.360 |
one of the most helpful questions to ask there is like, 01:04:35.160 |
In three years from now, what do I wish in one year? 01:04:40.520 |
And so, for example, you know, when I was just starting 01:04:51.800 |
are really all you should be doing in that early phase. 01:05:00.600 |
but to get people, we're gonna need fundraise, 01:05:03.520 |
Okay, to raise money, we're gonna have to like, 01:05:04.720 |
have to answer all these questions we have to, 01:05:06.440 |
so you like work back from that and you're like, 01:05:10.240 |
And then, you know, the circle keeps expanding. 01:05:12.760 |
we're gonna need like accountants and lawyers and employees, 01:05:24.240 |
with like Microsoft, Google, Apple, Facebook, 01:05:27.840 |
And like, somehow we're gonna have to like deal with that. 01:05:29.360 |
And then that's like what the company's got to deal with. 01:05:31.520 |
And as CEO, I'm gonna be responsible for all that. 01:05:35.480 |
I'm gonna like need to know like what marketing is 01:05:37.920 |
and like what finance is and how to manage people, 01:05:44.480 |
And so, and then I think one thing people often do 01:05:51.560 |
You're like, oh, it seems so remote or far away 01:06:01.800 |
You're like, now I thought I didn't know a lot, 01:06:07.400 |
Like how do I learn all these different disciplines 01:06:13.640 |
that are sort of going through the same things. 01:06:16.600 |
Maybe reading was the single most helpful thing 01:06:18.600 |
more than any one person or talking to people, 01:06:23.320 |
But then there's a whole mindset piece of it, 01:06:24.880 |
which is sort of like you have to cut yourself 01:06:27.320 |
Like, you know, I wish someone had sort of sat me down 01:06:29.960 |
and told me like, dude, you may be an engineer, 01:06:40.760 |
So like this is actually something that's normal 01:06:47.080 |
didn't come out of the womb with like shiny hair 01:06:54.240 |
but I think there's a big piece of it around like discomfort 01:07:00.320 |
or I don't know if I'm ready for this, this, this. 01:07:09.200 |
So five weeks, you're not going to be a great leader, 01:07:12.080 |
manager, or a great public speaker or whatever. 01:07:15.240 |
a great guitar player or play a sport that well, 01:07:20.800 |
you can be pretty good at any of those things. 01:07:25.360 |
but you have like a lot more latent potential. 01:07:28.240 |
I mean, people have a lot more latent potential 01:07:36.080 |
- How do you think about that for building your team? 01:07:39.320 |
Obviously, that's a great example of building a dynasty 01:07:50.560 |
versus like how do you measure like the learning rate 01:07:53.720 |
And like, how do you think about picking and choosing? 01:07:59.400 |
And we've had a lot of success with great leaders 01:08:09.440 |
Our exec team is populated with a lot of those folks, 01:08:11.520 |
but there's also a lot of benefit to experience 01:08:18.080 |
And there's a lot of drawbacks to kind of learning 01:08:25.360 |
if they have like some experience to learn from. 01:08:28.840 |
Either you can have various organ rejection or misfit 01:08:32.320 |
or like overfitting from their past experience 01:08:37.880 |
I've kind of gotten all the mistake merit badges on that. 01:08:45.440 |
who are sort of in the biggest jobs of their lives, 01:08:47.680 |
do they either have someone that is managing them 01:08:50.000 |
that they can learn from, you know, as a CEO, 01:08:54.680 |
like you have to like surround or help support them. 01:08:57.920 |
So getting the mentors or getting first time execs 01:09:08.920 |
Okay, like, you know, there's usually these informal 01:09:13.480 |
And then, yeah, you just don't want to be too rotated 01:09:17.800 |
We've like overdone it on the high potential piece, 01:09:19.800 |
but then like everybody's kind of making dumb mistakes. 01:09:23.200 |
- The bad mistakes are the ones where you're like, 01:09:26.480 |
or like these are known knowns to the industry, 01:09:30.200 |
if they're like unknown unknowns to your team, 01:09:41.600 |
of whatever culture or practices they bring in 01:09:44.200 |
can create resentment or like lack of career opportunities. 01:09:47.920 |
So it's really about how do you get, you know, 01:09:49.480 |
it doesn't really matter if it's like exactly 50/50. 01:09:51.320 |
I don't think about a sort of perfect balance, 01:10:11.160 |
And, you know, we've talked a little bit about the product. 01:10:13.160 |
It's like universal search, universal access control, 01:10:16.080 |
a lot of rethinking, sharing for the modern environment. 01:10:20.960 |
and, you know, we could talk about the product, 01:10:22.040 |
but like the, it's just really exciting for me to like, 01:10:30.320 |
And there's probably a lot of people out there 01:10:35.800 |
or like, yeah, I used Dropbox like 10 years ago 01:10:44.760 |
And it's a lot of, it's been a lot of fun to, 01:10:49.880 |
has created all these new like paths forward for Dropbox 01:10:55.440 |
And then, yeah, to the founders, like, you know, 01:11:02.800 |
So we're pretty lucky to get to do what we do. 01:11:04.760 |
- Yeah, watch the Pat's documentary on Apple TV. 01:11:22.040 |
but I'm like a lot more of a fan now, as you'd imagine. 01:11:24.760 |
- Awesome, well, thank you so much for the train, Drew.