back to index

Building the Silicon Brain - Drew Houston of Dropbox


Chapters

0:0 Introductions
0:43 Drew's AI journey
4:14 Revalidating expectations of AI
8:23 Simulation in self-driving vs. knowledge work
12:14 Drew's AI Engineering setup
15:24 RAG vs. long context in AI models
18:6 From "FileGPT" to Dropbox AI
23:20 Is storage solved?
26:30 Products vs Features
30:48 Building trust for data access
33:42 Dropbox Dash and universal search
38:5 The evolution of Dropbox
42:39 Building a "silicon brain" for knowledge work
48:45 Open source AI and its impact
51:30 "Rent, Don't Buy" for AI
54:50 Staying relevant
58:57 Founder Mode
63:10 Advice for founders navigating AI
67:36 Building and managing teams in a growing company

Whisper Transcript | Transcript Only Page

00:00:00.000 | (upbeat music)
00:00:02.580 | - Hey everyone.
00:00:05.400 | Welcome to the Latent Space Podcast.
00:00:06.840 | This is Alessio, partner and CTO at Decibel Partners.
00:00:09.600 | And there's no spooks today,
00:00:11.120 | but I'm joined by Drew Hauston of Dropbox.
00:00:13.280 | Welcome Drew.
00:00:14.120 | - Thanks for having me.
00:00:14.940 | - So we're not gonna talk about the Dropbox story.
00:00:17.000 | We're not gonna talk about the Chinatown bus
00:00:18.960 | and the flash drive and all that.
00:00:20.720 | I think you've talked enough about it.
00:00:22.640 | Where I wanna start is you as a AI engineer.
00:00:25.320 | So as you know, most of our audience is engineering folks,
00:00:28.000 | kind of like technology leaders.
00:00:29.520 | You obviously run Dropbox, which is a huge company,
00:00:31.920 | but you also do a lot of coding.
00:00:33.020 | I think it's how you spend almost 400 hours,
00:00:35.240 | just like coding.
00:00:36.320 | So let's start there.
00:00:37.640 | - Sure.
00:00:38.480 | - What was like the first interaction you had
00:00:39.960 | with like an LLM API
00:00:41.140 | and when did the journey start for you?
00:00:43.400 | - Yeah.
00:00:44.440 | Well, I think probably like all AI engineers
00:00:47.060 | or whatever you call an AI engineer,
00:00:48.760 | those people started out as engineers before that.
00:00:50.640 | So like engineering is my first love.
00:00:52.040 | I mean, I grew up as a little kid.
00:00:53.040 | I was that kid.
00:00:54.320 | My first line of code was five years old.
00:00:56.560 | Just really loved, I wanted to make computer games,
00:00:59.000 | like this whole path.
00:01:00.620 | That also led me into startups
00:01:02.320 | and eventually starting Dropbox.
00:01:04.080 | And then with AI specifically,
00:01:06.480 | I studied computer science.
00:01:07.840 | I did my undergrad,
00:01:09.320 | but I didn't do like grad level computer science.
00:01:12.520 | I sort of got distracted by all the startup things,
00:01:14.280 | so I didn't do grad level work.
00:01:15.920 | But about several years ago, I made a couple of things.
00:01:18.280 | So one is I sort of,
00:01:19.560 | I knew I wanted to go from being an engineer to a founder.
00:01:22.280 | And then, but sort of the becoming a CEO part
00:01:25.040 | was sort of backed into the job.
00:01:26.560 | And so a couple of realizations.
00:01:28.540 | One is that, I mean, there's a lot of repetitive
00:01:32.080 | and manual work you have to do as an executive
00:01:34.800 | that is actually lends itself pretty well to automation,
00:01:38.080 | both for my own convenience
00:01:40.520 | and then out of interest in learning,
00:01:42.340 | I guess what we call classical machine learning these days.
00:01:45.520 | I started really trying to wrap my head
00:01:48.160 | around understanding machine learning
00:01:50.160 | and informational retrieval more formally.
00:01:52.400 | So I'd say maybe 2016, 2017.
00:01:54.740 | Started writing these more successively,
00:01:56.660 | more elaborate scripts to understand basic classifiers
00:02:01.380 | and regression and again, basic information retrieval
00:02:04.460 | and NLP back in those days.
00:02:06.860 | And there's sort of two things that came out of that.
00:02:09.220 | One is techniques are super powerful.
00:02:11.680 | And even just studying old school machine learning
00:02:14.380 | was a pretty big inversion
00:02:16.340 | of the way I had learned engineering, right?
00:02:18.980 | I started programming.
00:02:20.540 | When everyone starts programming,
00:02:21.540 | you're sort of the human,
00:02:22.620 | you're giving an algorithm and spelling out to the computer
00:02:26.400 | how it should run it.
00:02:27.880 | And then machine learning,
00:02:28.720 | here's machine learning where it's like,
00:02:29.880 | actually flip that, give it sort of the answer you want
00:02:33.320 | and it'll figure out the algorithm,
00:02:35.140 | which was pretty mind-bending.
00:02:36.540 | And it was both pretty powerful
00:02:39.440 | when I would write tools to figure out time audits
00:02:43.240 | or where's my time going?
00:02:44.640 | Is this meeting a one-on-one or is it a recruiting thing
00:02:47.600 | or is it a product strategy thing?
00:02:49.600 | I started out doing that manually with my assistant
00:02:51.600 | but then found that this was a very automatable task,
00:02:55.480 | which also had the side effect
00:02:56.600 | of teaching me a lot about machine learning.
00:02:59.880 | But then there was this big problem.
00:03:02.240 | It was very good at tabular structured data,
00:03:04.560 | but any time it hit the usual malformed English
00:03:07.440 | that humans speak, it would just fall over.
00:03:10.040 | I had to kind of abandon a lot of the things
00:03:12.140 | that I wanted to build
00:03:12.980 | 'cause there's no way to parse text.
00:03:15.120 | Maybe it would sort of identify the part of speech
00:03:17.480 | in a sentence or something.
00:03:19.040 | But then fast forward to the LLM,
00:03:20.840 | I mean, actually, I started trying something like this,
00:03:22.800 | what we would call very small LLMs
00:03:25.320 | before kind of the GPT class models.
00:03:27.880 | And it was super hard to get those things working.
00:03:31.000 | So these 500-parameter models
00:03:32.600 | would just be hallucinating and repeating.
00:03:35.440 | So actually, I'd kind of written it off a little bit.
00:03:38.040 | But then the ChatGPT launch and GPT-3, for sure.
00:03:41.920 | And then once people figured out prompting
00:03:43.560 | and instruction tuning,
00:03:45.400 | this was sort of November-ish, 2022, like everybody else.
00:03:49.240 | The ChatGPT launch being the starting gun
00:03:52.200 | for the whole AI era of computing,
00:03:55.160 | and then having API access to 3,
00:03:56.880 | and then early access to GPT-4.
00:03:59.680 | I was like, "Oh, man, it's happening."
00:04:02.360 | And so I was literally on my honeymoon,
00:04:03.860 | and we're on a beach in Thailand,
00:04:05.800 | and I'm coding these AI tools to automate writing
00:04:10.400 | or to assist with writing and all these different use cases.
00:04:12.840 | - You're like, "I'm never going back to work.
00:04:14.300 | "I'm gonna automate all of it before I get to 5G."
00:04:17.240 | - Ever since then, I've always been coding prototypes
00:04:20.880 | and just stuff to make my life more convenient,
00:04:23.240 | but that escalated a lot after '22.
00:04:26.080 | And yeah, I checked.
00:04:28.160 | I think it was probably over 400 hours this year,
00:04:30.880 | so far, coding, 'cause I had my paternity leave
00:04:33.160 | where I was able to work on some special projects.
00:04:35.120 | But yeah, it's a super important part
00:04:37.860 | of my whole learning journey,
00:04:39.640 | is being really hands-on with these things.
00:04:41.680 | And it's probably not a typical recipe,
00:04:43.760 | but I really love to get down to the metal
00:04:46.280 | as far as how this stuff works.
00:04:47.360 | - Yeah, Swigs and I were with Sam Allman in October '22.
00:04:50.360 | We were at a hack day at OpenAI,
00:04:52.640 | and that's why we started this podcast eventually.
00:04:54.960 | But you did an interview with Sam seven years ago,
00:04:58.200 | and he asked you, "What's the biggest opportunity
00:05:00.040 | "in startups?"
00:05:00.880 | And you were like, "Machine learning and AI."
00:05:02.880 | And you were almost too early, right?
00:05:04.760 | It's like, maybe seven years ago,
00:05:06.120 | the models weren't quite there.
00:05:07.880 | How should people think about revalidating expectations
00:05:11.780 | of this technology?
00:05:12.800 | I think even today, people will tell you,
00:05:14.320 | "Oh, models are not really good at X
00:05:16.360 | "because they were not good 12 months ago,
00:05:17.720 | "but they're good today."
00:05:19.000 | What's your process for that?
00:05:19.840 | - Heuristics for thinking about that, or how is...
00:05:22.160 | Yeah, I think the way I look at it now
00:05:23.960 | is pretty, has evolved a lot since when I started.
00:05:27.280 | I mean, I think everybody intuitively starts with like,
00:05:29.440 | "All right, let's try to predict the future,
00:05:30.920 | "or imagine what's this great end state
00:05:32.440 | "we're gonna get to."
00:05:33.720 | And the tricky thing is often those prognostications
00:05:36.680 | are right, but they're right in terms of direction,
00:05:39.200 | but not when.
00:05:40.520 | For example, even in the early days of the internet,
00:05:43.320 | the '90s, when things were even like tech space,
00:05:45.040 | and even before the browser, things like that,
00:05:48.640 | people were like, "Oh man, you're gonna have,
00:05:50.580 | "you're gonna be able to order food,
00:05:52.480 | "get Snickers delivered to your house,
00:05:54.500 | "you're gonna be able to watch any movie ever created."
00:05:57.320 | And they were right, but they were like,
00:05:58.800 | "It took 20 years for that to actually happen."
00:06:01.600 | And before you got to DoorDash, you had to get,
00:06:03.920 | you started with like Webvan and Cosmo,
00:06:05.560 | and before you get to Spotify, you had to do like Napster,
00:06:07.880 | and Kazaa, and LimeWire, and a bunch of like,
00:06:10.360 | broken Britney Spears MP3s, and Malware.
00:06:14.000 | So I think the big lesson is being early
00:06:16.440 | is the same as being wrong.
00:06:17.600 | Being late is the same as being wrong.
00:06:19.460 | So really, how do you calibrate timing?
00:06:21.480 | And then I think with AI, it's the same thing.
00:06:23.960 | People are like, "Oh, it's gonna completely upend society
00:06:25.820 | "in all these positive and negative ways."
00:06:27.340 | I think that's, like most of those things
00:06:28.880 | are gonna come true.
00:06:30.760 | The question is like, when is that gonna happen?
00:06:32.680 | And then with AI specifically, I think there's also,
00:06:36.680 | in addition to sort of the general tech category,
00:06:38.720 | or like jumping too fast to the future,
00:06:41.320 | I think that AI is particularly susceptible to that.
00:06:44.520 | And you look at self-driving, right?
00:06:46.320 | This idea of like, "Oh my God,
00:06:47.560 | "you can have a self-driving car,"
00:06:49.340 | captured everybody's imaginations 10, 12 years ago.
00:06:52.500 | And people are like, "Oh man, in two years,
00:06:54.400 | "there's not gonna be another,
00:06:55.320 | "there's not gonna be a human driver on the road to be seen."
00:06:58.160 | It didn't work out that way, right?
00:06:59.480 | We're still 10, 12 years later,
00:07:00.880 | where we're in a world where you can sort of sometimes
00:07:03.200 | get a Waymo in like one city on Earth.
00:07:05.880 | Exciting, but just took a lot longer than people think.
00:07:09.640 | And the reason is there's a lot of like time,
00:07:11.720 | there's like a lot of engineering challenges,
00:07:13.600 | but then there's a lot of other like societal time constants
00:07:15.840 | that are hard to compress.
00:07:17.740 | So one thing I think what you can learn
00:07:19.360 | from things like self-driving is
00:07:20.840 | they have these levels of autonomy
00:07:22.560 | that's a useful kind of framework in driving,
00:07:24.600 | or these like maturity levels.
00:07:26.300 | People sort of skip to like level five, full autonomy,
00:07:28.600 | or we're gonna have like an autonomous knowledge worker
00:07:31.320 | that's gonna take, that's gonna,
00:07:33.000 | and then we won't need humans anymore,
00:07:34.800 | kind of projection that that's gonna take a long time.
00:07:37.300 | But then when you think about level one or level two,
00:07:39.840 | like these little assistive experiences,
00:07:42.320 | you know, we're seeing a lot of traction with those.
00:07:44.160 | So what you see really working is the level one autonomy
00:07:49.160 | in the AI world would be like the tab autocomplete
00:07:51.440 | and copilot, right?
00:07:52.960 | GitHub copilot.
00:07:53.800 | And then, you know, maybe a little higher
00:07:55.360 | is like the chatbot type interface.
00:07:57.640 | Obviously you wanna get to the highest level you can
00:07:59.960 | to build a good product,
00:08:01.240 | but the reliability just isn't,
00:08:03.600 | and the capability just isn't there in the early innings.
00:08:05.840 | And so, and then you think of other level one,
00:08:09.000 | level two type things,
00:08:09.840 | like Google Maps probably did more for self-driving
00:08:11.960 | than in literal self-driving.
00:08:13.480 | Like a billion people have like the ability
00:08:15.720 | to have like maps and navigation
00:08:16.840 | just like taken care of for you autonomously.
00:08:19.220 | So I think that timing and maturity
00:08:20.540 | are really important factors to include.
00:08:23.280 | - The thing with self-driving,
00:08:24.360 | maybe one of the big breakthroughs was like simulation.
00:08:26.720 | So it's like, okay, instead of driving,
00:08:27.960 | we can simulate these environments.
00:08:29.720 | It's really hard to do with knowledge work.
00:08:31.440 | You know, how do you simulate like a product review?
00:08:33.500 | How do you simulate these things?
00:08:34.880 | I'm curious if you've done any experiments.
00:08:36.860 | I know some companies have started to build
00:08:38.320 | kind of like a virtual personas
00:08:39.920 | that you can like bounce ideas off of.
00:08:42.120 | - I mean, fortunately in a company,
00:08:44.620 | you generate lots of, you know,
00:08:46.760 | actual human training data all the time.
00:08:49.520 | And then I also just like start with myself.
00:08:51.320 | Like, all right, I can triage, you know.
00:08:53.120 | It's pretty tricky even within your company to be like,
00:08:55.240 | all right, let's open all this up as quote training data.
00:08:57.680 | But, you know, I can start with my own emails
00:08:59.720 | or my own calendar or own stuff
00:09:01.080 | without running into the same kind of like privacy
00:09:04.720 | or other concerns.
00:09:06.440 | So I often like start with my own stuff.
00:09:08.200 | And so that is like one level of bootstrapping.
00:09:10.520 | But actually four or five years ago during COVID,
00:09:14.400 | we decided, you know, a lot of companies were thinking about
00:09:16.480 | how do we go back to work?
00:09:17.720 | We decided to really lean into remote and distributed work
00:09:20.720 | because I thought, you know,
00:09:22.280 | this is gonna be the biggest change
00:09:23.480 | to the way we work in our lifetimes.
00:09:25.640 | And COVID kind of ripped up a bunch of things,
00:09:27.920 | but I think everybody was sort of pleasantly surprised
00:09:30.400 | how it was a lot of knowledge work.
00:09:32.360 | You could just keep going.
00:09:33.680 | And actually you were sort of find work was decoupled
00:09:36.040 | from your physical environment,
00:09:37.520 | from being in a physical place,
00:09:39.320 | which meant that things people had dreamed about
00:09:41.680 | since the '50s or '60s, like telework,
00:09:43.440 | like you actually could work from anywhere.
00:09:45.360 | And that was now possible.
00:09:46.400 | So we decided to really lean into that
00:09:48.040 | 'cause we debated,
00:09:48.880 | should we sort of hit the fast forward button
00:09:50.520 | or should we hit the rewind button, go back to 2019?
00:09:53.040 | You know, obviously that's been playing out
00:09:54.200 | over the last few years.
00:09:55.560 | And we decided to basically turn,
00:09:56.960 | we went like 90% remote.
00:09:58.520 | We still, the in-person part's really important.
00:10:00.400 | We can kind of come back to our working model,
00:10:01.960 | but we're like, yeah, this is,
00:10:04.360 | everybody is gonna be in some kind of like distributed
00:10:06.640 | or hybrid state.
00:10:07.640 | So like, instead of like running away from this,
00:10:10.800 | like let's do a full send, let's really go into it.
00:10:13.160 | Let's live in the future a few years before our customers.
00:10:16.440 | Let's like turn Dropbox into a lab for distributed work.
00:10:19.920 | And we do that like quite literally,
00:10:21.320 | both with our working model
00:10:22.280 | and then increasingly with our products.
00:10:23.840 | And then absolutely, like we have products like Dropbox Dash,
00:10:27.920 | which are universal search products.
00:10:30.600 | That was like very elevated in priority for me after COVID
00:10:33.640 | because like now you have,
00:10:35.800 | we're putting a lot more stress on the system
00:10:37.440 | and on our screens.
00:10:38.960 | It's a lot more chaotic and overwhelming.
00:10:40.680 | And so even just like getting the right information
00:10:42.600 | and the right person at the right time
00:10:43.800 | is a big fundamental challenge in knowledge work
00:10:45.400 | and these, in the distributed world,
00:10:47.240 | like big problem today is still getting,
00:10:49.560 | you know, has been getting bigger.
00:10:51.160 | And then for a lot of these other workflows,
00:10:52.600 | yeah, there's, we can both get a lot of natural
00:10:55.640 | like training data from just our own like strategy docs
00:10:58.360 | and processes.
00:10:59.800 | There's obviously a lot you can do with synthetic data.
00:11:01.960 | And, you know, actually like LLMs are pretty good
00:11:04.960 | at being like imitating generic knowledge workers.
00:11:07.920 | So it's kind of funny that way.
00:11:10.320 | But yeah, the way I look at it is like really turn Dropbox
00:11:13.960 | into a lab for distributed work.
00:11:15.480 | You think about things,
00:11:17.200 | like what are the big problems we're going to have
00:11:18.640 | is just the complexity on our screens just keeps growing.
00:11:20.960 | And the whole environment gets kind of more out of sync
00:11:23.560 | with what makes us like cognitively productive and engaged.
00:11:28.000 | And then even something like Dash was initially seeded.
00:11:30.440 | I made a little personal search engine
00:11:31.920 | 'cause I was just like personally frustrated
00:11:33.280 | with not being able to find my stuff.
00:11:35.520 | And along that whole learning journey with AI,
00:11:37.760 | like the vector search or semantic search,
00:11:40.160 | like that had just been the tooling for that.
00:11:42.960 | The open source stuff had finally gotten to a place
00:11:44.600 | where it was a pretty good developer experience.
00:11:45.920 | And so, you know, in a few days I had sort of a
00:11:48.800 | hello world type search engine.
00:11:50.280 | I'm like, oh my God, like this completely works.
00:11:53.080 | You don't even have to get the keywords right.
00:11:54.920 | The relevance and ranking is super good.
00:11:57.440 | We even like untuned.
00:11:59.200 | So I guess that's to say, like,
00:12:00.600 | I've been pretty surprised by it.
00:12:01.600 | If you choose like the right algorithm and the right approach
00:12:03.600 | you can actually get like super good results
00:12:05.120 | without having like a ton of data.
00:12:07.000 | And even with LLMs,
00:12:08.440 | you can apply all these other techniques to give them,
00:12:11.080 | kind of bootstrap kind of like task maturity pretty quickly.
00:12:14.680 | - Before we jump into Dash,
00:12:16.200 | let's talk about the Joe Haas and AI engineering stuff.
00:12:18.840 | So IDE, let's break that down.
00:12:21.360 | What IDE do you use?
00:12:22.480 | Do you use Cursor, VS Code?
00:12:24.000 | Do you use any coding assistant?
00:12:25.840 | Like with chat, is it just auto-complete?
00:12:27.720 | - Yeah, yeah, both.
00:12:29.040 | So I use VS Code as like my daily driver,
00:12:30.920 | although I'm like super excited about things like Cursor
00:12:32.840 | or the AI agents.
00:12:34.280 | I have my own like stack underneath that.
00:12:36.720 | I mean, some off the shelf parts, some pretty custom.
00:12:39.200 | So I use the continue.dev,
00:12:41.280 | just like AI chat UI basically as just the UI layer,
00:12:44.280 | but I also proxy the request.
00:12:47.240 | I don't, I proxy request my own backend,
00:12:49.800 | which is sort of like a router.
00:12:50.720 | You can use any backend.
00:12:52.840 | I mean, Sonnet 3.5 is probably the best all around.
00:12:55.920 | But then these things are like pretty limited
00:12:57.800 | if you don't give them the right context.
00:12:58.960 | And so part of what the proxy does is like,
00:13:01.000 | there's a separate thing where I can say like,
00:13:02.640 | include all these files by default with the request.
00:13:05.720 | Then it becomes a lot easier
00:13:07.360 | and like without like cutting and pasting.
00:13:09.040 | And I'm building mostly like prototype toy apps,
00:13:11.080 | so it's like a front end React thing
00:13:12.760 | and a Python backend thing.
00:13:15.080 | And so it can do these like end to end diffs basically.
00:13:18.280 | And then I also like love being able
00:13:20.280 | to host everything locally or do it offline.
00:13:22.560 | So I have my own, when I'm on a plane or something,
00:13:24.640 | or where like you don't have access
00:13:26.600 | or the internet's not reliable,
00:13:28.240 | I actually bring a gaming laptop on the plane with me.
00:13:31.480 | It's like a little like blue briefcase looking thing.
00:13:33.520 | And then I like literally hook up a GPU
00:13:35.280 | like into one of the outlets.
00:13:36.680 | And then I have, I can do like transcription.
00:13:39.080 | I can do like auto-complete.
00:13:41.040 | Like I have an 8 billion, like llama will run fine.
00:13:44.680 | - And you're using like a LLama to run the model?
00:13:47.160 | - No, I use, I have my own like LLM inference stack.
00:13:50.640 | I mean, I use this, the backend is somewhat interchangeable.
00:13:53.440 | So everything from like XLama to VLM or SGLang,
00:13:58.080 | there's a bunch of these different backends you can use.
00:14:00.440 | And then I started like working on stuff
00:14:01.760 | before all this tooling was like really available.
00:14:03.960 | So, you know, over the last several years,
00:14:05.840 | I've built like my own like whole crazy environment
00:14:08.280 | and like in stack here.
00:14:09.960 | So I'm a little nuts about it.
00:14:12.280 | - What's the state-of-the-art for,
00:14:14.720 | I guess not state-of-the-art,
00:14:15.800 | but like when it comes to like frameworks
00:14:17.160 | and things like that, do you like using them?
00:14:19.160 | I think maybe a lot of people say,
00:14:20.320 | hey, things change so quickly.
00:14:21.720 | They're like trying to abstract things.
00:14:23.240 | - Yeah. - It's maybe too early.
00:14:24.680 | - As much as I do a lot of coding,
00:14:26.840 | I have to be pretty surgical with my time.
00:14:28.360 | Like I don't have that much time,
00:14:29.960 | which means I have to sort of like scope my innovation
00:14:32.120 | to like very specific places or like my time.
00:14:35.520 | So for the front end, it'll be like a pretty vanilla stack,
00:14:38.280 | like, you know, Next.js, React based thing.
00:14:41.720 | And then these are toy apps.
00:14:43.320 | So it's like Python, Flask, SQLite,
00:14:45.640 | and then all the different,
00:14:46.840 | then there's a whole other thing on like the backend,
00:14:48.640 | like how do you get sort of run all these models locally
00:14:51.440 | or with a local GPU.
00:14:53.240 | The sort of scaffolding on the front end
00:14:54.760 | is pretty straightforward.
00:14:55.640 | The scaffolding on the backend is pretty straightforward,
00:14:57.560 | but then a lot of it is just like the LLM inference
00:15:00.160 | and control over like fine grain aspects
00:15:02.000 | of how you do generation, caching, things like that.
00:15:05.840 | And then there's a lot, like a lot of the work is
00:15:07.720 | how do you take sort of go to an IMAP,
00:15:09.680 | like take an email, get a new, or a document,
00:15:12.960 | or a spreadsheet, or, you know,
00:15:14.440 | any of these kinds of primitives that you work with
00:15:16.920 | and then translate them, you know,
00:15:18.440 | render them in a format that an LLM can understand.
00:15:20.880 | So there's like a lot of work that goes into that too.
00:15:24.160 | - Yeah, I built a kind of like email triage assistant.
00:15:27.600 | And like, I would say 80% of the code is like Google,
00:15:29.920 | OAuth, and like pulling the emails,
00:15:31.600 | and then the actual AI part is pretty easy.
00:15:33.960 | - Yeah, and even, same experience.
00:15:36.320 | And then I tried to do all these like NLP things,
00:15:39.440 | and then to my dismay, like a bunch of reg Xs
00:15:42.640 | were like, got you like 95% of the way there.
00:15:45.360 | So I still leave it running.
00:15:46.920 | I just haven't really built like the LLM,
00:15:49.000 | LLM powered version of it yet.
00:15:50.560 | - Yeah, yeah, yeah.
00:15:51.400 | Any thoughts on reg versus long context,
00:15:53.880 | especially, I mean, with Dropbox, you know,
00:15:55.880 | do you just want to shove things in?
00:15:57.160 | Like, have you seen that be a lot better?
00:15:59.360 | - Well, they kind of have different strengths and weaknesses.
00:16:00.840 | So you need both for different use cases.
00:16:03.240 | I mean, it's been awesome in the last 12 months.
00:16:05.200 | Like now you have these like long context models
00:16:07.360 | that can actually do a lot.
00:16:09.000 | You know, you can put a book in Sonnet's context.
00:16:12.240 | And then now with the later versions of Llama,
00:16:14.480 | you can have 128K context.
00:16:16.320 | So that's sort of the new normal, which is awesome.
00:16:18.000 | And that wasn't even the case a year ago.
00:16:21.400 | That said, models don't always use,
00:16:24.000 | and certainly like local models
00:16:25.120 | don't use the full context well, fully yet.
00:16:27.880 | And actually, if you provide too much irrelevant context,
00:16:30.440 | the quality degrades a lot.
00:16:32.440 | And so I say in the open source world,
00:16:34.040 | like we're still just getting to the cusp
00:16:36.240 | of like the full context is usable.
00:16:38.560 | And then of course, like when you're
00:16:40.600 | something like Dropbox Dash,
00:16:42.360 | it's basically building this whole like brain
00:16:44.120 | that's like read everything your company's ever written.
00:16:46.840 | And so that's not going to fit into your context window.
00:16:48.880 | So you need RAG just as a practical reality.
00:16:51.400 | And for a lot of similar reasons,
00:16:52.480 | you need like RAM and hard disk
00:16:54.280 | in conventional computer architecture.
00:16:56.680 | And I think these things will keep like horse trading.
00:16:58.640 | Like maybe if, you know, a million or 10 million
00:17:00.920 | is the new, tokens is the new context length,
00:17:03.440 | maybe that shifts.
00:17:05.000 | Maybe the bigger picture is like,
00:17:06.920 | it's super exciting to talk about the LLM
00:17:08.440 | and like that piece of the puzzle,
00:17:10.080 | but there's this whole other scaffolding
00:17:12.280 | of more conventional like retrieval
00:17:14.360 | or conventional machine learning,
00:17:16.160 | especially because you have to scale up products
00:17:18.280 | to like millions of people.
00:17:20.240 | You do, and your toy app is not going to scale to that
00:17:22.960 | from a cost or latency or performance standpoint.
00:17:25.440 | So I think you really need these like hybrid architectures
00:17:28.880 | that where you have very like purpose fit tools,
00:17:32.000 | or you're probably not using Sonnet 3.5
00:17:33.960 | for all of your normal product use cases.
00:17:36.760 | You're going to use like a fine-tuned 8 billion model
00:17:38.960 | or sort of the minimum model that gets you the right output.
00:17:42.640 | And then a smaller model also is like a lot more cost
00:17:44.920 | and latency versus like much better,
00:17:46.840 | some better characteristics on that front.
00:17:48.400 | - Yeah. Let's jump into the Dropbox AI story.
00:17:50.720 | So your initial prototype was Files GPT.
00:17:54.920 | - Yeah.
00:17:55.800 | - How did it start?
00:17:56.720 | And then how did you communicate that internally?
00:17:59.440 | You know, I know you have a pretty strong like mammal culture.
00:18:02.040 | - Sure.
00:18:02.880 | - One where you're like, okay, hey,
00:18:04.360 | we got to really take this seriously.
00:18:05.680 | - Yeah.
00:18:06.520 | Well, on the latter, it was, so how do we,
00:18:09.800 | to me, I'll say like how we took Dropbox,
00:18:11.400 | how AI seriously as a company started
00:18:14.360 | kind of around that honeymoon time, unfortunately.
00:18:17.680 | In January, I wrote this like memo to the company,
00:18:19.880 | like around basically like how we need to play offense
00:18:23.720 | in 23 and that most of the time,
00:18:26.280 | the kind of concrete is set and like the winners
00:18:29.080 | are the winners and things are kind of frozen.
00:18:32.120 | But then with these new eras of computing,
00:18:34.800 | like the PC or the internet or the phone
00:18:37.040 | or the concrete unfreezes and you can sort of build,
00:18:39.360 | do things differently and have a new set of winners.
00:18:41.760 | It's sort of like a new season starts.
00:18:43.520 | As a result of a lot of that sort of personal hacking
00:18:45.880 | and just like thinking about this, I'm like, yeah,
00:18:47.600 | this is an inflection point in the industry.
00:18:49.040 | Like we really need to change
00:18:50.640 | how we think about our strategy.
00:18:52.360 | And then becoming an AI first company
00:18:55.440 | was probably the headline thing that we did.
00:18:58.760 | And then that got, and then calling on everybody
00:19:02.000 | in the company to really think about in your world,
00:19:03.520 | how is AI going to reshape your workflows
00:19:05.440 | or what's sort of the AI native way
00:19:07.320 | of thinking about your job.
00:19:08.640 | And FileGPT, which is sort of this Dropbox AI
00:19:12.360 | kind of initial concept that actually came
00:19:14.320 | from our engineering team as we like called on everybody
00:19:17.320 | to like really think about what we should be doing
00:19:20.080 | that's new or different.
00:19:21.200 | So it was kind of organic and bottoms up
00:19:23.040 | like a bunch of engineers just kind of hacked that together.
00:19:25.880 | And then that materialized as basically
00:19:27.840 | when you preview a file on Dropbox,
00:19:29.600 | you can have kind of the most straightforward
00:19:31.520 | possible integration of AI, which is a good thing.
00:19:34.240 | Like basically you have a long PDF,
00:19:36.840 | you want to be able to ask questions of it.
00:19:38.680 | And so like a pretty basic implementation of RAG
00:19:41.200 | and being able to do that
00:19:42.760 | when you preview a file on Dropbox.
00:19:44.800 | So that was the origin of that.
00:19:46.280 | And that was like back in 2023,
00:19:47.880 | when we released just like the starting engines
00:19:50.000 | had just, you know, gotten going.
00:19:53.080 | - It's funny where you're basically like these files
00:19:55.600 | that people have, they really don't want them in a way,
00:19:57.760 | you know, like you're storing all these files
00:19:59.240 | and like you actually don't want to interact with them.
00:20:00.760 | You want a layer on top of it.
00:20:02.800 | And that's kind of what also takes you to Dash eventually,
00:20:05.480 | which is like, hey,
00:20:06.320 | you actually don't really care where the file is.
00:20:08.440 | You just want to be the place that aggregates it.
00:20:10.880 | How do you think about what people will know about files?
00:20:14.160 | You know, are files the actual file?
00:20:16.720 | Are files like the metadata
00:20:18.200 | and they're just kind of like a pointer that goes somewhere
00:20:20.040 | and you don't really care where it is?
00:20:21.720 | Yeah, any thoughts about?
00:20:22.760 | - Totally, yeah.
00:20:23.720 | I mean, there's a lot of potential complexity
00:20:26.120 | in that question, right?
00:20:27.040 | Or is it a, you know,
00:20:28.160 | what's the difference between a file and a URL?
00:20:30.600 | And you can go into the technicals,
00:20:31.720 | it's like pass-by-value, pass-by-reference.
00:20:34.160 | Okay, what's the format like?
00:20:35.600 | All right, well, now if it's real-time collaborative,
00:20:37.240 | it's not really a flat file.
00:20:38.680 | It's like a structured data you're sort of collaborating,
00:20:41.360 | you know, that's keeping in sync, blah, blah, blah.
00:20:43.240 | I actually don't start there at all.
00:20:44.960 | I just start with like, what do people,
00:20:46.520 | like what do humans,
00:20:47.600 | let's work back from like how humans think about this stuff
00:20:49.560 | or how they should think about this stuff.
00:20:51.360 | Meaning like, I don't think about,
00:20:53.160 | oh, here are my files and here are my links or cloud docs.
00:20:56.640 | I'm just sort of like, oh, here's my stuff.
00:20:58.960 | Here's sort of my documents.
00:21:00.040 | Here's my media.
00:21:00.880 | Here's my projects.
00:21:01.720 | Here are the people I'm working with.
00:21:03.160 | So it starts from primitives more like those,
00:21:05.480 | like how do humans think about these things?
00:21:07.680 | And then start from like a more ideal experience.
00:21:11.520 | 'Cause if you think about it,
00:21:12.760 | we kind of have this situation that we'll look like,
00:21:14.720 | particularly medieval in hindsight,
00:21:16.360 | where, all right, how do you manage your work stuff?
00:21:18.520 | Well, on one side of your screen,
00:21:20.640 | you have this file browser
00:21:21.920 | that literally hasn't changed since the early '80s, right?
00:21:24.440 | You could take someone from the original Mac
00:21:26.640 | and sit them in front of a computer
00:21:28.200 | and they'd be like, this is it.
00:21:29.600 | And it's been 40 years, right?
00:21:31.920 | Then on the other side of your screen,
00:21:33.120 | you have Chrome or a browser that has so many tabs open,
00:21:36.320 | you can no longer see text or titles.
00:21:39.280 | This is the state of the art
00:21:40.680 | for how we manage stuff at work.
00:21:43.040 | Interestingly, neither of those experiences
00:21:45.200 | was purpose-built to be the home for your work stuff
00:21:48.680 | or even anything related to it.
00:21:51.040 | And so it's important to remember
00:21:52.960 | we get stuck in these local maxima pretty often in tech,
00:21:56.680 | where we're obviously aware that files are not going away,
00:22:00.200 | especially in certain domains,
00:22:02.000 | like the format really matters
00:22:03.520 | and where files are still gonna be the tool you use
00:22:07.120 | for if there's something big, right?
00:22:09.080 | If you have a big video file,
00:22:11.800 | that kind of format in a file makes sense.
00:22:14.840 | There's a bunch of industries
00:22:15.960 | where it's like construction or architecture
00:22:17.600 | or sort of these domain-specific areas.
00:22:20.600 | Media generally, if you're making music or photos or video,
00:22:24.200 | that all kind of fits in the big file zone
00:22:26.160 | where Dropbox is really strong
00:22:27.680 | and that's what customers love us for.
00:22:29.600 | But it's also pretty obvious
00:22:31.800 | that a lot of stuff that used to be in Word docs
00:22:35.200 | or Excel files, all that has tilted towards the browser
00:22:38.080 | and that tilt is gonna continue.
00:22:40.040 | So with Dash, we wanted to make something
00:22:41.080 | that was really cloud-native, AI-native,
00:22:44.960 | and deliberately not be tied down
00:22:47.720 | to the abstractions of the file system.
00:22:50.760 | Now, on the other hand, it would be ironic and bad
00:22:54.120 | if we then fractured the experience
00:22:55.960 | that you're like, well, if it touches a file,
00:22:57.280 | it's a syncing metaphor to this app.
00:22:59.360 | And if it's a URL,
00:23:00.960 | it's this completely different interface.
00:23:02.920 | So there's a convergence that I think makes sense over time.
00:23:06.560 | But I think you have to start
00:23:08.600 | from not so much the technology,
00:23:10.720 | start from what do the humans want?
00:23:13.200 | And then what's the idealized product experience?
00:23:15.280 | And then what are the technical underpinnings of that
00:23:18.240 | that can make that good experience?
00:23:20.200 | - I think it's counterintuitive that in Dash,
00:23:22.240 | you can connect Google Drive, right?
00:23:24.000 | Because you think about Dropbox as well,
00:23:25.960 | it's file storage,
00:23:26.800 | you really don't want people to store a file somewhere,
00:23:28.520 | but the reality is that they do.
00:23:30.480 | How do you think about the importance of storage?
00:23:32.440 | And do you kind of feel storage is almost solved,
00:23:34.440 | where it's like, hey,
00:23:35.280 | you can kind of store these files anywhere?
00:23:36.680 | What matters is access?
00:23:38.200 | - It's a little bit nuanced in that
00:23:40.640 | if you're dealing with large quantities of data,
00:23:42.680 | it actually does matter.
00:23:43.680 | The implementation matters a lot
00:23:44.960 | or if you're dealing with 10 gig video files like that.
00:23:48.280 | Then you sort of inherit all the problems of sync
00:23:50.160 | and have to go into a lot of the challenges
00:23:52.400 | that we've solved.
00:23:53.320 | Touching on a pretty important question,
00:23:54.920 | what is the value we provide?
00:23:56.320 | What does Dropbox do?
00:23:58.360 | And probably like most people,
00:23:59.680 | I would have said, well, Dropbox syncs your files.
00:24:03.040 | And we didn't even really have a mission of the company
00:24:05.280 | in the beginning.
00:24:06.120 | I'm just like, yeah,
00:24:06.960 | I just don't want to carry my thumb driving around
00:24:07.920 | and life would be a lot better
00:24:09.720 | if our stuff just lived in the cloud
00:24:11.160 | and I just didn't have to think about
00:24:13.200 | what device is the thing on
00:24:14.480 | or why are these operating systems
00:24:16.560 | fighting with each other and incompatible?
00:24:18.800 | I just want to abstract all of that away.
00:24:21.680 | But then so we thought,
00:24:22.840 | even we were like, all right, Dropbox provides storage.
00:24:25.320 | But when we talked to our customers,
00:24:26.520 | they're like, that's not how we see this at all.
00:24:29.880 | Actually, Dropbox is not just like a hard drive in the cloud.
00:24:33.600 | It's like the place where I go to work
00:24:35.680 | or it's a place like I started a small business.
00:24:37.240 | It's a place where my dreams come true
00:24:39.000 | or it's like, yeah, it's not keeping files in sync.
00:24:41.160 | It's keeping people in sync.
00:24:42.880 | It's keeping my team in sync.
00:24:44.160 | And so they're using this kind of language
00:24:46.120 | where we're like, wait, okay.
00:24:47.680 | Yeah, 'cause I don't know,
00:24:49.000 | storage probably is a commodity
00:24:50.440 | or what we do is a commodity.
00:24:52.600 | But then we talked to our customers,
00:24:53.720 | they're like, no, we're not buying the storage.
00:24:55.560 | We're buying the ability to access
00:24:57.080 | all of our stuff in one place.
00:24:58.560 | We're buying the ability to share everything
00:25:00.160 | and sort of, in a lot of ways,
00:25:01.560 | people are buying the ability to work from anywhere
00:25:04.160 | and Dropbox was kind of,
00:25:05.280 | the fact that it was like file syncing
00:25:06.440 | was an implementation detail
00:25:08.320 | of this higher order need that they had.
00:25:10.720 | So I think that's where we start too,
00:25:13.280 | which is like, what is the sort of higher order thing,
00:25:16.120 | the job the customer's hiring Dropbox to do?
00:25:19.480 | Storage in the new world is kind of incidental to that.
00:25:21.800 | I mean, it still matters for things like video
00:25:23.360 | or those kinds of workflows.
00:25:24.920 | The value of Dropbox has never been,
00:25:26.080 | we provide you like the cheapest bits in the cloud.
00:25:28.720 | But it is a big pivot from Dropbox is the company
00:25:33.400 | that syncs your files to now where we're going
00:25:35.560 | is Dropbox is the company that kind of helps you organize
00:25:38.080 | all your cloud content.
00:25:39.520 | I started the company 'cause I kept forgetting
00:25:41.120 | my thumb drive, but with the question I was really asking,
00:25:44.120 | I was like, why is it so hard to like find my stuff,
00:25:46.680 | organize my stuff, share my stuff, keep my stuff safe?
00:25:50.720 | You know, I'm always like one washing machine.
00:25:53.600 | I would leave like my little thumb drive
00:25:54.880 | with all my prior company stuff in the pocket of my shorts
00:25:58.080 | and then almost wash it and destroy it.
00:25:59.880 | And so I was like, why do we have to,
00:26:01.720 | this is like medieval that we have to think about this.
00:26:03.640 | So that same mindset is how I approach where we're going.
00:26:07.360 | But I think, and then unfortunately,
00:26:09.920 | we're sort of back to the same problems.
00:26:11.960 | Like it's really hard to find my stuff.
00:26:13.400 | It's really hard to organize myself.
00:26:14.720 | It's hard to share my stuff.
00:26:16.240 | It's hard to secure my content at work.
00:26:20.000 | Now, the problem is the same.
00:26:21.520 | The shape of the problem and the shape of the solution
00:26:23.720 | is pretty different.
00:26:24.560 | You know, instead of 100 files on your desktop,
00:26:26.280 | it's now 100 tabs in your browser, et cetera.
00:26:28.640 | But I think that's the starting point.
00:26:30.160 | - How has the idea of a product evolved for you?
00:26:32.720 | So, you know, famously Steve Jobs started by Dropbox
00:26:35.600 | and he's like, you know, this is just a feature.
00:26:37.120 | It's not a product.
00:26:37.960 | Can you build like a $10 billion feature?
00:26:40.760 | How, in the age of AI, how do you think about, you know,
00:26:43.640 | maybe things that used to be a product are now features
00:26:46.120 | because the AI on top of it, it's like the product,
00:26:48.280 | like what's your mental model to think about it?
00:26:50.240 | - Yeah.
00:26:51.080 | So I don't think there's really like a bright line.
00:26:53.680 | I don't know if like I use the word features and products
00:26:56.720 | in my mental model that much of how I break it down.
00:26:59.680 | 'Cause it's kind of a, it's a good question.
00:27:02.400 | I mean, I don't not think about features,
00:27:04.560 | I don't think about products,
00:27:06.080 | but it does start from that place of like, all right,
00:27:08.520 | we have all these new colors we can paint with.
00:27:10.640 | And all right, what are these higher order needs
00:27:13.440 | that are sort of evergreen, right?
00:27:15.520 | So people will always have stuff at work.
00:27:17.800 | They're always need to be able to like find it,
00:27:19.320 | or, you know, all the verbs I just mentioned.
00:27:21.840 | It's like, okay, how can we make like a better painting
00:27:24.080 | and how can we,
00:27:24.920 | and then how can we use some of these new colors?
00:27:27.120 | And then, yeah, it's like pretty clear
00:27:28.920 | that after the large models,
00:27:31.160 | the way you find stuff, organize stuff, share stuff,
00:27:33.760 | it's gonna be completely different.
00:27:34.840 | After COVID, it's gonna be completely different.
00:27:37.440 | So that's the starting point.
00:27:38.600 | But I think it is also important to, you know,
00:27:41.800 | you have to do more than just work back from the customer
00:27:43.920 | and like what they're trying to do.
00:27:45.000 | Like, you have to think about,
00:27:46.480 | and we've learned a lot of this the hard way sometimes.
00:27:49.520 | Okay, you might start with a customer,
00:27:50.560 | you might start with a job to be done.
00:27:51.760 | Then you're like, all right,
00:27:52.600 | what's the solution to their problem?
00:27:54.360 | Or like, can we build the best product
00:27:57.320 | that solves that problem, right?
00:27:58.400 | So can we build the best way to find your stuff
00:28:01.400 | in the modern world?
00:28:02.240 | Like, well, yeah, right now the status quo
00:28:04.240 | for the vast majority of the billion knowledge workers
00:28:07.600 | is they have like 10 search boxes at work
00:28:09.540 | that each search 10% of your stuff.
00:28:10.880 | Like, that's clearly broken.
00:28:12.520 | Obviously, you should just have like one search box,
00:28:14.760 | all right, so we can do that.
00:28:15.960 | And that also has to be like,
00:28:17.680 | I'll come back to defensibility in a second.
00:28:19.160 | But like, can we build the right solution
00:28:20.480 | that is like meaningfully better from the status quo?
00:28:22.520 | Like, yes, clearly.
00:28:23.760 | Okay, then can we like get distribution and growth?
00:28:27.200 | Like, that's sort of the next thing you learn.
00:28:29.040 | As a founder, you start with like,
00:28:30.120 | what's the product, what's the product, what's the product?
00:28:31.920 | Then you're like, wait, wait, we need distribution
00:28:33.600 | and we need a business model.
00:28:34.880 | So those are the next kind of two dominoes
00:28:36.600 | you have to knock down,
00:28:37.440 | or sort of needles you have to thread at the same time.
00:28:40.720 | So, all right, how do we grow?
00:28:42.240 | I mean, with Dropbox 1.0,
00:28:44.000 | there's really this like self-serve viral model
00:28:46.160 | that there's a lot of,
00:28:47.540 | we sort of took a lot,
00:28:48.380 | borrowed from a lot of the consumer internet playbook
00:28:50.840 | and like what Facebook and social media were doing
00:28:52.680 | and then translated that to sort of the business world.
00:28:55.280 | But how do you get distribution,
00:28:56.880 | you know, especially as a startup?
00:28:58.140 | And then a business model like, all right,
00:28:59.600 | storage happened to be some, in the beginning,
00:29:01.080 | happened to be something people were willing to pay for.
00:29:02.720 | They recognized that, you know, okay,
00:29:04.560 | if I don't buy something like Dropbox,
00:29:05.800 | I'm going to have to buy an external hard drive.
00:29:07.320 | I'm going to have to buy a thumb drive.
00:29:09.360 | I'm going to have to pay for something one way or another.
00:29:11.200 | People were already paying for things like backup.
00:29:12.820 | So we felt good about that.
00:29:14.280 | But then the last domino is like defensibility.
00:29:16.720 | Okay, so you build this product,
00:29:18.160 | so you get the business model,
00:29:19.080 | but then, you know, what do you do when the incumbents,
00:29:21.520 | the next chess move for them is just like copy, bundle, kill.
00:29:25.440 | So they're going to copy your product.
00:29:26.900 | They'll bundle it with their platforms
00:29:28.160 | and they'll like give it away for free or no added cost.
00:29:30.720 | And, you know, we had a lot of, you know,
00:29:32.920 | scar tissue from being on the wrong side of that.
00:29:35.640 | Now you don't need to solve all four,
00:29:37.280 | for all four or five variables or whatever at once,
00:29:40.320 | or you can sort of have, you know, some flexibility,
00:29:42.600 | but the more of those gates that you get through,
00:29:46.760 | you sort of add a 10X to your valuation.
00:29:49.680 | And so with AI, I think, you know,
00:29:52.280 | there's been a lot of focus on the large language model,
00:29:55.600 | but it's like large language models
00:29:57.320 | are a pretty bad business from a, you know,
00:30:00.000 | you sort of take off your tech lens
00:30:01.280 | and your sort of business lens.
00:30:02.240 | Like there's sort of this weirdly self-commoditizing thing
00:30:05.400 | where, you know, models only have value
00:30:07.240 | if they're kind of on this like Pareto frontier
00:30:09.120 | of size and quality and cost.
00:30:11.240 | Being number two, you know, if you're not on that frontier,
00:30:14.160 | the second the frontier moves out,
00:30:16.080 | which it moves out every week,
00:30:17.460 | like your model literally has zero economic value
00:30:19.840 | 'cause it's dominated by the new thing.
00:30:21.880 | LLMs generate output that can be used to train or improve.
00:30:25.320 | So there's this weird, peculiar things
00:30:27.760 | that are specific to the large language model.
00:30:30.040 | And then you have to like be like,
00:30:31.080 | where's the value gonna accrue in the stack
00:30:33.760 | or the value chain?
00:30:34.600 | And, you know, certainly at the bottom with NVIDIA
00:30:37.680 | and the semiconductor companies,
00:30:38.960 | and then it's gonna be at the top,
00:30:40.760 | like the people who have the customer relationship,
00:30:42.960 | who have the application layer.
00:30:45.000 | Those are a few of the like lenses
00:30:46.480 | that I look at a question like that through.
00:30:48.720 | - Do you think AI is making people more careful
00:30:52.200 | about sharing the data at all?
00:30:54.240 | People are like, oh, data's important,
00:30:55.760 | but it's like, whatever, I'm just throwing it out there.
00:30:57.560 | But now everybody's like,
00:30:58.840 | but are you gonna train on my data?
00:31:00.200 | And like your data is actually not that good
00:31:01.680 | to train on anyway.
00:31:02.600 | But like, how have you seen, especially customers,
00:31:05.200 | like think about what to put in, what to not?
00:31:07.040 | - I mean, everybody should be,
00:31:08.600 | well, if everybody is concerned about this,
00:31:10.040 | nobody should be concerned about this, right?
00:31:11.880 | Because nobody wants their personal
00:31:14.880 | and for companies information to be kind of ground up
00:31:17.000 | into little pellets to like sell you ads
00:31:19.840 | or train the next foundation model.
00:31:22.160 | I think it's like massively top of mind
00:31:24.320 | for every one of our customers,
00:31:25.880 | and me personally, and with my Dropbox hat on.
00:31:28.720 | It's like so fundamental.
00:31:30.400 | And you know, we had experience with this too
00:31:32.280 | at Dropbox 1.0, the same kind of resistance.
00:31:34.320 | Like, wait, I'm gonna take my stuff on my hard drive
00:31:37.000 | and put it on your server somewhere.
00:31:39.200 | Are you serious?
00:31:40.640 | What could possibly go wrong?
00:31:42.360 | And you know, before that I was like, wait,
00:31:43.920 | are you gonna sell me?
00:31:44.760 | I'm gonna put my credit card number into this website.
00:31:47.480 | And before I was like, hey, you're gonna,
00:31:48.720 | I'm gonna take all my cash
00:31:49.680 | and put it in a bank instead of under my mattress.
00:31:51.880 | You know, so there's a long history of like tech
00:31:54.280 | and comfort, so in some sense AI is kind of another round
00:31:57.360 | of the same thing, but the issues are real.
00:31:59.160 | And then when I think about like defensibility for Dropbox,
00:32:01.480 | like that's actually a big advantage that we have
00:32:03.360 | is one, our incentives are very aligned
00:32:05.720 | with our customers, right?
00:32:06.600 | We only get, we only make money if you pay us,
00:32:08.880 | and you only pay us if we do a good job.
00:32:10.840 | So we don't have any like side hustle.
00:32:13.280 | We're not training the next foundation model.
00:32:15.360 | You know, we're not trying to sell you ads.
00:32:18.400 | Actually, we're not even trying to lock you
00:32:19.520 | into an ecosystem, like the whole point of Dropbox
00:32:21.480 | is it works, you know, everywhere.
00:32:23.520 | Because I think one of the big questions,
00:32:24.960 | and we've circling around and sort of like,
00:32:26.280 | all right, in the world of AI, where should our lane be?
00:32:28.200 | Like every startup has to ask,
00:32:29.680 | or in every big company has to ask like,
00:32:31.320 | where can we really win?
00:32:33.440 | But to me, it was like a lot of the like trust advantages
00:32:36.080 | about being platform agnostic,
00:32:37.360 | having like a very clean business model,
00:32:38.800 | not having these other incentives.
00:32:40.760 | And then we also are like super transparent.
00:32:43.280 | We were transparent early on.
00:32:44.440 | We're like, all right, we're gonna establish
00:32:45.280 | these AI principles, very table stake stuff of like,
00:32:48.280 | here's transparency, we wanna give people control,
00:32:50.600 | we wanna cover privacy, safety, bias,
00:32:53.720 | like fairness, all these things.
00:32:55.480 | And we put that out up front
00:32:56.480 | to put some sort of explicit guardrails out.
00:32:58.800 | We're like, hey, we're, you know,
00:32:59.840 | 'cause everybody wants like a trusted partner
00:33:02.040 | as they sort of go into the wild world of AI.
00:33:04.080 | And then, you know, you also see people cutting corners
00:33:06.880 | and, you know, or just there's a lot of uncertainty
00:33:08.920 | or, you know, moving the pieces around after the fact,
00:33:12.600 | which no one feels good about.
00:33:14.240 | - I mean, I would say the last 10, 15 years,
00:33:16.080 | the race was kind of being the system of record,
00:33:18.160 | being the storage provider.
00:33:19.960 | I think today it's almost like, hey,
00:33:21.600 | if I can use Dash to like access my Google Drive file,
00:33:24.400 | why would I pay Google for like their AI feature?
00:33:26.680 | So like vice versa, you know,
00:33:27.760 | if I can connect my Dropbox storage
00:33:29.120 | to this other AI system.
00:33:30.680 | How do you kind of think about that?
00:33:31.960 | About, you know, not being able to capture all the value
00:33:34.920 | and how open people will stay?
00:33:37.320 | I think today things are still pretty open,
00:33:39.000 | but I'm curious if you think things will get more closed
00:33:41.360 | or like more open later?
00:33:42.240 | - Yeah, well, I think you have to get
00:33:43.640 | the value exchange right.
00:33:45.120 | And I think you have to be like a trustworthy partner
00:33:47.640 | or like no one's gonna partner with you
00:33:48.840 | if they think you're gonna eat their lunch, right?
00:33:51.040 | Or if you're gonna disintermediate them.
00:33:52.840 | And like all the companies are quite sophisticated
00:33:54.600 | with how they think about that.
00:33:55.640 | So we try to, like we know that's gonna be the reality.
00:33:58.280 | So we're actually not trying to eat
00:34:00.520 | anyone's like Google Drive's lunch or anything.
00:34:03.520 | Actually we'll like integrate with Google Drive,
00:34:05.400 | we'll integrate with OneDrive,
00:34:06.880 | really any of the content platforms,
00:34:08.840 | even if they compete with file syncing.
00:34:10.360 | So that's actually a big strategic shift.
00:34:12.360 | We're not really reliant on being like the store of record.
00:34:15.320 | And there are pros and cons to this decision.
00:34:17.560 | But if you think about it,
00:34:18.920 | we're basically like providing
00:34:20.040 | all these apps more engagement.
00:34:21.400 | We're like helping users do
00:34:22.880 | what they're really trying to do,
00:34:23.760 | which is to get, you know, that Google Doc or whatever.
00:34:25.840 | And we're not trying to be like,
00:34:26.800 | oh, by the way, use this other thing.
00:34:28.680 | This is all part of our like brand reputation.
00:34:30.920 | It's like, no, we give people freedom
00:34:32.720 | to use whatever tools or operating system they want.
00:34:35.320 | We're not taking anything away from our partners.
00:34:37.000 | We're actually like making their thing more useful
00:34:39.360 | or routing people to those things.
00:34:41.720 | I mean, on the margin there might be something like,
00:34:43.000 | well, okay, to the extent you do rag and summarize things,
00:34:45.200 | maybe that doesn't generate a click.
00:34:46.800 | Okay.
00:34:47.640 | You know, we also know there's like infinity investment
00:34:49.880 | going into like the work agents.
00:34:52.160 | So we're not really building like a co-pilot
00:34:55.080 | or Gemini competitor.
00:34:56.600 | Not because we don't like those.
00:34:58.440 | We don't find that thing like captivating.
00:35:00.040 | Yeah, of course.
00:35:00.880 | But just like, you know,
00:35:01.720 | you learn after some time in this business that like,
00:35:04.520 | yeah, there's some places that are just gonna be
00:35:05.880 | such kind of red oceans
00:35:07.800 | or just like super big battlefields.
00:35:10.920 | Everybody's kind of trying to solve the same problem
00:35:12.600 | and they just start duplicating all each other effort.
00:35:14.600 | And then meanwhile, you know,
00:35:16.080 | I think the concern would be is like,
00:35:17.200 | well, there's all these other problems
00:35:18.480 | that aren't being properly addressed by AI.
00:35:20.720 | And I was concerned that like,
00:35:21.960 | yeah, and everybody's like fixated on the agent
00:35:24.760 | or the chatbot interface,
00:35:25.920 | but forgetting that like, hey guys,
00:35:27.400 | like we have the opportunity to like really fix search
00:35:30.080 | or build a self-organizing Dropbox or environment
00:35:32.680 | or there's all these other things that can be a compliment.
00:35:34.840 | 'Cause we don't really want our customers to be thinking like,
00:35:38.280 | oh, do I use Dash or do I use co-pilot?
00:35:40.160 | And frankly, none of them do.
00:35:41.840 | In a lot of ways, actually some of the things that we do
00:35:43.360 | on the security front with Dash for Business
00:35:44.960 | are a good compliment to co-pilot
00:35:47.440 | because as part of Dash for Business,
00:35:49.240 | we actually give admins IT like universal visibility
00:35:53.440 | and control over all the different,
00:35:54.680 | what's being shared in your company
00:35:55.840 | across all these different platforms.
00:35:57.560 | And as a precondition to installing something like co-pilot
00:36:02.320 | or Dash or Glean or any of these other things, right?
00:36:05.560 | You know, IT wants to know like,
00:36:06.560 | hey, before we like turn all the lights in here,
00:36:09.240 | like let's do a little cleaning first
00:36:11.000 | before we let everybody in.
00:36:12.400 | And there just haven't been good tools to do that.
00:36:14.440 | And post AI, you would do it completely differently.
00:36:16.320 | And so that's like a big,
00:36:17.160 | that's a cornerstone of what we do
00:36:18.160 | and what sets us apart from these tools.
00:36:20.480 | And actually, in a lot of cases,
00:36:21.800 | we will help those tools be adopted
00:36:24.680 | because we actually help them do it safely.
00:36:26.800 | - Yeah.
00:36:27.640 | How do you think about building for AI versus people?
00:36:29.400 | It's like, when you mentioned cleaning up,
00:36:31.280 | it's because maybe before you were like,
00:36:32.840 | well, humans can have some common sense
00:36:35.040 | when they look at data on what to pick
00:36:36.760 | versus models are just kind of like ingesting.
00:36:39.200 | Do you think about building products differently,
00:36:41.040 | knowing that a lot of the data
00:36:42.680 | will actually be consumed by LLMs
00:36:44.320 | and like agents and whatnot versus like just people?
00:36:47.040 | - Well, I think it'll always be,
00:36:48.720 | I aim a little bit more for like, you know,
00:36:50.360 | level three, level four kind of automation.
00:36:52.280 | 'Cause even if the LLM is like capable
00:36:54.400 | of completely autonomously organizing your environment,
00:36:57.120 | it probably would do a reasonable job.
00:36:58.640 | But like, I think you build bad UI
00:37:00.880 | when the sort of user has to fit itself to the computer
00:37:04.120 | versus something that you're, you know,
00:37:06.320 | it's like an instrument you're playing
00:37:07.440 | or something where you have some kind of good partnership.
00:37:10.640 | And, you know, and on the other side,
00:37:12.280 | you don't have to do all this like manual effort.
00:37:14.200 | And so like the command line was sort of subsumed
00:37:16.800 | by like, you know, graphical UI.
00:37:19.240 | We'll keep toggling back and forth.
00:37:20.520 | Maybe chat will be, chat will be an increasing,
00:37:22.720 | and especially when you bring in voice,
00:37:23.920 | like will be an increasing part of the puzzle.
00:37:25.960 | But I don't think we're gonna go back
00:37:26.840 | to like a million command lines either.
00:37:29.440 | And then as far as like the sort of plumbing of like,
00:37:32.080 | well, is this gonna be consumed by an LLM or a human?
00:37:34.320 | Like fortunately, like you don't really have to
00:37:36.440 | design it that differently.
00:37:37.640 | I mean, you have to make sure
00:37:38.480 | everything's legible to the LLM,
00:37:39.880 | but it's like quite tolerant of, you know,
00:37:42.280 | malformed everything.
00:37:43.960 | And actually the more,
00:37:45.360 | the easier you make something to read for a human,
00:37:47.040 | the easier it is for an LLM to read to some extent as well.
00:37:49.800 | But we really think about what's that kind of right,
00:37:51.800 | how do we build that right, like human machine interface
00:37:54.240 | where you're still in control and driving,
00:37:56.760 | but then it's super easy to translate your intent
00:37:59.520 | into like the, you know, however you want your folder,
00:38:02.480 | setting your environment set up or like your preferences.
00:38:05.440 | - What's the most underrated thing about Dropbox
00:38:07.880 | that maybe people don't appreciate?
00:38:09.480 | - Well, I think this is just such a natural evolution
00:38:12.160 | for us.
00:38:13.040 | It's pretty true.
00:38:14.080 | Like when people think about the world of AI,
00:38:17.400 | file syncing is not like the next thing
00:38:20.400 | you would auto-complete mentally.
00:38:22.320 | And I think we also did like our first thing so well
00:38:24.840 | that there were a lot of benefits to that.
00:38:26.520 | But I think there also are like,
00:38:28.240 | we hit it so hard with our first product
00:38:30.000 | that it was like pretty tough to come up with a sequel.
00:38:32.840 | And we had a bit of a sophomore slump and, you know,
00:38:35.680 | I think actually a lot of kids do use Dropbox
00:38:38.440 | through in high school or things like that,
00:38:39.800 | but you know, they're not,
00:38:41.720 | they're a lot more in the browser
00:38:42.640 | and then their file system, right?
00:38:43.800 | And we know all this,
00:38:45.360 | but still like we're super well positioned
00:38:49.680 | to like help a new generation of people
00:38:52.480 | with these fundamental problems and these like,
00:38:54.320 | that affect, you know, a billion knowledge workers
00:38:56.240 | around just finding, organizing, sharing your stuff
00:38:59.720 | and keeping it safe.
00:39:00.960 | And there's a ton of unsolved problems in those four verbs.
00:39:04.640 | We've talked about search a little bit,
00:39:06.080 | but just even think about like a whole new generation
00:39:08.520 | of people like growing up
00:39:10.160 | without the ability to like organize their things.
00:39:12.600 | And yeah, search is great.
00:39:13.760 | And if you just have like a giant infinite pile of stuff,
00:39:16.600 | then search does make that more manageable.
00:39:18.520 | But you know, you do lose some things
00:39:20.920 | that were pretty helpful in prior decades, right?
00:39:23.680 | So even just the idea of persistence,
00:39:25.720 | stuff still being there when you come back,
00:39:27.760 | like when I go to sleep and wake up,
00:39:29.320 | my physical papers are still on my desk.
00:39:31.280 | When I reboot my computer,
00:39:32.360 | the files are still on my hard drive.
00:39:33.880 | But then when in my browser,
00:39:35.080 | like if my operating system updates the wrong way
00:39:37.000 | and closes the browser,
00:39:38.080 | or if I just more commonly just declared tab bankruptcy,
00:39:41.000 | it's like your whole workspace just clears itself out
00:39:43.560 | and starts from zero.
00:39:45.000 | And you're like, on what planet is this a good idea?
00:39:47.800 | Like, you know, there's no like concept of like,
00:39:50.200 | oh, here's the stuff I was working on.
00:39:51.640 | Yeah, let me get back to it.
00:39:53.080 | And so that was like a big motivation for things like Dash.
00:39:56.160 | Huge problems with sharing, right?
00:39:57.480 | If I'm remodeling my house
00:39:59.400 | or if I'm getting ready for a board meeting,
00:40:02.360 | you know, what do I do if I have a Google doc
00:40:03.840 | and an air table and a 10 gig 4K video?
00:40:06.320 | There's no collection that holds mixed format things.
00:40:10.920 | And so it's another kind of hidden problem,
00:40:12.560 | hidden in plain sight, like it's missing primitives.
00:40:14.480 | Like yeah, files have folders, songs have playlists,
00:40:17.160 | links have, you know, there's no, somehow we miss that.
00:40:20.520 | And so we're building that with stacks in Dash,
00:40:22.720 | where it's like a mixed format, smart collection
00:40:24.840 | that you can then, you know,
00:40:26.200 | just share whatever you need internally, externally,
00:40:28.520 | and have it be like a really well-designed experience
00:40:31.240 | and platform agnostic
00:40:32.240 | and not tying you to any one ecosystem.
00:40:34.440 | We're super excited about that.
00:40:36.280 | You know, we talked a little bit about security
00:40:37.640 | in the modern world.
00:40:38.480 | Like IT signs all these compliance documents,
00:40:40.680 | but in reality has no way of knowing where anything is
00:40:43.320 | or what's being shared.
00:40:44.520 | It's actually better for them to not know about it
00:40:46.360 | than to know about it and not be able to do anything about it.
00:40:48.560 | And when we talked to customers,
00:40:49.400 | we found that there were like literally people in IT
00:40:52.520 | whose jobs it is to like manually go through,
00:40:55.080 | log into each, like log into Office, log into Workspace,
00:40:58.480 | log into each tool and like go comb through one by one
00:41:02.000 | the links that people have shared and like un-share.
00:41:03.960 | There's like an un-share guy in all these companies.
00:41:06.360 | And that job is probably about as fun as it sounds.
00:41:08.560 | Like, my God.
00:41:10.080 | So there's, you know, fortunately,
00:41:12.600 | I guess what makes technology a good business
00:41:14.280 | is for every problem it solves, it like creates a new one.
00:41:17.920 | So there's always like a sequel that you need.
00:41:20.160 | And so, you know, I think the happy version of our Act Two
00:41:23.280 | is kind of similar to Netflix,
00:41:25.080 | where I look a lot of these companies
00:41:26.640 | that really had multiple acts.
00:41:28.440 | And, you know, Netflix had the vision
00:41:29.920 | to be streaming from the beginning,
00:41:31.280 | but broadband and everything wasn't ready for it.
00:41:33.280 | So they started by mailing you DVDs,
00:41:35.000 | but then went to streaming and then,
00:41:36.400 | but the value prop the whole time was just like,
00:41:37.760 | let me press play on something I want to see.
00:41:40.280 | And they did a really good job about bringing people along
00:41:42.240 | from the DVD mailing off.
00:41:43.720 | You would think like, oh, the DVD mailing piece
00:41:45.760 | is like this burning platform,
00:41:47.960 | or it's like legacy, you know, ankle weight.
00:41:51.360 | And they did have some false starts in that transition.
00:41:54.160 | But when you really think about it,
00:41:55.440 | they were able to take that DVD mailing audience,
00:41:58.320 | move, like migrate them to streaming
00:42:00.440 | and actually bootstrap a, you know,
00:42:02.960 | take their season one people
00:42:04.120 | and bootstrap a victory in season two,
00:42:06.640 | because they already had, you know,
00:42:08.440 | they weren't starting from scratch.
00:42:10.400 | And like both of those worlds were like super,
00:42:12.480 | it's super easy to sort of forget
00:42:14.840 | and be like, oh, it was all kind of destiny.
00:42:17.000 | But like, no, that was like
00:42:18.560 | an incredibly competitive environment.
00:42:21.080 | And Netflix did a great job of like activating
00:42:23.280 | their Act One advantages and winning in Act Two
00:42:26.240 | because of it.
00:42:27.080 | So I don't think people see Dropbox that way.
00:42:28.600 | I think people are sort of thinking about it
00:42:29.800 | as just in terms of our Act One.
00:42:31.240 | And they're like, yeah, Dropbox is fine.
00:42:32.360 | I used it 10 years ago,
00:42:33.360 | but like, what have they done for me lately?
00:42:35.120 | And I don't blame them.
00:42:36.040 | So fortunately we have like better and better answers
00:42:38.560 | to that question every year.
00:42:39.720 | - And you call it like the Silicon Brain.
00:42:41.680 | So you see like Dash and Stacks
00:42:43.480 | being like the Silicon Brain interface basically for people.
00:42:46.520 | - I mean, that's part of it, yeah.
00:42:47.480 | And writ large, I mean, I think what's so exciting
00:42:50.160 | about AI and everybody's got their own kind of take on it,
00:42:53.360 | but if you like really zoom out civilizationally
00:42:56.080 | and like what allows humans to make progress
00:42:59.440 | and what sort of is above the fold
00:43:01.640 | in terms of what's really mattered.
00:43:03.320 | Certainly one of the, I mean, there are a lot of points,
00:43:05.160 | but some that come to mind are like,
00:43:06.600 | you think about things like the Industrial Revolution,
00:43:08.960 | like before that, like mechanical energy,
00:43:10.640 | like the only way you could get it
00:43:11.720 | was like by your own hands, maybe an animal,
00:43:15.680 | maybe some like clever sort of machines
00:43:19.040 | or machines made of like wood or something, right?
00:43:22.040 | But you were quite like energy limited.
00:43:24.600 | And then suddenly, you know, the Industrial Revolution,
00:43:27.360 | things like electricity, it suddenly it's like,
00:43:28.840 | all right, mechanical energy is now available on demand.
00:43:32.440 | It's like very fungible kind of.
00:43:33.880 | And then suddenly we consume a lot more of it.
00:43:36.360 | And then the standard of living goes way, way, way, way up.
00:43:38.800 | That's been pretty limited to the physical realm.
00:43:40.840 | And then I believe that the large models,
00:43:43.760 | that's really the first time we can kind of bottle up
00:43:46.560 | cognitive energy and offload.
00:43:49.200 | You know, if we started by offloading
00:43:50.320 | a lot of our mechanical or physical busy work to machines
00:43:53.720 | that freed us up to make a lot of progress in other areas.
00:43:56.160 | And then with AI and computing, we're like,
00:43:57.800 | now we can offload a lot more of our cognitive
00:43:59.800 | busy work to machines.
00:44:00.760 | And then we can create a lot more of it.
00:44:03.120 | Price of it goes way down.
00:44:04.560 | Importantly, like, it's not like humans
00:44:06.320 | never did anything physical again.
00:44:08.040 | It's sort of like, no, but we're more leveraged.
00:44:09.600 | We can move a lot more earth with a bulldozer than a shovel.
00:44:13.040 | And so that's like what is at the most fundamental level,
00:44:15.600 | what's so exciting to me about AI.
00:44:17.480 | And so what's the silicon brain?
00:44:18.520 | It's like, well, we have our human brains
00:44:19.840 | and then we're going to have this other,
00:44:21.240 | like half of our brain that's sort of coming online,
00:44:23.960 | like our silicon brain.
00:44:25.280 | And it's not like one or the other,
00:44:26.760 | they complement each other.
00:44:27.600 | They have very complimentary strengths and weaknesses.
00:44:30.200 | And that's a good thing.
00:44:31.880 | There's also this weird tangent we've gone on as a species
00:44:35.080 | to like where knowledge workers have this like epidemic
00:44:38.600 | of burnout, great resignation, quiet quitting.
00:44:42.800 | And there's a lot going on there,
00:44:44.000 | but I think that's one of the biggest problems we have
00:44:46.440 | is that people deserve like meaningful work
00:44:48.600 | and, you know, can't solve all of it.
00:44:50.640 | But like, at least in knowledge work,
00:44:52.440 | there's a lot of own goals, you know,
00:44:54.760 | unforced errors that we're doing where it's like,
00:44:57.600 | you know, on one side with brain science,
00:44:59.160 | like we know what makes us like productive.
00:45:02.280 | And fortunately it's also what makes us engaged.
00:45:04.600 | It's like when we can focus
00:45:05.840 | or when we're some kind of flow state,
00:45:07.600 | but then we go to work and then increasingly going to work
00:45:10.800 | is like going to a screen.
00:45:12.120 | And you're like, if you wanted to design an environment
00:45:14.920 | that made it impossible to ever get into a flow state
00:45:18.320 | or ever be able to focus, like what we have is that.
00:45:21.240 | And that was the thing that just like seven, eight years ago
00:45:23.080 | it just blew my mind.
00:45:23.920 | I'm just like, I cannot understand
00:45:25.040 | why like knowledge work is so jacked up on this adventure.
00:45:28.400 | It was like, we put ourselves
00:45:29.640 | in like the most cognitively polluted environment possible.
00:45:32.880 | And we put so much more stress on the system
00:45:34.480 | when we're working remotely and things like that.
00:45:36.240 | And, you know, all of these problems
00:45:38.000 | are just like going in the wrong direction.
00:45:39.480 | And I just, I just couldn't understand
00:45:41.240 | why this was like a problem that wasn't fixing itself.
00:45:43.800 | And I'm like, maybe there's something Dropbox
00:45:45.120 | can do with this.
00:45:45.960 | And, you know, things like Dash are the first step,
00:45:47.680 | but then, well, so like what?
00:45:49.600 | Well, I mean, now, like, well, why are humans
00:45:52.760 | in this like polluted state?
00:45:54.040 | It's like, well, we're just, all of the tools we have today,
00:45:56.320 | like this generation of tools just passes on
00:45:58.120 | all of the weight, the burden to the human, right?
00:46:01.040 | So it's like, here's a bajillion,
00:46:03.480 | you know, 80,000 unread emails, cool.
00:46:05.760 | Here's 25 unread Slack channels.
00:46:08.880 | Here's, we all get started like,
00:46:11.400 | it's like jittery, like thinking about it.
00:46:13.880 | And then you look at that, you're like, wait,
00:46:15.200 | I'm looking at my phone, it says like 80,000 unread things.
00:46:17.360 | There's like no question, product question
00:46:19.440 | for just the right answer.
00:46:21.000 | Fortunately, that's why things like our silicon brain
00:46:23.800 | are pretty helpful, 'cause like,
00:46:25.320 | they can serve as like an attention filter,
00:46:27.760 | where it's like, actually, computers have no problem
00:46:30.040 | reading a million things.
00:46:31.800 | Humans can't do that, but computers can.
00:46:33.960 | And to some extent, this was already happening
00:46:36.240 | with computing, you know, Excel is an aversion
00:46:38.200 | of your silicon brain, or, you know,
00:46:39.680 | you could draw the line arbitrarily.
00:46:42.120 | But with larger models, like now,
00:46:43.280 | so many of these little subtasks and tasks we do at work
00:46:46.080 | can be like fully automated.
00:46:48.640 | And I think, you know, I think it's like
00:46:50.560 | an important metaphor to me, 'cause it mirrors
00:46:52.640 | a lot of what we saw with computing,
00:46:54.120 | computer architecture generally.
00:46:55.160 | It's like, we started out with the CPU,
00:46:57.480 | very general purpose, then GPU came along
00:47:00.520 | much better at these like parallel computations.
00:47:03.440 | We talk a lot about like human versus machine
00:47:05.200 | being like substituting, where it's like,
00:47:06.800 | CPU, GPU, it's not like one is categorically
00:47:08.800 | better than the other, they're complements.
00:47:10.400 | Like if you have something really parallel,
00:47:12.200 | use a GPU, if not, use a CPU.
00:47:14.760 | That whole relationship, that symbiosis
00:47:16.840 | between CPU and GPU has obviously evolved
00:47:18.880 | a lot since, you know, playing Quake 2 or something.
00:47:21.840 | But right now, we have like the human CPU
00:47:24.320 | doing a lot of, you know, silicon CPU tasks.
00:47:27.160 | And so you really have to like redesign the work
00:47:28.920 | thoughtfully, such that, you know,
00:47:31.360 | probably not that different from how
00:47:32.720 | it's evolved in computer architecture,
00:47:34.320 | where the CPU is sort of an orchestrator
00:47:35.720 | of these really like heavy lifting GPU tasks.
00:47:39.320 | That dividing line does shift a little bit,
00:47:41.600 | you know, with every generation.
00:47:43.320 | And so I think we need to think about knowledge work
00:47:44.720 | in that context.
00:47:45.560 | What are human brains good at?
00:47:47.280 | What's our silicon brain good at?
00:47:48.400 | Let's re-segment the work.
00:47:49.440 | Let's offload all the stuff that can be automated.
00:47:51.720 | Let's go on a hunt for like anything
00:47:53.120 | that could save a human CPU cycle.
00:47:55.240 | Let's give it to the silicon one.
00:47:56.920 | And so I think we're at the early earnings
00:47:58.280 | of actually being able to do something about it.
00:48:00.280 | - It's funny, I gave a talk to a few government people
00:48:02.840 | earlier this year with a similar point,
00:48:04.800 | where we used to make machines to replace human labor.
00:48:07.720 | And then the kilowatt hour was kind of like
00:48:09.560 | the unit of growth for a lot of countries.
00:48:11.400 | And now you're doing the same thing with the brain.
00:48:13.720 | And the data centers are kind of computational power plants,
00:48:16.760 | you know, they're kind of on demand tokens.
00:48:19.640 | You're on the board of Meta,
00:48:20.600 | which is the number one donor of Flops
00:48:22.640 | for the open source world.
00:48:24.440 | The thing about open source AI is like,
00:48:26.480 | the model can be open source,
00:48:27.680 | but you need to carry a briefcase
00:48:29.520 | to actually maybe run a model that is not even that good
00:48:32.040 | compared to some of the big ones.
00:48:33.600 | How do you think about some of the differences
00:48:35.760 | in the open source ethos with like traditional software,
00:48:38.640 | where it's like really easy to run and act on it
00:48:40.280 | versus like models where it's like,
00:48:41.640 | it might be open source,
00:48:42.480 | but like I'm kind of limited to what I can do with it.
00:48:45.280 | - Yeah, well, I think with every new era of computing,
00:48:48.280 | there's sort of a tug of war between,
00:48:50.520 | is this going to be like an open one or a closed one?
00:48:52.800 | And, you know, there's pros and cons to both.
00:48:54.560 | It's not like, oh, open is always better
00:48:57.200 | or open always wins.
00:48:58.400 | But, you know, I think you look at how the mobile,
00:49:01.120 | like the PC era and the internet era
00:49:03.440 | started out being more on the open side,
00:49:04.920 | like it's very modular.
00:49:06.160 | Everybody sort of party that everybody could, you know,
00:49:08.960 | come to.
00:49:09.800 | Some downsides of that, security.
00:49:11.520 | But I think, you know, at the advent of AI,
00:49:13.240 | I think there's a real question,
00:49:14.160 | like given the capital intensity of what it takes
00:49:16.720 | to train these foundation models,
00:49:18.000 | like, are we going to live in a world
00:49:19.080 | where oligopoly or cartel are all, you know,
00:49:21.760 | there's a few companies that have the keys
00:49:24.640 | and we're all just like paying them rent.
00:49:26.560 | You know, that's one future.
00:49:27.720 | Or is it going to be, you know, more open and accessible?
00:49:31.360 | And I'm like super happy with how that's just,
00:49:33.720 | I find it exciting on many levels
00:49:35.280 | with all the different hats I wear about it.
00:49:38.120 | You know, fortunately you're seeing, you've seen
00:49:39.880 | in real life, yeah, even if people aren't, you know,
00:49:42.240 | bringing GPUs on a plane or something,
00:49:45.480 | you've seen like the price performance of these models
00:49:48.840 | improve 10 or 100X year over year,
00:49:51.200 | which is, it's sort of like many Moore's laws
00:49:53.160 | compounded together for a bunch of reasons.
00:49:55.960 | Like that wouldn't have happened without open source, right?
00:49:58.920 | You know, for a lot of same reasons,
00:49:59.920 | but it's probably better that we can,
00:50:01.080 | anyone can sort of spin up a website
00:50:03.040 | without having to buy an internet information server license.
00:50:05.840 | Like there was some alternative future.
00:50:07.240 | So like things are Linux and really good.
00:50:09.320 | And there was a good balance of trade too,
00:50:10.840 | where like people would contribute their code
00:50:12.360 | and then also benefit from the community
00:50:15.920 | returning the favor.
00:50:16.880 | I mean, you're seeing that with open source.
00:50:18.240 | So you wouldn't see all this like, you know,
00:50:20.320 | this flourishing of research
00:50:22.640 | and of just sort of the democratization of access to compute
00:50:26.000 | without open source.
00:50:27.040 | And so I think it's been like phenomenally successful
00:50:29.400 | in terms of just moving the ball forward.
00:50:31.120 | And pretty much anything you care about, I believe,
00:50:33.160 | even like safety, you can have a lot more eyes on it
00:50:35.640 | and transparency instead of just something is happening.
00:50:37.800 | And there was three places with nuclear power plants
00:50:40.320 | attached to them, right?
00:50:41.760 | So I think it's, you know, it's been awesome to see.
00:50:44.000 | And then, and again, for like wearing my Dropbox hat,
00:50:47.360 | like anybody who's like scaling a service
00:50:49.880 | to millions of people, again,
00:50:51.000 | I'm probably not using like frontier models
00:50:52.880 | for every request.
00:50:53.760 | It's, you know, there are a lot of different configurations,
00:50:56.840 | mostly with smaller models.
00:50:58.240 | And even before you even talk about getting on the device,
00:51:00.520 | like, you know, you need this whole kind of constellation
00:51:03.160 | of different options.
00:51:05.000 | So open source has been great for that.
00:51:06.440 | - And you were one of the first companies
00:51:08.200 | in the cloud repatriation.
00:51:10.720 | You kind of brought back all the storage
00:51:12.840 | into your own data centers.
00:51:14.360 | Where are we in the AI wave for that?
00:51:16.160 | I don't think people really care today
00:51:17.960 | to bring the models in-house.
00:51:19.840 | Do you think people will care in the future?
00:51:21.600 | Like, especially as you have more small models,
00:51:23.320 | like you want to control more of the economics,
00:51:25.000 | or are the tokens so subsidized
00:51:27.400 | that like it just doesn't matter?
00:51:28.600 | It's more like a principle setting?
00:51:30.520 | - Yeah, I mean, I think there's another one
00:51:32.200 | where like thinking about the future is a lot easier,
00:51:34.480 | or if you start with the past.
00:51:36.200 | So, I mean, there's definitely this like big surge
00:51:39.960 | in demand as like, there's sort of this FOMO-driven bubble
00:51:43.280 | of like all of big tech taking their headings
00:51:45.800 | and shipping them to Jensen for a couple years.
00:51:48.640 | And then you're like, all right, well, first of all,
00:51:51.120 | we've seen this kind of thing before.
00:51:52.920 | And you know, in the late '90s with like Fiber,
00:51:55.600 | you know, this huge race to like own the internet,
00:51:58.200 | own the information superhighway, literally,
00:52:00.440 | and then way overbuilt.
00:52:02.120 | And then there was this like crash.
00:52:03.920 | I don't know to what extent,
00:52:05.160 | like maybe it is really different this time,
00:52:06.800 | or maybe if we create AGI,
00:52:08.720 | that will sort of solve the rest of the,
00:52:09.920 | or we'll just have a different set of things
00:52:11.640 | to worry about.
00:52:12.640 | But, you know, the simplest way I think about it is like,
00:52:15.120 | this is sort of a rent-not-buy phase,
00:52:18.000 | 'cause, you know, I wouldn't want to be,
00:52:20.520 | we're still so early in the maturity.
00:52:22.120 | You know, I wouldn't want to be buying like pallets
00:52:23.920 | of over like, of 286s at a 5X markup,
00:52:28.560 | when like the 386 and 486 and Pentium and everything
00:52:31.400 | are like clearly coming, they're around the corner.
00:52:33.880 | And again, because of open source,
00:52:35.000 | there's just been a lot more competition
00:52:36.440 | at every layer in the stack.
00:52:37.480 | And so product developers
00:52:38.840 | are basically beneficiaries of that.
00:52:40.840 | You know, the things we can do,
00:52:42.000 | well, the sort of cost estimates
00:52:43.320 | I was looking at a year or two ago
00:52:44.880 | to like provide different capabilities in the product,
00:52:47.280 | you know, cut, you know, slashing by 10, 100, 1,000X.
00:52:51.640 | I think about coming back around,
00:52:52.720 | I mean, I think, you know, at some point,
00:52:55.080 | you have to believe that the sort of supply and demand
00:52:57.280 | will even out, as it always does.
00:52:58.960 | And then there's also like non-NVIDIA stacks,
00:53:01.240 | like the GROK or Cerebris
00:53:02.880 | or some of these custom silicon companies
00:53:04.960 | that are super interesting and outperformed NVIDIA stack
00:53:07.800 | in terms of latency and things like that.
00:53:09.280 | So I guess it'd be a pretty exciting change.
00:53:11.440 | I think we're not close to the point
00:53:13.480 | where we were with like hard drives or storage
00:53:15.680 | when we sort of went back from the public cloud.
00:53:17.920 | 'Cause like there, it was like,
00:53:19.160 | yeah, the cost curves are super predictable.
00:53:21.840 | We know what the cost of a hard drive and a server
00:53:24.400 | and, you know, terabyte of bandwidth and all the inputs
00:53:27.240 | are gonna just keep going down,
00:53:28.640 | riding down this cost curve.
00:53:30.120 | But to like rely on the public cloud to pass that along
00:53:32.800 | is sort of, we need a better strategy
00:53:34.640 | than like relying on the kindness of strangers.
00:53:37.000 | So we decided to bring that in house and still do,
00:53:40.000 | and we still get a lot of advantages.
00:53:41.880 | That said, like the public cloud is like scaled
00:53:44.000 | and been like a lot more reliable
00:53:45.400 | and just good all around than we would have predicted.
00:53:48.000 | 'Cause actually back then we were worried like,
00:53:49.320 | is the public cloud gonna even scale fast enough
00:53:51.280 | to keep up with us?
00:53:53.720 | But yeah, I think we're in the early innings.
00:53:55.200 | It's a little too chaotic right now.
00:53:56.960 | So I think renting and not sort of preserving agility
00:53:59.560 | is pretty important in times like these.
00:54:01.160 | - Yeah, we just went to the Cerebrus factory
00:54:03.120 | to do an episode there.
00:54:04.560 | We saw one of their data centers.
00:54:06.880 | Yeah, it's kind of like, okay, if this really works,
00:54:09.240 | you know, it kind of changes everything.
00:54:13.880 | - And that is one of those things,
00:54:15.040 | like this is one where you could just have these things
00:54:17.560 | that just like, okay, there's just like a new kind of piece
00:54:19.960 | on the chessboard, like recalc everything.
00:54:22.320 | So I think there's still, I mean, it's like not that likely,
00:54:25.000 | but I think this is an area where it actually could,
00:54:27.160 | you could have these sort of like,
00:54:28.520 | and out of nowhere, suddenly, all of a sudden,
00:54:31.640 | everything's different.
00:54:32.600 | - Yeah, I know one of the management books
00:54:34.280 | he references, Andy Grove's "Only the Paranoid Survive."
00:54:37.720 | Maybe if you look at Intel,
00:54:39.040 | they did a great job memory to chip,
00:54:41.360 | but then it's like maybe CPU to GPU,
00:54:43.480 | they kind of missed that thing.
00:54:45.520 | How do you think about staying relevant for so long?
00:54:47.840 | Now, it's been 17 years, you've been doing Dropbox.
00:54:50.320 | What's the secret?
00:54:51.160 | And maybe we can touch on founder mode and all of that.
00:54:55.000 | - Yeah, well, first, what makes tech exciting
00:54:57.960 | and also makes it hard is like,
00:54:59.080 | there's no standing still, right?
00:55:00.600 | And your customers never are like, oh no, we're good now.
00:55:03.240 | They always want more just,
00:55:05.600 | and then the ground is shifting under you,
00:55:07.000 | where it's like, oh yeah, well,
00:55:07.920 | files are not even that relevant to the modern,
00:55:10.720 | I mean, it's still important,
00:55:11.560 | but like, you know, so much is tilted elsewhere.
00:55:14.480 | So I think you have to like always be moving
00:55:16.120 | and think about, on the one level, like what is,
00:55:19.000 | and think of these different layers of abstraction.
00:55:20.600 | Like, well, yeah, the technical service we provide
00:55:23.120 | is file syncing and storage in the past,
00:55:25.800 | but in the future, it's gonna be different.
00:55:27.600 | The way Netflix had to like,
00:55:28.480 | well, technically we mail people physical DVDs
00:55:30.880 | and fulfillment centers,
00:55:31.760 | and then we have to switch like streaming and codecs
00:55:33.920 | and bandwidth and data centers.
00:55:35.720 | So you do have to think about that level,
00:55:38.040 | but then it's like,
00:55:38.880 | what's the evergreen problem we're solving
00:55:40.440 | is an important problem.
00:55:41.360 | Can we build the best product?
00:55:42.360 | Can we get distribution?
00:55:43.320 | Can we get a business model?
00:55:44.360 | Can we defend ourselves when we get copied?
00:55:46.760 | And then having like some context of like,
00:55:48.400 | history has always been like one of the,
00:55:50.240 | reading about the history, not just in tech,
00:55:52.160 | but of business or government or sports or military,
00:55:55.840 | these things that seem like totally new.
00:55:57.880 | You know, and to me,
00:55:58.720 | it would have been like totally new as a 25 year old,
00:56:00.640 | like, oh my God,
00:56:01.480 | the world's completely different now
00:56:02.320 | and everything's gonna change.
00:56:03.440 | You're like, well,
00:56:04.320 | there's not a lot of great things about getting older,
00:56:05.760 | but you do see like, well, no,
00:56:06.720 | this actually has like a million like precedents,
00:56:09.440 | and you can actually learn a lot from,
00:56:12.320 | you know, about like the future of GPUs from like,
00:56:15.400 | I don't know how, you know, how Formula One teams work,
00:56:18.280 | or you can draw all these like weird analogies
00:56:20.440 | that are super helpful in guiding you from first principles
00:56:24.040 | or through a combination of first principles
00:56:25.480 | and like past context.
00:56:27.240 | But like, you know, build shit we're really proud of.
00:56:29.480 | Like that's a pretty important first step
00:56:31.680 | and really think about like,
00:56:32.680 | you sort of become blind to like how technology works
00:56:35.200 | as that's just the way it works.
00:56:36.400 | And it's something like carrying a thumb drive.
00:56:38.240 | You're like, well,
00:56:39.080 | I'd much rather have a thumb drive
00:56:40.160 | than like literally not have my stuff
00:56:41.800 | or like have to carry a big external hard drive around.
00:56:43.640 | So you're always thinking like, oh, this is awesome.
00:56:45.440 | Like I ripped CDs and these like MP3s
00:56:47.560 | and these files and folders.
00:56:48.640 | This is the best.
00:56:49.800 | But then you miss on the other side.
00:56:51.640 | You're like, this isn't the end, right?
00:56:53.400 | MP3s and folders.
00:56:54.960 | And Apple comes along and is like, this is dumb.
00:56:56.320 | You should have like a catalog, artists, playlists.
00:56:58.720 | You know, then Spotify is like, hey, this is dumb.
00:57:00.720 | Like you should, why are you buying these things
00:57:02.360 | a la carte?
00:57:03.200 | It's the internet.
00:57:04.040 | You should have access to everything.
00:57:04.920 | And then by the way,
00:57:05.760 | why is this like such a single player experience?
00:57:07.200 | You should be able to share
00:57:08.280 | and there should be AI curated, et cetera, et cetera.
00:57:11.920 | And then a lot of it is also just like drawing,
00:57:14.840 | connecting dots between different disciplines, right?
00:57:16.680 | So a lot of what we did to make Dropbox successful
00:57:18.520 | is like we took a lot of the consumer internet playbook,
00:57:20.840 | applied it to business software from a virality
00:57:23.280 | and kind of ease of use standpoints.
00:57:25.440 | And then, you know, I think there's a lot of,
00:57:26.800 | you can draw from the consumer realm and what's worked there
00:57:29.120 | and that hasn't been ported over to business, right?
00:57:31.960 | So a lot of what we think about is like,
00:57:34.760 | yeah, when you sign into Netflix or Spotify or YouTube
00:57:38.440 | or any consumer experience, like what do you see?
00:57:41.080 | Well, you don't see like a bunch of titles
00:57:43.080 | starting with AA, right?
00:57:44.720 | You see like this whole, and it went on evolution, right?
00:57:48.040 | Like we talked about music.
00:57:49.520 | I mean, TV went through the same thing,
00:57:50.880 | like 10 channels over the air broadcast
00:57:53.080 | to 30 channels, 100, but then something like 1,000 channels,
00:57:56.160 | you're like, this has totally lost the plot.
00:57:58.240 | So we're sort of in the 1,000 channels era
00:58:00.480 | of productivity tools where we're just like,
00:58:02.160 | wait, wait, we just need to like rethink the system here.
00:58:04.760 | And we don't need another 1,000 channels.
00:58:06.600 | We need to redesign the whole experience.
00:58:08.600 | And so, you know, I think the consumer experiences
00:58:10.920 | that are like smart, you know, when you sign into Netflix,
00:58:13.680 | it's not like 1,000 channels.
00:58:15.000 | It's like, here are a bunch of smart defaults.
00:58:16.920 | Even if you're a new signup,
00:58:18.000 | we don't know anything about you,
00:58:19.480 | but because of what the world is watching,
00:58:21.000 | here are some, you know, reasonable suggestions.
00:58:23.480 | And then it's like, okay, I watched "Strive to Survive."
00:58:25.400 | I didn't watch "Squid Game."
00:58:26.920 | You know, the next time I sign in,
00:58:28.200 | it's like a complete, it's a learning system, right?
00:58:29.840 | So a combination of design, machine learning,
00:58:33.160 | and just like the courage to like rethink the whole thing.
00:58:35.440 | I think that's a pretty reliable recipe.
00:58:39.120 | And then you think, you're like, all right,
00:58:40.440 | there's all that intelligence.
00:58:42.120 | In the consumer experience, there's no filing things away.
00:58:44.240 | Everything's just all sort of auto-curated for you
00:58:47.600 | and sort of self-optimizing.
00:58:48.920 | Then you go to work and you're like,
00:58:49.960 | there's not even an attempt to incorporate
00:58:52.200 | any intelligence or organization
00:58:53.760 | anywhere in this experience.
00:58:55.600 | And so like, okay, can we do something about that?
00:58:57.960 | - You know, you're one of the last founder CEOs.
00:59:00.000 | Like, you would talk, then you're like,
00:59:01.720 | Toby Lute, some of these folks.
00:59:03.480 | How does that change?
00:59:05.000 | - I'm like 300 years old, and why can't I be a founder?
00:59:07.000 | - No, but I'm saying like, when you run a company,
00:59:08.920 | like, you've had multiple executives over the years.
00:59:11.600 | Like, how important is that for the founder to be CEO
00:59:14.760 | and just say, hey, look,
00:59:15.960 | we're changing the way the company and the strategy works.
00:59:18.120 | It's like, we're really taking this seriously.
00:59:19.520 | Versus like, you could be a public CEO and be like,
00:59:21.600 | hey, I got my earnings call, and like, whatever,
00:59:24.200 | I just need to focus on getting the right numbers.
00:59:26.320 | Like, how does that change the culture in the company?
00:59:29.040 | - Yeah, well, I think it sort of dovetails
00:59:30.920 | with the founder mode whole thing.
00:59:32.920 | You know, I think founder mode's kind of this Rorschach test.
00:59:35.680 | It's sort of like ill-specified,
00:59:37.560 | so it's sort of like whatever you,
00:59:39.000 | you know, it is whatever you see it.
00:59:41.000 | I think it's also like a destination you get to
00:59:44.200 | more than like a state of mind, right?
00:59:46.680 | So if you think about, you know, imagine someone,
00:59:48.840 | there was something called surgeon mode.
00:59:50.560 | You know, giving a med student a scalpel on day one,
00:59:53.120 | it's like, okay, hold up.
00:59:54.500 | You know, so there's something to be said
00:59:57.120 | for like experience and conviction,
00:59:59.400 | and you know, you're gonna do a lot better.
01:00:01.440 | A lot of things are a lot easier for me
01:00:03.160 | like 17 years into it than they were one year into it.
01:00:06.440 | I think part of why founder mode is so resonant is,
01:00:09.920 | or it's like striking such a chord with so many people is,
01:00:12.440 | yeah, there's a real power when you have like a directive,
01:00:15.160 | intuitive leader who can like decisively take the company
01:00:19.280 | like into the future.
01:00:20.760 | It's like, how the hell do you get that?
01:00:22.840 | And I think every founder who makes it this long,
01:00:25.720 | like kind of can't help it,
01:00:27.240 | but to learn a lot during that period.
01:00:29.220 | And you talk about the, you know,
01:00:30.520 | Steve Jobs or Elans of the world.
01:00:32.440 | They did go through like wandering,
01:00:34.400 | a period of like wandering in the desert,
01:00:36.460 | or like nothing was working and they weren't the cool kids.
01:00:39.320 | I think you either sort of like unsubscribe
01:00:40.960 | or kind of get off the train during that,
01:00:42.640 | and I don't blame anyone for doing that.
01:00:44.000 | There are many times where I thought about that,
01:00:45.760 | but I think at some point you sort of,
01:00:47.520 | it all comes together and you sort of start being able
01:00:49.480 | to see the matrix.
01:00:50.360 | So you've sort of seen enough and learned enough.
01:00:52.320 | And as long as you keep your learning rate up,
01:00:54.160 | you can kind of surprise yourself in terms of like
01:00:56.000 | how capable you can become over a long period.
01:00:58.460 | And so I think there's a lot of like founder CEO journey,
01:01:00.720 | especially as an engineer, like, you know,
01:01:02.680 | I never like set out to be a CEO.
01:01:05.240 | In fact, like the more I like understood
01:01:08.120 | in the early days what CEOs did,
01:01:09.260 | the more convinced I was that I was like
01:01:10.680 | not the right person actually.
01:01:12.840 | And it was only after some like shoving
01:01:14.400 | by a previous mentor.
01:01:15.720 | Be like, okay, don't, just go try it.
01:01:17.400 | And if you don't like it,
01:01:18.240 | then you don't have to do it forever.
01:01:20.280 | So I think you start founder mode,
01:01:22.160 | you sort of default that because there's like,
01:01:25.160 | you realize pretty quickly like nothing gets done
01:01:27.200 | in this company unless the founders
01:01:28.520 | are literally doing it by hand.
01:01:30.200 | Then you scale and then you're like, you get,
01:01:32.440 | you know, a lot of actually pretty good advice
01:01:35.200 | that like you can't do everything yourself.
01:01:37.360 | Like you actually do need to hire people
01:01:38.760 | and like give them real responsibilities and empower people.
01:01:41.720 | And that's like a whole discipline called like management
01:01:43.800 | that, you know, we're not figuring out
01:01:45.600 | for the first time here.
01:01:47.040 | But then you, then there's a tendency
01:01:49.480 | to like lean too far back.
01:01:51.960 | You know, it's tough.
01:01:52.800 | And if you're like a 30-year-old
01:01:53.800 | and you hire a 45-year-old exec from, you know,
01:01:56.640 | high-flying company and a guy who was running
01:01:58.720 | like a $10 billion P&L and came to work for Dropbox
01:02:02.840 | where we were like a fraction of a billion dollar P&L.
01:02:05.560 | And, you know, what am I going to tell him about sales?
01:02:08.080 | Right?
01:02:08.920 | And so you sort of recognize pretty quickly,
01:02:10.360 | like I actually don't know a lot
01:02:11.520 | about all these different disciplines
01:02:12.800 | and like maybe I should lean back
01:02:14.160 | and like let people do their thing.
01:02:15.280 | But then you can create this,
01:02:16.520 | like if you lean too far back out,
01:02:18.080 | you create this sort of like vacuum,
01:02:19.680 | leadership vacuum where people are like,
01:02:21.840 | what are we doing?
01:02:23.360 | And then, you know, the system kind of like,
01:02:25.440 | nature reports a vacuum.
01:02:26.800 | It builds all these like kind of weird structures
01:02:29.120 | just to keep the thing like standing up.
01:02:31.280 | And then at some point you learn enough of this
01:02:33.400 | that you're like, wait, this is not how,
01:02:34.800 | this should be designed.
01:02:36.120 | And you actually get like the conviction
01:02:38.440 | and you've learned enough to like know what to do
01:02:41.560 | and things like that.
01:02:42.400 | And then on the other side, you lean way back in.
01:02:44.480 | I think it's more of like a table flipping
01:02:45.880 | where you're like, hey,
01:02:46.720 | this company is like not running the way I want it.
01:02:48.800 | Like something, I don't know what happened,
01:02:50.160 | but it's going to be like this now.
01:02:52.120 | And I think that that's like an important
01:02:53.360 | developmental stage for a founder, CEO.
01:02:56.040 | And if you can do it right and like make it to that point,
01:02:58.040 | like then the job becomes like a lot of fun and exciting
01:03:02.960 | and good things happen for the company,
01:03:04.520 | good things happen for your customers.
01:03:05.840 | But it's not, it's like a really rough learning journey.
01:03:09.520 | - It is, it is.
01:03:10.400 | I've had many therapy sessions with founders, CEOs.
01:03:13.880 | Let's go back to the beginning.
01:03:15.360 | Like today, the AI wave is like so big
01:03:17.760 | that like a lot of people are kind of scared
01:03:19.400 | to jump in the water.
01:03:20.800 | And when you started Dropbox,
01:03:22.720 | one article said,
01:03:23.560 | "Fortunately, the Dropbox founders are too stupid
01:03:25.840 | to know everyone's already tried this."
01:03:27.800 | In AI now, it kind of feels the same.
01:03:30.120 | You have a lot of companies that sound the same,
01:03:32.000 | but like none of them are really working.
01:03:34.360 | So obviously the problem is not solved.
01:03:36.600 | Do you have any advice for founders trying to navigate
01:03:39.160 | like the IDMAs today and like what they should do,
01:03:41.680 | what are like counterintuitive things maybe to try?
01:03:45.280 | - Well, I think like, you know,
01:03:46.880 | bringing together some of what we've covered,
01:03:48.280 | I think there's a lot of very common kind of category errors
01:03:51.400 | that founders make.
01:03:52.680 | One is, you know, thinking starting from the technology
01:03:55.760 | versus starting from like a customer
01:03:57.160 | or starting from a use case.
01:03:58.800 | And I think every founder has to start with what you know.
01:04:01.240 | Like you're, yeah, you know, maybe if you're an engineer,
01:04:03.440 | you know how to build a product,
01:04:05.000 | but don't know any of the other next, you know, hurdle,
01:04:08.120 | you don't know much about the next hurdles
01:04:09.360 | you have to go through.
01:04:10.200 | So I think the biggest lesson would be
01:04:13.960 | you have to keep your personal growth curve
01:04:16.040 | out of the company's growth curve.
01:04:17.920 | And for me, that meant you have to be like super systematic
01:04:20.040 | about training up what you don't know.
01:04:21.920 | 'Cause no one's gonna do that for you.
01:04:23.000 | Your investors aren't gonna do that.
01:04:24.120 | Like literally no one else will do that for you.
01:04:26.200 | And so then you have to have like, all right, well,
01:04:29.240 | and I think the most important,
01:04:30.360 | one of the most helpful questions to ask there is like,
01:04:32.720 | in five years from now,
01:04:33.760 | what do I wish I had been learning today?
01:04:35.160 | In three years from now, what do I wish in one year?
01:04:37.200 | You know, how will my job be different?
01:04:38.960 | How do I work back from that?
01:04:40.520 | And so, for example, you know, when I was just starting
01:04:42.600 | in 2007, it really was just like coding
01:04:45.600 | and talking to customers.
01:04:46.960 | And it's sort of like the YC ethos.
01:04:49.360 | You know, make something people want
01:04:50.440 | and coding and talking to customers
01:04:51.800 | are really all you should be doing in that early phase.
01:04:53.440 | But then if I were like, all right,
01:04:54.960 | well, that's sort of YC phase.
01:04:57.800 | What are the next hurdles?
01:04:58.960 | Well, a year from now, then I'm gonna need,
01:05:00.600 | but to get people, we're gonna need fundraise,
01:05:02.280 | like raise money.
01:05:03.520 | Okay, to raise money, we're gonna have to like,
01:05:04.720 | have to answer all these questions we have to,
01:05:06.440 | so you like work back from that and you're like,
01:05:07.760 | all right, we need to become like an expert
01:05:09.120 | in like venture capital financing.
01:05:10.240 | And then, you know, the circle keeps expanding.
01:05:11.840 | Then if we have a bunch of money,
01:05:12.760 | we're gonna need like accountants and lawyers and employees,
01:05:14.960 | and I'm gonna have to start managing people.
01:05:17.160 | Then two years would be like,
01:05:18.080 | well, we're gonna have this like product,
01:05:19.520 | but then we're gonna need users,
01:05:20.480 | we're gonna need money, revenue.
01:05:22.320 | And then in five years, it'd be like,
01:05:23.280 | yeah, we're gonna be like tangling
01:05:24.240 | with like Microsoft, Google, Apple, Facebook,
01:05:25.880 | but just like everybody.
01:05:27.840 | And like, somehow we're gonna have to like deal with that.
01:05:29.360 | And then that's like what the company's got to deal with.
01:05:31.520 | And as CEO, I'm gonna be responsible for all that.
01:05:33.280 | But then like my personal growth,
01:05:34.320 | or there's all these skills I'm gonna need.
01:05:35.480 | I'm gonna like need to know like what marketing is
01:05:37.920 | and like what finance is and how to manage people,
01:05:42.760 | how to be a leader, whatever that is.
01:05:44.480 | And so, and then I think one thing people often do
01:05:48.640 | is like, oof, like that,
01:05:50.000 | it's like imposter syndrome kind of stuff.
01:05:51.560 | You're like, oh, it seems so remote or far away
01:05:53.520 | that I'm not comfortable speaking publicly,
01:05:55.960 | or I've never managed people before.
01:05:57.240 | I haven't this, I haven't,
01:05:58.080 | and like even learning a little bit about it
01:06:00.960 | makes it feel even worse.
01:06:01.800 | You're like, now I thought I didn't know a lot,
01:06:03.520 | now I know I don't know a lot.
01:06:06.080 | Part of it is more technical.
01:06:07.400 | Like how do I learn all these different disciplines
01:06:09.160 | and sort of train myself?
01:06:10.200 | And a lot of that's like reading,
01:06:11.880 | you know, having founders or community
01:06:13.640 | that are sort of going through the same things.
01:06:15.240 | That was how I learned.
01:06:16.600 | Maybe reading was the single most helpful thing
01:06:18.600 | more than any one person or talking to people,
01:06:22.160 | like reading books.
01:06:23.320 | But then there's a whole mindset piece of it,
01:06:24.880 | which is sort of like you have to cut yourself
01:06:26.320 | a little bit of slack.
01:06:27.320 | Like, you know, I wish someone had sort of sat me down
01:06:29.960 | and told me like, dude, you may be an engineer,
01:06:31.840 | but like look, all the tech founders that,
01:06:33.640 | you know, tech CEOs that you admire,
01:06:35.400 | like they actually all, you know,
01:06:36.600 | almost all of them started out as engineers.
01:06:38.240 | They learned the business stuff on the job.
01:06:40.760 | So like this is actually something that's normal
01:06:42.480 | and achievable.
01:06:43.320 | You're not like broken for not knowing.
01:06:45.360 | No, those people didn't, weren't like,
01:06:47.080 | didn't come out of the womb with like shiny hair
01:06:48.960 | and a Armani suit.
01:06:50.480 | You know, you can learn this stuff.
01:06:51.720 | So even just like knowing it's learnable.
01:06:53.400 | And then second, like,
01:06:54.240 | but I think there's a big piece of it around like discomfort
01:06:56.280 | where it's like, I mean,
01:06:57.360 | we're like kind of pushing the edges.
01:06:59.080 | I don't know if I want to be CEO,
01:07:00.320 | or I don't know if I'm ready for this, this, this.
01:07:01.960 | Like learning to like walk towards that
01:07:04.120 | when you want to run away from it.
01:07:06.080 | And then lastly, I think, you know,
01:07:07.880 | just recognizing the time constant.
01:07:09.200 | So five weeks, you're not going to be a great leader,
01:07:12.080 | manager, or a great public speaker or whatever.
01:07:14.280 | You know, think of any more than you'll be
01:07:15.240 | a great guitar player or play a sport that well,
01:07:17.560 | or be a surgeon.
01:07:18.920 | But in like five years, like actually,
01:07:20.800 | you can be pretty good at any of those things.
01:07:23.760 | Maybe you won't be like fully expert,
01:07:25.360 | but you have like a lot more latent potential.
01:07:28.240 | I mean, people have a lot more latent potential
01:07:29.600 | than they fully appreciate,
01:07:30.920 | but it doesn't happen by itself.
01:07:32.280 | You have to like carve out time
01:07:33.320 | and really be systematic about unlocking it.
01:07:36.080 | - How do you think about that for building your team?
01:07:38.120 | I know you're a big Pats fan.
01:07:39.320 | Obviously, that's a great example of building a dynasty
01:07:42.760 | on like some building blocks
01:07:43.800 | and bringing people into the system.
01:07:45.520 | When you're building a company,
01:07:46.720 | like how much slack do you have people on?
01:07:49.640 | Hey, you're going to learn this
01:07:50.560 | versus like how do you measure like the learning rate
01:07:52.880 | of the people you hire?
01:07:53.720 | And like, how do you think about picking and choosing?
01:07:56.000 | - Great question.
01:07:56.840 | It's hard.
01:07:57.680 | What you want is a balance, right?
01:07:59.400 | And we've had a lot of success with great leaders
01:08:02.720 | who actually grew up with a company,
01:08:05.160 | started as an IC engineer or something,
01:08:07.320 | then made their way to whatever level.
01:08:09.440 | Our exec team is populated with a lot of those folks,
01:08:11.520 | but there's also a lot of benefit to experience
01:08:15.120 | and having seen different environments
01:08:16.520 | and kind of been there, done that.
01:08:18.080 | And there's a lot of drawbacks to kind of learning
01:08:19.640 | by trial and error only.
01:08:22.120 | And then even your high potential people
01:08:23.800 | like can go up the learning curve faster
01:08:25.360 | if they have like some experience to learn from.
01:08:27.280 | Now, like experience isn't a panacea.
01:08:28.840 | Either you can have various organ rejection or misfit
01:08:32.320 | or like overfitting from their past experience
01:08:34.600 | or cultural mismatches or you name it.
01:08:37.040 | I've seen it all.
01:08:37.880 | I've kind of gotten all the mistake merit badges on that.
01:08:41.720 | But I think it's like constructing a team
01:08:43.120 | where there's a good balance.
01:08:44.000 | Like, okay, for the high potential folks
01:08:45.440 | who are sort of in the biggest jobs of their lives,
01:08:47.680 | do they either have someone that is managing them
01:08:50.000 | that they can learn from, you know, as a CEO,
01:08:52.520 | part of your job or as a manager,
01:08:54.680 | like you have to like surround or help support them.
01:08:57.920 | So getting the mentors or getting first time execs
01:09:00.400 | like mentors who have been there, done that,
01:09:01.880 | or getting them in like, you know,
01:09:04.200 | there's usually for any function,
01:09:05.480 | there's usually like a social group,
01:09:06.760 | like, oh, chiefs of staff of Silicon Valley.
01:09:08.920 | Okay, like, you know, there's usually these informal
01:09:11.080 | kind of communities you can join.
01:09:13.480 | And then, yeah, you just don't want to be too rotated
01:09:15.840 | in one direction or the other.
01:09:16.960 | 'Cause we've done it.
01:09:17.800 | We've like overdone it on the high potential piece,
01:09:19.800 | but then like everybody's kind of making dumb mistakes.
01:09:22.360 | - Right, yeah, yeah, yeah.
01:09:23.200 | - The bad mistakes are the ones where you're like,
01:09:25.080 | either you're making it multiple times
01:09:26.480 | or like these are known knowns to the industry,
01:09:28.680 | but if they're not known,
01:09:30.200 | if they're like unknown unknowns to your team,
01:09:31.960 | then you have a problem.
01:09:34.200 | And then again, if you have too much,
01:09:35.840 | if you've just only hire external people,
01:09:38.000 | like then you're sort of at the mercy,
01:09:39.600 | you'll be like whatever random average
01:09:41.600 | of whatever culture or practices they bring in
01:09:44.200 | can create resentment or like lack of career opportunities.
01:09:47.920 | So it's really about how do you get, you know,
01:09:49.480 | it doesn't really matter if it's like exactly 50/50.
01:09:51.320 | I don't think about a sort of perfect balance,
01:09:52.960 | but you just need to be sort of
01:09:54.360 | tending that garden continuously.
01:09:57.280 | - Also, Drew, just to wrap,
01:09:58.800 | do you have any call to actions?
01:10:00.760 | Like who should come work at Dropbox?
01:10:02.560 | Like who should use Dropbox?
01:10:03.920 | Anything you want to tell people?
01:10:06.640 | - Well, I'm super, I mean,
01:10:07.480 | today is a super exciting day
01:10:09.000 | because we just launched Dash for business.
01:10:11.160 | And, you know, we've talked a little bit about the product.
01:10:13.160 | It's like universal search, universal access control,
01:10:16.080 | a lot of rethinking, sharing for the modern environment.
01:10:19.280 | But, you know, what's personally exciting,
01:10:20.960 | and, you know, we could talk about the product,
01:10:22.040 | but like the, it's just really exciting for me to like,
01:10:24.480 | yeah, this is like the first like most major
01:10:26.160 | and most public step we've taken
01:10:27.720 | from our kind of Dropbox 1.0 roots.
01:10:30.320 | And there's probably a lot of people out there
01:10:32.680 | who either like grew up not using Dropbox
01:10:35.800 | or like, yeah, I used Dropbox like 10 years ago
01:10:38.680 | and it was cool, but I don't do that much.
01:10:40.800 | So I think there's a lot of new reasons
01:10:42.600 | to kind of tune into what we're doing.
01:10:44.760 | And it's a lot of, it's been a lot of fun to,
01:10:47.520 | I think like the sort of the AI era
01:10:49.880 | has created all these new like paths forward for Dropbox
01:10:52.960 | that wouldn't have been here five years ago.
01:10:55.440 | And then, yeah, to the founders, like, you know,
01:10:57.240 | hang in there, do some reading.
01:10:58.800 | And don't be too stressed about it.
01:11:02.800 | So we're pretty lucky to get to do what we do.
01:11:04.760 | - Yeah, watch the Pat's documentary on Apple TV.
01:11:08.400 | - Yeah, Bill Belichick, I'm still Pat's fan.
01:11:10.760 | Really got an F1.
01:11:12.200 | So we're technology partners to McLaren.
01:11:14.240 | They're doing super well.
01:11:15.080 | - So were you a McLaren fan
01:11:15.920 | before you were a technology partner?
01:11:17.080 | So did you become partners because--
01:11:19.080 | - It sort of like co-evolved, yeah.
01:11:20.920 | I mean, I was a fan beforehand,
01:11:22.040 | but I'm like a lot more of a fan now, as you'd imagine.
01:11:24.760 | - Awesome, well, thank you so much for the train, Drew.
01:11:26.880 | - Awesome. - This was great.
01:11:28.000 | - Lot of fun. - Yeah.
01:11:28.840 | - Thanks for having me.
01:11:29.680 | (upbeat music)
01:11:32.260 | (upbeat music)
01:11:34.840 | (upbeat music)
01:11:37.420 | [MUSIC PLAYING]