back to index

Everything is ugly, so go build something that isn't — Raiza Martin, Huxe (ex NotebookLM)


Whisper Transcript | Transcript Only Page

00:00:00.000 | Really quick show of hands, how many folks actually work in product? Wow, okay. Engineering, UX. Okay, I feel like there is definitely some overlap there, right? But that's exactly what we're seeing happen right now.
00:00:28.340 | Sorry, I have to keep walking over here because I'm so short. If I stand here, I can't see you. But what's crazy is I feel like we're all doing all the jobs now, right?
00:00:39.000 | Like that's the crazy thing about AI. That's the crazy thing about right now. And I wonder how much longer it's going to be relevant for to ask, you know, what do you do?
00:00:49.220 | What do you do at your company? Because chances are you're probably doing everything. And you can, right? With some ease. And so I think a lot of questions,
00:00:57.180 | people ask me this all the time. Students in particular ask me this all the time. So what does product do, like, in this new world?
00:01:04.120 | And so the way that I think about product is kind of like a multi-layer cake. And I think that it starts with how we think about ourselves.
00:01:12.700 | And I know this is like kind of a weird topic. It's like not super technical, but I think it's probably one of the most important,
00:01:19.180 | which is I remember when it was considered technical to be the type of PM that could write your own SQL queries, right?
00:01:26.840 | But, you know, if I said that to you now, like, it's kind of silly because everybody knows how to write their own SQL queries because chat GPT does, right?
00:01:34.180 | Like, all you have to do is to know what question you're asking and then you can do it, right? But it's not just like these little tasks that you can do.
00:01:41.180 | It's also like the entire roles themselves that are changing because it's easier than ever to access expertise or simulate it, right?
00:01:52.920 | So in a world where the jobs are blending together, I feel like it's even more important to understand, you know, where are you coming from?
00:02:00.180 | What is the value that you bring, you know, coming from yourself? And then there's this other thing that's really interesting that's happening,
00:02:08.180 | which is teams are becoming drastically reconfigured. And even at like a super basic level, like if you think about it,
00:02:15.180 | every team now has a bunch of like invisible participants, right? Like every doc, every slide, every little bit of thing that is passed around or created,
00:02:25.180 | there's like a chat GPT or a Claude or a Gemini behind that thing, right? Like most of the time, even when I read something like with my own raw eyeballs,
00:02:34.180 | I'll still give it to chat GPT and be like, what did I miss? Right? Like, here's my takeaways. What's yours? And that's crazy because teams are now fully augmented, right?
00:02:44.180 | We've got like five people on a team and we've got five chat GPTs. And that's crazy. We don't really know what that means, right?
00:02:51.180 | But we've got superpowers now. How are we going to use it? And then there's this layer, which I think is like super interesting,
00:02:58.180 | which is this is where the title slide comes from, where I think products come from people, like deep within people, right?
00:03:05.180 | You pull an idea out of yourself and you translate it into technology and that's what a product is. But when you look at every product that we are using,
00:03:15.180 | you can tell we are in the clunky, awkward years, right? You can tell everything is about to change. And literally everything we are using is the ugliest that it will ever be.
00:03:27.180 | Because everything we use was created in like the pre AI era where, you know, we had to imagine, well, how do I make this thing work? Like if I press this button, like it makes a J on my screen.
00:03:39.180 | Like now it feels like kind of strange. Like when you think about the richness of like the interactions that are possible, it now feels kind of strange to use like an everyday thing like a microwave or maybe it's just me.
00:03:51.180 | Maybe I'm the only one who wants like an AI microwave, right? Or whatever, whatever that thing does.
00:03:55.180 | But I think what I'm trying to say is like, we're starting to assemble the shapes of what we think AI is really capable of and what it can deliver.
00:04:04.180 | And I think that the best products out there haven't been discovered yet.
00:04:07.180 | How do we discover it? Well, I think the answer lies in the final layer and sort of typically, right, what a product manager would say, I think it's in the user layer.
00:04:17.180 | And what I mean by this is, I don't know if you all have noticed it, but there is this kind of like consumer unrest, right?
00:04:27.180 | Where as we use products like ChatGPT, Cursor, Claude, right, you start to see, well, it's super easy to use this shit.
00:04:37.180 | It's just like I have to say what I want and something magical comes back.
00:04:41.180 | And now I have to go use the rest of this dumb thing, like dumb products out there that don't do that.
00:04:46.180 | And so you have a little bit of this chasm now where you have these super powerful, really intuitive, really smart products.
00:04:54.180 | And you have the rest of the world, which is just like janky.
00:04:57.180 | And I think that what we're going to see is that there's going to be a phase where everything gets rebuilt, right?
00:05:06.180 | So how should we think about rebuilding?
00:05:10.180 | Well, first of all, I think we should not undersell the fact that there is a lot of chaos.
00:05:16.180 | Like all the time, I think I try to emphasize this to people, even though things are like really cool and pretty magical,
00:05:25.180 | it's also pretty fucking hard because each of these layers is being effectively rewritten.
00:05:31.180 | And that is not without cost, right?
00:05:33.180 | Like even the first question I asked was like, hey, what do you do for a living?
00:05:36.180 | Like, it's kind of weird.
00:05:37.180 | Like, it's really uncomfortable to be like, I don't know.
00:05:39.180 | Like, are they still going to be hiring product managers next year?
00:05:41.180 | Not sure.
00:05:42.180 | Engineers?
00:05:43.180 | Dunno.
00:05:44.180 | Designers?
00:05:45.180 | Maybe.
00:05:46.180 | Right?
00:05:47.180 | And that unrest lives inside of us at each of these layers.
00:05:50.180 | And that's crazy.
00:05:51.180 | And so I started out this spiel asking, you know, what is the role of product?
00:05:56.180 | And ultimately, I think that complementary to chaos is always opportunity.
00:06:02.180 | Right?
00:06:03.180 | And our job is product people.
00:06:06.180 | I don't say product managers.
00:06:07.180 | I think just like whoever you are.
00:06:09.180 | Right?
00:06:10.180 | Like if you embody sort of like the force behind a product, this applies to you.
00:06:14.180 | Your job is to find the nugget of opportunity out there and explode it like a popcorn kernel.
00:06:21.180 | Right?
00:06:22.180 | Like that is your singular job.
00:06:24.180 | And I think it's actually kind of exciting if you work at an organization where every person embraces this mission.
00:06:31.180 | Okay.
00:06:32.180 | So my talk is largely about this opportunity and how I think you should go about exploding it.
00:06:36.180 | I think there are a lot of talks that you could attend that tell you sort of practically how to, you know, technically build products.
00:06:42.180 | But like, I really want to talk through sort of the principles that I use to drive product building.
00:06:48.180 | Right?
00:06:49.180 | So let's jump in.
00:06:50.180 | What does it take to build a great AI product?
00:06:53.180 | So I think first I want to acknowledge that for folks who have shipped things, especially, you know, if you're part of a team or a company, I think that building a product is a forceful experience.
00:07:07.180 | Like, I tried to think about the right word to use here.
00:07:11.180 | I actually first was like, I think it's a violent experience.
00:07:14.180 | And my team was like, I don't know if you can say that.
00:07:17.180 | Right?
00:07:18.180 | Like, I'm pretty sure people don't know if like a tech job is violent.
00:07:22.180 | I was like, okay, okay, okay.
00:07:23.180 | It's forceful.
00:07:24.180 | Right?
00:07:25.180 | And what I mean when I say that is I think that you almost have to force a product into existence.
00:07:32.180 | And I don't mean sort of like the hobby stuff.
00:07:34.180 | Right?
00:07:35.180 | Like I mean to truly build something that can meaningfully exist as a product that has a place in people's lives.
00:07:42.180 | Like that takes a lot of force.
00:07:44.180 | And I think that to do that, to do that particularly well, you need a lot of personal clarity.
00:07:51.180 | Right?
00:07:52.180 | And it's like, it's like this thing that is inside of like an individual person.
00:07:56.180 | Right?
00:07:57.180 | Sometimes we talk about clarity.
00:07:58.180 | We talk about like, oh, the team has to be clear on this.
00:08:01.180 | The org has to be clear on this.
00:08:04.180 | Right?
00:08:05.180 | Like I don't think that's where it starts.
00:08:06.180 | I actually think it has to start with sort of a singular individual that like carries this clarity
00:08:11.180 | with them.
00:08:12.180 | Right?
00:08:13.180 | Because once you know the what of what you're building and the why of it, that's real energy.
00:08:19.180 | And I think when people talk about technology, we're always talking about technology, tech, the stack, et cetera.
00:08:25.180 | Right?
00:08:26.180 | Hiring.
00:08:27.180 | I think everything we are talking about is just a transformed energy that comes from people.
00:08:32.180 | And so ultimately, this personal clarity is what's going to give you the energy to push your team,
00:08:38.180 | to push your stakeholders, and to push your users because it's like, it's really hard.
00:08:44.180 | And your primary role is just to cultivate that relentlessly.
00:08:49.180 | And I think it's three things, right?
00:08:50.180 | It's the clarity of your vision, the clarity of your purpose, and the clarity of taste, which I'll talk more about in just a little bit.
00:08:57.180 | I'll tell you a short story, which is, has anybody ever seen this or used this thing?
00:09:04.180 | Okay.
00:09:05.180 | Well, this was the first version of Notebook LM.
00:09:06.180 | It was called Tailwind.
00:09:07.180 | We announced it at Google I/O in 2023.
00:09:11.180 | And I will never forget the road to get to this thing, right?
00:09:15.180 | Like, I think the amount of people that told me it was stupid was actually, like, fascinatingly high.
00:09:22.180 | And it was really great that I was like, whoa, I think it might actually be stupid.
00:09:27.180 | But it's like that force, right, that personal clarity that gave me the energy to keep driving forward with it.
00:09:34.180 | And in reality, the reason why I had so much personal clarity, I don't think a lot of people know this, but I had dropped out of college.
00:09:40.180 | And I went back to school full time when I was working at Google.
00:09:44.180 | And so I was full time in college, I was full time building Notebook LM, and I was like, I don't know how else to explain this to you.
00:09:51.180 | But if I could just have a tool where I put a bunch of shit in it, right, like just a bunch of docs, a bunch of slides, and I just chat with it, and it does something for me, that seems really valuable.
00:10:03.180 | Like, I've never been able to do that before, right?
00:10:06.180 | And I'm not just bolting it on.
00:10:07.180 | Like, I want to build this thing from the ground up.
00:10:09.180 | And so every time somebody would tell me it was stupid, whether it was a user, a stakeholder, teammates even, would be like, I don't understand, right?
00:10:16.180 | I would say, no, I do, though, right?
00:10:19.180 | Like, I get it.
00:10:20.180 | I know why this thing is important.
00:10:22.180 | And so personal clarity will get you far.
00:10:25.180 | And it will get your team far.
00:10:27.180 | And so I highly recommend starting from this place of just, like, cultivating this energy.
00:10:34.180 | Okay.
00:10:35.180 | So now you have clarity.
00:10:36.180 | Great.
00:10:37.180 | How do you turn that into a real thing?
00:10:39.180 | Well, I think that the first thing that I always tell people to do is you have to start with the job and not the pixels.
00:10:44.180 | Right?
00:10:45.180 | Because when we talk about taste, I think sometimes people feel that that it's about an aesthetic thing.
00:10:51.180 | But I actually think it's about an outcome, right?
00:10:54.180 | I think it's about the question of what is the single outcome that your product has to deliver every time for every user flawlessly.
00:11:05.180 | Like, that is purpose, right?
00:11:07.180 | Purpose is the North Star that tells you whether a feature is gold or if it's just baggage.
00:11:14.180 | And it's the antidote to a really common problem that I see all the time, which is AI demo disease, which is, hey, this thing is a cool demo.
00:11:24.180 | I made it.
00:11:25.180 | It demos really well.
00:11:26.180 | I made a really cool Twitter video or whatever.
00:11:28.180 | But these are not real products, right?
00:11:31.180 | If they're grounded in hype, if they're grounded in sort of like trying to ride the waves of chaos, you're not going to get anywhere.
00:11:38.180 | So purpose is what helps you say no to novelty when it's diluting your core job, right?
00:11:44.180 | Users literally do not give a shit if something is AI.
00:11:46.180 | I think people are actually kind of tired of the word.
00:11:49.180 | I think that what people care about ultimately is when they have an intent and you deliver it to them in a way that feels inevitable, right?
00:11:59.180 | And I think we go back to that energy, right?
00:12:01.180 | It takes energy to get there.
00:12:03.180 | It's very hard.
00:12:04.180 | And you need to be purpose-obsessed in order to get there.
00:12:06.180 | And here's another example that I want to give, which is I was singularly obsessed with this use case of, like, I want to put 50 things in the tool and I want to be able to do stuff with it.
00:12:17.180 | And summarization was a really big one because I figured if you could do this, you could certainly do a lot more things across data, right?
00:12:25.180 | And it was really hard to do this.
00:12:28.180 | It was really hard for UX reasons.
00:12:30.180 | It was really hard for sort of the actual way that we were able to generate this in a smart way.
00:12:36.180 | But the trade-off is that once we built this auto-summary into Notebook LM, it made it so much easier for people to understand a really foreign concept at the time.
00:12:46.180 | Right?
00:12:47.180 | Which is like the concept that I would put 50 files in one place and I would interact with it in this different way that we hadn't really done before.
00:12:54.180 | It was like a little token feature that was both a tutorial and was useful at a glance.
00:13:01.180 | But it's like you could not have arrived at this type of idea, which looks really basic, just like from the outside of it.
00:13:08.180 | You could not have arrived at it if you were not sort of obsessively trying to drive at, like, the value that you're trying to deliver to somebody.
00:13:16.180 | And so, I think that when we think about the value and what we're trying to give to users, I think that in reality, the value of a product is a promise.
00:13:28.180 | Right?
00:13:29.180 | Like, ultimately, any product is a promise to a user and you're making claims about what it can do.
00:13:36.180 | Right?
00:13:37.180 | So then the user tries it.
00:13:38.180 | But trust is oxygen.
00:13:40.180 | And using a product is like a transaction between the user and the company.
00:13:45.180 | And without it, you're nothing.
00:13:46.180 | Right?
00:13:47.180 | You have, like, a pretty limited credit.
00:13:48.180 | Like, most products get a credit of, like, negative one.
00:13:50.180 | Because not everybody's going to want to try your shit.
00:13:52.180 | And then, like, the person that does, like, they don't have the patience for whatever things you put in there.
00:13:58.180 | They only have patience for one thing.
00:14:00.180 | And so, I think one of the best ways that you can actually build trust is to expose the edges.
00:14:07.180 | Right?
00:14:08.180 | Show people where the model is dumb and make it seamless between the user and the product.
00:14:13.180 | Like, don't paper over it.
00:14:15.180 | Because even though we have these really smart, incredible models, the way that we are building them is still, like, a very human type of thing.
00:14:24.180 | So, when you give a product to a person, you are exposing your own process, your own thought process, your own workflow to them.
00:14:31.180 | And you're making a promise about how it works.
00:14:33.180 | And so, when it fails, when it doesn't work, think about how you go about it in the most human way possible.
00:14:40.180 | Like, I see this all the time where people are trying to instrument for, like, the best use case but not the worst case.
00:14:46.180 | And so, you kind of have this, like, weird half-baked product in space where it's purely machine and very little human involved.
00:14:53.180 | I think, on this note, I want to say something that sounds really dumb and basic.
00:14:59.180 | But I think that you have to nail the deterministic things before the delightful probabilistic bits.
00:15:06.180 | Because at the end of the day, a good app is still just a really good app.
00:15:12.180 | Like, it's just an app, right?
00:15:14.180 | And it's like, it doesn't matter what you jam in there, it's still an app.
00:15:20.180 | And this is another one of, like, sort of the older user interfaces we had in Notebook.
00:15:25.180 | But users routinely would do this thing where they would upload sources and they would enter a query like summarize this doc.
00:15:34.180 | Summarize this doc and this doc only.
00:15:37.180 | Summarize only this concept.
00:15:39.180 | And this was very hard to do in the early days of Notebook LM, particularly with the smaller context windows.
00:15:45.180 | And one of the things that we saw was the query type for summarization was, like, in terms of, like, the percent of, like, first queries, it was, like, 90%.
00:15:56.180 | 90% of, like, users, their first query was a summarization query.
00:16:00.180 | And it was this testing behavior, right?
00:16:03.180 | People were trying it out.
00:16:05.180 | They were like, oh, I heard about this thing.
00:16:07.180 | Okay, saw the website, cool, I'm gonna upload a thing.
00:16:10.180 | Like, think about all the steps the user went through to get there.
00:16:13.180 | User gets there, they upload something, enter summarize, and it borks.
00:16:19.180 | It didn't work.
00:16:20.180 | So the user leaves.
00:16:22.180 | They leave forever.
00:16:23.180 | But, like, think about, like, the amount of time that, like, that person put into it because you made a promise to them that it was gonna do exactly that.
00:16:31.180 | And so I think that one of the things I want to say about trust is it's not cheap.
00:16:36.180 | It's not cheap, right?
00:16:37.180 | If you get a user to try your product, make it as good as fucking possible the first time because they're not gonna come back.
00:16:42.180 | In fact, it was, like, so detrimental, this summarization use case, that it was, like, all I thought about for a very long time.
00:16:53.180 | Okay, so let's say you manage to earn the trust of your users.
00:16:58.180 | You've got a bunch of them.
00:16:59.180 | They love it.
00:17:00.180 | And really, to be honest with you, in real life, right, like, this flow that we're describing, it's actually, like, a matter of seconds for a person.
00:17:08.180 | This is, like, one minute, right, for you to deliver on this whole thing.
00:17:11.180 | But you get to earn the next thing after you have trust, which is the potential to delight.
00:17:17.180 | And I think with Notebook this was, like, really cool where I feel like the delightfulness was that it was unexpected and it was kind of funny.
00:17:26.180 | Where people would upload documents and then they would make a podcast and they were like, I don't know what it's going to say.
00:17:31.180 | Like, it could be kind of goofy.
00:17:33.180 | It could be pretty funny.
00:17:34.180 | And I think there's, like, an aspect in there that is, like, just very playful, right?
00:17:40.180 | Like, delightfulness is almost -- it's very similar to playfulness.
00:17:44.180 | And it sort of lives, I think, in this interesting space between technical capabilities and user expectations.
00:17:51.180 | Because it's, like, you kind of have to meet users where they are, but you have to push them just a little bit.
00:17:57.180 | You know, where once you've built trust, you can push just one step past what's familiar and not spook anybody.
00:18:05.180 | Because I think we've seen this too, right?
00:18:06.180 | Like, when things are, like, too weird, it's, like, spooky and people are like, I don't know, I'm not going to look at it anymore.
00:18:11.180 | I feel like people felt this way about robots for a long time.
00:18:14.180 | And so I think that in reality, kind of just going back to the trust piece, we get one chance to really make the machine feel like magic.
00:18:22.180 | And so my tip here is actually just, like, a very, like, kind of a tactical one, which is, I think you need to surface delightfulness through agency and not trickery.
00:18:32.180 | Right?
00:18:33.180 | Where it's, like, the user has to feel like they are a part of it, that they are steering, and it feels self-directed.
00:18:39.180 | And that was one of the big things about notebook.lm, which was, like, it wasn't, like, a random button, right?
00:18:44.180 | It was, like, I knew the specific document I had uploaded, I knew that it was going to make something magical for me, but it felt like equal parts me and the machine, right?
00:18:53.180 | It wasn't just the machine.
00:18:54.180 | Like, it felt like there was, like, a healthy tension between there where the machine had a real opportunity to delight me.
00:19:00.180 | I'm not going to show an example here in particular, but I want to make a point about delightfulness, which is, it is actually really hard to delight people if you're doing too much shit, right?
00:19:10.180 | And I really think that you're either shipping model capabilities or actual new outcomes.
00:19:16.180 | There is no real in-between here.
00:19:18.180 | And this is where, sort of, like, the stack will come to kind of haunt you, where do you know the outcome that you are optimizing for?
00:19:26.180 | Do you know how the user is supposed to get there?
00:19:28.180 | Do you know what is preventing users from getting there?
00:19:31.180 | You know, do you know how to show them what is possible with the system?
00:19:35.180 | And I think these are sort of, like, kind of deep, gnarly questions, and it's just, like, something that you will only get to by using the product a lot yourself.
00:19:43.180 | I think that that is the way to build a delightful product, right?
00:19:47.180 | Is to not test your product, but to sort of live and breathe it and live the life of your users.
00:19:53.180 | Like, who do you think is going to use this thing?
00:19:55.180 | Like, if you are not that person, then you have to become that person.
00:19:58.180 | Because that's the only way that I think you can start to feel at the borders of what people are ready for and what the technology is capable of, right?
00:20:06.180 | I think that's ultimately, like, the little trick to getting within that space and building something interesting.
00:20:13.180 | Okay, finally, I think that Delight shows us what the product can do and what's magical about it.
00:20:23.180 | But I think what a lot of people are not actually as judicious about is what it should do.
00:20:29.180 | And this is kind of, like, a weird piece of advice, especially because, like, I've made this mistake many, many, many times.
00:20:35.180 | Like, I only say this from sort of the tried and tested experience of being a kitchen sink person, right?
00:20:40.180 | And we've all used products like this where, when you look around you, there's plenty of examples where it kind of feels like you have not shipped your POV on the world.
00:20:50.180 | You actually just shipped a kitchen sink of model capabilities with your slight flare on it, right?
00:20:55.180 | Like, maybe you changed the color.
00:20:57.180 | And that's kind of cool because, you know, I think, like, we're all still living in the era of, like, research previews.
00:21:02.180 | Like, it's cool to know what the models can do, like, it's ever increasing in capability.
00:21:06.180 | But I think, like, to true end users, like, it's probably not that interesting, right?
00:21:10.180 | Because that is just ChatGPT. Like, I'll just use ChatGPT.
00:21:13.180 | I don't think you're gonna do a better job than them.
00:21:17.180 | And it's probably not right to jam everything in there and try to see what's sticking with who, right?
00:21:23.180 | So the barrier to shipping, especially nowadays, it's not about the capability, right?
00:21:29.180 | It's about judgment. And the thing that I try to ask myself is, you know, does this product respect user time, data, and agency?
00:21:38.180 | Because I really believe that restraint in particular, right?
00:21:43.180 | This is a new innovation multiplier.
00:21:45.180 | Like, if you yourself have that personal clarity, you have that purpose, and you are focused, you're driven, you're resilient, like, 100%.
00:21:52.180 | People are gonna tell you stuff like, "Well, this is, like, the simplest thing ever. How do you aim to compete?"
00:21:58.180 | Well, exactly, by being focused, right? By doing one thing incredibly reasonably well.
00:22:03.180 | And you will get to a point where the thing is so excellent that you are beginning to really delight people.
00:22:09.180 | Like, you will have more of an intuition about the borders that I was talking about in the previous slide.
00:22:14.180 | I think one last note I'll say on delightfulness is that -- I mean, on the kitchen sink -- is that users are not that different from you or me, right?
00:22:24.180 | Like, we are all users of something.
00:22:26.180 | And so think about the last time you tried something new.
00:22:28.180 | Like, think about how much patience you had for it.
00:22:31.180 | And think about how annoying it was when you tried it, and you're like, "I just don't know what I would do with this thing."
00:22:36.180 | Like, it does, like, 30 things, but I don't know when I would use it, right?
00:22:40.180 | And that's just, like, what happens when you sort of let the chaos overwhelm you, and you're not clear about what the whole point of, like, your thing is, right?
00:22:48.180 | And that comes from a place that I think is, like, the entire stack.
00:22:53.180 | Like, if you yourself don't know what you're trying to do with yourself, then you don't know what to do with your product.
00:22:59.180 | Then you don't know which model capabilities are actually useful, you know, in the context of your outcome.
00:23:04.180 | And you end up with a kitchen sink, which is not great.
00:23:07.180 | It's not a great experience.
00:23:09.180 | This actually happened to me recently.
00:23:11.180 | This is Hux.
00:23:13.180 | I don't know -- I mean, I do know, actually.
00:23:16.180 | But we ended up in a place where we had an app that did all these things.
00:23:20.180 | And I was like, it's so beautiful, okay?
00:23:23.180 | It does this thing.
00:23:24.180 | You can chat with it.
00:23:25.180 | You can make podcasts.
00:23:26.180 | You can read the news.
00:23:27.180 | It generates images.
00:23:28.180 | It generates videos.
00:23:30.180 | I was so excited.
00:23:31.180 | It was, like, my dream product.
00:23:33.180 | Until it wasn't.
00:23:35.180 | Because then I started using it every day.
00:23:37.180 | And I'm like, whoa.
00:23:38.180 | I don't really use any of this.
00:23:39.180 | I just use one thing.
00:23:41.180 | And we gave it to a bunch of users.
00:23:42.180 | We gave it to several hundred people.
00:23:44.180 | And they also just used one thing in the product.
00:23:47.180 | And I was like, whoa.
00:23:48.180 | You know, I think in real life people only have the bandwidth to sort of be like, this is my one thing, right?
00:23:53.180 | Like, I'm sure Spotify does a lot of things.
00:23:55.180 | But I still use it for one thing, which is to play music.
00:23:58.180 | I don't know what else it really does.
00:24:00.180 | Right?
00:24:01.180 | I'm sure, like, it's a huge team.
00:24:02.180 | It's like a bajillion dollar industry.
00:24:03.180 | But that's what it does.
00:24:04.180 | And so when we were working on Hux, I'm like, whoa.
00:24:07.180 | How crazy.
00:24:08.180 | Oh, I forgot.
00:24:09.180 | It does chat, too.
00:24:10.180 | It was wild.
00:24:11.180 | Except I made it, like, really goofy instead of ChatGPT.
00:24:13.180 | It's just crazy.
00:24:14.180 | But I think, like, to look at this product now and to see sort of, like, the lack of focus in the product direction, it's like, hey, this has a little bit of, like, that demo disease, right?
00:24:23.180 | Where it looks cool, but in real life, like, nobody's going to love that thing.
00:24:27.180 | And so I just want to reiterate kind of the message here, which is clarity is what is going to give you the energy for the job.
00:24:33.180 | And the job is hard.
00:24:34.180 | So you're going to need lots of that.
00:24:36.180 | Purpose is what keeps us focused on it and makes sure that we know what outcome we are marching towards.
00:24:42.180 | And it is the trust in that purpose that's going to earn us the belief, right, of our users.
00:24:48.180 | And the belief is what's going to get us to a place where we have an opportunity to delight, right?
00:24:54.180 | And that delight is going to prove potential.
00:24:56.180 | And the kitchen sink is just like a checks and balances kind of a thing.
00:24:59.180 | And it keeps us honest and focused about whether or not meeting the first four.
00:25:03.180 | And that's how I believe that we build something that isn't ugly.
00:25:07.180 | Thank you.
00:25:08.180 | Thank you.
00:25:09.180 | Thank you.
00:25:10.180 | We'll see you next time.