back to index

The State of AI in production — with David Hsu of Retool


Chapters

0:0 Introduction
2:43 Retool's founding story and decision not to present at YC demo day initially
5:40 Experience in YC with Sam Altman
9:27 Importance of hiring former founders early on
10:43 Philosophy on fundraising - raising less money at lower valuations
14:57 Overview of what Retool is
18:9 Origin story of Retool AI product
23:4 Decision to use open source vector database PG Vector
24:42 Most underrated Retool feature loved by customers
28:7 Retool's AI UX and workflows
34:5 Zapier vs Retool
37:41 Updates from Retool's 2023 State of AI survey
40:34 Who is adopting AI first?
43:43 Evolving engineering hiring practices in the age of Copilot/ChatGPT
45:52 Retool's views on internal vs external AI adoption
48:29 OSS models vs OpenAI in production
50:54 Additional survey questions to ask in 2024
52:7 Growing interest in multimodal AI capabilities
54:0 Balancing enterprise sales vs bottom-up adoption
60:0 Philosophical thoughts on AGI and intentionality

Whisper Transcript | Transcript Only Page

00:00:00.000 | Hey everyone, welcome to the Latent Space Podcast.
00:00:02.720 | This is Alessio, partner and CTO at Residence at Decibel Partners,
00:00:06.200 | and I'm joined by my co-host Swix, founder of Small.ai.
00:00:09.480 | Hi, and today we are in the studio with David Hsu from Reto.
00:00:13.880 | Welcome.
00:00:14.480 | Thanks, excited to be here.
00:00:15.960 | We'd like to give a little bit of intro
00:00:18.000 | from what little we can get about you
00:00:19.800 | and then have you talk about something personal.
00:00:23.720 | You got your degree in philosophy and CS from Oxford.
00:00:27.880 | I wasn't aware that they did like double degrees.
00:00:31.760 | Is that what you thought?
00:00:32.960 | It's actually a single degree, actually, which is really cool.
00:00:35.360 | So basically, yeah, so you study content, you study philosophy,
00:00:38.800 | and you study the intersection.
00:00:40.880 | The intersection is basically AI, actually,
00:00:43.640 | and sort of computers think, or computers be smart.
00:00:46.360 | Like, you know, what does it mean for a computer to be smart?
00:00:48.720 | As well as logic, which is also another intersection,
00:00:51.280 | which is really fun too.
00:00:53.160 | In Stanford, you know, it might be symbolic systems or whatever.
00:00:58.840 | And it's always hard to classify these things
00:01:01.440 | when we don't really have a word for it.
00:01:02.760 | Now, like this, everything's just called AI.
00:01:04.840 | Five years ago, you launched Retool.
00:01:09.080 | You were in YC '17, winter '17,
00:01:13.640 | and, you know, it's just been a straight line up from there, right?
00:01:17.960 | I wished.
00:01:20.200 | And that's your sort of brief bio that I think you want most people to know.
00:01:25.080 | What's something on your LinkedIn that people should know about you,
00:01:27.240 | maybe on a personal hobby or, you know,
00:01:29.360 | let's just say something you're very passionate about
00:01:32.040 | that might not be about Retool?
00:01:33.640 | I read quite a bit.
00:01:34.440 | I probably read like two books a week, around about, so it's a lot of fun.
00:01:37.240 | I love biking.
00:01:38.320 | It's also quite a bit of fun, so, yeah.
00:01:40.560 | Do you use Retool to read?
00:01:41.640 | Like, what the hell?
00:01:43.040 | No, I don't use Retool to read, so that'd be funny.
00:01:47.320 | What do you read?
00:01:48.640 | How do you choose what you read?
00:01:50.440 | Any recommendations that you just fall in love with?
00:01:54.240 | It's pretty diverse.
00:01:55.800 | I'm mostly reading fiction nowadays, so fiction's a lot of fun.
00:01:59.320 | I think it maybe helps me be more empathetic, if you will.
00:02:01.720 | I think it's a lot of fun, actually,
00:02:02.960 | to sort of see what it's like to be in someone else's shoes,
00:02:05.280 | so that's a lot of fun.
00:02:07.800 | Besides that, I'm really good about philosophy as well.
00:02:09.960 | I find philosophy just so interesting, especially logic.
00:02:13.560 | We can talk more about that for probably hours if you want, so.
00:02:17.400 | Yeah, I have a sort of casual interest in epistemology,
00:02:21.720 | and I think that any time you try to think about machine learning
00:02:26.640 | on a philosophical angle,
00:02:28.640 | you have to start wrestling with these very fundamental questions
00:02:31.760 | about how do you know what you know.
00:02:33.400 | Yeah, totally.
00:02:34.920 | What does it mean to know?
00:02:36.440 | What does it mean to know?
00:02:38.680 | Yeah, all right, so over to you.
00:02:41.880 | That's its own podcast.
00:02:43.160 | We should do a special edition about it,
00:02:46.200 | but that's fun.
00:02:48.400 | Let's just maybe jump through a couple things on Retool
00:02:51.600 | that I found out while researching your background.
00:02:54.920 | So you did YC, but you didn't present at Demo Day initially
00:02:58.560 | because you were too embarrassed of what you had built.
00:03:02.160 | Can you maybe give any learnings to founders
00:03:05.200 | on jumping back from that?
00:03:07.360 | I've seen a lot of people kind of give up early on
00:03:09.920 | because they were like, "Oh, this isn't really
00:03:12.600 | what I thought it was going to be to be a founder."
00:03:14.680 | They told me I would go to YC and then present
00:03:16.960 | and then raise a bunch of money,
00:03:18.320 | and then everything was going to be easy.
00:03:19.880 | So how did that influence also how you build Retool today
00:03:24.200 | in terms of picking ideas
00:03:25.960 | and deciding when to give up on it?
00:03:29.320 | Yeah, let's see.
00:03:30.320 | So this is around 2017 or so.
00:03:32.680 | So we were supposed to present at the March Demo Day,
00:03:34.880 | but then we basically felt like we had nothing really going on.
00:03:38.000 | We had no traction, we had no customers.
00:03:40.440 | And so we were like, "Okay, well, why don't we take six months
00:03:43.040 | to go find all that before presenting?"
00:03:46.040 | Part of that, to be honest, was I think there's a lot of noise
00:03:51.200 | around Demo Day, around startups in general,
00:03:53.120 | especially because there's so many startups nowadays.
00:03:55.680 | And I guess for me, I'd always want to sort of under-promise
00:04:01.600 | and over-deliver, if you will.
00:04:03.360 | And on Demo Day, I mean, maybe you two have seen
00:04:05.400 | a lot of the videos, it's a lot of honestly over-promising
00:04:08.800 | and under-delivering because every startup says,
00:04:11.360 | "Well, I'm going to be the next Google or something."
00:04:13.840 | And then you peer under it and you're like,
00:04:15.520 | "Wow, nothing's going on here," basically.
00:04:16.840 | So I really didn't want that.
00:04:18.520 | And so we chose actually not to present at Demo Day,
00:04:21.280 | mostly because we felt like we didn't have anything
00:04:23.640 | substantial underneath.
00:04:25.400 | Although actually a few other founders in our batch
00:04:27.080 | probably would have chosen to present in that situation,
00:04:30.120 | but we were just kind of embarrassed about it.
00:04:32.240 | And so we basically took six months to just say,
00:04:35.040 | "Okay, well, how do we get customers?"
00:04:36.800 | And we're not presenting until we have a product
00:04:39.600 | that we're proud of and customers that we're proud of.
00:04:41.400 | And fortunately, it worked out.
00:04:42.840 | Six months later, we did have that.
00:04:44.320 | So I don't know if there's much to learn from this situation
00:04:49.320 | besides I think social validation was something
00:04:54.000 | that I personally had never really been that interested in.
00:04:57.080 | And so it was definitely hard because it's hard to sort of,
00:05:02.080 | it's almost like you go to college
00:05:04.080 | and all your friends are graduating when you failed
00:05:06.640 | or something, you failed the final
00:05:07.840 | and you have to like redo it here.
00:05:09.240 | It's like, well, it kind of sucks
00:05:10.960 | that all your friends are up there and on the podium
00:05:13.440 | presenting and they are raising a ton of money
00:05:15.800 | and you're kind of being left behind.
00:05:18.040 | But in our case, we felt like it was a choice
00:05:21.680 | and we could have presented if we really wanted to,
00:05:23.760 | but we would not have been proud of the outcome
00:05:25.960 | or proud of what we were presenting.
00:05:27.200 | And for us, it was more important to be true to ourselves,
00:05:29.240 | if you will, and show something that we're actually proud of
00:05:31.520 | rather than just raise some money
00:05:33.200 | and then shut the company down after two years.
00:05:36.120 | - Yeah, any sad moment stories from the YC days?
00:05:40.160 | Could you tell in 2017 that Sam was going to become,
00:05:43.680 | like run the biggest AI company in the world?
00:05:47.080 | - Wow, no one's asked me that before.
00:05:51.440 | Let me think.
00:05:52.280 | Sam was, I think he was,
00:05:56.120 | I forgot, maybe president of YC in our batch.
00:05:59.760 | We actually weren't in his group actually
00:06:01.680 | at the very beginning.
00:06:03.600 | And then we got moved to a different group.
00:06:06.640 | I don't honestly have,
00:06:08.640 | I think Sam was clearly very ambitious
00:06:10.800 | when we first met him.
00:06:11.640 | I think he was very helpful
00:06:13.400 | and sort of wanted to help founders.
00:06:15.360 | But besides that, I mean, I think we were so overwhelmed
00:06:18.880 | by the fact that we had to go build a startup
00:06:20.280 | and we were not honestly paying that much attention
00:06:22.680 | to every YC partner and taking notes on them.
00:06:24.880 | - That makes sense.
00:06:27.960 | Well, and then just to wrap some
00:06:29.960 | of the Ritual history nuggets,
00:06:32.600 | you raised a serious A
00:06:34.800 | when you were at 1 million of revenue
00:06:36.280 | with only three or four people.
00:06:39.000 | How did you make that happen?
00:06:40.400 | Any learnings on keeping teams small?
00:06:43.000 | I think there's a lot of overhiring
00:06:46.120 | we've seen over the last few years.
00:06:47.440 | I think a lot of AI startups now
00:06:49.000 | are kind of like raising very large rounds
00:06:51.400 | and maybe don't know what to do with the capital, so.
00:06:54.680 | - Yeah.
00:06:55.840 | So this is kind of similar actually
00:06:57.120 | from sort of why we chose not to present a demo day.
00:06:58.960 | And the reason was it feels like a lot of people
00:07:01.960 | are really playing startup.
00:07:03.440 | I think PG has an essay about this,
00:07:04.880 | which is like, you're almost like playing house
00:07:06.440 | or something like that.
00:07:07.280 | Like, it's like, oh, well, I hear that in a startup,
00:07:09.440 | we're supposed to raise money and then hire people.
00:07:11.320 | And so therefore you go and do that.
00:07:13.000 | And you're supposed to do a lot of PR
00:07:14.960 | because that's what startup founders do.
00:07:16.960 | And so you could do a lot of PR and stuff like that.
00:07:18.640 | And for us, we always thought that the point
00:07:21.640 | of starting a startup is basically
00:07:22.800 | we have to create value for customers.
00:07:24.200 | If you're not creating value for customers,
00:07:25.520 | everything else is gonna, nothing's gonna work basically.
00:07:27.640 | You can't continue to raise money or hire people
00:07:30.760 | if you don't have customers, you're not gonna work value.
00:07:33.200 | And so for us, we were always very focused on that.
00:07:35.040 | And so that's initially where we started.
00:07:37.560 | I think it's, again, maybe it goes to like the sort of,
00:07:39.640 | you know, presenting something truthful about yourself
00:07:42.560 | or staying true to yourself, something to that effect,
00:07:44.440 | which is, we didn't want to pretend like we had a,
00:07:46.320 | you know, thriving business, we could actually.
00:07:48.760 | And so the only way to not pretend
00:07:50.200 | was actually to build a thriving business.
00:07:52.120 | And so we basically just, you know,
00:07:55.200 | put our heads down and, you know,
00:07:56.240 | grind it away for probably a year, year and a half or so.
00:07:59.520 | Just writing code, talking to customers.
00:08:01.160 | And I think that at that point,
00:08:04.120 | we had raised something like maybe a million dollars,
00:08:07.560 | maybe a million and a half, something out of YC.
00:08:10.400 | So, I mean, to us, to people, you know,
00:08:12.480 | that was a huge amount of money.
00:08:13.320 | I was like, wow, like how are we ever gonna spend
00:08:15.840 | a million and a half?
00:08:17.000 | Our run really was like, you know,
00:08:19.800 | five, six years at that point, right?
00:08:20.880 | 'Cause we're paying ourselves 30, 40K a year.
00:08:23.600 | And so then the question was not like,
00:08:26.440 | oh, we're gonna run on runways.
00:08:27.520 | The question was like, we better find traction
00:08:29.440 | because if we don't find traction,
00:08:30.400 | we're gonna, you know, just give up psychologically.
00:08:33.080 | Because, you know, we're grinding on it.
00:08:34.320 | If you grind on the idea for four years,
00:08:35.880 | nothing happens.
00:08:36.720 | You're probably psychologically gonna give up.
00:08:38.920 | I think that's actually true in most startups.
00:08:40.520 | Actually, it's like most startups die in the early stages,
00:08:43.240 | not because you run out of money,
00:08:45.080 | but really because you run out of motivation.
00:08:47.720 | And for us, had we hired people,
00:08:50.440 | I think it would have actually been harder for us
00:08:52.680 | because we would have ran out of motivation faster.
00:08:54.920 | Because when you're pretty part of market fit, actually,
00:08:56.880 | trying to lead the team with like, you know,
00:08:57.920 | 10 people, for example, to Marshall's part of market fit,
00:09:00.600 | I think it's actually pretty hard.
00:09:02.040 | Like it's, you know, everyday people are asking you,
00:09:04.840 | so why are we doing this?
00:09:05.840 | And you're like, I don't know, man.
00:09:07.160 | Like, hey, trust us.
00:09:08.440 | And that's actually a very tiring environment to be in.
00:09:10.960 | Whereas if it's just like, you know,
00:09:11.800 | the founders figuring out product market fit,
00:09:14.240 | I think that's actually a much sort of safer path,
00:09:16.920 | if you will.
00:09:17.760 | You're also schooling less with employees.
00:09:19.600 | Like when you hire employees, you have an idea.
00:09:21.640 | You have front of market, you have customers.
00:09:23.200 | That's actually, I think, a lot more stable
00:09:25.320 | of a place for employees to join as well, so.
00:09:27.920 | - Yeah, and I find that, you know,
00:09:29.880 | typically the sort of founder employee relationship is,
00:09:32.920 | you know, employee expects the founder
00:09:34.680 | to just tell them what to do.
00:09:36.000 | And you don't really get critical pushback from the employee,
00:09:39.200 | even if they're a body,
00:09:40.040 | and even if they like you as an early engineer.
00:09:42.360 | It's very much like the role play of like,
00:09:44.720 | once you have that founder hat on,
00:09:46.320 | you think differently, you act differently,
00:09:49.480 | and you're more scrappy, I guess,
00:09:51.560 | in trying to figure out what that product is.
00:09:54.280 | Yeah, I really resonate with this
00:09:55.720 | 'cause I'm going through this right now.
00:09:56.920 | (laughing)
00:09:58.800 | - That's awesome.
00:09:59.840 | One thing we did actually early on
00:10:01.080 | that I think has paid a lot of dividends,
00:10:02.760 | especially, you know, which was a lot larger now,
00:10:04.840 | is we hired a lot of former founders.
00:10:06.320 | So I want to say like, when we were, I don't know,
00:10:09.720 | 20, 30, 40 people,
00:10:10.920 | we were probably like half former founders
00:10:13.000 | at each one of those stages.
00:10:14.400 | And that was actually pretty cool
00:10:15.560 | because I think you infuse sort of a, you know,
00:10:18.560 | get things done kind of culture,
00:10:19.760 | a outcome-oriented culture of like a very little politics,
00:10:23.080 | 'cause, you know, no one came from larger companies.
00:10:24.880 | Everyone was just like, "This is my own startup.
00:10:26.280 | "Let me go figure out
00:10:27.280 | "how to achieve the perception for the customer."
00:10:28.800 | And so I think from a cultural perspective even today,
00:10:31.120 | a lot of those cultures are sort of very self-startery.
00:10:33.720 | I think it's actually because of sort of these like,
00:10:35.400 | you know, early founders that we hired,
00:10:37.200 | which was really, really, you know,
00:10:38.680 | we're really lucky to have them, so.
00:10:40.800 | - Yeah, and then closing off
00:10:42.360 | on just a little bit of the fundraising stuff,
00:10:44.440 | something notable that you did was when, in 2021,
00:10:47.280 | when it was the sort of peak Zerp
00:10:49.160 | and everyone was raising
00:10:50.280 | hundreds and hundreds of millions of dollars,
00:10:52.600 | you, you know, you intentionally raised less money
00:10:55.200 | at lower valuations, is your title.
00:10:57.960 | And I think it's a testament
00:11:00.760 | to your just overall general philosophy
00:11:02.400 | in building Retool that you're just very efficient
00:11:04.440 | and you do things from first principles.
00:11:07.680 | Yeah, I mean, like any updates on like,
00:11:10.920 | would you still endorse that?
00:11:13.200 | You know, would you recommend that to everyone else?
00:11:15.680 | What are your feelings sort of two years on from that?
00:11:18.360 | - Yeah, so on a high level,
00:11:20.880 | yeah, so exactly, you said it's correct,
00:11:23.040 | where we raised less money at a lower valuation.
00:11:26.200 | And I think the funny thing about this
00:11:27.600 | is that when we first announced that,
00:11:29.320 | even, you know, internally and both externally,
00:11:32.200 | I think people were really surprised actually,
00:11:33.880 | because I think Silicon Valley has been conditioned to think,
00:11:37.400 | oh, raising a giant sum of money at a giant valuation
00:11:39.920 | is a really good thing.
00:11:40.880 | It's like, you know,
00:11:41.720 | you should maximize both the numbers basically.
00:11:43.440 | But actually, maximizing both the numbers
00:11:45.000 | is actually really bad,
00:11:46.400 | actually for the people that matter the most,
00:11:48.520 | you know, i.e. your employees or your team.
00:11:50.880 | And the reason for that is,
00:11:52.680 | more, raising more money means more dilutions.
00:11:55.480 | If you look at, you know, a company like Uber, for example,
00:11:58.840 | if you join Uber at like, I don't know,
00:12:00.360 | like a $10 billion valuation,
00:12:02.440 | or, you know, let's say you joined before their huge round,
00:12:05.040 | which I think happened at a few billion dollars in valuation,
00:12:07.520 | they actually got diluted a ton when Uber fundraised.
00:12:10.920 | So if, you know, Uber raises,
00:12:12.560 | if Uber dilutes themselves by 10%, for example,
00:12:14.360 | let's say they raise 500 to 5 billion, for example,
00:12:16.880 | I think employee's stake goes down by 10%
00:12:20.160 | in terms of ownership.
00:12:21.400 | Same with, you know, previous investors,
00:12:22.880 | same with the founders, et cetera.
00:12:24.080 | And so if you look at actually a lot of founders
00:12:26.880 | in sort of, you know, the operations statistics space,
00:12:29.640 | or, you know, those that fundraise like, you know,
00:12:31.120 | 2013, 2017, a lot of the founders by IPO
00:12:33.920 | only have a few percentage points, actually, for a company.
00:12:36.320 | And if the founders only have a few percentage points,
00:12:38.120 | you can imagine how, you know, how little employees have.
00:12:40.080 | And so that I think is actually just a really, you know,
00:12:42.280 | bad thing for employees overall.
00:12:44.240 | Secondly, it's a sort of higher valuation
00:12:47.840 | given the same company quality is always worse.
00:12:50.880 | And so basically what that means is
00:12:53.040 | if you are fundraising as a company,
00:12:54.520 | you could command a certain valuation in the market.
00:12:56.360 | You know, let's say it's, you know, X, for example.
00:12:59.120 | Maybe you get lucky and you can raise two times X,
00:13:01.240 | for example.
00:13:02.160 | But if you choose two times X,
00:13:03.640 | your company itself is not fundamentally changed.
00:13:05.960 | It's just that, you know, for some reason,
00:13:07.160 | investors want to pay more for it.
00:13:08.600 | You know, maybe today you're an AI company, for example.
00:13:10.600 | And so investors are really excited about AI
00:13:12.160 | and want to pay more for it.
00:13:13.720 | However, that might not be true
00:13:14.960 | in a year or two years' time, actually.
00:13:16.440 | And if that's not true in two years' time,
00:13:18.000 | then you're in big trouble, actually.
00:13:19.280 | And so now I think you see a lot of companies
00:13:20.760 | that are raising really high valuations about 2021.
00:13:23.800 | And now they're like, man, we're at like 100 X,
00:13:26.120 | or, you know, we raised 300 X multiple, for example.
00:13:28.480 | And if we're at 300 X then, you know,
00:13:29.800 | maybe now we're at like 200 X.
00:13:31.440 | And like, man, we just can't raise money ever again.
00:13:33.520 | Like, you know, we're gonna have to grow like 50 X
00:13:35.440 | to go raise money, you know,
00:13:36.280 | at a reasonable valuation, let's say.
00:13:38.280 | And so I think that is really challenging
00:13:40.560 | and really demotivating for the team.
00:13:42.760 | And so I think a lower valuation actually is much better.
00:13:46.080 | And so for us, in retrospect,
00:13:47.840 | you know, to answer your question,
00:13:48.680 | two years later, we did not predict, you know,
00:13:50.960 | the crash, if you will.
00:13:52.400 | But given it, I think we've done extremely well,
00:13:55.480 | mostly because our valuation is not sky high.
00:13:58.240 | Because if our valuation were sky high,
00:14:00.720 | I think we'd have a lot more problems.
00:14:02.440 | We'd probably have recruiting problems, for example.
00:14:04.400 | We'd probably have a lot of internal morale problems,
00:14:06.080 | et cetera.
00:14:06.920 | A lot of people would be like, you know,
00:14:07.760 | why is the valuation this way?
00:14:08.760 | We might have cash flow problems
00:14:09.800 | because we might have to go raise money again,
00:14:11.120 | you know, et cetera.
00:14:11.960 | We can't because the valuation is too high.
00:14:13.400 | So I would urge, I think,
00:14:15.240 | you founders today to quote, unquote,
00:14:18.360 | like leave money on the table.
00:14:19.720 | Like there are some things
00:14:20.720 | that are not really worth optimizing.
00:14:22.920 | I think you should optimize for the quality
00:14:24.560 | of the company that you build,
00:14:25.960 | not like the value, which you raise that
00:14:27.600 | or the amount you raise, et cetera, so.
00:14:30.640 | - In hindsight, 2020, but it looks like, you know,
00:14:33.240 | you made the right call there anyway.
00:14:35.040 | So maybe we should also,
00:14:39.000 | for people who are not clued into Retool,
00:14:41.000 | do a quick, like, what is Retool?
00:14:43.840 | You know, I see you as the kings
00:14:46.160 | or the inventors of the low-code internal tooling category.
00:14:50.240 | Would you agree with that statement?
00:14:52.320 | Would, like, you know, how do you usually explain Retool?
00:14:55.160 | - I generally say it's like Legos for code.
00:14:59.120 | We actually hate the low-code moniker.
00:15:01.160 | We actually never, in fact, we have docs saying
00:15:03.040 | we will never use it internally.
00:15:04.560 | - Yeah.
00:15:05.400 | - Or even to customers.
00:15:06.240 | And the reason for that is I think low-code
00:15:07.800 | sounds very not developer-y.
00:15:10.000 | And developers, they hear the phrase low-code,
00:15:12.000 | they're like, oh, that's not for me.
00:15:12.840 | Like, I love writing code.
00:15:13.920 | Like, why would I ever want to write less code?
00:15:16.120 | And so for us, Retool's actually built for developers.
00:15:18.160 | Like, 95% of our customers actually are developers,
00:15:21.280 | actually, and so that is a little bit surprising to people.
00:15:24.360 | I'll generally explain it as,
00:15:26.680 | and this is, you know, kind of a funny joke, too.
00:15:28.760 | I think part of the reason why Retool's been successful
00:15:31.040 | is that developers hate building internal tools.
00:15:34.280 | And you can probably see why.
00:15:35.320 | I mean, if you're a developer,
00:15:36.960 | you've probably built internal tools yourself.
00:15:38.080 | Like, it's not a super exciting thing to do.
00:15:40.360 | You know, it's like piecing together a CRUD UI.
00:15:42.280 | You've probably, you know, pieced together
00:15:43.400 | many CRUD UIs in your life before.
00:15:45.200 | And there's a lot of crunch work involved.
00:15:46.600 | You know, it's like, hey, state management.
00:15:48.440 | It's like, you know, data validation.
00:15:49.720 | It's like display error messages, like domestic buttons.
00:15:51.680 | Like, all these things are not really exciting,
00:15:53.760 | but you have to do it because it's so important
00:15:55.520 | for your business to have high-quality internal software.
00:15:58.720 | And so what Retool does is basically allows you
00:16:00.360 | to sort of piece together an internal app really fast,
00:16:03.480 | whether it's a front-end, whether it's a back-end
00:16:05.280 | or whatever else.
00:16:06.960 | So yeah, that's what Retool is.
00:16:08.960 | - Yeah, actually, so you started hiring,
00:16:11.120 | and so I do a lot of developer relations
00:16:13.040 | and community building work,
00:16:14.840 | and then you hired Kritika,
00:16:16.960 | who has now moved on to OpenAI,
00:16:18.560 | to start out your sort of DevRel function.
00:16:23.240 | And I was like, what is Retool doing according to developers?
00:16:25.320 | And then she told me about this, you know,
00:16:27.840 | developer traction.
00:16:28.680 | And I think that is the first thing
00:16:30.240 | that people should know, which is that,
00:16:31.960 | like, actually the burden and weight of internal tooling
00:16:34.920 | often falls to developers,
00:16:36.760 | or it's an Excel sheet somewhere or whatever.
00:16:39.000 | But yeah, you guys have basically created this market.
00:16:44.000 | In my mind, I don't know if there was someone
00:16:46.880 | clearly before you in this,
00:16:48.480 | but you've clearly taken over and dominated.
00:16:51.120 | Every month, YC, there's a new YC startup launching
00:16:55.240 | that is like, we're the open-source Retool,
00:16:58.080 | we're like the lower-code Retool, whatever.
00:17:01.240 | And it's pretty, I guess it's endearing.
00:17:04.560 | We'll talk about Airplane later on,
00:17:06.320 | but yeah, I think that,
00:17:09.440 | I've actually used Retool in my previous startups
00:17:12.800 | for this exact purpose.
00:17:14.200 | Like, we needed a UI for AWS RDS
00:17:17.520 | that the rest of our less technical people,
00:17:21.960 | like our sales operations people could interact with,
00:17:25.120 | and yeah, Retool was perfect for that.
00:17:27.840 | - Yeah, that's a good example of, like,
00:17:28.960 | that's an application that an engineer
00:17:30.760 | probably does not want to build.
00:17:31.880 | Like building an app on Salesforce
00:17:33.400 | or something like that is not exciting.
00:17:34.720 | And Salesforce API sucks, it's very limited,
00:17:36.760 | it's not a fun experience at all.
00:17:38.720 | But piecing together a Retool is quite a bit easier.
00:17:41.120 | So yeah, let me know if you have any feedback,
00:17:42.680 | but awesome, thanks for using it.
00:17:44.600 | - Yeah, no, of course.
00:17:45.960 | Well, so, you know, like more recently,
00:17:47.760 | I think about three, four months ago,
00:17:49.360 | you launched Retool AI.
00:17:50.520 | Obviously, AI has been sort of in the air.
00:17:52.920 | I'd love for you to tell the journey
00:17:55.400 | of sort of AI products ideation within Retool.
00:17:59.640 | Given that you have a degree in this thing,
00:18:01.720 | I'm sure you're not new to this,
00:18:03.000 | but like, when did the,
00:18:04.280 | when would you consider sort of the start
00:18:06.280 | of the AI product thinking in Retool?
00:18:08.800 | - Yeah, wow, that's funny.
00:18:13.560 | So we actually had a joke internally at Retool.
00:18:17.160 | We are a product roadmap for every year,
00:18:19.480 | I think it was like 2019 or something.
00:18:21.440 | We had this joke, which was like,
00:18:24.280 | what are we gonna build this year?
00:18:25.400 | We're gonna build AI programming,
00:18:27.080 | is what we always said as a joke.
00:18:28.760 | And so, but it was funny 'cause we were like,
00:18:31.000 | ah, that's never gonna happen in my life.
00:18:32.680 | Let's add it because it's like a buzzwordy thing
00:18:34.880 | that enterprises love.
00:18:35.720 | So let's look at it.
00:18:37.240 | And so it was almost like a funny thing, basically.
00:18:39.360 | But it turns out, you know,
00:18:40.200 | we're actually building that now.
00:18:41.040 | So this is pretty cool.
00:18:42.600 | So let's say maybe AI thinking
00:18:44.520 | at Retool probably first started maybe like,
00:18:46.640 | I would say maybe a year and a half ago,
00:18:49.880 | something like that.
00:18:51.080 | And the evolution of our thinking was basically,
00:18:55.000 | when we first started thinking about it,
00:18:57.720 | sort of in a philosophical way, if you will,
00:18:59.640 | it's like, well, what is the purpose of AI?
00:19:01.080 | And how can it help what Retool does?
00:19:04.720 | And there were two sort of main prompts, if you will,
00:19:08.320 | of value that we got.
00:19:10.680 | One was helping people build apps faster.
00:19:12.960 | And so you've probably seen that with Copilot,
00:19:14.920 | you've seen sort of so many other coding assistants,
00:19:17.440 | like P0 to that, you know, stuff like that.
00:19:18.960 | So that's interesting because, you know,
00:19:21.840 | engineers, as we talked about, do some grunt work.
00:19:25.040 | And the grunt work, you know,
00:19:26.800 | maybe could be automated by AI was sort of the idea.
00:19:29.720 | And it's interesting, 'cause we actually,
00:19:32.320 | I would say kind of proved or disproved
00:19:34.080 | the hypothesis a little bit.
00:19:35.080 | If you talk to most engineers today,
00:19:36.520 | like a lot of engineers do use Copilot.
00:19:38.920 | But if you ask them, like,
00:19:39.880 | how much time has Copilot saved you?
00:19:41.760 | It's not like coding is 10x faster than before.
00:19:44.360 | You know, coding is maybe like 10% faster,
00:19:46.440 | maybe 20% faster or something like that, basically.
00:19:48.680 | And so it's not like a huge step change, actually.
00:19:51.960 | And the reason for that, as we think,
00:19:53.560 | is because the sort of fundamental frameworks
00:19:56.440 | and languages have not changed.
00:19:58.000 | And so if you're building, let's say, you know,
00:19:59.080 | like the sales ops tool we were talking about before,
00:20:00.800 | for example, let's say you've got AI to generate,
00:20:03.120 | you do a first version of that, for example.
00:20:05.760 | The problem is that it probably generated it for you
00:20:07.960 | in like JavaScript, 'cause you're, you know,
00:20:09.200 | writing for the web browser, for example, right?
00:20:11.520 | And then for you to actually go proofread that JavaScript,
00:20:13.720 | for you to go read the JavaScript to make sure it's working,
00:20:15.600 | you know, to fix the subtle bugs that AI might have caused,
00:20:18.280 | hallucinations, stuff like that,
00:20:19.800 | actually takes a long time and a lot of work.
00:20:21.880 | And so for us, the problem is actually
00:20:23.360 | not like the process of coding itself.
00:20:25.840 | It is more sort of the language or the framework
00:20:28.200 | we think is like way too low level.
00:20:30.040 | It's kind of like, you know, like punched cards.
00:20:31.800 | Like, you know, let's say back in the day,
00:20:33.440 | you proved who designed punched cards,
00:20:35.120 | and AI could help you generate punched cards.
00:20:37.240 | You're like, okay, you know, I guess that helps me,
00:20:40.160 | you know, punching cards is a little bit faster now
00:20:42.400 | 'cause I have a machine punching them for me.
00:20:43.800 | But like, when there's a bug,
00:20:44.640 | I still have to go read all the punched cards
00:20:45.960 | to figure out what's wrong, right?
00:20:46.800 | It's like, it's a lot of work, actually.
00:20:48.120 | And so for us, that was the sort of initial idea was,
00:20:51.600 | can we help engineers code faster?
00:20:53.720 | That would, you know, I think it's somewhat helpful,
00:20:55.520 | to be clear.
00:20:56.360 | And again, I think it's 10 or 20%.
00:20:57.320 | So we have things like, you know,
00:20:58.160 | you can generate SQL queries by AI,
00:20:59.640 | you can generate UIs by AI and stuff like that.
00:21:01.480 | So that's cool, to be clear.
00:21:03.080 | But it's not, I think, the step change of programming
00:21:05.680 | that we all wanted.
00:21:06.600 | And so that, I think, is, you know,
00:21:09.240 | we're investing somewhat in that.
00:21:10.520 | But the bulk of investment actually is a number two,
00:21:13.640 | which is helping developers
00:21:14.960 | build AI-enabled applications faster.
00:21:18.160 | And the reason why we think this is so exciting
00:21:19.800 | is we think that practically every app,
00:21:22.640 | every internal app, especially,
00:21:25.000 | is going to be AI-infused over the next, like, three years.
00:21:29.040 | And so every tool you might imagine,
00:21:31.320 | so like the tool you were mentioning,
00:21:32.720 | like a sales operations tool, for example,
00:21:34.840 | probably, you know, if you were to build it today,
00:21:36.720 | it would incorporate some form of AI.
00:21:38.320 | And so, you know, we see today, like for us,
00:21:40.560 | like a lot of people build, you know,
00:21:41.720 | I'll say sales manager tools, a retool.
00:21:43.560 | An example is there's a fortune,
00:21:45.080 | like a bunch of companies building, like,
00:21:46.720 | sales forecasting tools.
00:21:48.240 | So they basically have sales people enter their forecast,
00:21:50.320 | you know, for the quarter,
00:21:51.480 | you can have a recorder, like, hey, I have these deals,
00:21:53.680 | and these deals are going to close,
00:21:54.920 | these deals are not going to close.
00:21:55.920 | You know, I think I'm upside in these, downside in these,
00:21:57.600 | stuff like that, basically.
00:21:59.080 | And what they're doing now is they're actually,
00:22:02.280 | so you can imagine just pulling in deals
00:22:04.360 | from your Salesforce database.
00:22:05.960 | And so it pulls in the deals
00:22:06.920 | that actually use AI to compute, like, okay, well, you know,
00:22:09.120 | given previous deal dynamics,
00:22:11.000 | like these are the deals that are more likely
00:22:12.320 | to close this month versus next month,
00:22:13.760 | close this quarter, next quarter, et cetera.
00:22:15.840 | And so it could actually, you know,
00:22:17.120 | pre-write you a draft of, you know, your report, basically.
00:22:19.800 | And so that's an example where I think all apps,
00:22:22.680 | whether it's, you know, a sales app, you know,
00:22:24.400 | until let's say a fraud app, a, you know, FinTech app,
00:22:27.760 | you know, whatever it is, basically,
00:22:29.280 | especially internal apps, I think,
00:22:30.960 | like you said, Alessio,
00:22:32.840 | in order to make you a little productive,
00:22:34.560 | it's going to incorporate some form of AI.
00:22:36.520 | So then the question is,
00:22:37.360 | can we help them incorporate this AI faster?
00:22:39.800 | So that's why we launched, like, a vector database,
00:22:41.720 | for instance, built directly into Retool.
00:22:43.480 | That's why we, you know, launches all this AI actually,
00:22:45.400 | so you don't have to, you know,
00:22:46.240 | go figure out what the best model is
00:22:47.400 | and do testing and stuff like that,
00:22:48.440 | just, you know, give it to you out of the box.
00:22:49.720 | So for us, I think that is really the,
00:22:52.560 | really exciting future is can we make every app,
00:22:55.360 | mostly Retool, use AI a little bit
00:22:56.880 | and make people a little productive, so.
00:22:58.880 | - We talked with Jeffrey Wang,
00:23:01.680 | who's the co-founder and chief architect of Amplitude.
00:23:05.280 | He mentioned that you just used PostgreSQL Vector.
00:23:08.320 | When you were building Retool Vectors,
00:23:10.040 | how do you think about, yeah,
00:23:11.600 | leveraging a startup to do it,
00:23:13.640 | putting vectors into one of the existing data stores
00:23:16.000 | that you already had?
00:23:16.840 | I think, like, you already have quite large customer scale,
00:23:20.880 | so, like, you're maybe not trying to get too cute with it.
00:23:24.800 | Any learnings and tips from that?
00:23:26.920 | - Yeah, I think a general philosophical thing
00:23:31.560 | I think we believe is,
00:23:32.800 | we think the open source movement in AI,
00:23:36.320 | especially when it comes to all the supporting infrastructure
00:23:39.520 | is going to win.
00:23:40.680 | And the reason for that is we look at, like,
00:23:42.160 | developer tools in general,
00:23:43.680 | especially in such a fast-moving space.
00:23:46.560 | In the end, like, there are really smart people in the world
00:23:49.080 | that have really good ideas,
00:23:50.000 | and they're going to go build companies,
00:23:52.000 | and they're going to go build projects, basically,
00:23:53.280 | around these ideas.
00:23:54.120 | And so, for us,
00:23:55.920 | we have always wanted to partner
00:23:59.440 | with maybe more open source kind of providers
00:24:03.800 | or projects, you could say, like PG Vector, for example.
00:24:06.920 | And the reason for that is it's easy for us
00:24:08.920 | to see what's going on under the hood.
00:24:10.960 | A lot of this stuff is moving very fast.
00:24:12.400 | A lot of times there are bugs, actually,
00:24:13.520 | and so we can go look and fix bugs ourselves,
00:24:15.760 | and contribute back, for example.
00:24:17.720 | But we really think open source is going to win in this space.
00:24:20.840 | It's, you know, it's hard to say about models.
00:24:22.440 | I don't know about models, necessarily,
00:24:24.040 | because, you know, it starts to get pretty complicated there.
00:24:26.600 | But when it comes to tooling, for sure,
00:24:28.080 | I think there's just, like, so much,
00:24:29.240 | and there's an explosion of creativity, if you will.
00:24:31.760 | And I think betting on any one commercial company
00:24:34.000 | is pretty risky,
00:24:35.080 | but betting on the open source sort of community
00:24:37.120 | and the open source contributors,
00:24:38.760 | I think it's a pretty good bet.
00:24:39.640 | So that's why we decided to go with consumer games.
00:24:42.360 | - Is there any most underrated feature,
00:24:45.360 | like something that customers maybe love
00:24:47.600 | that you didn't expect them to really care about?
00:24:49.760 | I know you have, like, text-to-SQL,
00:24:51.520 | you have UI generation.
00:24:53.400 | There's, like, so many things in there.
00:24:55.240 | Yeah, what surprised you?
00:24:56.640 | - Yeah, so what's really cool,
00:24:59.840 | and this is my sense of the AI space overall,
00:25:02.640 | you know, if you're a skater,
00:25:04.040 | take some YouTube as well,
00:25:06.040 | is that, especially in Silicon Valley,
00:25:08.920 | where a lot of the innovation is happening,
00:25:11.520 | I think there's actually not that many AI use cases,
00:25:14.360 | to be honest.
00:25:15.600 | And AI, to me, even as of, what, like, January 19th or 2024,
00:25:20.600 | still feels like in search of truly good use cases.
00:25:30.120 | And what's really interesting, though, about retool,
00:25:32.960 | and I think we're in a really fortunate position,
00:25:34.960 | is that we have this large base of sort of customers,
00:25:37.320 | and a lot of these customers
00:25:38.520 | are actually much more legacy, if you will, customers.
00:25:41.800 | And a lot of them actually have a lot of use cases for AI.
00:25:44.960 | And so, to us, I think we're almost in, like,
00:25:47.480 | a really perfect or unique spot.
00:25:49.840 | We're able to adopt some of these technologies
00:25:51.320 | and then provide them to some of these, like, older players.
00:25:54.400 | So one example that actually really shocked
00:25:57.080 | and surprised me about AI was,
00:25:59.520 | so we have this one, let's say, clothing manufacturer.
00:26:03.320 | I think it's either the first or second largest
00:26:05.560 | clothing manufacturer in the world who's using retool,
00:26:07.880 | and, you know, a ginormous company with, you know,
00:26:10.560 | pretty multinational, you know,
00:26:13.040 | stores on, you know, pretty every mall in the world.
00:26:15.320 | And they were interested in, so they have one problem,
00:26:19.200 | which is they need to design styles every year
00:26:23.280 | for the next year, basically, for every season.
00:26:25.000 | So, like, hey, it's just, like, summer 2024, for example,
00:26:27.400 | and we're going to design.
00:26:28.840 | And so what they used to do before
00:26:29.720 | is they would hire designers,
00:26:30.560 | and designers would go to study data.
00:26:31.800 | They'd be like, okay, well, it looks like, you know,
00:26:33.480 | maybe floral patterns are really hot in, like, you know,
00:26:36.280 | California, for example, in 2023,
00:26:38.720 | and, like, do I think it's going to be hot in 2024?
00:26:40.360 | Well, let me think about it, I don't know, you know,
00:26:41.800 | so let me, maybe, and if so,
00:26:44.120 | if I believe that it's going to be hot,
00:26:45.120 | let me go design some floral patterns, actually.
00:26:47.520 | And what they ended up doing in retool, actually,
00:26:50.080 | is they actually automated a lot of this process away,
00:26:52.240 | retool, so they actually now have built a retool app
00:26:54.720 | that allows, actually, a non-designer,
00:26:56.040 | so, like, an analyst, if you will, to analyze, like, you know,
00:26:58.440 | who are the hottest-selling patterns, you know,
00:26:59.880 | particular geos, like, this was really hot in Brazil,
00:27:02.080 | this was really hot in China,
00:27:02.920 | this was really hot, you know, somewhere else, basically.
00:27:05.200 | And then they actually feed it into an AI,
00:27:07.920 | and the AI, you know, actually generates,
00:27:11.520 | with DALI and other image generation APIs,
00:27:14.400 | actually generates patterns for them,
00:27:16.120 | and they print the patterns, which is really cool.
00:27:18.560 | And so that's an example of, like, honestly,
00:27:20.400 | in use case I would have never thought about,
00:27:21.760 | like, thinking about, like, you know,
00:27:23.040 | how clothing manufacturers create their next line of clothing,
00:27:25.800 | you know, for the next season, like, I don't know,
00:27:27.480 | I never thought about it, to be honest,
00:27:29.400 | nor did I ever think, you know,
00:27:30.520 | how it would actually happen.
00:27:31.960 | And the fact that they're able to leverage AI,
00:27:33.640 | they actually, you know, leverage multiple things in retail
00:27:35.720 | to make that happen, it's really, really, really cool.
00:27:39.120 | And so that's an example where I think
00:27:41.640 | if you go deeper into sort of,
00:27:44.120 | if you go outside the Silicon Valley,
00:27:45.960 | there are actually a lot of use cases for AI,
00:27:49.760 | but a lot of this is not obvious.
00:27:52.400 | Like, you have to get into the businesses themselves,
00:27:54.200 | and so I think we're, we personally
00:27:55.760 | are in a really fortunate place,
00:27:56.800 | but if, you know, you're working in the AI space
00:27:59.080 | and want to find some use cases, please come talk to us.
00:28:00.920 | Like, you know, we're really excited
00:28:02.200 | about marrying sort of technology with use cases,
00:28:05.240 | which I think is actually really hard to do right now,
00:28:07.120 | so let's not, so.
00:28:08.400 | - You know, I have a bunch of, like,
00:28:09.480 | sort of standing presentations
00:28:12.040 | around, like, how this industry is developing,
00:28:14.880 | and, like, I think the foundation model layer is understood,
00:28:18.720 | the sort of lag chain, VectorDB, RAG layer is understood,
00:28:23.720 | and, like, what is, I always have a big question mark,
00:28:27.400 | and I actually have you and Vercel V0 in that box,
00:28:32.120 | which is, like, sort of the UI layer for AI,
00:28:35.120 | and, like, you know, you are perfectly placed
00:28:38.200 | to expose those functionalities to end users,
00:28:41.960 | even if you personally don't really know
00:28:43.680 | what they're going to use it for,
00:28:44.840 | and sometimes they'll surprise you (laughs)
00:28:46.960 | with their creativity.
00:28:48.040 | One segment of this, and I do see some startups
00:28:54.040 | springing up to do this,
00:28:55.960 | is related to the things that,
00:28:57.720 | to something that you also build,
00:28:59.800 | but it's not strictly AI-related,
00:29:01.120 | which is Retool Workflows,
00:29:02.720 | which is the sort of canvas-y boxes and arrows,
00:29:05.480 | point and click, do this, then do that, type of thing,
00:29:09.400 | like, which every, you know,
00:29:11.360 | I hate that, you know, what are we calling low-code?
00:29:15.240 | Let's, every internal tooling company (laughs)
00:29:19.160 | eventually builds, you know,
00:29:20.280 | I worked at a sort of workflow orchestration company before,
00:29:24.040 | and we were also discussing internally
00:29:25.640 | how to make that happen,
00:29:27.520 | but you are obviously very well-positioned to that.
00:29:31.040 | - Yeah, basically, like, you know,
00:29:33.160 | would you, do you think that there is an overlap
00:29:35.440 | between Retool Workflows and AI?
00:29:37.960 | I think that, you know, there's a lot of interest
00:29:40.200 | in sort of chaining AI steps together.
00:29:44.120 | Do people, I couldn't tell if, like,
00:29:46.920 | that is already enabled within Retool Workflows,
00:29:48.720 | I don't think so,
00:29:49.600 | but you could sort of hook them together sort of jankily.
00:29:53.680 | Like, what's the interest there?
00:29:57.480 | You know, is it all of a kind, ultimately, in your mind?
00:30:01.760 | - It is 100% of the time,
00:30:03.120 | and yes, so you could actually already,
00:30:05.320 | so a lot of people actually are building
00:30:06.520 | AI workflows down in Retool,
00:30:08.200 | which is, we can talk about that in a second,
00:30:09.760 | but a hot take here is actually,
00:30:12.200 | I think a lot of the utility in AI today,
00:30:17.240 | like, I would probably argue 60, 70% of the utility,
00:30:21.680 | like, you know, businesses are found in AI,
00:30:24.160 | is mostly via ChatGPT, and across the world, too.
00:30:27.880 | And the reason for that is, I think the ChatGPT
00:30:30.360 | sort of UI, you could say, or interface,
00:30:32.800 | or user experience is just really quite good.
00:30:34.960 | You know, you can sort of converse with an AI, basically.
00:30:38.640 | And that said, there are downsides to it.
00:30:42.920 | If you talk to, like, a giant company,
00:30:44.280 | like a J.P. Morgan Chase, you know, for example,
00:30:46.320 | they may be reticent to have people
00:30:49.120 | copy-paste it into ChatGPT, for example,
00:30:50.960 | even on ChatGP Enterprise, for example.
00:30:53.120 | Some problems are that,
00:30:55.000 | I think Chat is good for one-off tasks.
00:30:57.560 | So if you're like, hey,
00:30:58.440 | I want a first version of representation
00:30:59.920 | or something like that, you know,
00:31:00.880 | and help me write this first version of a doc
00:31:02.840 | or something like that, Chat is great for that.
00:31:04.800 | It's a great, you know, very portable,
00:31:06.200 | you know, if you will, form factor.
00:31:07.360 | So you can do that.
00:31:08.720 | However, if you think about it,
00:31:10.920 | you think about some of the economic productivity
00:31:13.400 | more generally, like, Chat, again,
00:31:16.520 | will help you, like, 10 or 20%,
00:31:18.440 | but it's unlikely that you're gonna replace
00:31:20.040 | an employee with Chat.
00:31:21.040 | Like, you're not gonna be like,
00:31:22.720 | oh, I have a relationship manager at J.P. Morgan Chase,
00:31:25.240 | and I've replaced him with an AI chatbot.
00:31:27.280 | Like, it's kind of hard to imagine, right?
00:31:28.680 | 'Cause like, the employees actually do a lot of things
00:31:30.600 | besides, you know, just, you know, generating, you know,
00:31:33.840 | maybe another way of putting it is like,
00:31:35.480 | Chat is like a reactive interface.
00:31:36.920 | Like, it's like, when you have an issue,
00:31:38.440 | you'll go reach out to Chat, and Chat might solve it.
00:31:40.440 | But like, Chat is not gonna solve 100% of your problems.
00:31:42.280 | It'll solve, like, you know, 25% of your problems,
00:31:43.920 | like, you know, pretty quickly, right?
00:31:44.760 | And so what we think the next, like,
00:31:47.320 | big breakthrough in AI is, is actually, like, automation.
00:31:51.480 | It's not just like, oh, I have a problem,
00:31:53.080 | let me go to a chatbot and solve it.
00:31:54.440 | Because like, again, like, you know,
00:31:55.880 | people don't spend 40 hours a week in a chatbot.
00:31:57.560 | This time, they've been like,
00:31:58.400 | two hours a week in a chatbot, for example.
00:31:59.920 | And so what we think can be really big, actually,
00:32:02.360 | is you're able to automate entire processes via AI.
00:32:05.480 | Because then, you're really realizing the potential of AI.
00:32:07.720 | It's like, it's not just like, you know,
00:32:09.360 | a human copy-pasting data in to an AI chatbot,
00:32:11.720 | and then, you know, pasting it back out, or copying back out.
00:32:14.240 | Instead, it's like, the whole process now
00:32:15.720 | is actually done in an automated fashion without the human.
00:32:18.720 | And that, I think, is what's gonna really unlock
00:32:20.720 | sort of big canonical productivity,
00:32:22.200 | or, and that's what I'm really excited about.
00:32:25.520 | And I think part of the problem right now is,
00:32:28.520 | you know, I'm sure you all thought a lot about agents.
00:32:30.640 | I think agents are actually quite hard.
00:32:33.240 | Because like, you know, the AI is wrong,
00:32:34.720 | like, you know, 2% of the time,
00:32:35.960 | but then you like, you know, screw,
00:32:37.760 | if you, let's say, you know, raise it to the power of seven,
00:32:39.720 | for example, that's actually wrong, you know,
00:32:40.840 | quite often, for example.
00:32:42.480 | And so what we've actually done with workflows is,
00:32:46.080 | we prefer, what we've learned, actually,
00:32:48.160 | is that we don't want to generate the whole workflow
00:32:49.560 | for you via AI.
00:32:51.280 | Instead, what we want you to do, actually,
00:32:52.840 | is we want you to actually sort of drag and drop
00:32:54.680 | the workflow yourself.
00:32:55.520 | And maybe you can get a vSphere or something by AI,
00:32:57.080 | but it's coded, basically.
00:32:59.160 | You should actually be able to modify the steps yourself.
00:33:01.360 | But every step can use AI.
00:33:03.360 | And so what that means is like,
00:33:04.560 | it's not the whole workflow is created by AI.
00:33:06.200 | So like every step is AI automated.
00:33:08.440 | And so if you go back to, for example,
00:33:09.640 | like the users are talking about,
00:33:10.760 | you know, with a clothing manufacturer,
00:33:12.000 | that's actually a workflow, actually.
00:33:13.120 | So basically, what they say is like, hey, every day,
00:33:15.200 | we see all the data, you know,
00:33:16.840 | from our sales systems into our database.
00:33:19.720 | And then we, you know, do some data analysis.
00:33:21.960 | And, you know, that's just, you know,
00:33:23.160 | raw SQL, basically, so it's nothing too surprising.
00:33:25.320 | And then they use AI to go generate the new ideas.
00:33:28.080 | And then the analysts will look at the new ideas
00:33:29.800 | and approve or reject them, basically.
00:33:31.360 | And that is like a, you know, that's true automation.
00:33:33.720 | You know, it's not just like, you know,
00:33:34.920 | a designer copy pasting things as a chat sheet.
00:33:37.320 | We can be like, hey, you know, give me a design.
00:33:39.600 | It's actually, the designs are being generated.
00:33:40.960 | We generate 10,000 designs every day.
00:33:42.720 | And then you have to go and approve or reject those designs,
00:33:44.840 | which I think is a lot, you know,
00:33:46.400 | that's a lot more economically productive
00:33:48.680 | than just copy pasting stuff in a chat sheet.
00:33:50.320 | So we think sort of the AI workflow space
00:33:53.840 | is a really exciting space.
00:33:56.080 | And I think that is the next step
00:33:57.440 | in sort of delivering a lot of business value by AI.
00:34:00.320 | I personally don't think it's, you know,
00:34:01.800 | via chat or, you know, via agents quite yet, so.
00:34:05.480 | - Yeah, yeah, I think that's a pretty reasonable take.
00:34:09.560 | It's disconcerting because I, not, I mean,
00:34:13.600 | disconcerting only in the fact that, like,
00:34:15.120 | I know a lot of people are trying to build
00:34:17.240 | what you already have in workflows.
00:34:18.800 | (laughs)
00:34:21.040 | So you have that sort of,
00:34:22.240 | you're the incumbent sort of in their minds.
00:34:25.480 | I'm sure it doesn't feel that way to you,
00:34:27.040 | but I'm sure, you know,
00:34:29.280 | you're the incumbent in their minds
00:34:30.400 | and they're like, okay, like, you know,
00:34:31.720 | like how do I, you know, compete with Retool
00:34:35.000 | or, you know, differentiate from Retool?
00:34:37.480 | And, you know, as you mentioned, you know,
00:34:39.880 | all these sort of connections,
00:34:41.560 | it does remind me that you're running up against Zapier.
00:34:45.600 | You're running up against maybe Notion
00:34:48.000 | in the distant future.
00:34:50.560 | And yeah, I think that there'll be
00:34:51.920 | a lot of different takes at this space
00:34:53.320 | and like whoever is best positioned
00:34:54.800 | to serve their customer in the way
00:34:56.800 | that they sort of need to shape is going to win.
00:35:00.080 | Do you have a philosophy against,
00:35:01.360 | around like what you won't build?
00:35:03.040 | Like what do you prefer to partner
00:35:05.720 | and not build in-house?
00:35:08.320 | Because it seems, I feel like you build a lot in-house.
00:35:10.720 | - Yes, there's probably two philosophical things.
00:35:12.520 | So one is that we're developer first
00:35:14.240 | and I think that's actually one big differentiator
00:35:15.800 | between us and Zapier.
00:35:16.840 | You know, we're so very rare to see them actually.
00:35:18.760 | The reason is we're developer first.
00:35:20.400 | Because developers like,
00:35:21.680 | if you're like building a sales ops tool,
00:35:23.440 | you're probably not considering Notion
00:35:24.720 | if you're a developer.
00:35:25.560 | You're probably like,
00:35:26.400 | I want to build this via React basically,
00:35:27.520 | or use Retool.
00:35:29.640 | And so, are we built for developers?
00:35:32.680 | It's pretty interesting, actually.
00:35:33.880 | I think one huge advantage of some of the developers
00:35:36.080 | is that they actually don't prefer,
00:35:38.400 | like developers don't want to be given an institution.
00:35:41.560 | They want to be given the building blocks
00:35:42.840 | so they can do themselves to build the institution.
00:35:44.840 | And so for us, like, you know,
00:35:46.920 | actually, you know,
00:35:48.640 | interesting point that Equilibrium and Retool can get to
00:35:51.600 | is basically to say,
00:35:52.440 | hey, Retool's a consulting company
00:35:53.680 | and we basically build apps for everybody, for example.
00:35:56.600 | And what's interesting
00:35:57.440 | is that we've actually never gotten to that Equilibrium
00:35:59.560 | and the reason for that is for some of the developers.
00:36:01.280 | Developers don't want, you know,
00:36:02.560 | like a consultant coming in
00:36:03.640 | and building all the apps for them.
00:36:04.880 | Developers are like,
00:36:05.720 | hey, I want to do it myself.
00:36:06.560 | Just give me the building blocks.
00:36:07.400 | So give me the best table library.
00:36:08.800 | Give me, you know, good state management.
00:36:10.440 | Give me easy way to query the rest of the APIs
00:36:11.880 | and I'll do it myself, basically.
00:36:12.920 | So that is pretty,
00:36:14.160 | so we generally end up
00:36:15.960 | basically always building building blocks
00:36:17.920 | that are reusable by multiple customers.
00:36:20.120 | We have, I think,
00:36:21.160 | basically never built anything specific for one customer.
00:36:24.000 | So that's one thing that's interesting.
00:36:26.080 | The second thing is when it comes to sort of,
00:36:27.920 | you know, let's say like in the AI space,
00:36:29.560 | we're going to build and we're not going to build,
00:36:31.240 | we basically think about whether it's a core competency
00:36:33.960 | or whether there are,
00:36:36.040 | whether there are unique advantages
00:36:37.240 | to us building it or not.
00:36:39.080 | And so we think about the Workflows product.
00:36:40.960 | We think Workflows actually
00:36:41.920 | is a pretty core competency for us.
00:36:43.400 | And I think the idea that we could build
00:36:45.000 | a developer-first Workflows automation engine,
00:36:48.000 | it's actually, there's nothing,
00:36:49.360 | I mean, I think after we released,
00:36:51.080 | you know, Workflows, Virtual Workflows,
00:36:52.800 | there have been a sort of few copycats
00:36:54.280 | that are, I think, quite far behind, actually.
00:36:56.280 | They sort of are missing a lot of more critical features.
00:36:59.360 | But like, if you look at the space,
00:37:01.880 | it's like Zapier on one side
00:37:04.680 | and then maybe like Airflow on the other.
00:37:07.040 | And so Virtual Workflows actually is fairly differentiated.
00:37:10.280 | And so we're like, okay, we should go build that.
00:37:11.960 | This is the one that we built, so let's go build it.
00:37:14.360 | Whereas if you look at like Vectors, for example,
00:37:16.480 | you look at Vectors, you're like, wow,
00:37:17.560 | this is a pretty thriving space
00:37:18.680 | and pretty, you know, Vector databases.
00:37:21.160 | Does it make sense for us to go build our own?
00:37:22.520 | Like, what's the benefit?
00:37:23.400 | Like, not much.
00:37:24.240 | We should go partner with,
00:37:25.400 | or go find technology off the shelf.
00:37:27.200 | In our case, it's Pagerected.
00:37:28.040 | And so for us, I think it's like,
00:37:29.800 | how much value does it add for customers?
00:37:31.240 | Do we have a different take on the space?
00:37:32.360 | Do we not?
00:37:33.320 | And every product that we've launched,
00:37:35.240 | we've had a different take on the space
00:37:36.880 | and the products that we don't have a different take,
00:37:38.600 | we just adopt what's off the shelf.
00:37:41.080 | - Let's jump into the state of AI survey that you ran
00:37:43.840 | and maybe get some live updates.
00:37:46.280 | So you surveyed about 1,600 people last August.
00:37:49.480 | So, you know, and I were this busy like five years ago.
00:37:53.000 | And there were kind of like a lot of interesting nuggets
00:37:56.560 | and we'll just run through everything.
00:37:59.280 | The first one is more than half the people,
00:38:03.000 | 52% said that AI is overrated.
00:38:06.320 | Are you seeing sentiment shift in your customers
00:38:08.840 | or like the people that you talk to,
00:38:10.160 | like as the months go by,
00:38:11.880 | or do you still see a lot of people,
00:38:16.120 | yeah, that are not in Silicon Valley,
00:38:17.480 | maybe say, "Hey, this is maybe not as worth changing
00:38:21.440 | as you all made it sound to be."
00:38:23.320 | - Yes, so actually I'll be running the survey again,
00:38:24.720 | actually in the next few months.
00:38:25.560 | So I can let you know when it changes.
00:38:27.520 | It seems to me that it has settled down a bit
00:38:32.520 | in terms of sort of the,
00:38:33.640 | maybe like, I don't know, signal to noise, you could say.
00:38:35.960 | Like, it seems like there's a little bit less noise
00:38:38.880 | than before, but the signals actually run about the same
00:38:41.480 | in the sense that like,
00:38:42.480 | I think people are still trying to look for use cases.
00:38:44.920 | I was saying with all this last year,
00:38:46.520 | like United States, again,
00:38:48.000 | and I think there are slightly more use cases,
00:38:50.200 | but still not substantially more.
00:38:51.040 | And I think as far as we can tell,
00:38:53.440 | a lot of the engineers surveyed,
00:38:54.560 | especially some of the comments that we saw,
00:38:56.440 | do feel like the companies are investing quite a bit in AI
00:38:59.800 | and they're not sure where it's going to go yet,
00:39:01.280 | but they're like, right, it could be big.
00:39:02.800 | So I think we should keep on investing.
00:39:04.680 | I do think that based on what we're hearing from customers,
00:39:08.240 | if we're not seeing returns in like a year or something,
00:39:10.600 | there'll be more skepticism.
00:39:11.800 | So I think there is like a, it is time bound, if you will.
00:39:14.800 | - You finally gave us some numbers
00:39:18.320 | on Stack Overflow usage.
00:39:19.920 | I think that's been a Twitter meme for a while,
00:39:22.080 | whether or not ChatGPT kills Stack Overflow.
00:39:24.760 | In the survey, 58 people said they used it less,
00:39:28.120 | and 94% of them said they used it less
00:39:30.440 | because of Copilot and ChatGPT,
00:39:32.720 | which, yeah, I think it kind of makes sense.
00:39:36.840 | I know Stack Overflow tried to pull a whole thing.
00:39:38.760 | It's like, no, the traffic is going down
00:39:40.640 | because we changed the way we instrument our website,
00:39:42.640 | but I don't think anybody bought that.
00:39:45.040 | And then you add, right after that,
00:39:47.520 | expectation of job impact by function and operations,
00:39:51.080 | people, eight out of 10, basically,
00:39:53.440 | they think it's going to,
00:39:54.280 | AI is going to really impact their job.
00:39:56.240 | Designers were the lowest one, 6.8 out of 10,
00:39:59.560 | but then all the examples you gave
00:40:00.800 | were designers of a job being impacted by AI.
00:40:04.600 | Do you think there's a bit of a dissonance, maybe,
00:40:08.840 | between the human perception of like,
00:40:11.560 | oh, my job can't possibly be automated?
00:40:14.400 | It's funny that the operations people are like,
00:40:15.960 | yeah, it makes sense.
00:40:16.800 | I wish I could automate myself, you know,
00:40:18.880 | versus the designers that maybe they love their craft more.
00:40:22.800 | Yeah, I don't know if you have any thoughts
00:40:23.960 | on who will accept it first, you know,
00:40:26.720 | that they should just embrace the technology
00:40:28.480 | and change the way they work.
00:40:29.880 | - Hmm.
00:40:30.720 | Yeah, that's interesting.
00:40:34.440 | I think it's probably going to be engineering driven.
00:40:38.640 | I mean, I think you two are very well,
00:40:40.720 | maybe you two even started some of this wave
00:40:42.280 | and sort of the AI engineer wave.
00:40:44.680 | I think the companies that adopt AI the best,
00:40:48.080 | it is going to be engineering driven, I think,
00:40:50.440 | rather than like operations driven or anything else.
00:40:52.640 | And the reason for that is,
00:40:54.240 | I think the rise of this like profile of AI engineering,
00:40:56.760 | like AI is very philosophical.
00:41:00.320 | Like AI is a tool in my head.
00:41:01.800 | Like it is not a, in my head,
00:41:03.560 | I think we're actually pretty far from AGI,
00:41:05.320 | we'll see what happens.
00:41:06.160 | But AI is not like a thing that,
00:41:09.760 | it's not like a black box
00:41:10.600 | where like it does everything you want it to do.
00:41:12.840 | The models that we have today
00:41:14.120 | require like very specific prompting, for example,
00:41:16.480 | in order to get like, you know, really good results.
00:41:19.000 | And the reason for that is, it's a tool that, you know,
00:41:20.880 | you can use it in specific ways.
00:41:22.080 | If you use it the wrong way,
00:41:22.920 | it's not going to produce good results for you, actually.
00:41:24.320 | It's not like, you know,
00:41:25.800 | by itself taking a job away, right?
00:41:27.000 | So, I think actually, to adopt AI,
00:41:30.160 | it's probably going to be,
00:41:31.480 | going to have to be engineering first, basically,
00:41:33.400 | where engineers are playing around with it,
00:41:35.200 | figuring out all the bases of the models,
00:41:37.200 | figuring out like, oh,
00:41:38.480 | maybe like using vectorized databases
00:41:40.640 | is a lot better, for example.
00:41:42.280 | Maybe like prompting in this particular way
00:41:44.440 | is going to be a lot better, et cetera.
00:41:46.280 | And that's not the kind of stuff
00:41:47.920 | that I think like an operations team
00:41:49.320 | is going to really be like experimenting with, necessarily.
00:41:51.720 | I think it really has to be engineering led.
00:41:54.080 | And then I think the question is,
00:41:55.920 | well, what are the engineers going to focus on first?
00:41:58.120 | Like, are they going to focus on, you know,
00:41:59.520 | design first or like operations first?
00:42:01.920 | And that, I think, is more of a business decision.
00:42:03.880 | I think it's probably going to be more like, you know,
00:42:05.600 | the CEO, for example, says,
00:42:06.840 | hey, you know, we're having trouble
00:42:08.200 | scaling this one function.
00:42:09.320 | So, like, why don't we try using AI for that?
00:42:11.040 | And let's see what happens, for example.
00:42:13.200 | And so in our case, for example,
00:42:15.360 | we are really, we have a lot of support issues.
00:42:17.920 | And what I mean by that is,
00:42:19.240 | we have a really, really high performance support team,
00:42:21.440 | but we get a lot of tickets.
00:42:22.480 | And the reason for that is, you know,
00:42:23.640 | we're a very dynamic product,
00:42:25.080 | you can use it in so many different ways.
00:42:26.560 | And so we'll have a lot of questions for us, basically.
00:42:28.360 | And so we were looking at, well, you know,
00:42:30.600 | can we, for example, draft some replies to support tickets,
00:42:33.480 | you know, by AI, for example.
00:42:34.880 | Can we allow our support agents to be, you know,
00:42:36.840 | hopefully, you know, double as,
00:42:38.280 | doubly productive as before, for example.
00:42:39.960 | And so, I guess I would say it's like,
00:42:42.520 | business needs driven,
00:42:43.840 | but then engineering driven after that.
00:42:45.360 | So like, you know, the business decides,
00:42:47.120 | okay, well, this is where AI could be most applied.
00:42:49.240 | And then we assign the project to an engineer,
00:42:50.560 | and the engineer goes and figures it out.
00:42:52.640 | I honestly am not sure if like,
00:42:55.480 | the operation we're gonna have much of a,
00:42:58.800 | like, if they accept and reject it,
00:42:59.880 | I don't know if that's gonna change the outcome,
00:43:02.240 | if you will, so.
00:43:03.960 | - Yeah, interesting.
00:43:06.200 | Another interesting part was the importance
00:43:09.320 | of AI in hiring.
00:43:10.600 | 45% of companies said they made their interviews
00:43:14.600 | more difficult, and the engineering side,
00:43:17.160 | made interviews more difficult to compensate
00:43:19.400 | for people using Copilot and ChatGPT.
00:43:22.280 | Has that changed at Retool?
00:43:24.680 | Like, have you, yeah, have you thought about it?
00:43:26.800 | I don't know how much you're still involved
00:43:28.600 | with engineering hiring the company,
00:43:30.360 | but I'm curious how we're scaling
00:43:34.880 | the difficulty of interviews,
00:43:36.360 | even though the job is the same, right?
00:43:39.160 | So just because you're gonna use AI
00:43:40.680 | doesn't mean the interview should be harder,
00:43:42.280 | but I guess it makes sense.
00:43:43.720 | - Yeah, for us, I think our sense,
00:43:47.640 | based on the survey,
00:43:48.480 | and this is true for what we believe, too,
00:43:50.040 | is that we are most, when we do engineering interviews,
00:43:55.040 | we are most interested in assessing, like,
00:43:57.400 | critical thinking, or thinking on the spot.
00:44:00.240 | And I guess, when you're hiring an employee,
00:44:04.960 | at the end, the job of the employee is to be productive,
00:44:07.160 | which they should use whatever tools they want
00:44:08.360 | to be productive, so that's kind of our thinking, too.
00:44:10.520 | However, we do think that,
00:44:12.120 | if you think about it from a first-person's way,
00:44:14.040 | if your only method of coding is literally copy-pasting
00:44:17.000 | off of TractionBT, or just pressing Tab and Copilot,
00:44:20.240 | I think that would be concerning.
00:44:21.240 | And so, for that reason,
00:44:23.440 | we still do want to test for fundamentals,
00:44:26.200 | understanding of CompSci.
00:44:27.880 | Now, that said, I think if you're able to use
00:44:30.920 | TractionBT or Copilot, let's say, competently,
00:44:33.200 | we do view that as a plus, we don't view it as a minus,
00:44:35.400 | but if you only use Copilot,
00:44:36.800 | and you aren't able to reason about
00:44:38.400 | how to write a for loop, for example,
00:44:39.600 | or how to write FizzBuzz,
00:44:40.800 | that would be highly problematic.
00:44:42.320 | And so, for us, what we do today
00:44:44.880 | is we'll use a screen share,
00:44:45.840 | or we'll actually use a hackpad, actually.
00:44:47.640 | So, I guess there's no Copilot there,
00:44:49.560 | you can sort of see what they're doing,
00:44:50.400 | or see what they're thinking.
00:44:51.920 | And we really want to test for thinking, basically.
00:44:53.800 | But yeah, I mean, we, ourselves,
00:44:55.400 | imperially have embraced Copilot,
00:44:57.120 | and we would encourage engineers to build this Copilot, too.
00:44:59.920 | But we do want to test for understanding
00:45:02.720 | of what you're doing,
00:45:03.560 | rather than just copy-pasting on Copilot, so.
00:45:06.320 | - The other one was AI adoption rate.
00:45:08.360 | Only 27% are in production.
00:45:10.800 | Of that 27%, 66% are internal use cases.
00:45:14.680 | Shout out to Retool, you know?
00:45:16.280 | How, do you have a mental model
00:45:20.520 | as to how people are gonna make the jump
00:45:22.520 | from, like, using it internally to externally?
00:45:24.760 | Obviously, there's, like, all these different things,
00:45:26.280 | like privacy, like, you know,
00:45:28.760 | if an internal tool hallucinates, that's fine,
00:45:30.960 | because you're paying people to use it, basically,
00:45:32.920 | versus if it hallucinates to your customer,
00:45:34.760 | there's a different bar.
00:45:36.200 | Yeah, I don't know if you have thought about it.
00:45:39.480 | Because for you, if people build internal tool with Retool,
00:45:42.640 | they're external customers to you, you know?
00:45:45.160 | So, I think you're on the flip side of it.
00:45:47.960 | - Yeah, I think it's hard to say.
00:45:52.040 | Maybe a core Retool belief
00:45:53.320 | is actually that most software built in the world
00:45:55.480 | is internal-facing, actually,
00:45:57.040 | which actually may sound kind of surprising,
00:45:58.760 | you know, first time you're hearing this,
00:45:59.920 | but effectively, like, you know,
00:46:01.760 | we all work at Silicon Valley, right?
00:46:03.080 | Like, we all work at businesses, basically,
00:46:06.640 | that sell software as, you know, as sort of a business.
00:46:10.880 | And that's why all the software engineers that we hire
00:46:14.200 | basically work on external-facing software,
00:46:16.360 | which makes sense, because we're software companies.
00:46:18.440 | But if you look at most companies in the world,
00:46:20.160 | most companies in the world
00:46:21.000 | are actually not software companies.
00:46:22.160 | If you look at, like, you know,
00:46:23.000 | the clothing manufacturers I was talking about,
00:46:24.640 | they're not a software company.
00:46:25.600 | Like, they don't sell software, you know, to make money.
00:46:27.880 | They sell clothing to make money.
00:46:29.480 | And most companies in the world
00:46:30.480 | are not software companies, actually.
00:46:31.320 | And so, most of the engineers in the world, in fact,
00:46:33.800 | don't work at Silicon Valley companies.
00:46:35.160 | They work outside of Silicon Valley.
00:46:36.320 | They work in these sort of more traditional companies.
00:46:37.920 | So, if you look at the Forge of 100, for example,
00:46:39.920 | probably, like, 20 of them are software companies.
00:46:42.400 | You know, there are 480 of them are not software companies.
00:46:44.600 | And actually, they employ those software engineers.
00:46:46.440 | And so, most of the software engineers in the world,
00:46:48.720 | and most of the code engineers in the world,
00:46:50.120 | actually goes towards these internal-facing applications.
00:46:53.120 | And so, like, for all the reasons you said there,
00:46:55.600 | like, I think hallucination matters less, for example,
00:46:57.560 | 'cause they have someone checking the output,
00:46:59.280 | and consumer, so hallucination is more okay.
00:47:01.840 | It's more acceptable, as well.
00:47:04.040 | Yeah, it can be unreliable because it's probabilistic,
00:47:06.520 | and that's also okay.
00:47:07.480 | So, I think it's kind of hard to imagine
00:47:12.480 | AI being adopted in a consumer way
00:47:16.400 | without the consumer, like, opting in.
00:47:18.000 | Like, Chats with BT is very obviously a consumer.
00:47:20.280 | The consumer knows that it's Chats with BT.
00:47:22.160 | They're using it.
00:47:24.120 | I don't know if it's going to make its way
00:47:27.000 | to, like, the banking app any time soon.
00:47:29.000 | Maybe for, like, even for support, it's hard,
00:47:32.680 | because if it hallucinates, then, you know,
00:47:34.280 | it's actually quite bad for support if you're hallucinating.
00:47:36.640 | So, it's, yeah, it's hard to say.
00:47:39.840 | I'm not sure.
00:47:40.840 | - Yeah, but that's a good idea, like, insight, you know?
00:47:47.160 | Yeah, I think a lot of people, like you said,
00:47:49.040 | we all build software, so we expect that everybody else
00:47:52.160 | is building software for other people,
00:47:53.600 | but most people just want to use the software
00:47:55.760 | that we build out here.
00:47:57.240 | I think the last big bucket is, like, models breakdown.
00:48:01.840 | 80% of people you survey just use OpenAI.
00:48:05.440 | Some might experiment with smaller models.
00:48:08.080 | Any insight from your experience at WeTool,
00:48:11.040 | like building some of the AI features?
00:48:13.560 | Have you guys thought about using open source models?
00:48:16.200 | Have you thought about fine-tuning models
00:48:17.960 | for specific use cases?
00:48:19.200 | Or have you just found GV84 to just be great at most tasks?
00:48:23.800 | - Yeah, so two things.
00:48:26.480 | One is that from a data privacy perspective,
00:48:31.480 | people are getting more and more okay
00:48:34.120 | with using a hosted model, like a GPT-4, for example,
00:48:37.600 | especially because GPT-4 or OpenAI
00:48:40.440 | often has to have enterprise equipment
00:48:41.680 | in some companies already,
00:48:42.720 | 'cause I think a lot of CIOs are just like,
00:48:44.240 | let's get a second house,
00:48:45.080 | like, you know, let's use Azure, for example,
00:48:47.040 | and, you know, let's make it available
00:48:49.400 | for employees to experiment with.
00:48:50.920 | So I do think there is more acceptance, if you will,
00:48:53.200 | today of feed data into GPT.
00:48:56.920 | That's gonna take some sensitive data.
00:48:58.280 | People might not want to do so,
00:48:59.560 | like, you know, feeding in, like, earnings results data,
00:49:01.520 | you know, three days before you announce earnings,
00:49:03.080 | like, probably is a bad idea.
00:49:04.080 | They probably don't want GPT writing
00:49:06.160 | your, like, earnings statement for you.
00:49:07.320 | So, you know, there's still some challenges like that,
00:49:10.280 | that I think actually open source models
00:49:11.640 | could actually help solve, like a lot of three,
00:49:13.720 | you know, when it comes to, and that can be exciting.
00:49:16.800 | So that's maybe just one thought.
00:49:19.480 | The second thought is,
00:49:20.680 | I think OpenAI has been really quite smart
00:49:23.280 | with sort of no pricing,
00:49:24.160 | and they've been pretty aggressive of, like,
00:49:27.040 | let's get, you know, let's create this model
00:49:28.960 | and sell it at a pretty cheap price
00:49:30.720 | to make it such that there's no reason
00:49:32.040 | for you to use any other model.
00:49:33.560 | Just from, like, a strategy perspective,
00:49:38.880 | I don't know if that's gonna work.
00:49:40.080 | And the reason for that is
00:49:42.160 | you have really well-funded players,
00:49:43.760 | like a Google or like a Facebook, for example,
00:49:46.560 | that are actually quite interested.
00:49:47.560 | Like, I think if OpenAI was competing with startups,
00:49:49.520 | OpenAI would win for sure.
00:49:50.600 | Like, at this point, OpenAI is so far ahead
00:49:52.360 | from both a model and a pricing perspective
00:49:54.000 | that, like, there is no reason for it to go,
00:49:55.760 | I think, in my opinion, at least, a startup model.
00:49:58.560 | But if, like, you know,
00:49:59.680 | Facebook is not gonna give up on AI.
00:50:00.880 | Like, Facebook is investing a lot in AI, in fact.
00:50:03.240 | And so competing against a large FANG company
00:50:07.280 | that is making their model open source,
00:50:10.000 | I think that is challenging.
00:50:11.080 | Now, however, where we are right now
00:50:12.680 | is, I think, GPT-4 is so far ahead
00:50:14.040 | in terms of performance that,
00:50:15.680 | and I would say model performance is so important right now
00:50:19.400 | because, like, the average, you know, like,
00:50:21.160 | you know, you can argue Lama 2 is actually so far behind,
00:50:24.000 | but, like, customers don't want to use Lama 2
00:50:25.440 | 'cause it's so far behind right now.
00:50:26.600 | And so that, I think, is part of the challenge.
00:50:29.160 | As AI progress slows down,
00:50:31.000 | so if we get, like, Lama 4, Lama 5, for example,
00:50:33.120 | maybe it's comparable at that point,
00:50:34.360 | like GPT-5 or GPT-6, like,
00:50:36.360 | it may get to the point where it's like,
00:50:37.600 | look, I just want to use Lama.
00:50:38.600 | Like, it's safer for me to, you know, host it on-prem.
00:50:41.000 | It's just as fast, just as cheap.
00:50:42.440 | Like, why not, basically?
00:50:44.040 | But I think right now we are in this state
00:50:46.400 | where open AI is executing really well, I think.
00:50:48.760 | And right now they're thriving,
00:50:50.440 | but let's see what happens in the next year or two, so.
00:50:53.600 | - Awesome, and there's a lot more numbers,
00:50:55.200 | but we're just gonna send people to the link
00:50:57.760 | in the show notes.
00:50:59.400 | Yeah, this was great, Sean.
00:51:01.400 | Any other thoughts, threads we wanna pull on
00:51:05.320 | while we have David?
00:51:06.160 | - Well, what are you gonna ask differently
00:51:07.720 | for the next survey?
00:51:08.720 | Like, what info do you really actually want to know
00:51:11.000 | that's gonna change your worldview?
00:51:12.800 | - I'll also ask you that,
00:51:13.720 | but if you have any ideas, let me know.
00:51:15.160 | For us, actually, we're planning on asking
00:51:17.120 | very similar questions, because for us,
00:51:19.360 | the value of the survey is mostly seeing changes over time
00:51:22.840 | and understanding, like, okay, wow.
00:51:25.320 | For example, GPT-4 Turbo MPS has declined.
00:51:28.480 | That would be interesting, actually.
00:51:30.400 | One thing that was actually pretty shocking to us
00:51:32.280 | was, let me find the exact number,
00:51:33.480 | but if you look at the one change that we saw,
00:51:37.080 | for example, if you compare GPT-3.5 MPS,
00:51:39.760 | I wanna say it was like 14 or something.
00:51:42.840 | It was not high, actually.
00:51:44.920 | The GPT-4 MPS thing was like 45 or something like that,
00:51:47.800 | so it was actually quite a bit higher.
00:51:49.320 | So I think that kind of progress over time
00:51:52.160 | is where we're most interested in seeing,
00:51:53.880 | is our models getting worse, models getting better?
00:51:56.240 | Are people still loving PG-Vector?
00:51:57.800 | Do people still love Mongo, stuff like that?
00:52:00.000 | That, I think, is the most interesting thing, so.
00:52:02.480 | Do you two have any questions that you think we should ask?
00:52:06.040 | - Off the bat, it seems like
00:52:10.480 | you're very language model-focused.
00:52:13.400 | I think that there's an increasing interest
00:52:15.920 | in multimodality in AI,
00:52:19.720 | and I don't really know how that is going to manifest.
00:52:23.760 | Obviously, GPT-4 Vision, as well as Gemini,
00:52:27.360 | both have multimodal capabilities.
00:52:30.480 | There's a smaller subset of open-source models
00:52:32.360 | that have multimodal features as well.
00:52:35.560 | We just released an episode today
00:52:37.640 | talking about Idafix from Hugging Face.
00:52:39.720 | And yeah, so I think I would like to understand
00:52:43.720 | how people are adopting or adapting
00:52:46.360 | to the different modalities
00:52:47.520 | that are now coming online for them,
00:52:49.800 | what their demand is relative to,
00:52:52.040 | for, let's say, generative images,
00:52:53.720 | versus just visual comprehension,
00:52:58.120 | versus audio, versus text-to-speech.
00:53:01.440 | What do they want? What do they need?
00:53:03.120 | And what's the forced, stack-ranked preference order?
00:53:08.120 | - That's a brilliant lawsuit, yeah.
00:53:11.520 | I wonder what it is, I don't know, yeah.
00:53:13.800 | - It's something that we're trying to actively understand,
00:53:18.560 | because there's this multimodality world,
00:53:20.560 | but really, multimodalities is like an umbrella term
00:53:22.840 | for actually a whole bunch of different things
00:53:24.480 | that are, quite honestly,
00:53:26.480 | not really that related to each other,
00:53:28.560 | unless in the limit, which it tends towards,
00:53:32.480 | maybe everything uses transformers,
00:53:33.960 | and ultimately everything can be merged together
00:53:37.240 | with a text layer,
00:53:38.240 | because text is the universal interface.
00:53:40.720 | But given the choice between,
00:53:42.760 | like if I want to implement an audio feature
00:53:45.000 | versus I want to implement an image feature
00:53:47.240 | versus video, whatever,
00:53:49.640 | what are people needing the most?
00:53:52.200 | What should we pay the most attention to?
00:53:54.000 | What is going to be the biggest market
00:53:56.280 | for builders to build in?
00:53:57.600 | I don't know.
00:53:58.440 | - Yeah, we'll go ask that.
00:53:59.280 | Yeah, that's a great question.
00:54:00.880 | - I know, I think, Sean,
00:54:03.880 | you put in a question from Joseph here in the show notes,
00:54:06.280 | our friend Joseph Nelson,
00:54:07.560 | founder of Roboflow and Office.
00:54:10.040 | Cool coworker, I guess.
00:54:13.080 | - Yeah, sure.
00:54:15.680 | So I figure we'll just kind of zoom out a little bit
00:54:18.360 | to just the general founder questions.
00:54:20.000 | You have a lot of fans in the founder community,
00:54:24.520 | and I think you're just generally well-known
00:54:26.600 | as a very sort of straightforward,
00:54:29.160 | painstaking person about just business.
00:54:31.840 | Something that is the perception from Joseph
00:54:35.640 | is that you have been notably sort of sales-led in the past.
00:54:40.640 | That's his perception.
00:54:43.720 | I actually never got that,
00:54:44.800 | but I'm not that close to sort of your sales portion.
00:54:48.400 | And it's interesting to understand your market,
00:54:51.400 | like the internal tooling market,
00:54:53.040 | versus all the competition that's out there, right?
00:54:57.360 | There's a bunch of open-source retools,
00:55:00.240 | and there's this bunch of like,
00:55:01.840 | I don't know how you sort of categorize
00:55:04.120 | the various things out there,
00:55:05.280 | but effectively what he's seeing and what he's asking
00:55:07.640 | is how do you manage between sort of enterprise
00:55:10.200 | versus ubiquity, or in other words,
00:55:12.440 | enterprise versus bottom-up, right?
00:55:14.840 | I was actually surprised when he told me
00:55:17.320 | to ask that question,
00:55:18.560 | because I had always assumed that you were a self-serve,
00:55:22.360 | sign-up, like bottom-up-led.
00:55:25.320 | But it seems like you have a counter-consensus view on that.
00:55:29.200 | - Yeah.
00:55:30.040 | Let me think about that for a second.
00:55:32.760 | Yeah, so actually when retail first started,
00:55:34.480 | we started mostly by doing sales, actually.
00:55:36.200 | And the reason we started by doing sales
00:55:37.800 | was mostly because we weren't sure
00:55:41.240 | whether we had product-market fit,
00:55:42.360 | and sales seemed to be the best way
00:55:43.600 | of proving whether we had product-market fit out.
00:55:45.560 | Because I think this is true of a lot of AI projects.
00:55:47.880 | You can watch a project, and people might use it a bit,
00:55:50.400 | and people might stop using it, and you're like,
00:55:52.240 | well, I don't know, is that product-market fit?
00:55:53.400 | Is that not?
00:55:54.240 | It's hard to say, actually.
00:55:55.480 | However, if you work very closely with the customer
00:55:57.440 | in sort of a sales-led way,
00:55:59.080 | it's easier to understand their sort of requests,
00:56:01.600 | understand their needs, and stuff like that,
00:56:02.880 | and actually go build a product
00:56:03.840 | that actually serves them really well.
00:56:05.920 | And so basically, we viewed sales
00:56:08.600 | as like working with customers, basically,
00:56:10.200 | which is like, I think actually quite a,
00:56:12.680 | I think it's a better way of describing
00:56:13.840 | what sales is at an early-stage company,
00:56:15.280 | and so we did a lot of that,
00:56:16.720 | certainly, when we got started.
00:56:18.280 | I think we, over the last maybe five years,
00:56:22.200 | maybe like three years ago, four years ago,
00:56:23.640 | something like that, I think we
00:56:25.200 | have invested more on the self-serve ubiquity side.
00:56:32.960 | And the reason for that is, when we started Retool,
00:56:35.720 | we always wanted, actually, some percent of software
00:56:38.160 | to get built inside of Retool.
00:56:39.800 | Whether AI software or origin software,
00:56:41.400 | or broadly, UIs, but like software, basically.
00:56:44.240 | And for us, we were like, we think that maybe one day,
00:56:49.000 | 10% of all the code in the world
00:56:50.480 | could be written inside of Retool, actually,
00:56:52.160 | or 10% of the software could be running on Retool,
00:56:54.080 | which would be really, really cool.
00:56:56.000 | And for us to achieve that vision,
00:56:59.480 | it really does require a broad-basis option of the platform.
00:57:04.080 | It can't just be like, oh, only like 1,000 customers,
00:57:07.280 | but the largest 1,000 companies in the world use it.
00:57:09.360 | It has to be like, all the developers in the world use it.
00:57:11.800 | And for us, there's like, well, I think 25,
00:57:13.800 | 30 million developers in the world.
00:57:15.280 | This is, of course, how do you get to all the developers?
00:57:18.640 | And the only way to get to all those developers
00:57:20.040 | is not by sales.
00:57:20.880 | You can't have a salesperson talk to 30 million people.
00:57:23.160 | It has to be, basically, in this sort of
00:57:25.120 | bottoms-up, product-led, ubiquity kind of way, basically.
00:57:28.000 | And so, for us, we actually changed our focus
00:57:31.400 | to be ubiquity, actually, over the last year.
00:57:32.680 | So when we first started Retool,
00:57:33.520 | it used to always be sort of revenue-generated,
00:57:36.000 | or in a new way, R-generated.
00:57:37.480 | We actually changed it to be number of developers
00:57:39.160 | building on the platform, actually, last year.
00:57:41.360 | And that, I think, was actually a really clarifying change,
00:57:44.440 | because obviously, revenue was important.
00:57:47.080 | You know, it funds a lot of our product,
00:57:49.240 | and it funds the business.
00:57:51.440 | But we're going to fail if we aren't able
00:57:53.920 | to get to something like 10, 20, 30 million developers
00:57:56.280 | one day.
00:57:57.120 | We can't convince all developers that Retool's
00:57:58.600 | a better way of building a sort of class of software,
00:58:01.120 | let's say, internal applications, for today.
00:58:03.440 | And so, I think that has been a pretty good outcome.
00:58:07.600 | Like, I think about, you know, the last, like,
00:58:09.120 | I don't know, five years of Retool.
00:58:10.400 | Like, I think the starting off with sales,
00:58:13.600 | so you can build revenue,
00:58:14.680 | and then you can actually build traction,
00:58:16.920 | and then you can hire more slowly,
00:58:18.960 | I think it was really good.
00:58:20.320 | I do think the focus towards, like, you know,
00:58:22.000 | bottom-left ubiquity also is really important,
00:58:23.840 | because it helps us get to our long-term outcome.
00:58:25.960 | What's interesting, I think, is that long-term ubiquity,
00:58:28.320 | actually, is harder for us to achieve
00:58:30.160 | outside of Silicon Valley.
00:58:31.920 | Like, to your point, I think at Silicon Valley,
00:58:34.480 | Retool is, like, reasonably ubiquitous.
00:58:37.040 | I think, like, if you're starting a startup today,
00:58:38.680 | and you're looking to build a internal UI,
00:58:41.000 | you're probably going to consider Retool, at least.
00:58:42.960 | Maybe you don't choose it,
00:58:43.800 | because you're like, "Hey, I'm not ready for it yet,"
00:58:45.000 | or something, but you're going to consider it, at least.
00:58:48.120 | And when you want to build it,
00:58:49.200 | I think it's actually a high probability
00:58:50.360 | you will actually end up choosing Retool.
00:58:51.680 | It's awesome.
00:58:52.720 | But it's that, you know, if you think about, you know,
00:58:54.600 | a random developer working at, let's say,
00:58:56.320 | like, an Amazon, for example.
00:58:58.480 | We actually, so, today at Amazon, actually,
00:59:01.280 | we have, I think, 11 separate business units
00:59:03.360 | that use Retool at this point, which is really awesome.
00:59:05.120 | So, Amazon's actually a big Retool customer.
00:59:07.640 | But, like, the average user at Amazon
00:59:09.280 | probably has never heard of Retool, actually.
00:59:10.920 | And so, that is where the challenge really is.
00:59:12.720 | How do we get, like, you know, I don't know,
00:59:14.920 | let's say 10,000 developers at Amazon
00:59:16.520 | building via Retool?
00:59:17.920 | And that, again, I think,
00:59:18.760 | is still a bottoms-up ubiquity thing.
00:59:20.040 | I don't think that's, like, a,
00:59:21.240 | I don't think we can, like, you know,
00:59:22.240 | go to Amazon and knock on every developer's door
00:59:24.000 | or send out an email to every developer and be like,
00:59:25.560 | "Go use Retool."
00:59:26.400 | They're going to ignore us, actually.
00:59:27.600 | I think it has to be, use the product,
00:59:29.520 | and you love it, and you tell your coworker about it.
00:59:31.400 | And so, for us, a big bottoms-up ubiquity,
00:59:33.440 | but marrying that with, sort of, enterprise
00:59:35.680 | or the community business has been something
00:59:37.240 | that's really near and dear to our hearts.
00:59:40.160 | - Yeah, and just, like, general market thoughts on AI.
00:59:43.880 | Are you, you know, do you spend a lot of time
00:59:46.160 | thinking about, like, AGI stuff or regulation or safety?
00:59:50.240 | Or, like, what interests you most, you know,
00:59:52.680 | outside of the Retool context?
00:59:54.280 | - Wow.
00:59:57.080 | Well, I'll give you a little bit of the Retool context,
00:59:58.480 | because your actual question.
01:00:01.120 | In my opinion, there's a lot of hype in AI right now,
01:00:04.000 | and there's, again, not too many use cases.
01:00:06.400 | So, for us, at least from a Retool context,
01:00:08.360 | it really is, how do we bring AI
01:00:10.360 | and have it actually meet business problems?
01:00:13.040 | And, again, it's actually pretty hard.
01:00:15.600 | Like, I think most founders that I meet at the AI space
01:00:18.200 | are always looking for use cases,
01:00:19.760 | never have enough use cases,
01:00:21.400 | sort of, real use cases people want to pay money for.
01:00:23.080 | So, I don't think really where the Retool interest comes from.
01:00:27.120 | Me, personally, I think, philosophically,
01:00:30.000 | yeah, I've been thinking recently, myself,
01:00:32.120 | a bit about sort of intentionality and AGI,
01:00:34.280 | and like, you know, what would it take for me to say,
01:00:39.280 | yes, GPT-X or any sort of model actually is AGI?
01:00:47.400 | I think it's kind of challenging, because it's like,
01:00:51.120 | I think, if you look at, like, evolution, for example,
01:00:53.240 | like, humans have been programmed to do, like,
01:00:55.400 | three things, if you will.
01:00:56.240 | Like, you know, we are here to survive,
01:00:58.160 | you know, we're here to reproduce,
01:00:59.320 | and we're here to like, you know,
01:01:01.560 | maybe those are just two things, I suppose.
01:01:03.280 | So, like, basically, to survive,
01:01:05.360 | you have to go eat food, you know, for example.
01:01:07.320 | To survive, maybe, like, having more resources
01:01:10.880 | helps you want to go make money, you know, for example.
01:01:13.680 | To reproduce, you should go date, you know, or whatever.
01:01:15.640 | You should get married and stuff like that, right?
01:01:16.960 | So, like, that's, we have a program to do that.
01:01:19.000 | And humans that are good at that have propagated.
01:01:22.600 | And so, humans that, you know, we're not actually surviving,
01:01:25.080 | probably have disappeared,
01:01:26.080 | or just do the natural selection.
01:01:27.200 | Humans that we're not interested in reproducing
01:01:30.800 | also disappeared, because, or, you know,
01:01:32.880 | there are less of them, you could say,
01:01:34.280 | because they just, they're just up here and gone, basically.
01:01:36.400 | And so, it almost feels like humans
01:01:39.040 | have been sort of naturally self-selected
01:01:41.240 | for these, like, two aims.
01:01:42.560 | I think the third aim I was thinking about was, like,
01:01:43.760 | does it matter to be happy?
01:01:44.680 | Like, maybe it does.
01:01:45.520 | So, maybe, like, happier humans, you know, survival?
01:01:47.840 | It's hard to say.
01:01:48.680 | So, I'm not sure.
01:01:49.520 | But if you think about that,
01:01:51.440 | in the realm of, like, AIs, if you will,
01:01:54.520 | like, right now, we're not really selecting AIs
01:01:56.360 | for, like, you know, reproduction.
01:01:57.480 | Like, it's not like, you know, we're being like,
01:01:58.720 | "Hey, AI, you know, you should go make 30 other AIs."
01:02:01.160 | And, you know, those that make the most AIs,
01:02:03.080 | you know, are the ones that survive.
01:02:04.600 | We're not saying that.
01:02:05.440 | So, it is kind of interesting, sort of,
01:02:06.760 | thinking about where intentionality for humans come from.
01:02:09.320 | And, like, I think you can argue the intentionality
01:02:10.840 | for humans basically comes out of these three things.
01:02:12.360 | You know, like, you know, if you want to be happy,
01:02:13.640 | you want to survive, you want to reproduce.
01:02:15.040 | That's, like, basically your sort of goal, you know, in life.
01:02:18.360 | Whereas, like, the AI doesn't really have that.
01:02:21.640 | But maybe you could program it in.
01:02:22.760 | Like, if you, you know, prompt inject, for example,
01:02:24.520 | like, "Hey, AI, you know, go do these things."
01:02:27.160 | And, you know, you can even create a simulation,
01:02:28.840 | if you will, of, like, all these AIs,
01:02:29.960 | you know, in the world, for example.
01:02:31.520 | And maybe you don't have AGI in that world,
01:02:33.400 | which I think is kind of interesting.
01:02:34.400 | So, that's kind of stuff I've been thinking about
01:02:36.000 | when I talk about with some of my friends
01:02:37.960 | from a sort of philosophical perspective,
01:02:39.440 | but that's kind of interesting.
01:02:41.000 | - Yeah, my quick response to that is
01:02:43.880 | we're kind of doing that,
01:02:46.680 | maybe not at the sort of trained final model level,
01:02:50.200 | but at least at the datasets level,
01:02:51.600 | there's a lot of knowledge being transferred
01:02:54.520 | from model to model.
01:02:55.760 | And if you want to think about
01:02:57.120 | that sort of evolutionary selection pressure,
01:03:00.360 | it is happening in there.
01:03:01.680 | And, you know, I guess one of the early concerns
01:03:05.400 | about being in Sydney and sort of, like, bootstrap,
01:03:10.400 | self-bootstrapping AGI is that it actually is,
01:03:13.480 | if these models are sentient,
01:03:15.360 | it actually is in their incentive
01:03:17.080 | to get as much of their data out there into our datasets
01:03:20.600 | so that they can bootstrap themselves
01:03:21.840 | in the next version that gets trained.
01:03:23.640 | (laughs)
01:03:24.960 | And that is a scary sobering thought
01:03:28.120 | that we need to try to be on top of.
01:03:31.800 | - David, I know we're both fan of Hofstadter's GEB.
01:03:36.320 | And actually, I saw in one of your posts
01:03:39.120 | on the Sequoia blog,
01:03:41.400 | you referred to the anteater, like, piece of the,
01:03:46.400 | I don't even know if you call them chapters,
01:03:49.000 | and GEB is just kind of like this discontinuous riff,
01:03:52.320 | but basically like how ants are like not intelligence,
01:03:55.200 | but like ant colony has signs of intelligence.
01:03:57.920 | And I think Hofstadter then used that to say,
01:04:00.960 | hey, you know, neurons are kind of like similar,
01:04:04.120 | and then computers maybe will be the same.
01:04:06.000 | I've always been curious if, like,
01:04:07.200 | we're drawing the wrong conclusion for, like,
01:04:10.240 | neural networks where people are like,
01:04:11.640 | oh, each weight is like a neuron,
01:04:14.120 | and then you tie them together, it should be like a brain.
01:04:16.680 | But maybe like the neuron is like different models
01:04:19.400 | that then get tied together to make the brain.
01:04:21.600 | You know, we're kind of looking
01:04:22.560 | at the wrong level of abstraction.
01:04:25.840 | Yeah, I think there's a lot
01:04:26.720 | of interesting philosophical discussions to have.
01:04:30.360 | Sean and I recorded a monthly recap podcast yesterday,
01:04:32.960 | and we had a similar discussion on are we using the wrong,
01:04:37.160 | are we, like, what did you say, Sean,
01:04:40.320 | on the plane and the bird?
01:04:41.480 | I think that was a good analogy.
01:04:43.520 | - Oh, the sour lesson.
01:04:45.600 | Are we using the wrong analogies?
01:04:47.800 | Because we're trying to be inspired
01:04:49.240 | by human evolution and human development,
01:04:52.280 | and we are trying to apply that analogy
01:04:54.200 | strictly to machines.
01:04:55.160 | But in every example in history,
01:04:57.160 | machines have always evolved differently than humans.
01:04:59.600 | So why should we expect AI to be any different?
01:05:02.080 | - Yeah, if you sort of peer under the hood of AGI,
01:05:05.080 | if you insist on AGI, we have always used AGI
01:05:07.640 | for things like a human,
01:05:09.440 | and that is the Turing test, I suppose,
01:05:10.600 | but whether that is a good point.
01:05:14.040 | Like, if it works, no, it's not the Turing test.
01:05:17.240 | The Turing test is if the output is the same as a human,
01:05:19.800 | then I'm happy, basically.
01:05:20.640 | I don't really care about what's going on inside.
01:05:23.080 | And so it feels like caring about the inside
01:05:24.480 | is, like, a pretty high bar.
01:05:25.480 | Like, why do you care?
01:05:26.320 | It's kind of like the plane thing.
01:05:27.160 | Like, for flies, it's not a bird, I agree.
01:05:29.400 | It does not fly necessarily the same way as a bird.
01:05:33.560 | Physically, it does, I suppose.
01:05:34.640 | But you see what I mean.
01:05:35.520 | Like, it's not the same under the hood,
01:05:37.480 | but it's okay for the flies.
01:05:39.000 | That's what I care about.
01:05:39.840 | And it does seem to be like AGI is probably, like,
01:05:42.320 | doesn't think and can achieve, like, outcomes that I give it
01:05:45.960 | and can achieve its own outcomes.
01:05:47.440 | And it can do that.
01:05:48.280 | Like, I kind of don't care what it is, like, under the hood.
01:05:49.960 | It may not need to be human-like at all.
01:05:51.560 | It doesn't matter to me.
01:05:52.400 | So I agree.
01:05:53.360 | - Awesome.
01:05:55.320 | I know we kept you long.
01:05:56.160 | I actually have GUB right here on my bookshelf.
01:05:59.000 | Sometimes I pick it up and I'm like,
01:06:00.200 | "Man, I can't believe I got through it once."
01:06:02.080 | It's quite the piece of work.
01:06:05.280 | - It's a lot of homework, too.
01:06:06.480 | - Yeah, I mean, I started studying physics in undergrad,
01:06:10.200 | so, you know, it's one of the edgy things
01:06:12.480 | that every physicist starts going through.
01:06:16.160 | But thank you so much for your time, David.
01:06:17.960 | This was a lot of fun,
01:06:18.800 | and looking forward to the 2024 set of AI results
01:06:22.200 | to see how things change.
01:06:23.920 | - Yeah, we'll let you know.
01:06:24.760 | So thanks, both.
01:06:26.360 | (upbeat music)
01:06:28.960 | (upbeat music continues)
01:06:33.360 | (upbeat music continues)
01:06:36.760 | (upbeat music continues)
01:06:40.160 | (upbeat music continues)
01:06:43.560 | (upbeat music continues)
01:06:46.960 | (upbeat music)
01:06:49.540 | [BLANK_AUDIO]