back to index

Agents @ Work: Lindy.ai (with live demo!)


Chapters

0:0 Introductions
4:5 AI engineering and deterministic software
8:36 Lindys demo
13:21 Memory management in AI agents
18:48 Hierarchy and collaboration between Lindys
21:19 Vertical vs. horizontal AI tools
24:3 Community and user engagement strategies
26:16 Rickrolling incident with Lindy
28:12 Evals and quality control in AI systems
31:52 Model capabilities and their impact on Lindy
39:27 Competition and market positioning
42:40 Relationship between Factorio and business strategy
44:5 Remote work vs. in-person collaboration
49:3 Europe vs US Tech
58:59 Testing the Overton window and free speech
64:20 Balancing AI safety concerns with business innovation

Whisper Transcript | Transcript Only Page

00:00:00.000 | (upbeat music)
00:00:02.580 | - Hey everyone.
00:00:04.740 | Welcome to the Lydian Space Podcast.
00:00:06.360 | This is Alessio, partner and CTO at Decibel Partners,
00:00:09.040 | and I'm joined by my co-host, Swix, founder of Small AI.
00:00:12.060 | - Hey, and today we're joined in the studio
00:00:13.720 | by Florent Crevello.
00:00:15.080 | Welcome.
00:00:15.920 | - Hey, yeah, thanks for having me.
00:00:17.080 | - Also known as Altimore.
00:00:19.160 | Always wanted to ask, what is Altimore?
00:00:20.960 | - It was the name of my character
00:00:22.640 | when I was playing Dungeons and Dragons.
00:00:23.960 | - Always.
00:00:24.800 | - I was like 11 years old.
00:00:25.840 | - What was your classes?
00:00:26.960 | - I was an elf.
00:00:27.800 | I was a magician elf.
00:00:28.840 | - Okay, all right.
00:00:30.280 | Well, you're still spinning magic.
00:00:31.520 | Right now you're a solo founder/CEO of Lindy.ai.
00:00:35.040 | What is Lindy?
00:00:36.320 | - Yeah, we are a no-code platform
00:00:38.000 | letting you build your own AI agents easily.
00:00:40.080 | So you can think of we all to link chain
00:00:42.520 | as air table into MySQL.
00:00:44.560 | Like you can just pin up AI agents super easily
00:00:47.000 | by clicking around and no code required.
00:00:49.680 | You didn't have to be an engineer
00:00:50.840 | and you can automate business workflows
00:00:52.440 | that you simply could not automate before in a few minutes.
00:00:55.640 | - You've been in our orbit a few times.
00:00:57.360 | I think you spoke at our Latent Space anniversary.
00:01:00.440 | You spoke at my summit, the first summit,
00:01:03.400 | which was a really good keynote.
00:01:05.200 | And most recently,
00:01:06.480 | like we actually already scheduled this podcast
00:01:08.720 | before this happened,
00:01:09.560 | but Andrew Wilkinson was like,
00:01:10.880 | "I'm obsessed by Lindy."
00:01:13.280 | He's just created a whole bunch of agents.
00:01:14.720 | So basically, why are you blowing up?
00:01:16.240 | - Well, thank you.
00:01:17.080 | I think we are having a little bit of a moment.
00:01:18.520 | I think it's a bit premature to say we're blowing up,
00:01:20.480 | but why are things going well?
00:01:22.240 | We revamped the product majorly.
00:01:24.000 | We called it Lindy 2.0.
00:01:25.520 | I would say we started working on that six months ago.
00:01:27.960 | We've actually not really announced it yet.
00:01:29.760 | It's just, I guess, I guess that's what we're doing now.
00:01:32.200 | (laughs)
00:01:33.400 | And so we've basically been cooking for the last six months,
00:01:35.680 | like really rebuilding the product from scratch.
00:01:37.600 | I think, Alessio, actually,
00:01:38.560 | the last time you tried the product,
00:01:39.880 | it was still Lindy 1.0.
00:01:41.160 | - Oh yeah, it was.
00:01:42.000 | - If you log in now, the platform looks very different.
00:01:44.360 | There's like a ton more features.
00:01:45.800 | And I think one realization that we made,
00:01:48.360 | and I think a lot of folks in the agent space
00:01:50.600 | made the same realization,
00:01:51.800 | is that there is such a thing as too much of a good thing.
00:01:55.000 | I think many people, when they started working on agents,
00:01:57.520 | they were very LLM-peeled and chat GPT-peeled, right?
00:02:01.360 | They got ahead of themselves in a way, as included,
00:02:03.840 | and they thought that agents were actually,
00:02:06.240 | and LLMs were actually more advanced
00:02:08.200 | than they actually were.
00:02:09.600 | And so the first version of Lindy was like
00:02:11.400 | just a giant prompt and a bunch of tools.
00:02:14.320 | And then the realization we had was like,
00:02:15.960 | hey, actually, the more you can put your agent on Rails,
00:02:18.760 | one, the more reliable it's going to be, obviously,
00:02:21.280 | but two, it's also going to be easier to use for the user,
00:02:23.760 | because you can really, as a user,
00:02:25.000 | you get, instead of just getting this big, giant,
00:02:27.600 | intimidating text field, and you type words in there,
00:02:30.000 | and you have no idea if you're typing the right words or not,
00:02:32.560 | here you can really click and select step-by-step
00:02:35.320 | and select, tell your agent what to do,
00:02:37.480 | and really give as narrow or as wide a guardrail
00:02:40.760 | as you want for your agent.
00:02:42.360 | We started working on that.
00:02:43.520 | We called it Lindy on Rails about six months ago.
00:02:46.040 | And we started putting it into the hands of users
00:02:48.720 | over the last, I would say, two months or so.
00:02:50.560 | And that's, I think things really started going
00:02:53.160 | pretty well at that point.
00:02:54.520 | The agent is way more reliable, way easier to set up,
00:02:57.040 | and we're already seeing a ton of new use cases pop up.
00:02:59.440 | FRANCESC CAMPOY: Yeah.
00:03:00.400 | Just a quick follow-up on that.
00:03:01.700 | You launched the first Lindy in November last year,
00:03:05.160 | and you were already talking about having a DSL, right?
00:03:08.320 | I remember having this discussion with you,
00:03:10.200 | and you were like, it's just much more reliable.
00:03:12.240 | Is it still the DSL under the hood?
00:03:14.000 | Is this a UI-level change, or is it a bigger rewrite?
00:03:16.960 | FRANCESC CAMPOY: No, it is a much bigger rewrite.
00:03:18.780 | I'll give you a concrete example.
00:03:20.160 | Suppose you want to have an agent that
00:03:22.200 | observes your Zendesk tickets, OK?
00:03:24.320 | And it's like, hey, every time you receive a Zendesk ticket,
00:03:26.820 | I want you to check my knowledge base--
00:03:28.440 | so it's like a RAG module and whatnot--
00:03:30.800 | and then answer the ticket.
00:03:32.640 | The way it used to work with Lindy before
00:03:35.040 | was you would type the prompt asking it to do that.
00:03:38.260 | Every time you receive a Zendesk ticket,
00:03:39.920 | you check my knowledge base, and so on and so forth.
00:03:42.160 | The problem with doing that is that it can always go wrong.
00:03:44.600 | Like, you're praying the LLM gods that they will actually
00:03:47.440 | invoke your knowledge base, but I don't want to ask it.
00:03:50.040 | I want it to always, 100% of the time,
00:03:52.040 | consult the knowledge base after it receives a Zendesk ticket.
00:03:54.400 | And so with Lindy, you can actually
00:03:55.880 | have the trigger, which is Zendesk ticket received,
00:03:58.280 | have the knowledge base consult, which is always there,
00:04:00.600 | and then have the agent.
00:04:01.880 | So you can really set up your agent any way you want like
00:04:04.840 | that.
00:04:05.340 | FRANCESC CAMPOY: This is something
00:04:06.760 | I think about for AI engineering as well, which
00:04:09.660 | is the big labs want you to hand over everything in the prompts
00:04:13.080 | and only code of English.
00:04:14.560 | And then the smaller brains, the GPU pourers,
00:04:17.080 | always want to write more code to make
00:04:19.800 | things more deterministic, and reliable, and controllable.
00:04:22.480 | One way I put it is put Shoggoth in a box
00:04:24.560 | and make it a very small, like the minimal viable box.
00:04:27.040 | Everything else should be traditional,
00:04:28.520 | if this, then that software.
00:04:29.680 | FRANCESC CAMPOY: I love that characterization,
00:04:30.920 | put the Shoggoth in the box.
00:04:32.200 | Yeah, we talk about using as much AI as necessary
00:04:35.160 | and as little as possible.
00:04:37.040 | YOSSI ELKRIEF: And what was the choosing between this drag
00:04:40.420 | and drop, low code, whatever, super code-driven,
00:04:43.640 | maybe like the [INAUDIBLE] of the world?
00:04:46.760 | And maybe the flip side of it, which you don't really do,
00:04:49.280 | it's just text to agent.
00:04:51.360 | It's like build the workflow for me.
00:04:53.640 | Whatever you learn, actually putting this in front of users
00:04:56.640 | and figuring out how much do they actually
00:04:59.240 | want to edit versus how much--
00:05:01.160 | kind of like Ruby on Rails, instead of Lindy on Rails,
00:05:03.400 | it's kind of like defaults over configuration.
00:05:05.800 | FRANCESC CAMPOY: Yeah.
00:05:06.720 | I actually used to dislike when people said,
00:05:08.680 | oh, text is not a great interface.
00:05:10.920 | I was like, ah, this is such a mid-take.
00:05:12.220 | I think text is awesome.
00:05:13.320 | And I've actually come around.
00:05:14.560 | I actually sort of agree now that text is really not great.
00:05:18.040 | I think for people like you and me, because we sort of have
00:05:20.280 | a mental model, OK, when I type a prompt into this text box,
00:05:23.180 | this is what it's going to do.
00:05:24.400 | It's going to map it to this kind of data structure
00:05:26.560 | under the hood and so forth.
00:05:27.800 | I guess it's a little bit blackmailing towards humans.
00:05:30.320 | You jump on these calls with humans,
00:05:32.000 | and you're like, here's a text box.
00:05:33.560 | This is going to set up an agent for you.
00:05:35.240 | Do it.
00:05:36.140 | And then they type words like, I want
00:05:38.120 | you to help me put order in my inbox.
00:05:40.120 | Or actually, this is a good one.
00:05:41.600 | This is actually a good one.
00:05:42.760 | Well, that's a bad one.
00:05:44.320 | I would say 60% or 70% of the prompts that people type
00:05:47.160 | don't mean anything.
00:05:48.600 | Me as a human, as AGI, I don't understand what they mean.
00:05:51.120 | I don't know what they mean.
00:05:52.360 | It is actually, I think, whenever you can have a GUI,
00:05:55.160 | it is better than to have just a pure text interface.
00:05:58.160 | And then how do you decide how much to expose?
00:06:01.400 | So even with the tools, you have a bunch of them.
00:06:05.440 | You have Slack, you have Google Calendar, you have Gmail.
00:06:08.360 | Should people, by default, just turn over access to everything,
00:06:11.640 | and then you help them figure out what to use?
00:06:13.800 | I think that's the question.
00:06:15.040 | Because when I tried to set up Slack, it was like, hey,
00:06:17.640 | give me access to all channels and everything.
00:06:19.960 | Which, for the average person, probably makes sense,
00:06:22.200 | because you don't want to re-prompt them every time
00:06:23.920 | to add new channels.
00:06:25.000 | But at the same time, for maybe the more sophisticated
00:06:27.880 | enterprise use cases, people are like, hey,
00:06:29.760 | I want to really limit what you have access to.
00:06:32.760 | How do you thread that balance?
00:06:35.200 | The general philosophy is we ask for the least amount
00:06:38.680 | of permissions needed at any given moment.
00:06:41.280 | I don't think Slack--
00:06:42.360 | I could be mistaken, but I don't think Slack lets you request
00:06:45.020 | permissions for just one channel.
00:06:46.880 | But, for example, for Google, obviously there's
00:06:49.400 | hundreds of scopes that you could require for Google.
00:06:52.200 | There's a lot of scopes.
00:06:53.160 | And sometimes it's actually painful to set up your Lindy,
00:06:55.840 | because you're going to have to ask to Google and add scopes
00:06:58.520 | five or six times.
00:06:59.520 | Like, we've had sessions like this.
00:07:01.520 | But that's what we do, because, for example,
00:07:03.640 | the Lindy email drafter, she's going
00:07:05.280 | to ask you for your authorization once for,
00:07:07.480 | I need to be able to read your email so I can draft a reply.
00:07:10.040 | And then another time for, I need
00:07:11.400 | to be able to write a draft for them.
00:07:13.040 | So we just try to do it very incrementally like that.
00:07:15.380 | FRANCESC CAMPOY: Yeah.
00:07:16.340 | Do you think OAuth is just overall going to change?
00:07:18.700 | I think maybe before it was like, hey,
00:07:20.340 | we need to set up an OAuth that humans only
00:07:22.700 | want to kind of do once.
00:07:23.740 | So we try to jam-pack things all at once,
00:07:25.820 | versus what if you could, on-demand,
00:07:28.300 | get different permissions every time from different parts?
00:07:31.500 | Like, do you ever think about designing things knowing
00:07:33.860 | that maybe AI will use it instead of humans will use it?
00:07:36.700 | FRANCESC CAMPOY: Yeah, for sure.
00:07:38.060 | One pattern we've started to see is people provisioning accounts
00:07:41.420 | for their AI agents.
00:07:42.860 | And so in particular, Google Workspace accounts.
00:07:44.840 | So, for example, Lindy can be used as a scheduling assistant.
00:07:48.240 | And so you can just cc her to your emails
00:07:50.680 | when you're trying to find time with someone.
00:07:52.600 | And just like a human assistant, she's
00:07:53.960 | going to go back and forth in a flow of abilities and so forth.
00:07:56.660 | Very often, people don't want the other party
00:07:59.000 | to know that it's an AI.
00:08:00.280 | So it's actually funny, they introduce delays.
00:08:02.200 | They ask the agent to wait before replying,
00:08:04.120 | so it's not too obvious that it's an AI.
00:08:06.000 | And they provision an account on Google Suite,
00:08:08.240 | which costs them like $10 a month or something like that.
00:08:10.760 | So we're seeing that pattern more and more.
00:08:12.800 | I think that does the job for now.
00:08:14.580 | I'm not optimistic on us actually patching OAuth,
00:08:18.060 | because I agree with you ultimately.
00:08:19.540 | We would want to patch OAuth, because the new account thing
00:08:22.420 | is kind of a kludge.
00:08:23.220 | It's really a hack.
00:08:24.140 | You would want to patch OAuth to have more granular access
00:08:27.060 | control and really be able to put your showbiz in the box.
00:08:30.660 | I'm not optimistic on us doing that before AGI, I think.
00:08:33.460 | [LAUGHTER]
00:08:35.260 | FRANCESC CAMPOY: That's a very close timeline.
00:08:37.180 | I'm mindful of talking about a thing without showing it.
00:08:39.900 | And we already have the setup to show it.
00:08:41.600 | We jump into a screen share.
00:08:43.700 | For listeners, you can jump on to YouTube
00:08:46.260 | and like and subscribe.
00:08:47.380 | But also, let's have a look at how you show off Lindy.
00:08:50.980 | ALAIN VONGSOUVANH: Yeah, absolutely.
00:08:52.540 | I'll give an example of a very simple Lindy,
00:08:54.780 | and then I'll graduate to a much more complicated one.
00:08:57.620 | A super simple Lindy that I have is I unfortunately
00:09:01.300 | bought some investment properties
00:09:02.700 | in the south of France.
00:09:04.060 | It was a really, really bad idea.
00:09:05.940 | And I put them on the Holydew, which is like the French Airbnb,
00:09:09.460 | if you will.
00:09:10.380 | And so I received these emails from time to time telling me,
00:09:12.180 | like, oh, hey, you made $200.
00:09:13.700 | Someone booked your place, OK?
00:09:15.500 | When I receive these emails, I want
00:09:17.340 | to log this reservation in a spreadsheet.
00:09:20.500 | Doing this without an AI agent or without AI in general
00:09:24.420 | is a pain in the butt, because you must write an HTML parser
00:09:28.580 | for this email.
00:09:29.500 | And so it's just hard.
00:09:30.620 | You may not be able to do it, and it's
00:09:32.200 | going to break the moment the email changes.
00:09:34.260 | By contrast, the way it works with Lindy, it's really simple.
00:09:37.180 | It's two steps.
00:09:38.260 | It's like, OK, I receive an email.
00:09:40.380 | If it is a reservation confirmation--
00:09:42.100 | I have this filter here--
00:09:43.500 | then I append a row to this spreadsheet.
00:09:45.340 | And so this is where you can see the AI part, where
00:09:48.820 | the way this action is configured here--
00:09:50.940 | you see these purple fields on the right.
00:09:53.100 | Each of these fields is a prompt.
00:09:55.140 | And so I can say, OK, you extract from the email
00:09:57.580 | the day the reservation begins on.
00:09:59.700 | You extract the amount of the reservation.
00:10:01.500 | You extract the number of travelers of the reservation.
00:10:04.760 | And now you can see, when I look at the task history of this,
00:10:09.120 | Lindy, it's really simple.
00:10:10.320 | It's like, OK, you do this.
00:10:11.520 | And boom, I'm appending this row to this spreadsheet.
00:10:13.740 | And this is the information extracted.
00:10:15.520 | So effectively, this node here, this append row node,
00:10:20.520 | is a mini-agent.
00:10:21.800 | It can see everything that just happened.
00:10:23.840 | It has context over the task, and it's appending the row.
00:10:28.120 | And then it's going to send a reply to the thread.
00:10:31.360 | That's a very simple example of an agent.
00:10:34.200 | Quick follow-up question on this one
00:10:35.240 | while we're still on this page.
00:10:36.480 | Is that one call?
00:10:37.320 | Is that a structured output call?
00:10:38.680 | Yeah.
00:10:39.440 | OK, nice.
00:10:40.280 | Yeah.
00:10:41.520 | And you can see here, for every node,
00:10:43.800 | you can configure which model you want to power the node.
00:10:46.720 | Here, I use Cloud.
00:10:47.680 | For this, I use GPT for Turbo.
00:10:49.500 | Much more complex example, my meeting recorder.
00:10:53.280 | It looks very complex, because I've added to it over time.
00:10:56.000 | But at a high level, it's really simple.
00:10:57.720 | It's like, when a meeting begins,
00:10:59.240 | you record the meeting.
00:11:00.840 | And after the meeting, you send me a summary.
00:11:03.160 | And it sends me coaching notes.
00:11:04.640 | So I receive-- like, my Lindy is constantly coaching me.
00:11:07.920 | And so you can see here, in the prompt of the coaching notes,
00:11:10.440 | I've told it, hey, was I unnecessarily confrontational
00:11:13.840 | at any point?
00:11:14.480 | I'm French, so I have to watch out for that.
00:11:16.960 | Or not confrontational enough.
00:11:18.160 | Should I have double-clicked on any issue?
00:11:20.000 | So I can really give it exactly the kind of coaching
00:11:22.160 | that I'm expecting.
00:11:23.400 | And then the interesting thing here is, you can see,
00:11:26.240 | the agent here, after it sends me these coaching notes,
00:11:28.640 | moves on.
00:11:29.400 | And it does a bunch of other stuff.
00:11:30.820 | So it goes on Slack.
00:11:32.040 | It disseminates the notes on Slack.
00:11:33.480 | It does a bunch of other stuff.
00:11:34.880 | But it's actually able to backtrack and resume
00:11:37.880 | the automation at the coaching notes email
00:11:39.720 | if I responded to that email.
00:11:41.400 | So I'll give a super concrete example.
00:11:44.520 | This is an actual coaching feedback
00:11:46.280 | that I received from Lindy.
00:11:47.960 | She was like, hey, this was a sales call
00:11:50.640 | I had with a customer.
00:11:52.000 | And she was like, I found your explanation
00:11:53.760 | of Lindy too technical.
00:11:54.920 | And I was able to follow up and just ask a follow-up question
00:11:58.000 | in the thread here.
00:11:59.000 | And I was like, why did you find too technical
00:12:00.380 | about my explanation?
00:12:01.600 | And Lindy restored the context.
00:12:03.660 | And so she basically picked up the automation
00:12:05.660 | back up here in the tree.
00:12:07.580 | And she has all of the context of everything that happened,
00:12:09.540 | including the meeting in which I was.
00:12:11.580 | So she was like, oh, you used the words "deterministic"
00:12:13.860 | and "context window" and "agent state."
00:12:15.660 | And that concept exists at every level
00:12:18.300 | for every channel and every action that Lindy takes.
00:12:20.900 | So another example here is, I mentioned,
00:12:23.220 | she also disseminates the notes on Slack.
00:12:25.380 | So this was a meeting where I was not, right?
00:12:28.020 | So this was a teammate.
00:12:29.360 | He's an Indie meeting recorder, posts the meeting notes
00:12:32.640 | in this customer discovery channel on Slack.
00:12:34.760 | So you can see, okay, this is the onboarding call we had.
00:12:37.960 | This was the use case.
00:12:39.400 | Look at the questions.
00:12:40.240 | How do I make Lindy slower?
00:12:41.560 | How do I add delays to make Lindy slower?
00:12:43.840 | And I was able, in the Slack thread,
00:12:45.760 | to ask follow-up questions like,
00:12:46.680 | oh, what did we answer to these questions?
00:12:48.400 | And it's really handy because I know I can have
00:12:50.800 | this sort of interactive Q&A with these meetings.
00:12:53.040 | It means that very often now,
00:12:54.320 | I don't go to meetings anymore.
00:12:55.600 | I just send my Lindy,
00:12:58.380 | and instead of going to a 60-minute meeting,
00:13:00.220 | I have a five-minute chat with my Lindy afterwards.
00:13:02.820 | And she just replied.
00:13:03.660 | She was like, well, this is what we replied to this customer,
00:13:06.020 | and I can just be like, okay, good job, Jack.
00:13:07.860 | No notes about your answers.
00:13:09.260 | So that's the kind of use cases people have with Lindy.
00:13:12.580 | It's a lot of, there's a lot of sales automations,
00:13:15.140 | customer support automations, and a lot of this,
00:13:17.140 | which is basically personal assistance automations,
00:13:19.660 | like meeting scheduling and so forth.
00:13:21.140 | - Yeah, and I think the question
00:13:22.260 | that people might have is memory.
00:13:24.380 | So as you get coaching,
00:13:25.840 | how does it track whether or not you're improving?
00:13:27.880 | You know, if these are like mistakes you made in the past,
00:13:29.800 | like, how do you think about that?
00:13:31.520 | - Yeah, we have a memory module.
00:13:33.600 | So I'll show you my meeting scheduler, Lindy,
00:13:35.840 | which has a lot of memories
00:13:36.980 | because by now I've used her for so long.
00:13:39.340 | And so every time I talk to her, she saves a memory.
00:13:42.680 | If I tell her, you screwed up, please don't do this.
00:13:46.480 | So you can see here, it's, oh, it's got a double memory here.
00:13:49.800 | This is the meeting link I have,
00:13:51.700 | or this is the address of the office.
00:13:53.960 | If I tell someone to meet me at home,
00:13:55.340 | this is the address of my place.
00:13:56.580 | This is the code.
00:13:57.620 | I guess we'll have to edit that out.
00:13:59.180 | (all laughing)
00:14:01.020 | This is not the code of my place.
00:14:02.540 | No doxing.
00:14:03.500 | (all laughing)
00:14:05.860 | Yeah, so Lindy can just like manage her own memory
00:14:08.100 | and decide when she's remembering things
00:14:09.720 | between executions.
00:14:10.860 | - Okay, I mean, I'm just gonna take the opportunity
00:14:13.460 | to ask you, since you are the creator of this thing,
00:14:15.580 | how come there's so few memories, right?
00:14:17.900 | Like if you've been using this for two years,
00:14:19.820 | there should be thousands of thousands of things.
00:14:21.940 | - That is a good question.
00:14:22.940 | Agents still get confused if they have too many memories,
00:14:25.620 | to my point earlier about that.
00:14:27.100 | So I just am out of a call with a member
00:14:30.580 | of the Lama team at Meta,
00:14:32.420 | and we were chatting about Lindy,
00:14:33.900 | and we were going into the system prompt
00:14:35.620 | that we sent to Lindy and all of that stuff.
00:14:37.700 | And he was amazed, and he was like,
00:14:38.940 | it's a miracle that it's working, guys.
00:14:40.460 | He was like, this kind of system prompt,
00:14:41.980 | this does not exist either pre-training or post-training.
00:14:44.500 | Like these models were never trained
00:14:45.780 | to do this kind of stuff.
00:14:46.640 | Like it's a miracle that they can be agents at all.
00:14:49.220 | And so what I do, I actually prune the memories.
00:14:52.620 | - You know, it's actually something
00:14:53.860 | I've gotten into the habits of doing
00:14:55.380 | from back when we had GPT 3.5 being Lindy agents.
00:14:58.660 | I suspect it's probably not as necessary
00:15:00.620 | in like the cloud 3.5 sunnet days,
00:15:02.800 | but I prune the memories, yeah.
00:15:04.780 | - Yeah, okay.
00:15:05.620 | The reason is 'cause I have another assistant
00:15:07.740 | that also is recording
00:15:09.020 | and trying to come up with facts about me.
00:15:10.180 | It comes up with a lot of like trivial, useless facts
00:15:13.580 | that I, so I spend most of my time pruning.
00:15:16.420 | It actually is not super useful.
00:15:17.580 | I'd much rather have high quality facts that it accepts.
00:15:20.860 | Or maybe I was even thinking like,
00:15:23.120 | were you ever tempted to add a wake word
00:15:25.300 | to only memorize this when I say memorize this?
00:15:28.220 | - Yeah.
00:15:29.060 | - And otherwise don't even bother.
00:15:30.060 | - I have a Lindy that does this.
00:15:31.740 | So this is my inbox processor Lindy.
00:15:33.460 | It's kind of beefy
00:15:34.620 | because there's a lot of different emails,
00:15:36.780 | but somewhere in here, there is a rule where I'm like,
00:15:39.900 | aha, I can email my inbox processor Lindy.
00:15:42.460 | It's really handy.
00:15:43.300 | So she has her own email address.
00:15:44.780 | And so when I process my email inbox,
00:15:46.620 | I sometimes forward an email to her
00:15:48.980 | and it's a newsletter, or it's like a cold outreach
00:15:51.500 | from a recruiter that I don't care about
00:15:53.180 | or anything like that.
00:15:54.020 | And I can give her a rule and I can be like,
00:15:56.340 | hey, this email I want you to archive moving forward.
00:15:58.820 | Or I want you to alert me on Slack
00:16:00.300 | when I have this kind of email, it's really important.
00:16:02.460 | And so you can see here, the prompt is,
00:16:04.460 | if I give you a rule about a kind of email,
00:16:06.300 | like archive emails from X, save it as a new memory.
00:16:09.260 | And I give it to the memory saving skill.
00:16:11.380 | And yeah.
00:16:13.540 | - One thing that just occurred to me.
00:16:14.660 | So I'm a big fan of virtual mailboxes.
00:16:16.380 | I recommend that everybody have a virtual mailbox.
00:16:18.660 | You could set up a physical mail receive thing for Lindy.
00:16:22.020 | And so then people can just,
00:16:23.580 | then Lindy can process your physical mail.
00:16:26.500 | - That's actually a good idea.
00:16:28.340 | I actually already have something like that.
00:16:30.300 | I use like Else Class Mail.
00:16:31.740 | Yeah.
00:16:32.580 | So yeah, most likely I can process my physical mail.
00:16:34.940 | - And then the other products idea I have
00:16:36.700 | looking at this thing is people want to brag
00:16:38.460 | about the complexity of their Lindys.
00:16:40.300 | So this would be like a 65 point Lindy, right?
00:16:43.660 | - What's a 65 point?
00:16:44.860 | - Complexity counting.
00:16:46.140 | Like how many nodes, how many things,
00:16:47.460 | how many conditions, right?
00:16:48.540 | - Yeah, this is not the most complex one.
00:16:50.620 | I have another one.
00:16:51.460 | This designer recruiter here is kind of beefy as well.
00:16:54.820 | - Right, right, right.
00:16:55.660 | So I'm just saying like, let people brag.
00:16:57.540 | Let people like be super users.
00:16:59.220 | - Oh, right.
00:17:00.060 | Give them a score or something.
00:17:00.900 | - Give them a score.
00:17:01.740 | Then they'll just be like,
00:17:02.740 | okay, how high can you make this score?
00:17:04.540 | - Yeah, that's a good point.
00:17:05.860 | And I think that's against the beauty
00:17:07.180 | of this on-rails phenomenon.
00:17:08.900 | It's like, think of the equivalent,
00:17:10.900 | the prompt equivalent of this Lindy here,
00:17:12.820 | for example, that we're looking at.
00:17:14.140 | It'd be monstrous.
00:17:15.380 | And the odds that it gets it right are so low.
00:17:17.380 | But here, because we're really holding the agent's hand
00:17:19.740 | step-by-step-by-step, it's actually super reliable.
00:17:21.900 | - Yeah.
00:17:22.740 | And is it all structured output base?
00:17:23.740 | - Yeah.
00:17:24.580 | - As far as possible?
00:17:25.420 | - Basically.
00:17:26.260 | - Like there's no non-structured output.
00:17:27.580 | - There is.
00:17:28.980 | So for example, here, this like AI agent step, right?
00:17:31.900 | Or this like send message step.
00:17:33.900 | Sometimes it gets to-
00:17:34.740 | - That's just plain text.
00:17:35.580 | - That's right.
00:17:36.420 | - Yeah.
00:17:37.260 | - So I'll give you an example.
00:17:38.380 | Maybe it's TMI.
00:17:39.220 | I'm having blood pressure issues these days.
00:17:40.780 | And so I'm, this Lindy here,
00:17:42.700 | I give it my blood pressure readings
00:17:44.460 | and it updates a log that I have of my blood pressure
00:17:47.820 | that it sends to my doctor.
00:17:49.420 | - Oh, so this is a, every Lindy comes with a to-do list?
00:17:52.180 | - Yeah.
00:17:53.900 | Every Lindy has its own task history.
00:17:55.420 | - Huh.
00:17:56.260 | - Yeah.
00:17:57.100 | And so you can see here, this is my main Lindy,
00:17:58.420 | sort of like my personal assistant.
00:17:59.780 | And I've told it, where is this?
00:18:02.500 | There is a point where I'm like,
00:18:04.020 | if I am giving you a health-related fact, right here.
00:18:07.900 | I'm giving you a health information.
00:18:09.340 | So then you update this log that I have in this Google Doc,
00:18:11.700 | and then you send me a message.
00:18:12.780 | And you can see, I've actually not configured
00:18:14.540 | this send message node.
00:18:15.620 | I haven't told it what to send me a message for.
00:18:17.660 | Right?
00:18:18.980 | And you can see, it's actually lecturing me.
00:18:21.860 | It's like, I'm giving it my blood pressure readings.
00:18:23.980 | It's like, hey, it's a bit high.
00:18:25.180 | Like, here are some lifestyle changes
00:18:26.540 | you may want to consider.
00:18:27.740 | - I think maybe this is the most confusing
00:18:30.300 | or new thing for people.
00:18:31.660 | So even I use Lindy and I didn't even know
00:18:33.900 | you could have multiple workflows in one Lindy.
00:18:36.100 | I think the mental model is kind of like the Zapier.
00:18:38.100 | Workflows is like, it starts and it ends.
00:18:40.180 | It's not, doesn't choose between.
00:18:42.100 | How do you think about what's a Lindy
00:18:44.620 | versus what's a sub-function of a Lindy?
00:18:47.020 | Like, what's the hierarchy?
00:18:48.660 | - Yeah, frankly, I think the line is a little arbitrary.
00:18:51.300 | It's kind of like when you code,
00:18:52.420 | like when do you start to create a new class
00:18:54.740 | versus when do you overload your current class?
00:18:57.580 | I think of it in terms of like jobs to be done.
00:18:59.740 | And I think of it in terms of who is the Lindy serving.
00:19:02.860 | This Lindy is serving me personally.
00:19:04.500 | It's really my day-to-day Lindy.
00:19:05.900 | I give it a bunch of stuff, like very easy tasks.
00:19:08.380 | And so this is just a Lindy I go to.
00:19:10.740 | Sometimes when a task is really more specialized,
00:19:13.220 | so for example, I have this like summarizer Lindy
00:19:15.700 | or this designer recruiter Lindy,
00:19:17.220 | these tasks are really beefy.
00:19:18.500 | I wouldn't want to add this to my main Lindy,
00:19:20.260 | so I just created a separate Lindy for it.
00:19:22.380 | Or when it's a Lindy that serves another constituency,
00:19:25.020 | like our customer support Lindy,
00:19:27.660 | I don't want to add that to like my personal assistant.
00:19:29.540 | These are two very different Lindys.
00:19:30.860 | - Yeah.
00:19:31.700 | And you can call a Lindy from within another Lindy.
00:19:34.460 | - That's right.
00:19:35.300 | - You can kind of chain them together.
00:19:36.460 | - Lindys can work together, absolutely.
00:19:38.580 | - A couple more things for the video portion.
00:19:40.980 | I noticed you have a podcast follower.
00:19:42.340 | We have to ask about that.
00:19:44.100 | What is that?
00:19:45.740 | - So this one wakes me up every,
00:19:47.820 | so wakes herself up every week.
00:19:50.940 | And she sends me, so she woke up yesterday actually,
00:19:54.260 | and she searches for Lenny's podcast.
00:19:56.940 | And she looks for like the latest episode on YouTube.
00:19:59.100 | And once she finds it, she transcribes the video.
00:20:01.260 | And then she sends me the summary by email.
00:20:02.980 | I don't listen to podcasts as much anymore.
00:20:05.860 | I just like read these summaries.
00:20:07.900 | - Yeah, yeah.
00:20:08.740 | - We should make a "Late in Space" Lindy.
00:20:11.220 | Marketplace.
00:20:12.740 | - Okay, so, and then, you know,
00:20:14.380 | you have a whole bunch of connectors.
00:20:15.660 | I saw the list briefly.
00:20:17.220 | Any interesting one, complicated one that you're proud of?
00:20:21.340 | Anything that you want to just share?
00:20:22.820 | - Yeah.
00:20:23.660 | - Connector stories.
00:20:24.500 | - So many of our workflows are about meeting scheduling.
00:20:26.300 | So we had to build some very open unity tools
00:20:28.700 | around meeting scheduling.
00:20:29.780 | So for example, one that is surprisingly hard
00:20:32.980 | is this find available times action.
00:20:35.300 | You would not believe,
00:20:36.140 | this is like a thousand lines of code or something.
00:20:37.740 | It's just a very beefy action.
00:20:39.460 | And you can pass it a bunch of parameters
00:20:41.620 | about how long is the meeting?
00:20:43.540 | When does it start?
00:20:44.420 | When does it end?
00:20:45.260 | What are the meeting, like the weekdays in which I meet?
00:20:48.820 | There's like, how many time slots do you return?
00:20:52.220 | What's the buffer between my meetings?
00:20:53.460 | It's just a very, very, very complex action.
00:20:56.340 | I really like our GitHub action.
00:20:58.140 | So we have like a Lindy PR reviewer.
00:21:01.380 | And it's really handy because anytime any bug happens,
00:21:04.140 | so the Lindy reads our guidelines on a Google Docs.
00:21:07.340 | By now the guidelines are like 40 pages long or something.
00:21:10.140 | And so every time any new kind of bug happens,
00:21:12.540 | we just go to the guideline and we add the lines like,
00:21:14.300 | "Hey, this has happened before.
00:21:15.460 | Please watch out for this category of bugs."
00:21:17.020 | And it's saving us so much time every day.
00:21:19.300 | - There's companies doing PR reviews.
00:21:21.340 | Where does a Lindy start?
00:21:22.660 | When does a company start?
00:21:23.980 | Or maybe how do you think about the complexity of these tasks
00:21:27.420 | when it's going to be worth having kind of like
00:21:28.820 | a vertical standalone company versus just like,
00:21:30.940 | "Hey, a Lindy is going to do a good job 99% of the time."
00:21:34.660 | - That's a good question.
00:21:36.700 | We think about this one all the time.
00:21:38.900 | I can't say that we've really come up
00:21:40.380 | with a very crisp articulation
00:21:42.100 | of when do you want to use a vertical tool
00:21:44.300 | versus when do you want to use a horizontal tool.
00:21:46.660 | I think of it as very similar to the internet.
00:21:48.700 | I find it surprising the extent
00:21:50.020 | to which a horizontal search engine has won.
00:21:52.540 | But I think that Google, right?
00:21:53.860 | But I think the even more surprising fact
00:21:55.580 | is that the horizontal search engine has won
00:21:57.740 | in almost every vertical, right?
00:21:59.460 | You go through Google to search Reddit.
00:22:00.820 | You go through Google to search Wikipedia.
00:22:03.180 | I think maybe the biggest exception is e-commerce.
00:22:05.460 | Like you go to Amazon to search e-commerce,
00:22:07.220 | but otherwise you go through Google.
00:22:08.700 | And I think that the reason for that
00:22:10.380 | is because search in each vertical
00:22:12.580 | has more in common with search
00:22:14.300 | than it does with each vertical.
00:22:15.860 | And search is so expensive to get right,
00:22:17.780 | and Google is a big company,
00:22:19.100 | that it makes a lot of sense to aggregate
00:22:21.260 | all of these different use cases
00:22:22.500 | and to spread your R&D budget
00:22:23.900 | across all of these different use cases.
00:22:25.780 | I have a thesis, which is a really cool thesis for Lindy,
00:22:29.540 | is that the same thing is true for agents.
00:22:31.500 | I think that by and large, in a lot of verticals,
00:22:34.780 | agents in each vertical have more in common with agents
00:22:37.060 | than they do with each vertical.
00:22:38.700 | I also think there are benefits
00:22:39.820 | in having a single agent platform
00:22:41.220 | because that way your agents can work together.
00:22:43.020 | They're all under one roof.
00:22:44.620 | That way you only learn one platform,
00:22:46.500 | and so you can create agents for everything that you want,
00:22:49.380 | and you don't have to pay
00:22:50.620 | for a bunch of different platforms and so forth.
00:22:52.740 | So I think ultimately it is actually going to shake out
00:22:55.420 | in a way that is similar to search
00:22:56.900 | in that search is everywhere on the internet.
00:22:59.780 | Every website has a search box, right?
00:23:01.540 | So there's going to be a lot of vertical agents
00:23:03.900 | for everything.
00:23:04.740 | I think AI is going to completely penetrate
00:23:06.740 | every category of software,
00:23:08.300 | but then I also think there are going to be a few
00:23:10.180 | very, very, very big horizontal agents
00:23:12.420 | that serve a lot of functions for people.
00:23:14.180 | - Yeah, that is actually one of the questions
00:23:15.620 | that we had about the agent stuff.
00:23:16.940 | So I guess we can transition away from the screen
00:23:19.700 | and I'll just ask the follow-up, which is,
00:23:22.020 | that is a hot topic.
00:23:22.860 | You're basically saying that the current VC obsession
00:23:26.180 | of the day, which is vertical AI-enabled SaaS,
00:23:29.340 | is mostly not going to work out,
00:23:31.740 | and then there are going to be
00:23:32.580 | some super giant horizontal SaaS?
00:23:34.780 | - Oh no, I'm not saying it's either or.
00:23:36.580 | Like SaaS today, vertical SaaS is huge,
00:23:38.620 | and there's also a lot of horizontal platforms.
00:23:40.420 | If you look at like Airtable or Notion,
00:23:42.420 | basically the entire no-code space is very horizontal.
00:23:45.060 | I mean, Loom and Zoom and Slack,
00:23:46.340 | like there's a lot of very horizontal tools out there.
00:23:48.820 | - Okay. (laughs)
00:23:50.340 | I was just trying to get a reaction out of you for hot takes.
00:23:53.660 | - Trying to get a hot take.
00:23:54.660 | No, I also think it is natural for the vertical solutions
00:23:58.020 | to emerge first, 'cause it's just easier to build.
00:24:00.700 | It's just much, much, much harder
00:24:01.700 | to build something horizontal.
00:24:03.060 | - Cool, some more Lindy-specific questions.
00:24:05.420 | So we covered most of the top use cases,
00:24:07.300 | and you have an Academy.
00:24:09.060 | That was nice to see.
00:24:10.380 | I also see some other people doing it for you for free.
00:24:12.900 | So like Ben Spites is doing it,
00:24:14.500 | and then there's some other guy
00:24:15.460 | who is also doing like lessons.
00:24:16.900 | - Yeah. - Which is kinda nice, right?
00:24:18.380 | - Yeah, absolutely. - You don't have
00:24:19.220 | to do any of that.
00:24:20.060 | - Oh, we're even seeing it more and more
00:24:20.940 | on like LinkedIn and Twitter,
00:24:22.140 | like people posting their Lindys and so forth.
00:24:24.060 | - Yeah, I think that's the flywheel,
00:24:25.340 | that you built the platform where creators see value
00:24:27.740 | in aligning themselves to you,
00:24:29.860 | and so then your incentive is to make them successful
00:24:32.820 | so that they can make other people successful,
00:24:34.340 | and then it just drives more and more engagement
00:24:36.260 | that you are, like it's earned media,
00:24:38.180 | like you don't have to do anything.
00:24:39.140 | - Yeah, yeah, I mean, community is everything.
00:24:41.100 | - Are you doing anything special there, any big wins?
00:24:44.620 | - We have a Slack community that's pretty active.
00:24:47.220 | I can't say we've invested much more than that so far.
00:24:49.780 | - I would say from having,
00:24:51.620 | so I have some involvement in the no-code community.
00:24:54.020 | I would say that Webflow going very hard
00:24:56.660 | after no-code as a category got them a lot more allies
00:25:00.580 | than just the people using Webflow.
00:25:02.820 | So it helps you to grow the community beyond just Lindy.
00:25:07.220 | And I don't know what this is called.
00:25:08.100 | Maybe it's just no-code again.
00:25:09.780 | Maybe you want to call it something different,
00:25:11.340 | but there's definitely an appetite for this,
00:25:13.700 | and you are one of a broad category, right?
00:25:15.700 | Like just before you, we had Duston,
00:25:17.820 | and they're also kind of going after a similar market.
00:25:20.860 | Zapier obviously is not going to try
00:25:22.620 | to also compete with you.
00:25:23.740 | - Yeah.
00:25:24.580 | - There's no question there,
00:25:25.420 | it's just like a reaction about community.
00:25:26.380 | Like I think a lot about community,
00:25:28.020 | Linspace is growing the community of AI engineers,
00:25:30.420 | and I think you have a slightly different audience of,
00:25:32.900 | I don't know what.
00:25:34.540 | - Yeah, I think the no-code tinkerers is the community.
00:25:37.900 | Yeah, it is going to be the same sort of community
00:25:39.900 | as what Webflow, Zapier, Airtable, Notion to some extent.
00:25:43.220 | - Yeah, the framing can be different if you were,
00:25:45.460 | so I think tinkerers has this connotation
00:25:47.180 | of not serious or like small.
00:25:49.420 | And if you framed it to like no-code EA,
00:25:52.860 | we're exclusively only for CEOs with a certain budget,
00:25:56.340 | then you just have, you tap into a different budget.
00:25:58.300 | - That's true.
00:25:59.140 | The problem with EA is like the CEO has no willingness
00:26:01.980 | to actually tinker and play with the platform.
00:26:04.780 | - Maybe Andrew's doing that.
00:26:06.020 | Like a lot of your biggest advocates are CEOs, right?
00:26:09.180 | - A solopreneur, you know, small business owners,
00:26:11.620 | I think Andrew is an exception, yeah.
00:26:13.180 | - Yeah, yeah, he is.
00:26:14.460 | He's an exception in many ways.
00:26:15.820 | - Yep.
00:26:16.660 | - Just before we wrap on the use cases,
00:26:18.660 | is Rick rolling your customers,
00:26:20.500 | like a officially supported use case,
00:26:22.620 | or maybe tell that story?
00:26:24.300 | - It's one of the main jobs to be done, really.
00:26:27.340 | Yeah, we woke up recently,
00:26:29.460 | so we have a Lindy obviously doing our customer support,
00:26:32.300 | and we do check after the Lindy.
00:26:34.300 | And so we cut this email exchange
00:26:36.020 | where someone was asking Lindy for video tutorials.
00:26:39.180 | And at the time, actually, we did not have video tutorials,
00:26:41.900 | we do now on the Lindy Academy.
00:26:43.860 | And Lindy responded to the emails like,
00:26:45.860 | "Oh, absolutely, here's a link."
00:26:47.220 | And we were like, "What, like we don't,
00:26:48.900 | what kind of link did you send?"
00:26:50.180 | And so we clicked on the link and it was a recall.
00:26:52.380 | We actually reacted fast enough
00:26:53.940 | that the customer had not yet opened the email,
00:26:56.020 | and so we reacted immediately like,
00:26:57.340 | "Oh, hey, actually, sorry, this is the right link."
00:26:59.180 | And so the customer never reacted to the first link.
00:27:01.780 | And so, yeah, I tweeted about that,
00:27:03.300 | it went surprisingly viral.
00:27:04.980 | And I checked afterwards in the logs,
00:27:07.380 | we did like a database query,
00:27:08.540 | and we found like, I think, like three or four
00:27:10.700 | other instances of it.
00:27:11.820 | - That's surprisingly low.
00:27:13.420 | - Yeah, it is, it is low.
00:27:14.620 | And we fixed it across the board
00:27:16.300 | by just adding a line to the system prompt.
00:27:18.380 | That's like, "Hey, don't recall people, please don't recall."
00:27:20.660 | - Yeah, yeah, yeah, yeah.
00:27:22.500 | I mean, so you can explain it retroactively, right?
00:27:25.140 | Like that YouTube slug has been pasted
00:27:26.740 | in so many different corpuses
00:27:29.100 | that obviously it learned to hallucinate that.
00:27:31.580 | - And it pretended to be so many things.
00:27:33.420 | That's the thing, it's like everybody-
00:27:35.260 | - I wouldn't be surprised if that takes one token.
00:27:37.300 | Like there's a tokenizer and it's just one token.
00:27:40.900 | - That's the idea of a YouTube video.
00:27:44.220 | - Because it's used so much, right?
00:27:46.140 | Like, and you have to basically get it exactly correct.
00:27:49.260 | It's probably not.
00:27:50.100 | I mean, that's a long-
00:27:51.300 | - It would have been so good.
00:27:52.500 | It is not a single token.
00:27:53.540 | (both laughing)
00:27:55.660 | - So this is just a jump maybe into evals from here.
00:27:59.260 | How could you possibly come up for an eval
00:28:01.980 | that says, "Make sure my AI does not rickroll my customer."
00:28:05.180 | I feel like when people are writing evals,
00:28:06.660 | that's not something that they come up with.
00:28:08.420 | So how do you think about evals
00:28:10.300 | when it's such like an open-ended problem space?
00:28:12.740 | - Yeah, it is tough.
00:28:13.700 | We built quite a bit of infrastructure
00:28:15.420 | for us to create evals in one click
00:28:17.220 | from any conversation history.
00:28:18.780 | So we can point to a conversation
00:28:20.380 | and we can be like,
00:28:21.700 | "In one click, we can turn it into effectively a unit test."
00:28:24.940 | It's like, "This is a good conversation.
00:28:26.500 | This is how you're supposed to handle things like this."
00:28:28.860 | Or if it's a negative example,
00:28:30.180 | then we modify a little bit the conversation
00:28:32.340 | after generating the eval.
00:28:33.860 | So it's very easy for us to spin up this kind of eval.
00:28:36.860 | - Do you use an off-the-shelf tool,
00:28:38.420 | which is at Brain Trust on the podcast,
00:28:40.260 | or did you just build your own?
00:28:41.860 | - We built, we unfortunately built our own.
00:28:43.900 | We're most likely going to switch to Brain Trust.
00:28:47.060 | It's, well, when we built it, there was nothing.
00:28:49.780 | Like, there was no eval tool, frankly.
00:28:51.260 | And we, I mean, we started this project like end of 2022.
00:28:53.820 | It was like, it was very, very, very early.
00:28:55.740 | I wouldn't recommend it to build your own eval tool.
00:28:57.740 | There's better solutions out there
00:28:58.940 | and our eval tool breaks all the time
00:29:00.820 | and it's a nightmare to maintain.
00:29:02.220 | And that's not something we want to be spending our time on.
00:29:04.300 | - I was going to ask that basically,
00:29:05.500 | 'cause I think my first conversations with you about Lindy
00:29:08.180 | was that you had a strong opinion
00:29:09.820 | that everyone should build their own tools.
00:29:11.820 | And you were very proud of your evals.
00:29:13.620 | You're kind of showing off to me
00:29:14.740 | like how many evals you were running, right?
00:29:16.260 | - Yeah, I think that was before
00:29:17.700 | all of these tools came around.
00:29:18.780 | I think the ecosystem has matured a fair bit.
00:29:21.300 | - What is one thing that Braintrust has nailed
00:29:23.460 | that you always struggled to do?
00:29:25.460 | - Well, not using them yet, so I couldn't tell.
00:29:27.380 | But from what I've gathered
00:29:28.580 | from the conversations I've had,
00:29:29.900 | like they're doing what we do with our eval tool, but better.
00:29:32.900 | - Yeah, and like they do it,
00:29:34.700 | but also like 60 other companies do it, right?
00:29:36.340 | So I don't know how to shop apart from brand.
00:29:40.100 | - Yeah. - Word of mouth.
00:29:41.060 | - Same here.
00:29:42.660 | - Yeah, like evals, there's two kinds of evals, right?
00:29:45.980 | In some way, you don't have to eval your system as much
00:29:50.340 | because you've constrained the language model so much.
00:29:52.980 | And you can rely on open AI to guarantee
00:29:55.260 | that the structured outputs are going to be good, right?
00:29:57.380 | We had Michelle sit where you sit
00:29:58.980 | and she explained exactly how they do
00:30:02.020 | constraint grammar sampling and all that good stuff.
00:30:04.060 | So actually, I think it's more important
00:30:06.580 | for your customers to eval their Lindys
00:30:08.460 | than you evaling your Lindy platform
00:30:10.780 | 'cause you just built the platform.
00:30:11.780 | You don't actually need to eval that much.
00:30:14.380 | - Yeah, in an ideal world,
00:30:16.300 | our customers don't need to care about this.
00:30:18.060 | And I think the bar is not like,
00:30:20.380 | look, it needs to be at 100%.
00:30:22.300 | I think the bar is it needs to be better than a human.
00:30:24.980 | And for most use cases we serve today,
00:30:27.300 | it is better than a human,
00:30:28.380 | especially if you put it on Rails.
00:30:30.180 | - Is there a limiting factor of Lindy at the business?
00:30:33.740 | Like, is it adding new connectors?
00:30:36.180 | Is it adding new node types?
00:30:38.340 | Like how do you prioritize
00:30:39.780 | what is the most impactful to your company?
00:30:41.700 | - Yeah, the raw capabilities for sure are a big limit.
00:30:45.540 | It is actually shocking the extent
00:30:46.940 | to which the model is no longer the limit.
00:30:49.100 | It was the limit a year ago.
00:30:50.380 | It was too expensive.
00:30:51.420 | The context window was too small.
00:30:52.980 | It's kind of insane that we started building this
00:30:54.860 | when the context windows were like 4,000 tokens.
00:30:56.660 | Like today our system prompt is more than 4,000 tokens.
00:30:59.220 | So yeah, the model is actually very much
00:31:01.220 | not a limit anymore.
00:31:02.540 | It almost gives me pause because I'm like,
00:31:04.100 | I want the model to be a limit.
00:31:05.660 | And so no, the integrations are ones,
00:31:07.740 | the core capabilities are ones.
00:31:09.060 | So for example, we are investing in a system
00:31:11.420 | that's basically, I call it like the,
00:31:12.580 | it's a Jayhack, gave me these names,
00:31:14.300 | like the Paul Mann's RLHF.
00:31:16.100 | So you can turn on a toggle on any step
00:31:19.300 | of your Lindy workflow to be like,
00:31:21.220 | ask me for confirmation before you actually
00:31:22.820 | execute this step.
00:31:23.780 | So it's like, hey, I receive an email,
00:31:25.300 | you send a reply,
00:31:26.420 | ask me for confirmation before actually sending it.
00:31:28.540 | And so today you see the email that's about to get sent
00:31:30.780 | and you can either approve, deny,
00:31:32.300 | or change it and then approve.
00:31:33.780 | And we are making it so that when you make a change,
00:31:36.420 | we are then saving this change that you're making
00:31:38.900 | or embedding it in a vector database.
00:31:40.220 | And then we are retrieving these examples for future tasks
00:31:42.780 | and injecting them into the context window.
00:31:44.740 | So that's the kind of capability that like
00:31:46.300 | makes a huge difference for users.
00:31:48.500 | That's the bottleneck today.
00:31:49.540 | It's really like good old engineering and product work.
00:31:52.100 | I assume you're hiring.
00:31:52.940 | What's the call for hiring at the end?
00:31:54.700 | - Any other comments on the model side?
00:31:57.180 | When did you start feeling like the model
00:31:58.900 | was not a bottleneck anymore?
00:32:00.820 | Was it 4.0?
00:32:01.780 | Was it 3.5?
00:32:03.940 | - 3.5 Sonnet, definitely.
00:32:06.580 | I think 4.0 is overhyped, frankly.
00:32:08.660 | We don't use 4.0.
00:32:09.820 | I don't think it's good for agentic behavior.
00:32:12.180 | - Yeah, 3.5 Sonnet is when I started feeling that.
00:32:14.540 | And then with prompt caching with 3.5 Sonnet,
00:32:17.220 | like that fills the cost,
00:32:18.820 | cut the cost again.
00:32:19.660 | - Just cut it by half.
00:32:20.700 | - Yeah.
00:32:21.540 | - Your prompts are,
00:32:23.060 | some of the problems with agentic uses
00:32:25.780 | is that your prompts are kind of dynamic, right?
00:32:27.460 | Like from caching to work,
00:32:28.700 | you need the front prefix portion to be stable.
00:32:32.180 | - Yes, but we have this append-only ledger paradigm.
00:32:36.100 | So every node keeps appending to that ledger
00:32:38.260 | and every filled node inherits all the context
00:32:41.020 | built up by all the previous nodes.
00:32:42.820 | And so we can just decide like,
00:32:44.060 | hey, every X thousand nodes,
00:32:45.580 | we trigger prompt caching again.
00:32:47.380 | - Oh, so you do it like programmatically,
00:32:49.220 | not all the time.
00:32:50.140 | - No, sorry.
00:32:50.980 | Anthropic manages that for us.
00:32:51.820 | But basically it's like,
00:32:52.660 | because we keep appending to the prompt,
00:32:54.220 | we just, like the prompt caching works pretty well.
00:32:56.660 | - We have this like small podcaster tool
00:32:58.780 | that I built for the podcast
00:32:59.940 | and I rewrote all of our prompts
00:33:01.820 | because I noticed, you know,
00:33:03.020 | I was inputting stuff early on.
00:33:04.660 | I wonder how much more money OpenAN and Anthropic are making
00:33:07.420 | just because people don't rewrite their prompts
00:33:09.700 | to be like static at the top
00:33:11.140 | and like dynamic at the bottom, but.
00:33:13.100 | - I think that's the remarkable thing
00:33:14.460 | about what we're having right now
00:33:16.100 | is it's insane that these companies
00:33:17.580 | are routinely cutting their costs by two, four, five.
00:33:20.620 | Like they basically just apply constraints.
00:33:22.300 | They want people to take advantage of these innovations.
00:33:24.700 | - Very good.
00:33:25.540 | Do you have any other competitive commentary?
00:33:27.780 | Dust, WordWare, Gumloop, Zapier?
00:33:30.260 | If not, we can move on.
00:33:31.620 | - No comment.
00:33:32.460 | I think the market is, look, I mean, AGI is coming.
00:33:36.740 | - All right, that's what I'm talking about.
00:33:37.940 | - I think you're helping.
00:33:38.780 | Like you're paving the road to AGI.
00:33:41.380 | - I'm playing my small role.
00:33:43.020 | I'm adding my small brick to this giant, giant,
00:33:45.260 | giant castle.
00:33:46.100 | Yeah, look, when it's here,
00:33:48.260 | we are gonna, this entire category of software
00:33:51.180 | is going to create,
00:33:52.260 | it's going to sound like an exaggeration,
00:33:53.580 | but it is a fact
00:33:54.420 | that it's going to create trillions of dollars of value
00:33:56.700 | in a few years, right?
00:33:58.060 | It's going to, for the first time,
00:33:59.180 | we're actually having software
00:34:00.460 | directly replace human labor.
00:34:02.180 | I see it every day in sales calls.
00:34:04.500 | It's like Lindy is today replacing,
00:34:06.300 | like we talk to even small teams.
00:34:07.860 | It's like, oh, like, stop, this is a 12 people team here.
00:34:11.220 | I guess we'll set up this Lindy for one or two days,
00:34:12.860 | and then we'll have to decide
00:34:13.940 | what we do with this 12 people team.
00:34:15.820 | And so, yeah, to me,
00:34:17.140 | there's this immense uncapped market opportunity.
00:34:19.900 | It's just such a huge ocean.
00:34:21.220 | And there's like three sharks in the ocean.
00:34:22.620 | I'm focused on the ocean more than on the sharks.
00:34:24.820 | - Cool.
00:34:25.660 | So we're moving on to hot topics,
00:34:26.500 | like kind of broadening out from Lindy,
00:34:27.940 | but obviously informed by Lindy.
00:34:29.460 | What are the high order bits of good agent design?
00:34:31.740 | - The model, the model, the model, the model.
00:34:33.620 | I think people fail to truly, and me included,
00:34:37.420 | they fail to truly internalize the bitter lesson.
00:34:40.060 | So for the listeners out there who don't know about it,
00:34:42.260 | it's basically like, you just scale the model,
00:34:44.060 | like GPUs go brrr, it's all that matters.
00:34:46.980 | I think it also holds for the cognitive architecture.
00:34:49.820 | I used to be very cognitive architecture-filled,
00:34:52.180 | and I was like, ah, and I was like a critic,
00:34:53.860 | and I was like a generator, and all this,
00:34:55.460 | and then it's just like GPUs go brrr,
00:34:56.900 | like just like let the model do its job.
00:34:58.740 | I think we're seeing it a little bit right now with O1.
00:35:01.180 | I'm seeing some tweets that say
00:35:02.460 | that the new 3.5 SONET is as good as O1,
00:35:05.820 | but with none of all the crazy-
00:35:09.100 | - It beats O1 on some measures.
00:35:11.580 | - On some reasoning tasks.
00:35:12.580 | - On AIME, it's still a lot lower.
00:35:14.260 | Like it's like 14 on AIME versus O1, it's like 83.
00:35:17.020 | - Got it.
00:35:17.860 | - So. (laughs)
00:35:18.700 | - Right.
00:35:19.540 | - But even O1 is still the model.
00:35:20.940 | - Yeah.
00:35:21.780 | - Like there's no cognitive architecture on top of it.
00:35:23.380 | You can just like wait for O1 to get better.
00:35:25.460 | - And so as a founder, how do you think about that?
00:35:27.940 | Right, because now knowing this,
00:35:30.020 | wouldn't you just wait to start Lindy?
00:35:32.100 | You know, you started Lindy, it's like 4K context,
00:35:34.020 | the models are not that good.
00:35:35.740 | It's like, but you're still kind of like
00:35:37.020 | going along and building
00:35:38.260 | and just like waiting for the models to get better.
00:35:40.300 | How do you today decide, again, what to build next?
00:35:43.460 | - Yeah.
00:35:44.300 | - Knowing that, hey, the models are gonna get better,
00:35:45.260 | so maybe we just shouldn't focus on improving
00:35:47.460 | our prompt design and all that stuff
00:35:48.980 | and just build the connectors instead or whatever.
00:35:50.860 | - Yeah, I mean, that's exactly what we do.
00:35:52.380 | Like all day, we always ask ourselves,
00:35:54.420 | oh, when we have a feature idea or a feature request,
00:35:56.820 | we ask ourselves like, is this the kind of thing
00:35:58.540 | that just gets better while we sleep
00:36:00.420 | because models get better?
00:36:02.140 | I'm reminded again, when we started this in 2022,
00:36:05.020 | we spent a lot of time because we had to
00:36:06.700 | around the context pruning,
00:36:08.180 | 'cause 4,000 tokens is really nothing.
00:36:09.700 | You really can't do anything with 4,000 tokens.
00:36:11.780 | All that work was throwaway work.
00:36:13.300 | Like now it's like it was for nothing, right?
00:36:15.540 | Now we just assume that infinite context windows
00:36:17.860 | are gonna be here in a year or something,
00:36:19.420 | a year and a half, and infinitely cheap as well.
00:36:22.660 | And dynamic compute is gonna be here.
00:36:24.620 | Like we just assume all of these things are gonna happen.
00:36:26.460 | And so we really focus,
00:36:27.860 | our job to be done in the industry
00:36:29.780 | is to provide the input and output to the model.
00:36:32.340 | I really compare it all the time to the PC and the CPU,
00:36:34.540 | right? Apple is busy all day.
00:36:35.660 | They're not like a CPU wrapper.
00:36:36.980 | They have a lot to build, but they don't.
00:36:38.500 | Well, now actually they do build the CPU as well,
00:36:40.420 | but leaving that aside, they're busy building a laptop.
00:36:43.140 | It's just a lot of work to build these things.
00:36:44.860 | - It's interesting 'cause like, for example,
00:36:46.620 | another person that we're close to, Mihaly from Repl.it,
00:36:49.980 | he often says that the biggest jump for him
00:36:52.260 | was having a multi-agent approach,
00:36:53.740 | like the critique thing that you just said
00:36:56.300 | that he doesn't need.
00:36:57.340 | And I wonder when, in what situations
00:36:59.180 | you do need that and what situations you don't.
00:37:01.180 | Obviously the simple answer is for coding, it helps.
00:37:04.100 | And you're not coding except for...
00:37:06.420 | Are you still generating code?
00:37:07.660 | - In Lindy? - Yeah.
00:37:08.660 | - No, we do. - Not really, right?
00:37:10.140 | - Oh, right.
00:37:10.980 | No, no, no, the cognitive architecture changed.
00:37:12.540 | We don't, yeah. - Yeah, yeah, okay.
00:37:14.100 | For you, you one-shot and you chain tools together
00:37:17.100 | and that's it.
00:37:18.180 | - And if the user really wants
00:37:19.940 | to have this kind of critique thing,
00:37:21.100 | you can also edit the prompt.
00:37:22.420 | You're welcome to.
00:37:23.260 | I have some of my Lindys, I've told them like,
00:37:24.860 | "Hey, be careful, think step-by-step
00:37:26.300 | "about what you're about to do."
00:37:27.300 | But that gives you a little bump also on use cases,
00:37:30.020 | but yeah.
00:37:30.980 | - What about unexpected model releases?
00:37:33.100 | So Anthropic released computer use today.
00:37:36.420 | I don't know if many people were expecting computer use
00:37:38.740 | to come out today.
00:37:40.060 | Do these things make you rethink how to design
00:37:42.700 | like your roadmap and things like that?
00:37:44.140 | Or are you just like, "Hey, look, whatever."
00:37:46.020 | That's just like a small thing in their like AGI pursuit
00:37:49.140 | that like maybe they're not even gonna support.
00:37:51.180 | And like, it's still better for us
00:37:52.300 | to build their own integrations into systems
00:37:54.780 | and things like that.
00:37:55.820 | Because maybe people would say,
00:37:56.940 | "Hey, look, why am I building all these API integrations
00:37:59.460 | "when I can just do computer use
00:38:00.940 | "and never go to the product?"
00:38:02.340 | - Yeah.
00:38:03.180 | No, I mean, we did take into account computer use.
00:38:05.420 | We were talking about this a year ago or something.
00:38:07.820 | Like we've been talking about it as part of our roadmap.
00:38:09.900 | It's been clear to us that it was coming.
00:38:11.900 | Like we've read reports of OpenAI working
00:38:13.820 | on something like that for a very long time.
00:38:16.060 | My philosophy about it is anything that can be done
00:38:18.700 | with an API must be done by an API
00:38:20.900 | or should be done by an API for a very long time.
00:38:24.260 | I think it is dangerous to be overly cavalier
00:38:26.500 | about improvements of model capabilities.
00:38:29.180 | I'm reminded of iOS versus Android.
00:38:31.660 | Android was built on the JVM.
00:38:34.020 | There was a garbage collector.
00:38:35.100 | And I can only assume that the conversation that went down
00:38:37.700 | in the engineering meeting room was,
00:38:40.140 | "Oh, who cares about the garbage collector?
00:38:41.700 | "Anyway, Moore's law is here.
00:38:43.460 | "And so that's all going to go to zero eventually."
00:38:45.820 | Sure, but in the meantime,
00:38:47.740 | you are operating on a 400 megahertz CPU,
00:38:50.220 | was like the first CPU on the iPhone 1.
00:38:51.900 | And it's really slow.
00:38:52.940 | And the garbage collector is introducing
00:38:54.260 | a tremendous overhead on top of that,
00:38:56.460 | especially like a memory overhead.
00:38:58.020 | And so for the longest time,
00:38:59.060 | and it's really only been recently
00:39:00.460 | that Android caught up to iOS
00:39:02.460 | in terms of how smooth the interactions were.
00:39:04.700 | But for the longest time,
00:39:05.620 | Android phones were significantly slower and laggier
00:39:08.580 | and just not feeling as good as iOS devices.
00:39:10.980 | And so, look, when you're talking about
00:39:12.820 | all those magnitude of differences
00:39:14.460 | in terms of performance and reliability,
00:39:17.100 | which is what we are talking about
00:39:18.460 | when we're talking about API use versus computer use,
00:39:20.900 | then you can't ignore that, right?
00:39:22.420 | And so I think we're going to be
00:39:24.500 | in an API use world for a while.
00:39:27.220 | - O1 doesn't have API use today.
00:39:29.060 | It will have it at some point.
00:39:30.500 | It's on the roadmap.
00:39:31.980 | There is a future in which OpenAI
00:39:34.620 | goes much harder after your business,
00:39:37.180 | your market, than it is today.
00:39:39.300 | Like ChatGPT, it's its own business.
00:39:41.140 | It's making like $2 billion a year or something.
00:39:43.780 | All they need to do is add tools to the ChatGPT,
00:39:47.700 | and now they're suddenly competing with you.
00:39:49.340 | And by the way, they have a GPT store
00:39:51.180 | where a bunch of people have already
00:39:53.100 | configured their tools to fit with them.
00:39:54.940 | Is that a concern?
00:39:56.980 | - I think even the GPT store in a way,
00:39:58.700 | like the way they architect it, for example,
00:40:00.180 | the plug-in systems are actually grateful
00:40:02.340 | because it's like we can also use the plug-ins.
00:40:04.060 | It's very open.
00:40:05.100 | No, again, I think it's going to be such a huge market.
00:40:07.820 | I think there's going to be
00:40:08.660 | a lot of different jobs to be done.
00:40:10.100 | Today, at least, ChatGPT,
00:40:11.460 | I know they have like a huge enterprise offering and stuff,
00:40:13.580 | but today, ChatGPT is a consumer app, right?
00:40:15.700 | And so the sort of flow detail I showed you,
00:40:17.460 | this sort of workflow,
00:40:18.300 | this sort of use cases that we're going after,
00:40:19.980 | which is like, we're doing a lot of like lead generation
00:40:22.340 | and lead outreach and all of that stuff.
00:40:24.340 | That's not something like meeting recording,
00:40:26.020 | like Lindy Today right now joins your Zoom meetings
00:40:28.700 | and takes notes, all of that stuff.
00:40:30.100 | I don't see that so far on the OpenAI roadmap.
00:40:33.340 | - Yeah, but they do have an enterprise team
00:40:35.420 | that we talked to for a Decibel Summit.
00:40:38.620 | Cool, I have some other questions on company building stuff.
00:40:41.300 | You're hiring GMs?
00:40:42.380 | - We did.
00:40:43.220 | - It's a fascinating way to build a business, right?
00:40:45.260 | Like what should you, as CEO, be in charge of?
00:40:48.980 | And what should you basically hire a mini CEO to do?
00:40:52.500 | - Yeah, that's a good question.
00:40:53.540 | I think that's all something we're figuring out.
00:40:55.380 | The GM thing was inspired from my days at Uber,
00:40:58.380 | where we hired one GM per city or per major geo area.
00:41:02.180 | We had like all GMs, regional GMs and so forth.
00:41:04.820 | And yeah, Lindy is so horizontal
00:41:06.860 | that we thought it made sense to hire GMs
00:41:08.940 | to own each vertical and to go to market of the vertical
00:41:11.820 | and the customization of the Lindy templates
00:41:13.940 | for these verticals and so forth.
00:41:15.820 | What should I own as a CEO?
00:41:18.780 | I mean, the canonical reply here is always going to be,
00:41:21.420 | you own the fundraising, you own the culture,
00:41:24.380 | you own the, what's the rest of the canonical reply?
00:41:27.740 | The culture, the fundraising.
00:41:29.060 | - I don't know, products.
00:41:30.780 | - Even that eventually you do have to hand out.
00:41:33.500 | Yes, the vision, the culture and the fundraising.
00:41:35.740 | And it's like, if you just do these things
00:41:37.380 | and you've done it well, you've done your job as a CEO.
00:41:39.460 | In practice, obviously, yeah.
00:41:40.500 | I mean, all day, I do a lot of product work still
00:41:42.540 | and I want to keep doing product work
00:41:43.980 | for as long as possible.
00:41:45.060 | Obviously, like you're recording and managing the team, yeah.
00:41:48.220 | - That one feels like the most automatable part of the job,
00:41:50.860 | the recruiting stuff.
00:41:52.020 | - Well, yeah.
00:41:52.860 | You saw my design your recruiter here, yeah.
00:41:56.260 | - Relationship between Factorio and building Lindy.
00:41:59.420 | - We actually very often talk about
00:42:01.220 | how the business of the future is like a game of Factorio.
00:42:04.460 | It's like, you just wake up in the morning
00:42:06.380 | and you've got your Lindy instance, it's like Slack,
00:42:08.700 | and you've got like 5,000 Lindys in the sidebar
00:42:11.060 | and your job is to somehow manage your 5,000 Lindys.
00:42:13.740 | And it's going to be very similar to company building
00:42:16.140 | because you're going to look for the highest leverage way
00:42:18.980 | to understand what's going on in your AI company
00:42:22.220 | and understand what levels do you have
00:42:24.780 | to make impact in that company.
00:42:26.380 | So I think it's going to be very similar
00:42:27.580 | to like a human company,
00:42:28.540 | except it's going to go infinitely faster.
00:42:30.620 | Today in a human company,
00:42:31.740 | you could have a meeting with your team and you're like,
00:42:33.020 | "Oh, I guess we need one more designer.
00:42:35.100 | Okay, I guess I'll kick off a search."
00:42:36.540 | And two months later, you have a new designer.
00:42:38.300 | Now it's like, "Okay, boom, I'm going to spin up
00:42:39.940 | 50 designers." - That is going to go away.
00:42:40.980 | - Yeah.
00:42:41.820 | - Like actually, it's more important that you can clone
00:42:44.700 | an existing designer that you know works.
00:42:46.700 | Because the hiring process, you cannot clone someone.
00:42:49.180 | - Yeah.
00:42:50.020 | - Because every new person you bring in
00:42:51.340 | is going to have their own tweaks and you don't want that.
00:42:53.900 | - Yeah, yeah, that's true.
00:42:56.540 | - You want an army of mindless drones
00:42:58.100 | that all work the same way.
00:42:59.500 | The reason I bring Factorio up as well
00:43:02.940 | is one, Factorio Space just came out.
00:43:04.460 | Apparently a whole bunch of people stopped working.
00:43:06.580 | I tried out Factorio.
00:43:07.620 | I never really got that much into it.
00:43:09.140 | But the other thing was you had a tweet recently
00:43:11.100 | about how the sort of intentional top-down design
00:43:15.220 | was not as effective as just build.
00:43:18.420 | - Yeah.
00:43:20.220 | - Just ship.
00:43:21.060 | - I think people read it a little bit too much
00:43:22.860 | into that tweet.
00:43:23.700 | It went weirdly viral.
00:43:25.380 | I did not intend it as a giant statement on life.
00:43:27.860 | - I mean, you notice you have a pattern of this, right?
00:43:29.700 | Like you've done this for eight years now.
00:43:31.740 | You should know.
00:43:32.580 | (both laughing)
00:43:33.980 | - I legit was just hearing an interesting story
00:43:35.900 | about the Factorio game I had.
00:43:37.060 | And everybody was like, "Oh my God, so deep."
00:43:38.500 | I guess this explains everything about life and companies.
00:43:40.860 | And there is something to be said certainly
00:43:43.020 | about focusing on the constraint.
00:43:44.820 | And I think it is Patrick Collison who said,
00:43:46.820 | "People underestimate the extent to which moonshots
00:43:49.900 | are just one pragmatic step taken after the other."
00:43:52.340 | And I think as long as you have some inductive bias
00:43:54.300 | about like some loose idea about where you want to go,
00:43:56.500 | I think it makes sense to follow a sort of greedy search
00:43:59.620 | along that path.
00:44:00.700 | I think planning and organizing is important
00:44:02.740 | and having order is important.
00:44:05.140 | - I'm wrestling with that.
00:44:06.620 | There's two ways I encountered it recently.
00:44:08.980 | One with Lindy.
00:44:10.140 | When I tried out one of your automation templates
00:44:11.940 | and one of them was quite big
00:44:14.100 | and I just didn't understand it, right?
00:44:15.980 | So like it was not as useful to me as a small one
00:44:18.380 | that I can just plug in and see all of.
00:44:20.620 | And then the other one was me using Cursor.
00:44:23.380 | I was very excited about O1
00:44:25.500 | and I just upfront stuffed everything I wanted to do
00:44:29.980 | into my prompt and expected O1 to do everything.
00:44:33.820 | And it got itself into a huge jumbled mess
00:44:36.020 | and it was stuck.
00:44:37.180 | It was really...
00:44:38.820 | There was no amount, I wasted like two hours
00:44:40.740 | on just like trying to get out of that hole.
00:44:43.140 | So I threw away the code base, started small,
00:44:45.380 | switched to a class on it
00:44:46.540 | and build off something working
00:44:47.860 | and just added over time and it just worked.
00:44:50.100 | And to me, that was the factorial sentiment, right?
00:44:52.940 | Maybe I'm one of those fanboys
00:44:54.020 | that's just like obsessing over the death
00:44:56.180 | of something that you just randomly tweeted out.
00:44:58.460 | But I think it's true for company building,
00:45:00.380 | for Lindy building, for coding, I don't know.
00:45:02.500 | - I think it's fair.
00:45:03.420 | And I think like you and I talked about
00:45:05.380 | there's the Tuft and Metal principle
00:45:06.580 | and there's this other.
00:45:07.420 | - Yes, I love that.
00:45:08.740 | - There's the, I forgot the name of this other blog post
00:45:11.420 | but it's basically about this book,
00:45:13.820 | "Seeing Like a State"
00:45:15.020 | that talks about the need for legibility
00:45:18.140 | and people who optimize the system for its legibility.
00:45:20.540 | And anytime you make a system,
00:45:21.900 | so legible is basically more understandable.
00:45:23.540 | Anytime you make a system more understandable
00:45:25.060 | from the top down,
00:45:26.060 | it performs less well from the bottom up.
00:45:28.100 | And it's fine if that's what you want,
00:45:29.820 | but you should at least make this trade off
00:45:31.420 | with your eyes wide open.
00:45:32.380 | You should know I am sacrificing performance
00:45:34.500 | for understandability, for legibility.
00:45:36.340 | And in this case for you, it makes sense.
00:45:38.300 | It's like you are actually optimizing for legibility.
00:45:40.100 | You do want to understand your code base,
00:45:41.860 | but in some other cases, it may not make sense.
00:45:44.660 | Sometimes it's better to leave the system alone
00:45:47.020 | and let it be its glorious, chaotic, organic self
00:45:50.940 | and just trust that it is going to perform well,
00:45:53.100 | even though you don't understand it completely.
00:45:55.020 | - It does remind me of a common managerial issue or dilemma,
00:45:59.780 | which you experienced in a small scale of Lindy
00:46:01.780 | where, you know, do you want to organize your company
00:46:04.740 | by functional sections or by products
00:46:07.740 | or, you know, whatever the opposite of functional is.
00:46:10.300 | And you tried it one way
00:46:11.340 | and it was more legible to you as CEO,
00:46:14.500 | but actually it stopped working at the small level.
00:46:16.900 | - Yeah, I mean, one very small example,
00:46:18.860 | again, at a small scale is
00:46:20.540 | we used to have everything on Notion.
00:46:22.580 | And for me as founder, it was awesome
00:46:23.980 | because everything was there.
00:46:25.300 | The roadmap was there, the tasks were there,
00:46:27.860 | the postmortems were there.
00:46:29.220 | And so the postmortem was linked to a task.
00:46:30.820 | - Yeah, it's optimized for you.
00:46:31.700 | - It was exactly.
00:46:32.540 | And so I had this like one pane of glass
00:46:33.940 | and everything was on Notion.
00:46:35.220 | And then the team one day came to me with pitchforks
00:46:37.180 | and they really wanted to implement Linear.
00:46:39.340 | And I had to bite my fist so hard.
00:46:41.420 | I was like, fine, do it, implement Linear.
00:46:43.700 | 'Cause I was like, at the end of the day,
00:46:44.940 | the team needs to be able to self-organize
00:46:46.820 | and pick their own tools.
00:46:48.020 | Yeah, but it did make the company
00:46:49.260 | slightly less legible for me.
00:46:51.060 | - Another big change you had was going away from remote work,
00:46:54.300 | bringing people back in person.
00:46:55.780 | I think there's obviously every other month
00:46:58.620 | the discussion comes up again.
00:47:00.300 | What was that discussion like?
00:47:02.100 | How did your feelings change?
00:47:04.220 | Was there kind of like a threshold of employees
00:47:06.180 | and team size where you felt like, okay, maybe that worked.
00:47:08.340 | Now it doesn't work anymore.
00:47:09.340 | And how are you thinking about the future
00:47:11.140 | as you scale the team?
00:47:12.340 | - Yeah, so for context,
00:47:14.300 | I used to have a business called TeamFlow.
00:47:16.460 | The business was about building a virtual office
00:47:18.580 | for remote teams.
00:47:19.460 | And so being remote was not merely something we did.
00:47:22.580 | I was banging the remote drum super hard
00:47:25.020 | because we were helping companies to go remote, right?
00:47:27.940 | And so, frankly, in a way it's a bit embarrassing
00:47:30.460 | for me to do like a 180 like that,
00:47:32.060 | but I guess when the facts changed, I changed my mind.
00:47:34.820 | What happened?
00:47:35.660 | Well, I think at first, like everyone else,
00:47:37.780 | we went remote by necessity.
00:47:39.460 | It was like COVID and you got to go remote.
00:47:41.220 | And on paper, the gains of remote are enormous.
00:47:44.940 | In particular, from a founder standpoint,
00:47:46.700 | being able to hire from anywhere is huge.
00:47:49.020 | Saving on rent is huge.
00:47:50.140 | Saving on commute is huge for everyone and so forth.
00:47:52.580 | But then, look, I'm not going to say anything original here.
00:47:55.100 | It's like it is really making it
00:47:56.820 | much harder to work together.
00:47:58.540 | And I spent three years of my youth
00:48:00.980 | trying to build a solution for this.
00:48:02.260 | And my conclusion is at least we couldn't figure it out
00:48:04.820 | and no one else could.
00:48:06.140 | Zoom didn't figure it out.
00:48:07.140 | We had like a bunch of competitors,
00:48:08.380 | like Gathertown was one of the bigger ones.
00:48:10.580 | We had dozens and dozens of competitors.
00:48:12.500 | No one figured it out.
00:48:13.500 | I don't know that software can actually solve this problem.
00:48:16.260 | Reality of it is everyone just wants to get off
00:48:18.540 | the darn Zoom call.
00:48:19.820 | And it's not a good feeling to be in your home office
00:48:22.780 | if you even are lucky enough to have a home office all day.
00:48:25.940 | It's harder to build culture.
00:48:27.220 | It's harder to get in sync.
00:48:28.660 | I think software is peculiar because it's like an iceberg.
00:48:31.820 | It's like the vast majority of it is submerged under water.
00:48:35.780 | And so the quality of the software that you ship
00:48:38.460 | is a function of the alignment of your mental models
00:48:40.780 | about what is below that waterline.
00:48:42.660 | Can you actually get in sync
00:48:43.860 | about what it is exactly fundamentally that we're building?
00:48:46.460 | What is the soul of a product?
00:48:47.940 | And it is so much harder to get in sync about that
00:48:50.060 | when you're remote.
00:48:51.060 | And then you waste time in a thousand ways
00:48:52.820 | because people are offline and you can't get ahold of them
00:48:54.820 | or like you can't share your screen.
00:48:55.820 | It's just, it's like you feel like you're walking
00:48:57.700 | in more or less is all day.
00:48:59.100 | And eventually I just, I was like, okay, this is it.
00:49:01.180 | Like, we're not gonna do this anymore.
00:49:03.420 | - Yeah.
00:49:04.260 | I think that is the current builder
00:49:06.580 | San Francisco consensus here.
00:49:09.020 | But I still have a big, like one of my big heroes
00:49:11.940 | as a CEO is Sid Subbanj from GitLab.
00:49:14.900 | Matt Molleweg used to be a hero,
00:49:16.500 | but like these people run thousand person remote businesses.
00:49:21.460 | The main idea is that at some company size,
00:49:24.340 | your company is remote anyway.
00:49:25.980 | Because if you go from one building to two buildings,
00:49:28.180 | you're congrats, you're now remote from the other building.
00:49:30.740 | Like if you won't go from one city office
00:49:32.660 | to like two city offices, they're remote from each other.
00:49:35.020 | - But the teams are co-located.
00:49:36.460 | Every time anyone talks about remote success stories,
00:49:38.540 | they always talk about this real force.
00:49:39.860 | I mean, it's always GitLab and WordPress and Zapier.
00:49:44.860 | - Zapier.
00:49:45.980 | - It used to be InVision.
00:49:46.820 | (laughs)
00:49:47.660 | And I will point out that in every one of these examples,
00:49:50.780 | you have a co-located counterfactual
00:49:52.860 | that is sometimes orders of magnitude bigger.
00:49:55.140 | Look, I like Matt Molleweg a lot,
00:49:57.380 | but WordPress is a commercial failure.
00:49:59.500 | They run 60% of the internet
00:50:01.140 | and they're like a fraction of the size of even Substack.
00:50:04.260 | Right?
00:50:05.940 | - They're trying to get more money.
00:50:06.780 | (laughs)
00:50:07.940 | - Yeah, that's my point, right?
00:50:09.580 | Like look, GitLab is much smaller than GitHub.
00:50:11.740 | InVision, you know, is no more.
00:50:13.660 | And Figma like completely took off.
00:50:15.500 | And Figma was like very in person.
00:50:16.820 | Figma let go of people
00:50:17.820 | because they wanted to move from San Francisco to LA.
00:50:19.940 | So I think if you're optimizing for productivity,
00:50:23.100 | if you really know, hey, this is a support ticket, right?
00:50:26.020 | And I want to have my support tickets
00:50:27.500 | for a buck 50 per support ticket.
00:50:28.820 | And next year, I want it for like a buck 20.
00:50:31.140 | Then sure, send your support ticket team
00:50:32.860 | to offshore like the Philippines or whatever,
00:50:34.780 | and just optimize for cost.
00:50:35.780 | If you're optimizing for cost, absolutely be remote.
00:50:38.220 | If you're optimizing for creativity,
00:50:39.820 | which I think that's software
00:50:41.100 | and product building is a creative endeavor.
00:50:43.060 | If you're optimizing for creativity,
00:50:44.700 | it's kind of like composing an album.
00:50:46.060 | You can't do it on the cheap.
00:50:46.980 | You want the very best album that you can make.
00:50:49.340 | And you have to be in person
00:50:50.540 | and hear the music to do that.
00:50:51.940 | - Yeah.
00:50:52.780 | So the line is that all jobs that can be remote
00:50:56.540 | should be AI or Lindy's,
00:50:58.980 | and all jobs that are not remote are in person.
00:51:01.580 | Like there's a very, very clear separation of jobs.
00:51:04.460 | - Sure.
00:51:05.300 | Well, I think over the long term,
00:51:06.140 | every job is going to be AI anyway.
00:51:07.500 | (all laughing)
00:51:09.140 | - It would be curious to break down
00:51:11.020 | what you think is creativity in coding
00:51:13.500 | and in product defining and how to express that with LLMs.
00:51:16.940 | I think that is underexplored for sure.
00:51:19.020 | You're definitely a, what I call a temperature zero
00:51:21.740 | use case of LLMs.
00:51:23.100 | You want it to be reliable, predictable, small.
00:51:25.260 | And then there's other use cases of LLMs
00:51:27.460 | that are more for like creativity and engines, right?
00:51:30.620 | I haven't checked,
00:51:31.460 | but I'm pretty sure no one uses Lindy for brainstorming.
00:51:34.740 | Actually, probably they do.
00:51:36.540 | - I use Lindy for brainstorming a lot, actually.
00:51:38.060 | - Yeah, yeah, yeah.
00:51:39.380 | But like, you know,
00:51:40.220 | you want to have like something that's anti-fragile
00:51:42.380 | to hallucination.
00:51:43.540 | Like hallucinations are good.
00:51:45.420 | - By creativity, I mean,
00:51:46.900 | is it about direction or magnitude?
00:51:48.980 | If it is about direction, like decide what to do,
00:51:51.420 | then it's a creative endeavor.
00:51:52.780 | If it is about magnitude and just do it as fast as possible,
00:51:56.180 | as cheap as possible, then it's magnitude.
00:51:58.340 | And so sometimes, you know,
00:51:59.780 | software companies are not necessarily creative.
00:52:01.940 | Sometimes you know what you're doing.
00:52:03.580 | And I'll say that is going to come across the wrong way,
00:52:05.580 | but Linear, I look up to a huge amount,
00:52:08.340 | like such amazing product builders,
00:52:10.340 | but they know what they're building.
00:52:11.700 | They're building a task tracker.
00:52:12.660 | And so Linear is remote, right?
00:52:14.100 | Linear is building a task tracker, right?
00:52:16.860 | I don't mean to throw shade at them, like good for them.
00:52:18.620 | I think they're aware that they're not like-
00:52:19.980 | - They recently got shit for saying
00:52:21.820 | that they have work-life balance on their job description.
00:52:24.420 | They were like, "What do you mean by this?"
00:52:25.700 | (laughing)
00:52:28.420 | - We're building a new kind of product
00:52:30.100 | that no one's ever built before.
00:52:31.380 | And so we're just scratching our heads all day,
00:52:33.300 | trying to get in sync about like,
00:52:34.340 | what exactly is it that we're building?
00:52:35.980 | What does it consist of?
00:52:37.460 | - Inherently creative struggle.
00:52:39.340 | - Yeah.
00:52:40.180 | - Dare we ask about San Francisco?
00:52:41.940 | And there's a whole bunch of tough stuff in here.
00:52:44.100 | I don't know if you have any particular leanings.
00:52:46.500 | Probably the biggest one I'll just congratulate you on
00:52:49.620 | is becoming American, right?
00:52:50.980 | Like you, very French, but your heart was sort of
00:52:54.700 | in the US, you eventually found your way here.
00:52:57.100 | What are your takes for like founders, right?
00:52:59.500 | Like a few years ago,
00:53:00.660 | you wrote this post on like, "Go West, young man."
00:53:02.780 | And now you've basically completed that journey, right?
00:53:04.820 | Like you're now here and up to the point where
00:53:07.900 | you're kind of mystified by how Europe has been so decel.
00:53:11.540 | - In a way though, I feel vindicated
00:53:13.620 | 'cause I was making the prediction that Europe
00:53:15.780 | was over 14 years ago or something like that.
00:53:18.380 | I think it's been a walking corpse for a long time.
00:53:21.020 | I think it is only now becoming obvious
00:53:22.900 | that it is paying the consequences of its policies
00:53:24.860 | from 10, 20, 30 years ago.
00:53:26.780 | I think at this point,
00:53:27.700 | I wish I could rewrite the "Go West, young man" article,
00:53:31.260 | but really even more extreme.
00:53:33.580 | I think at this point, if you are in tech,
00:53:37.340 | especially in AI, but if you're in tech
00:53:39.380 | and you're not in San Francisco,
00:53:40.580 | you either lack judgment or you lack ambition.
00:53:42.660 | It's one of the two.
00:53:43.540 | It's funny, I recently told that to someone
00:53:45.660 | and they were like, "Oh, like not everyone
00:53:47.380 | wants to be like a unicorn founder."
00:53:48.980 | And I was like, "Like I said, judgment or ambition."
00:53:52.540 | It's fine to not have ambition.
00:53:53.780 | It's fine to want to prioritize other things
00:53:55.500 | than your company in life or your career in life.
00:53:57.260 | That's perfectly okay.
00:53:58.740 | But know that that's the trade-off you're making.
00:54:00.380 | If you prioritize your career, you've got to be here.
00:54:02.740 | - As a fellow European escapist, I grew up in Rome.
00:54:05.700 | - Yeah, how do you feel?
00:54:06.700 | We never talked about your feelings.
00:54:07.740 | - Yeah, I've been in the US now six years.
00:54:10.100 | Well, I started my first company in Europe 10 years ago,
00:54:12.820 | something like that.
00:54:13.660 | And yeah, you can tell nobody really wants to do much.
00:54:16.540 | And then you're like, "Okay."
00:54:17.900 | It's funny, I was looking back through some old tweets
00:54:20.340 | and I was sending all these tweets to Mark Andreessen
00:54:22.460 | like 15 years ago, trying to learn more about
00:54:25.900 | why are you guys putting money in these things
00:54:28.300 | that most people here would say you're crazy to even back.
00:54:31.980 | And eventually I started doing venture,
00:54:33.820 | yeah, six, five years ago.
00:54:35.620 | And I think just like so many people in Europe
00:54:38.340 | reach out and ask, "Hey, can you talk to our team?"
00:54:40.860 | And like, blah, blah, blah.
00:54:42.140 | And they just cannot comprehend the risk appetite
00:54:45.460 | that people have here.
00:54:46.620 | It just like so foreign to people, at least in Italy
00:54:50.260 | and like in some parts of Europe.
00:54:51.700 | I'm sure there's some great founders in Europe,
00:54:53.740 | but like the average European founders,
00:54:55.980 | like why would I leave my job at the post office
00:54:59.740 | to go work on the startup that could change everything
00:55:02.700 | and become very successful, but might go out of business.
00:55:05.100 | Instead in the US, you have like, you know,
00:55:06.740 | we host a hackathon and it's like 400 people show up
00:55:09.500 | and it's like, where can I go work
00:55:11.060 | that it's like no job security?
00:55:12.500 | You know? - Yeah.
00:55:13.340 | - It's just like completely different
00:55:14.500 | and there's no incentives from the government
00:55:16.900 | to change that.
00:55:17.780 | There's no way you can like change
00:55:19.380 | such a deep-rooted culture of like, you know,
00:55:22.060 | going in wine and April spritz
00:55:23.940 | and all of that early in the afternoon.
00:55:25.820 | So I don't really know how it's going to change.
00:55:27.140 | - It's quality of life.
00:55:28.300 | - Yeah, totally.
00:55:29.140 | That's why I left.
00:55:29.980 | (all laughing)
00:55:31.460 | The quality's so high that I left.
00:55:33.780 | But again, I agree with you.
00:55:35.300 | It's just like, hey, like there's no rational explanation
00:55:38.220 | as to why it's better to move here.
00:55:40.180 | It just, if you want to do this job and do this,
00:55:42.540 | you should be here.
00:55:43.620 | If you don't want to, that's fine.
00:55:44.980 | But like, don't cope him.
00:55:47.100 | - Right.
00:55:47.940 | - Don't be like, oh no,
00:55:48.980 | you can also be successful doing this
00:55:50.940 | and knees or like whatever.
00:55:52.900 | No, probably not, you know?
00:55:54.220 | So yeah, I've already done my N400.
00:55:56.700 | So I should get my U.S. citizenship interview.
00:55:58.700 | - Hell yeah. - Damn.
00:55:59.980 | - Soon.
00:56:00.900 | - Yeah.
00:56:01.860 | And I think to be fair,
00:56:02.980 | I think what's happening right now to Europe
00:56:04.660 | is largely self-inflicted.
00:56:05.940 | I think that it's just completely,
00:56:07.820 | again, they've said no to capitalism.
00:56:09.660 | They've decided to say no to capitalism a long time ago.
00:56:12.340 | They've completely over-regulated.
00:56:14.420 | Taxation is much too high and so forth.
00:56:16.500 | But I also think some of this
00:56:17.780 | is a little bit of a self-fulfilling prophecy
00:56:20.700 | or it's a self-perpetuating phenomenon.
00:56:23.100 | Because, look, to your point,
00:56:24.500 | once there is a network effect
00:56:25.580 | that's just so incredibly powerful,
00:56:27.860 | they can't be broken, really.
00:56:29.380 | And we tried with San Francisco.
00:56:31.100 | I tried with San Francisco.
00:56:32.300 | Like during COVID,
00:56:33.140 | there was a movement of people moving to Miami.
00:56:36.260 | - You and I both moved there.
00:56:37.340 | (laughing)
00:56:39.020 | - How did that pan out?
00:56:39.860 | You can't break the network effect, you know?
00:56:41.580 | - It's so annoying because first principles wise,
00:56:43.580 | tech should not be here.
00:56:44.860 | Like tech should be in Miami
00:56:45.940 | 'cause it's just better city.
00:56:47.420 | (laughing)
00:56:49.100 | - San Francisco does not want tech to be here.
00:56:49.940 | - San Francisco hates tech.
00:56:51.420 | - 100%.
00:56:52.260 | - This is the thing I actually wrote down.
00:56:53.100 | Like San Francisco hates tech.
00:56:54.100 | - It is true.
00:56:54.940 | - I think the people that are in San Francisco
00:56:57.500 | that were here before,
00:56:59.020 | tech hated and then there's kind of like
00:57:00.540 | this passed down thing.
00:57:01.380 | But I would say people in Miami would hate it too
00:57:03.380 | if there were too much of it, you know?
00:57:05.340 | Like the Niki Beach crowd would also not chill.
00:57:08.380 | - They're just rich enough and chill enough to not care.
00:57:10.060 | - Yeah, I think so too.
00:57:10.900 | Like, oh, crypto kids.
00:57:11.860 | Okay, cool.
00:57:12.700 | - Yeah.
00:57:13.540 | (laughing)
00:57:14.380 | - Yeah, Miami celebrates success,
00:57:15.620 | which is one thing I loved about it.
00:57:17.420 | - A little bit too much.
00:57:18.260 | (laughing)
00:57:19.620 | Maybe the last thing I'll mention,
00:57:20.900 | I just wanted a little bit of EUAC talk.
00:57:23.020 | I think that's good.
00:57:23.860 | I'll maybe carve out that I think the UK has done really well.
00:57:27.060 | That's an argument for the UK not being part of Europe
00:57:30.540 | is that, you know, the AI institutions there
00:57:32.660 | at least have done very well, right?
00:57:34.060 | - Sure.
00:57:34.900 | - The economy of Britain is in the gutter.
00:57:36.140 | - Yeah, exactly.
00:57:36.980 | - They've been stagnating at best.
00:57:38.060 | And then France has a few wins.
00:57:41.580 | - Who?
00:57:42.540 | - Mistral.
00:57:43.380 | - Who uses Mistral?
00:57:44.220 | - Hugging face.
00:57:45.220 | A few wins.
00:57:46.980 | (laughing)
00:57:47.820 | I'm just saying.
00:57:49.020 | They disappointed their first AI minister.
00:57:51.260 | - You know the meme with the guy
00:57:52.940 | who's celebrating with his trophy
00:57:54.540 | and then he's like, "No, it's France."
00:57:55.980 | Right?
00:57:56.820 | To me, that's France.
00:57:57.660 | It's like, "Aha, look, we've got Mistral!"
00:57:58.980 | And it's like, "Champagne!"
00:57:59.860 | And it's like maybe 1% of market share.
00:58:01.940 | And by the way, I said that I love Mistral.
00:58:03.620 | I love the guys over there.
00:58:05.100 | And it's not a critic of them.
00:58:06.140 | It's a critic of France and of Europe.
00:58:07.740 | And by the way,
00:58:08.580 | I think I've heard that the Mistral guys
00:58:10.220 | were moving to the U.S.
00:58:11.780 | - Yeah, they're opening an office here.
00:58:13.500 | - They're opening an office here.
00:58:14.660 | - But I mean, they're very French, right?
00:58:16.260 | (laughing)
00:58:17.220 | You can't really avoid it.
00:58:18.500 | There's one interesting counter move
00:58:19.780 | which is Jason Warner and ISOCAT
00:58:21.460 | moving to Paris for poolside.
00:58:24.180 | - I don't know.
00:58:25.020 | - It remains to be seen how that move is going.
00:58:27.780 | Maybe the last thing I'll say,
00:58:29.540 | you know, that's the Europe talk.
00:58:30.980 | We try not to do politics so much,
00:58:32.700 | but you're here.
00:58:34.300 | One thing that you do a lot
00:58:35.260 | is you test your Overton windows, right?
00:58:37.860 | Like far more than any founder I know.
00:58:39.820 | You know it's not your job.
00:58:40.900 | Someone, for sure, you're just indulging,
00:58:43.620 | but also I think you consciously test.
00:58:45.820 | And I just want to see what drives you there
00:58:48.780 | and why do you keep doing it?
00:58:50.620 | (laughing)
00:58:53.340 | 'Cause some of you tweet very spicy stuff,
00:58:55.020 | especially for like the San Francisco
00:58:56.780 | sort of liberal dynasty.
00:58:59.540 | - I don't know because,
00:59:00.380 | so I assume you're referring to recently,
00:59:02.220 | I posted something about pronouns
00:59:04.020 | and how non-sense. - Just in general.
00:59:05.980 | - I don't want you to focus on any particular thing
00:59:07.820 | unless you want to.
00:59:08.660 | - You know, well, is that tweet in particular,
00:59:10.180 | when I was tweeting it, I was like,
00:59:11.100 | "Oh, this is kind of spicy.
00:59:11.940 | "Should I do this?"
00:59:12.780 | And then I just did it.
00:59:13.620 | And I, you know, I received zero pushback
00:59:16.580 | and the tweet was actually pretty successful
00:59:18.180 | and I received a lot of people reaching out like,
00:59:19.540 | "Oh my God, so true."
00:59:20.740 | I think it's coming from a few different places.
00:59:22.660 | One, life is more fun this way.
00:59:24.860 | Like I don't feel like self-censoring all the time.
00:59:27.140 | You know, it's just, it's like, you know,
00:59:29.380 | that's number one.
00:59:30.340 | Number two, if everyone always self-censors,
00:59:32.340 | you never know what everyone, what anyone thinks.
00:59:34.420 | And so it's becoming like a self-perpetuating thing.
00:59:36.860 | It's like a public lies, private truth sort of phenomenon.
00:59:39.780 | Or like, you know, it's like,
00:59:40.860 | there's this phenomenon called a preference cascade.
00:59:43.180 | It's like, there's this joke.
00:59:44.100 | It's like, oh, there's only one communist left in USSR.
00:59:46.500 | The problem is no one knows which one it is.
00:59:48.260 | So everyone pretends to be communist
00:59:49.340 | because everyone else pretends to be a communist.
00:59:50.620 | And so I think there's a role to be played
00:59:52.180 | for someone to have backbone and just be like,
00:59:54.220 | "Hey, I'm thinking this."
00:59:55.180 | And actually everyone thinks the same,
00:59:56.620 | especially when you are like me in a position
00:59:58.740 | where it's like, I don't have a boss who's going to fire me.
01:00:01.820 | It's like, look, if I don't speak up
01:00:04.100 | and if founders don't speak up,
01:00:05.140 | I'm like, why, what are you afraid of?
01:00:06.980 | Right, like there's really not that much downside.
01:00:09.340 | And I think there's something to be said
01:00:10.700 | about standing up for what you think is right
01:00:12.460 | and being real and owning your opinions.
01:00:14.820 | - I think there's a correlation there
01:00:15.980 | between having that level of independence
01:00:19.380 | for your political beliefs and free speech or whatever,
01:00:22.020 | and the way that you think about business too.
01:00:24.740 | Like I see that it helps, I think.
01:00:27.100 | - I think the world contrarian has become abused,
01:00:29.580 | but I think there's such a powerful insight
01:00:31.700 | that it's cool, which is groupthink is real
01:00:34.420 | and pervasive and really problematic.
01:00:36.220 | Like your brain constantly shuts down
01:00:38.660 | because you're not even thinking in your other way
01:00:40.180 | or you're not thinking.
01:00:41.020 | You just look around you and you decide
01:00:42.460 | to adopt the same beliefs as people around you.
01:00:44.620 | And everyone thinks they're immune
01:00:46.380 | and everyone else is doing it except themselves.
01:00:47.940 | - I'm a special snowflake, I have free will.
01:00:50.820 | - That's right, and so I actually make it a point
01:00:53.060 | to like look for, hey, what would be a thing right now
01:00:56.420 | that I can't really say?
01:00:57.860 | And then I think about it and I'm like,
01:00:58.980 | "Do I believe this thing?"
01:00:59.980 | And very often the answer is yes.
01:01:01.300 | And then I just say it.
01:01:02.460 | And so I think the AI safety is an example of that.
01:01:04.540 | Like at some point, Mark Andresen blocked me on Twitter
01:01:07.780 | and it hurt, frankly.
01:01:09.140 | I really look up to Mark Andresen
01:01:11.540 | and I knew he would block me.
01:01:13.340 | - It means you're successful on Twitter.
01:01:15.060 | That's just a right of passage.
01:01:17.420 | - Mark Andresen was really my booster initially on Twitter.
01:01:19.660 | He really made my account.
01:01:21.420 | And I was like, "Look, I'm really concerned
01:01:23.660 | "about AI safety.
01:01:24.500 | "It is an unpopular view amongst my peers."
01:01:27.660 | - I remember you were one of the few
01:01:29.100 | that actually came out in support of the bill or something.
01:01:31.700 | - I came out in support of SB1047.
01:01:33.380 | A year and a half ago, I put some tweet storms
01:01:36.740 | about how I was really concerned.
01:01:38.220 | And yeah, I was blocked by a bunch of A6NZ people
01:01:40.740 | and I don't like it, but it's funny.
01:01:43.500 | Maybe it's my French education.
01:01:45.940 | But look, in France, World War II
01:01:48.180 | is very present in people's minds.
01:01:50.100 | And the phenomenon of people collaborating
01:01:52.500 | with the Nazis during World War II
01:01:53.940 | is really present in people's minds.
01:01:55.700 | And there is always this sort of debate
01:01:57.220 | that people have at dinner and say,
01:01:59.460 | "Ah, would you really have resisted
01:02:01.060 | "during World War II?"
01:02:02.260 | And everybody is always saying,
01:02:03.340 | "Oh yeah, I would totally have resisted."
01:02:04.700 | It's like, "Yeah, but no."
01:02:05.740 | Like, look, the reality of it is 95% of the country
01:02:08.220 | did not resist and most of it actually collaborated
01:02:10.460 | actively with the Nazis.
01:02:11.740 | And so 95% of y'all are wrong.
01:02:13.780 | You would actually have collaborated, right?
01:02:15.820 | I've always told myself I will stand
01:02:19.020 | for what I think is right,
01:02:20.620 | even if I've gotten into physical fights in my life,
01:02:24.300 | like in SF, because some people got attacked.
01:02:26.300 | And the way I was brought up
01:02:27.340 | is like if someone gets attacked before you,
01:02:28.660 | you get involved.
01:02:29.500 | Like, it doesn't matter.
01:02:30.340 | You get involved and you help the person, right?
01:02:32.260 | And so, look, I'm not pretending we're like nowhere near
01:02:35.140 | like a World War II phenomenon,
01:02:36.460 | but I'm like exactly because we are nowhere near
01:02:38.180 | this kind of phenomenon.
01:02:39.020 | Like the stakes are so low.
01:02:40.340 | And if you're not going to stand up
01:02:41.620 | for what you think is right when the stakes are so low,
01:02:43.340 | are you going to stand up when it matters?
01:02:44.900 | - Italian education is that in Italy,
01:02:46.660 | people don't have guns when you fight them,
01:02:48.180 | so you can always get in a fight.
01:02:49.500 | But here in the US, I'm always like, "Oh man."
01:02:52.500 | - I feel, I detect some inconsistency in your statements
01:02:55.540 | because you simultaneously believe that AGI is very soon.
01:02:59.540 | And you also say stakes are low.
01:03:01.540 | You can't believe both are real.
01:03:02.900 | - Well, the stakes,
01:03:03.740 | so why does AGI make the stakes of speaking up higher?
01:03:06.260 | - Sorry, the stakes of like safety.
01:03:08.340 | - Oh yeah, no, the stakes of AIs,
01:03:09.740 | or like physical safety?
01:03:11.060 | - No, AI safety.
01:03:12.060 | - Oh no, the stakes of AI safety couldn't be higher.
01:03:13.900 | I meant the stakes of like speaking up about-
01:03:16.540 | - Pronouns or whatever.
01:03:17.540 | - Oh, okay, okay.
01:03:18.380 | - Yeah, yeah, yeah.
01:03:19.220 | - How do you figure out who's real and who isn't?
01:03:21.140 | Because there was the whole like manifesto
01:03:23.700 | for responsible AI that like hundreds of like VCs
01:03:26.820 | and people signed,
01:03:27.660 | and I don't think anybody actually,
01:03:29.460 | any of them thinks about it anymore.
01:03:30.300 | - Was that the pause letter, like six month pause, or?
01:03:32.540 | - Some, no, there was like something else too
01:03:34.340 | that I think general catalyst and like some fun sign,
01:03:37.020 | but, and then there's maybe the anthropic case,
01:03:40.300 | which is like, "Hey, we're leaving open AI
01:03:41.860 | because you guys don't take security seriously."
01:03:43.860 | And then it's like, "Hey, what if we gave AI access
01:03:46.380 | to a whole computer to just like go do things?"
01:03:48.620 | Like, how do you reconcile like, okay,
01:03:52.860 | I mean, you could say the same thing about Lindy.
01:03:54.700 | It's like, if you're worried about AI safety,
01:03:56.220 | why are you building AI, right?
01:03:57.540 | That's kind of like the extreme thinking.
01:03:59.340 | How do you internally decide between participation
01:04:03.220 | and talking about it and saying,
01:04:05.060 | "Hey, I think this is important,
01:04:06.820 | but like I'm still gonna build towards that
01:04:08.900 | and building actually makes it safer because I'm involved,"
01:04:11.980 | versus just being like anti, "I think this is unsafe,"
01:04:15.020 | but then not do anything about it
01:04:16.700 | and just kind of remove yourself from the whole thing,
01:04:18.860 | if that makes sense.
01:04:19.740 | - Yeah, the way I think about our own involvement here
01:04:22.420 | is I'm acutely concerned about the risks at the model layer.
01:04:26.980 | And I'm simultaneously very excited about the upside.
01:04:30.580 | Like for the record, my PDoOM,
01:04:32.340 | insofar as I can quantify it, which I cannot,
01:04:34.220 | but if I had to, like my vibe is like 10%
01:04:37.100 | or something like that.
01:04:38.060 | And so there's like a 90% chance
01:04:40.020 | that we live in like a pure utopia, right?
01:04:42.660 | And that's awesome, right?
01:04:43.660 | So like, let's go after the utopia, right?
01:04:45.780 | And let's talk about the 10% chance
01:04:47.060 | that things go terribly wrong,
01:04:48.660 | but I do believe there's a 90% chance
01:04:50.140 | that we live in a utopia where there's no disease
01:04:51.820 | and it's like a post-scarcity world.
01:04:53.740 | I think that utopia is going to happen through,
01:04:56.100 | like, again, I'm bringing my little contribution
01:04:58.140 | to the movement.
01:04:58.980 | I think it would be silly to say no to the upside
01:05:01.820 | because you're concerned about the downside.
01:05:03.500 | At the same time,
01:05:04.340 | we want to be concerned about the downside.
01:05:05.620 | I know that it's very self-serving to say,
01:05:07.900 | "Oh, you know, like the downside doesn't exist
01:05:09.540 | at my layer, it exists at like the model layer."
01:05:11.700 | But truly, look at Lindy, look at the Apple building.
01:05:15.380 | I struggle to see exactly how it would like get up
01:05:17.780 | and start doing crazy stuff.
01:05:19.220 | I'm concerned about the model layer.
01:05:20.820 | - Okay, well, this kind of discussion can go on for hours.
01:05:23.340 | It is still daylight, so not the best time for it,
01:05:26.100 | but I really appreciate you spending the time.
01:05:28.740 | Any other last calls to actions
01:05:30.380 | or thoughts that you feel like you want
01:05:32.620 | to get off your chest?
01:05:33.460 | - AGI is coming.
01:05:34.460 | (all laugh)
01:05:37.500 | - Are you hiring for any roles?
01:05:40.260 | - Oh yeah, I guess that should be the...
01:05:41.740 | (all laugh)
01:05:43.540 | - Don't bother.
01:05:44.380 | - No, can you stop saying AGI is coming
01:05:45.500 | and just talk about it?
01:05:47.500 | - We are also hiring.
01:05:49.020 | Yeah, we are hiring designers and engineers right now.
01:05:52.260 | - Yeah.
01:05:53.100 | - So hit me up at a flow@lindy.ai.
01:05:55.700 | - And then go talk to my Lindy.
01:05:57.420 | - That's right.
01:05:58.260 | - You're not actually going to read it.
01:05:59.100 | - Actually, I have wondered how many times
01:06:00.220 | when I talk to you, I'm talking to a bot.
01:06:02.660 | - I want that as a discussion.
01:06:03.740 | - Part of that is I don't have to know, right?
01:06:05.500 | - That's right.
01:06:06.340 | Well, it's actually doubly confusing
01:06:07.500 | because we also have a teammate whose name is Lindy.
01:06:09.500 | - Yes, I was wondering, I met her.
01:06:10.940 | I was like, "Wait, did you hire her first?"
01:06:13.820 | - Marketing is fun.
01:06:15.260 | - No, she was an inspiration.
01:06:16.460 | We named the company both after her.
01:06:18.740 | - Okay, interesting, interesting.
01:06:20.820 | Yeah, wonderful.
01:06:21.660 | I'll comment on the design piece
01:06:23.060 | just because I think that there are a lot of AI companies
01:06:26.660 | that very much focus on the functionality
01:06:30.060 | and the models and the capabilities and the benchmark.
01:06:33.500 | But I think that increasingly
01:06:34.980 | I'm seeing people differentiate with design
01:06:36.940 | and people want to use beautiful products
01:06:38.940 | and people who can figure that out
01:06:40.820 | and integrate the AI into their human lives.
01:06:43.900 | Design at the limit.
01:06:44.780 | One, at the lowest level, it's make this look pretty,
01:06:47.220 | make this look like Stripe or Linear's homepage.
01:06:49.540 | That's design.
01:06:50.500 | But at the highest level of design,
01:06:52.260 | it is make this integrate seamlessly into my life.
01:06:54.300 | Intuitive, beautiful, inspirational, maybe even.
01:06:57.860 | And I think that companies that,
01:07:00.260 | this is kind of like a blog post I've been thinking about,
01:07:01.740 | companies that emphasize design
01:07:03.340 | actually are going to win more than companies that don't.
01:07:05.660 | - Yeah, I love this Steve Jobs quote
01:07:07.260 | and I'm going to butcher it.
01:07:08.260 | It's something like,
01:07:09.500 | "Design is the expression of the soul of a man-made product
01:07:12.140 | "through successive layers of design."
01:07:14.060 | - Jesus.
01:07:14.900 | - He was good.
01:07:16.260 | - He was cooking, he was cooking on that one.
01:07:19.540 | - It starts with the soul of the product,
01:07:21.820 | which is why I was saying it is so important
01:07:23.780 | to reach alignment about that soul of the product, right?
01:07:26.140 | It's like an onion,
01:07:26.980 | like you peel the onion in those layers, right?
01:07:28.700 | And you design an entire journey,
01:07:30.860 | just like the user experiencing your product chronologically
01:07:33.900 | all the way from the beginning of like the awareness stage,
01:07:36.580 | I think it is also the job of the designer
01:07:38.340 | to design that part of the experience.
01:07:39.660 | It's like, okay, what, you know?
01:07:40.980 | And that's brand basically.
01:07:42.660 | So yeah, I agree with you.
01:07:43.820 | I think design is immensely important.
01:07:45.740 | - Okay, lovely.
01:07:46.580 | - Yeah, thanks for coming on, Flo.
01:07:47.740 | - Yeah, absolutely.
01:07:48.580 | Thanks for having me.
01:07:50.100 | (upbeat music)
01:07:52.700 | (upbeat music)
01:07:55.340 | (upbeat music)
01:07:57.940 | [MUSIC PLAYING]