back to indexCooking with fire without burning down the kitchen: Dominik Kundel

00:00:00.000 |
Thanks everyone for joining. Let's start with a quick raise of hands. How many of you in the 00:00:17.520 |
last two years got into AI? That's a pretty good amount. And how many of you were in the room when 00:00:24.760 |
your boss was like, okay, so what is our plan about AI? What do we do about AI? All right. That was 00:00:31.700 |
pretty much me last year. So in the next 20 minutes, time permitting, should be exactly 20 minutes, 00:00:40.060 |
we're going to talk about our team at Twilio, our history, lessons we've learned around our team and 00:00:48.140 |
how we operate within the company, the role we play, and to ensure that Twilio can keep up with 00:00:54.340 |
the rapid pace of AI. Or in other words, how do we, you know, cook with fire without burning down the 00:01:00.140 |
kitchen and upsetting our customers? As Peter said, my name is Dominic Kondel. I lead product and design 00:01:08.440 |
within a small team at Twilio called Emerging Tech and Innovation. We're a team of 16 people across 00:01:14.980 |
engineering, product, design, and go to market. And we try to operate as self-contained as possible 00:01:20.080 |
outside of our two more traditional Twilio business units around communications that is more focused 00:01:25.880 |
on sort of the APIs that Twilio is known for around SMS, voice, email, etc. And then our customer data 00:01:32.380 |
platform that is more focused on providing a holistic overview of your customer so you can appropriately 00:01:38.840 |
engage with them. And our team itself is focused on exploring what the future of customer engagement is going to look like 00:01:44.640 |
with an eye towards emerging technologies like AI. 00:01:48.060 |
But we're not the AI team within Twilio. At Twilio, we sort of see AI as a feature that spreads across all of our products to 00:01:58.480 |
to help engage customers to better. It helps our customers to better engage with their customers. And you might have seen 00:02:08.400 |
sort of our like customer AI billboards around that is really sort of the vision around that is how do we 00:02:13.580 |
use AI to better empower people to engage with their customers. So we have innovation in the name though. So are we the only ones 00:02:22.500 |
innovating at Twilio then? The answer is also no. Generally, it's everyone's job to innovate. But I think there's two types of 00:02:31.680 |
innovation. If you've read the innovators dilemma, this seems familiar to you. But basically, there's sort of the sustaining 00:02:37.820 |
innovation and the disruptive innovation. Sustaining innovation is really teams trying to innovate on top of their existing products and their 00:02:46.400 |
existing customer needs. That is sort of often more the iterative one. It's like Sony trying to make a mirrorless camera better and like 00:02:55.540 |
While disruptive innovation is often initially in its infinite stage much worse in quality. It's actually a 00:03:03.440 |
characteristic of disruptive innovation is the poor quality that often does meet the need of a niche audience though. 00:03:10.760 |
So think about cell phone cameras when they initially came out. Nobody would leave their camera at home, go on vacation and be like I'm gonna take pictures on this like Motorola. 00:03:19.900 |
But these days like most people don't even have a camera anymore. Like I just went on vacation and only took my phone. 00:03:26.780 |
The exciting thing about AI is that innovation is really happening on both ends of the spectrum. We see both sort of 00:03:35.280 |
augmentation of existing workflows like, you know, Photoshop's generative fill. A lot of what Apple announced was baked into existing products, right? 00:03:46.420 |
It wasn't a, hey, here's a new AI thing. But it was more built into the features. So that's more on the sustaining innovation side. 00:03:53.420 |
While I would say agents, especially in GPT's in general, are more on the disruptive side. Because the reality is agents are not ready for enterprise prime time. 00:04:03.300 |
Primarily because the quality is not there yet, right? Like we've, we've all seen sort of agents going rogue, partially even rag based chatbots. 00:04:12.440 |
Like not even agents like, you know, hallucinating things like the example of selling, you know, Chevy Tahoe for a dollar or, you know, Air Canada's chatbot making up a policy. 00:04:23.440 |
And so because of all of that and overall the lack of clarity in the regulatory space, often within enterprises, the adoption right now is 00:04:36.580 |
And so overall we're seeing sort of a lack of, like it's not ready for the prime time. 00:04:43.720 |
The other aspect of it is for a live enterprise, they see AI as a cost saving method and agents are not cheap yet, right? 00:04:51.720 |
Like we can do increasingly better like rag based chatbots for low cost, but like agents are still not in the cheap level if you want high quality. 00:05:02.680 |
The difference though is for startups and SMBs, they're really your niche these days where for them, they actually see agents as a way, as a value add, not as a cost saving. 00:05:12.860 |
They already know that their quality of customer support and sales and stuff is not where you want it to be. 00:05:20.500 |
And so for them, it's much more an opportunity to really add value because, you know, it can't be worse than the current experience. 00:05:30.000 |
So why am I talking about the difference between these two types of innovation? 00:05:34.400 |
It's very natural for a business actually to focus on sustaining innovation, especially if you're the incumbent in the market. 00:05:40.820 |
It's you already have an established customer base, they're already paying you, so that's great. 00:05:45.780 |
The problems are well defined and so overall, it's much more predictable and you can you hopefully understand your target customer. 00:05:52.960 |
While for disruptive innovation, it's sort of really the opposite and you don't know yet what it's going to be. 00:05:58.780 |
And like you often are not going to make money initially, but ignoring disruptive innovation right now can be more dangerous than ever before because the biggest difference between sustaining and disruptive innovation is the quality. 00:06:12.920 |
Like every day right now with AI, there's massive leaps and so something that wasn't quite there from a quality perspective yesterday can be there tomorrow. 00:06:21.740 |
And in a similar way, a company that is not your concern yesterday can be tomorrow. 00:06:26.700 |
I think the best example of this is the impact that ChatGPT and perplexity and similar things had on how Google approaches search, right? 00:06:34.660 |
Now we have AI overview and things like that. 00:06:36.840 |
And so our team right now is focused on this disruptive innovation aspect. 00:06:43.800 |
That's sort of where our innovation perspective comes from is we want to understand what is going to be disruptive to Twilio and how do we prepare for that, react quickly, or be the disruptors in the space and inform the rest of the company about our learnings. 00:06:59.600 |
But that wasn't our original agenda, our original agenda was we were founded as a three person Skunk Works team last year that was really focused on special projects and trying to rapidly prototype the bridge between our customer engagement, like our customer data platform and our communications platform to create this customer engagement flywheel that I was talking about. 00:07:23.840 |
And to do this in a way of quickly, quickly prototyping solutions, showing them to customers, getting feedback, understanding what is their vision, where are they going to go, not just their current problems, and then iterate on that. 00:07:37.600 |
So we found Gen.AI was actually a natural fit for us because the biggest thing about customer engagement is you're sort of sitting between unstructured communications data and structured customer data. 00:07:55.360 |
And like your unstructured data is not very helpful for analysis. 00:08:00.560 |
And really large language models are perfect to bridge this. 00:08:03.520 |
They're great at creating this connection and translating structure to unstructured and vice versa. 00:08:09.280 |
And so as a result, we dove into prototyping our first two solutions in that space. 00:08:17.880 |
We had conceptualized essentially two features that we showed to customers, iterated on, and ultimately handed off to R&D to take to market. 00:08:26.960 |
One of them was the AI personalization engine, which is a rag on top of this customer profile that is within segment. 00:08:34.720 |
And then the AI perception engine that was more focused on how do you take communications data and then translate that into a customer profile. 00:08:42.720 |
Because within a conversation, there's so much rich information that you want to remember about a customer. 00:08:48.720 |
And sometimes a human agent will take notes somewhere, but I realistically want to have one central spot. 00:08:54.480 |
So we conceptualized these two systems and sort of together they make this customer memory. 00:09:00.480 |
But as we showed them to leadership and they handed them over to R&D, we did hit our first challenge. 00:09:06.480 |
And that was that we had conceptualized something that wasn't sustaining innovation, was disruptive innovation. 00:09:12.480 |
It was a problem where the quality was not quite there, the cost was not quite there, and primarily we were solving a niche use case. 00:09:21.240 |
Like what we had made up was primarily helpful for this up-and-coming wave of agents and generative AI solutions in general. 00:09:29.240 |
But there wasn't a large market for this yet. 00:09:32.000 |
And so even though we had great feedback, we didn't really have the momentum that helped us to prioritize this among the sustaining innovation and profitability focus that our R&D teams had. 00:09:46.000 |
And so we learned from that that even though our customer obsession was key, because it helped us understand that we were on the right track, we knew from talking to them that this wasn't just a problem they were trying to solve today. 00:09:59.000 |
It was a problem that they were going to solve tomorrow. 00:10:02.800 |
What wasn't working was we had to build ideas out further to be able to gain that traction and be able to hand it over to R&D. 00:10:11.760 |
And so we started working on our next project called AI Assistance, which is an agent builder that is built on top of this customer memory concept. 00:10:19.440 |
And it's designed to really allow you to build omnichannel customer engagement chatbots. 00:10:27.240 |
We tried to really rapidly prototype, quickly get something into the-- like to show to customers, get feedback, get a working demo, continue to iterate on it as we're getting feedback both internally and externally. 00:10:41.000 |
But we did a couple of things differently this time. 00:10:43.040 |
One, we decided we have to get this into the hands of people as soon as possible. 00:10:48.040 |
What we had built previously was primarily demo driven. 00:10:50.640 |
It was what can we show to customers and get feedback based on that. 00:10:54.800 |
And so we changed our mode and ran internal hackathons, tried to give it-- give really rough-- access to really rough prototypes with all the rough edges to customers coming into the office. 00:11:05.840 |
And we're like, here's for one day you have access to this prototype. 00:11:09.400 |
Let us know everything that is wrong with it and what you wanted to do. 00:11:13.400 |
And this was both terrifying and incredibly insightful for us. 00:11:19.600 |
And additionally, we tried to find more opportunities to do dogfooding. 00:11:23.440 |
We tried to find ways we can solve problems internally. 00:11:26.440 |
We started with a low-risk use case around IT help desk. 00:11:30.760 |
We knew it wasn't the perfect target customer that we would expect to buy our solution. 00:11:36.040 |
But it was a way for us to gather data and see what the quality challenges were and how we wanted to structure things. 00:11:42.680 |
And then we started to establish an engineering organization that would actually be able to set the foundation for us to actually take this into the hands of people. 00:11:52.360 |
But that brought us to the next challenge because at Twilio we've worked hard over the last couple of years to really build the foundation to-- 00:12:00.680 |
in our software development lifecycle to really create trust with customers in what we're launching. 00:12:06.200 |
And Gen.AI really changed how you develop AI products, right? 00:12:10.080 |
It's actually exciting but also tricky for larger companies because Gen.AI turned sort of the entire product development lifecycle on its head. 00:12:19.040 |
Where in the past you had to gather a lot of data. 00:12:22.120 |
You had to make sure you had enough data to train a model. 00:12:25.360 |
You trained the model until you reached a certain level of quality. 00:12:28.400 |
And then you released it and then you started optimizing. 00:12:31.120 |
These days you can actually use Gen.AI to build a prototype or often an MVP very rapidly without having that data and release it out to some customers to get feedback and then continue to iterate on that. 00:12:44.920 |
But that puts us into a dilemma because we have a system now that we know is not quite there from a quality perspective but we need to get it in front of people to make it better. 00:12:57.080 |
And so how do we do that and how do we keep up with the pace of technology without really upsetting customers with something that doesn't meet our normal quality bar? 00:13:07.800 |
And so that's why we looked around the industry for other examples. 00:13:10.520 |
The two that came up for us were GitHub Next, the creators of GitHub Copilot and then Cloud for Emerging Technologies and Incubation that created things like workers and D1 and workers AI. 00:13:22.520 |
And both of these were really focused on setting the appropriate expectations for users around reliability, capabilities, and availability. 00:13:33.240 |
Our solution for this was to create our own sub brand called Twilio Alpha that was very focused on setting the right expectations with customers but enabling us, therefore, to ship early and ship often, engage the interest from customers outside of those that we would have to otherwise track down and enable customers to come to us. 00:13:52.960 |
And so that's learning number two to ship early and ship often so that we can regularly get rapid feedback from customers. 00:14:00.160 |
And so that's learning number two to ship early and ship. 00:14:37.160 |
So that got us to our second learning about shipping early and shipping often. 00:14:47.160 |
Because by having a sub-brand, it allowed us to open wait lists for developer previews, 00:14:51.160 |
show our offerings, and onboard people quickly. 00:14:54.160 |
As well as have internal POCs to learn while we're onboarding people. 00:15:00.160 |
So like, you know, how fast, like, especially with sort of the rapid pace of AI, 00:15:08.160 |
And the honest answer is we haven't figured out yet what is fast enough. 00:15:11.160 |
But we put some base principles in place that help us move fast. 00:15:17.160 |
As we started hiring people, we focused on curiosity and creativity. 00:15:22.160 |
Because at Twilio, we've always valued the creativity of developers to solve not just coding, 00:15:30.160 |
In fact, our co-founder, Jeff Lawson, wrote a whole book, Ask Your Developer About It. 00:15:34.160 |
And so for us, having natural curiosity within the team was more important than existing AI experience. 00:15:41.160 |
Because we wanted everyone to be able to solve problems rather than relying on product to figure out what is the thing we're building. 00:15:49.160 |
We wanted to give problems to the engineers and be able to figure out what we're going to build. 00:15:54.160 |
The other thing we put in place is flexibility, both in terms of our systems. 00:15:59.160 |
We assume that every model that we're using is going to be redundant tomorrow. 00:16:03.160 |
That doesn't mean that we're jumping on every model that comes out in every paper. 00:16:07.160 |
But instead, we understand what are our known limitations and how do we -- 00:16:12.160 |
And every time a model comes out, how do we quickly validate whether the next model is something that helps us in that moment. 00:16:19.160 |
So for example, when 3.5 Sonic came out, we were able to figure out within a day whether we wanted to pay any attention to this for now 00:16:31.160 |
Having a new team that is specifically focused on innovation helped us not have any pre-existing commitments to customers 00:16:38.160 |
and allowed us to really have flexibility about the roadmap. 00:16:41.160 |
And we try to defend this flexibility as much as we can with current initiatives around this. 00:16:48.160 |
The big part around this I would recommend is to understand that expectations are shifting at any moment currently. 00:16:56.160 |
Meaning figure out what are expectations that are staying because those are worth putting on the roadmap. 00:17:01.160 |
Those that are ever-changing, you need to have more flexibility around. 00:17:04.160 |
Think about the difference between trust and safety where people are always going to be focused on versus multi-modality where every time some new mode comes out, 00:17:16.160 |
people are going to change what they want to have. 00:17:19.160 |
Most people don't talk about image these days anymore, but that was the hot thing a few months ago. 00:17:26.160 |
So you will constantly have to iterate on your roadmap and you have to be okay with failures. 00:17:33.160 |
But if you're okay with failures, especially in the environment right now where everyone in the company and most companies has to provide value at all times, 00:17:42.160 |
how do you provide that value while you're waiting for your eventual home run? 00:17:47.160 |
That's where we made the last mistake that I want to call out where, as I mentioned, our team is self-contained within the company, including having our own go-to-market. 00:17:58.160 |
And in the first year, we took that self-containment a bit too far. 00:18:02.160 |
We basically just operate in our own corner quietly and shared what we were working on on a need-to-know basis, both with customers but also internally. 00:18:11.160 |
And that resulted in often teams not even knowing that we existed as a team or what we were working on. 00:18:18.160 |
And it was on us to find potential conflicts or collaboration opportunities. 00:18:22.160 |
And at the same time, our customers were having the same questions that we had around what are we doing about AI, you know? 00:18:29.160 |
And it really, especially for a B2B company, you have the opportunity to help your customers be thought leaders in their space and build a stronger bond. 00:18:38.160 |
So our third learning was to share things as we go, both internally and externally. 00:18:43.160 |
Internally to find opportunities to share all that enormous learnings, the enormous learnings that we had with others to empower them, 00:18:52.160 |
but also externally to enable others to be thought leaders in their space. 00:18:56.160 |
So these are our biggest learnings from the last year, and we actually ended up turning them into our base principles for the team. 00:19:05.160 |
So being customer and developer obsessed to make sure that we talk to customers early and often and understand not just their current problems and their current problems with Twilio, 00:19:16.160 |
but understand their business, their challenges in general, and their vision for customer engagement so that we can anticipate their future needs. 00:19:24.160 |
Shipping early and often so that we can set the right expectations with them, but still get things into their hands and get feedback from them. 00:19:33.160 |
And creating a team that is curious and owns problems so that we can actually build things quickly. 00:19:39.160 |
And then lastly, share as we go, that we're both sharing internally as much as we can, but also sharing externally to help our customers be thought leaders. 00:19:50.160 |
And with that, thank you so much for your attention. 00:19:54.160 |
The slides are already on that URL if you want to check them out. 00:19:58.160 |
As Peter mentioned, I will have to run to the Twilio booth to raffle away of Christ. 00:20:03.160 |
But if people do have questions, please come to the Twilio booth. 00:20:06.160 |
It's like literally over there. It's the only one that is red. 00:20:09.160 |
And I'm happy to answer any questions there, but thank you so much for having me and have a great day.