SRABH SINGH: Cool. I'd originally titled this talk, "Connect real-time data to your AI," et cetera, et cetera. But really, it's more existential, right? The AI overlords are coming for us. And to help them be good rulers, to help us, let's just give them the data they need so that they can do a good job, right?
Hopefully, this talk is going to be the simplest talk that you heard at this conference. If it's not, I'll go back to using GPT-4 for coding instead of Sonnet. But the real pain that I have as I work with LLMs is that they can ride a flappy bird for me with my face going up and down in 30 seconds.
But they can't talk to my data intelligently. It's really stupid. If I want to connect it to my calendar and I just want to say, how many one-on-ones did I have last week? What's a good number to have with my team given their roles? Help me stagger them better and plan it out.
I want to connect it to my sales force and say, why is this deal with Acme stuck in stage three? And I needed to do the right thing. I needed to figure out the things between stage two and stage three in my sales pipeline and tell me why that particular deal is blocked.
I wanted to connect to my tickets and my product data and say, is this ticket from an enterprise customer? What's the name of their project? Can you tell me what the status of that project is and what part of the product funnel this project is in? I went to Amazon today in the morning and they have this Rufus thing.
And I was like, OK, cool. Is this product - I'm going to tell you what that product is in a second - but is this product available for one-day delivery at my Harrison Street address? And just doesn't - like, what is this, right? Like, it's right here. Just do it.
And it doesn't work. And you all know why it doesn't work, right? There's like a death by a thousand cuts and it's not secure, and I don't want to connect my calendar and make it into a GPT. Who even knows what the GPT is doing with this, right? Like, it's scary, and it doesn't work.
So we solved this with a pretty simple idea, which is that you take your live data and business logic and you make that available as a tool to your LLM. No shit. It's not surprising, right? It's easy. Because - and we did a bunch of things - that makes it work really, really well, right?
Let's see if you have time for a quick live demo here. Let me see if I'm connected to the internet, which I am. All right. I'm going to zoom this up. All right. So I am a blockbuster because, obviously, services businesses are the most important businesses now, and, like, movie streaming businesses are going to go nowhere in the AI world that is to come.
And so, in my blockbuster database and transactions and all of this stuff that I have going on, I want to ask my data question and say, "What helped me write an email to my top customer, thanking them for their patronage?" Quote, mention some recent movies they watched, right? Straightforward request.
"I have all this data. I just needed to do the right things. And I needed to write an email for me." Right? And it works. And it works despite the fact that it's going to two or three different places and getting data from them. And it works pretty well.
It handles all kinds of situations. And I'm going to talk to you about three key ideas about how it works, and hopefully that's going to be useful to you as well. So, the first is this idea of a unified query language. Whether you're talking to structured data, or unstructured data, or APIs, what if your LLM could talk to everything the same way?
Right? LLMs don't know what your API is. If you're a little honest with yourselves, you probably don't know what your API does. But LLMs know what SQL is. Right? Because when you say select star from x where id greater than 1, greater than has a semantic meaning that is embedded in the language.
That in your API, that URL param, who knows what it means? Is it greater than, is it greater than, equal to? Is it greater than, but actually only works with Boolean? I don't know, right? But it works with SQL. Because LLMs know what that SQL is, right? So the first part of this is, let's just make everything one query language and deal with that.
The second is an object model for authorization, right? Which is, again, kind of blows my mind of why it's so complicated. Look, I don't care where the data is coming from. The data has a schema, right? It's a property of the data, and it's a property of the session.
And then just run the rule. And maybe there's 100 rules, but it should just work. And then however it gets accessed, it's fine, right? I should be able to use this wherever it's used. However it's accessed, the same authorization should be applied. So that's idea number two. And that's kind of embedded there as well.
The third, and this is kind of interesting, is to get the LLM to figure out the plan to access data by itself. We don't have to hard code it, and we don't have to do the work. And then you're like, Tanmay, listen, what are you smoking, man? LLMs can't even reason.
I can't even get it to count the number of Rs in strawberry. What are you going to do with-- how are you going to make me fetch all of this data from three or four different places in DeSambuay and whatnot? And we're like, you know what? There's a really simple fix to this problem.
But let me ask you a live question. How many of you can count the number of I's in supercalifragilisticexpialidocious? Can you? You can't, right? You're being mean to the LLM by asking it such questions. Don't be mean to the LLM. Set it up for success. Ask it to write Python code to solve the problem, and it works.
And that's it. So when you're asking, and when we're asking our LLMs to figure out how to retrieve data, we just ask it to run Python code to fetch the data that we want. So if the AI singularity is coming, get ready for the data singularity. Put everything together.
If you're doing AI, you need access to data. If you're doing data and you wish that it could talk to your AI, if you have AI and data and you need to get to talk to each other, come visit us at our booth. Everything's in the open at hasura/pasture.ai.
Talk to you folks soon. Thank you for your time. Bye. Bye. Bye. Bye. Bye. Bye. Bye. Bye. Bye. Bye. Bye. Bye. Bye. Bye. Bye. Bye. Bye. Bye. Bye. Bye. Bye. Bye. Bye. We'll see you next time.