Okay, so we're gonna go through putting together our reader model and the fast API wrapper around all of this. So originally I was just gonna do the reader model in this video, but I realized it's actually super easy. So we'd almost be done already if we were just doing the reader model.
So what we're gonna do, is wrap everything up in the API and then we'll add the reader model into that. So this is where we got to before. We have this, which is our Jupyter Notebook and this has everything. So it has our document store, the retriever, and we're just retrieving a few things.
So we've got a few sort of contexts here. Now, obviously notebooks are fine for when we're putting things together and testing things, but we need to switch over to actual Python files. So we're gonna switch across to VS code over here. So this is our project directory on the left here.
So we have the code and the labs in here, which we already put together. We have our data, which is meditations. And I also put together this requirements text file. So in here, these are all the libraries that we need to pip install. So we have fast API, farm haystack, which we already installed, and uvicorn.
They're the two that I want to focus on. So fast API, we're obviously using that to build our API and uvicorn we're going to use to spin up the server that our API will be hosted on. So that's our requirements.txt. I'm gonna create a new folder in here. For now, I'm just gonna call it QA service.
I mean, we can change these later on. I'm not really sure what to call them all. So we're gonna encode. Okay, and then in here, we'll create our first Python file. And this is just going to be our API. So we'll call it API.py. And now we'll start putting together our actual API.
So let's switch across to the window. And the first thing we need to do is import fast API. So we do from fast API, import fast API. And then we want to initialize our API here. So let's call it app equals fast API. I just need to fix that there.
Okay, and that initializes our API. And then whenever we want to create a HTTP method, all we need to do is this. So we do app. And then in here, we write the method that we'd like to add to our API. So the only one that I think we're gonna need at least at the moment is a get method.
So we'll add that. And then we also need to add path to this method. And in this case, we're going to be querying our Q&A model. So we'll just create the query path. So it's pretty good. And then we do async. And here we have our function. So this is just going to be the query function.
And we want this query function to at some point, accept a query. But for now, we'll just leave it. And we're gonna add that in a moment. Now, this will just return, no, hello world. So that we know that it's working. So let's spin this up. So I'm opening terminal in this folder at the moment.
So at the moment, we're at the high level of this. So we'll go down a few items. So we have, we want to go into the code directory. And then in the code directory, it's QA service, which we made just a moment ago. And from inside here, we initialize our GuviCorn service, which will host our API.
So for that, we just write GuviCorn. And then we want our file name, which is API. And then we want whatever we've called our fast API instance in our code, which is app up here. And then we'll use the reload flag as well. So this means that whenever we edit our code, we don't need to re-initialize the API instance in order for changes to load.
Okay. So now we can see that here, we have our instance. So I'm gonna copy that and open it in my browser. So over here, enter that. Enter that. And we see this detail not found. So that's because we just went straight to the index. So let me just have the query endpoint and there we get hello world.
So this is the endpoint that we're gonna use for our fast API. So we know that's working. And now let's add in the rest of our code so that we can begin querying our QA stack. So if we just check what we did in our Jupyter notebook, all we actually need to do is basically copy loads of this across.
So let me just create a new cell here. And I'm gonna take all of this and we're just gonna copy it across. (keyboard clicking) And honestly, this should really be all we need. Okay, so we'll copy this and take it into our code here. So before we initialize our API, we're going to initialize the rest of our code.
So we have our document store, we initialize that, it accesses the Aurelius index. And then we have our retriever here. Do we, so import them as well. And then as well as the retriever, we also have the reader model, which is the next step. So first, obviously we need to import that and we do from Haystack reader farm, import farm reader.
And then when we put all of these together, we're also going to use something called a pipeline. And to use that, we just want to put from Haystack pipeline, import extractive QA pipeline. So the reason it's extractive QA is because it's question answering and we're extracting the answer from the text rather than generating it with model.
So that's why we're using the extractive QA pipeline. Now, that should be everything that we need to import. And then all we need to do here is we initialize our reader model, which is farm reader. And then in here, we want to include our model name or path. And that's going to be equal to deep set.
Let me do this on another line. So deep set, BERT base case squad two. And this is a, I don't know if we covered this already. I don't think we did. So this is a pretty standard Q&A model that I think a lot of people use. So it's pretty safe one to start with.
And then after we've initialized these three parts of our stack, we need to initialize our pipeline object. So for that, we do pipeline. And then we're taking our extractive QA pipeline. And then we have our reader, which is going to be equal to our reader. And we also need to pass the retriever, which is equal to our retriever.
So that should be everything that we need to do in order to initialize our model. So now what we want to do is we need to query our data. So we do pipeline.run. And then in here, I'm just going to put like a random query. So what was the one we used before?
It was, what did your grandfather do? Grandfather teach you. Okay. And just pass that as the query. And then this will return, this is going to return a dictionary. So then we can just return it straight away. Now, let's see if our API is still running. I think it should be.
(keyboard clicking) Okay. Yep, so it's just starting up again now. Okay, looks good. So I think that's good. And let's try and open it up. Over here. We should get loads of text. Yeah. So that's cool. So now we're getting the query. What did your grandfather teach you? And this is pretty hard to see what you see.
We get this list. And then this goes all the way through here and we get all of these different answers. So the first, like the top rated answer is what did your grandfather teach you? And it's act and speak. Okay. In sleep scene, act and speak. Toilet children from their parents.
Okay, well, it is interesting. Okay. So it's not amazing. So let me try, instead of using, instead of using the DPR retriever, let's try swapping it out for a VM25 retriever, which is, it's basically just an algorithmic retriever. And it should actually run faster as well. Elastic search retriever.
And all we do is swap those around. And I think the only parameter we need is the document store. Yeah. So, okay. Let's let that start up again. Okay, just waiting. Let's try this one. See how it compares. And then we'll go with one of these, I think. So let's go.
(mouse clicking) And just refresh the page again. Okay. Yeah, this is a lot better. Which is weird. I kind of thought the other one would do better. But that's fine. Maybe we need to fine tune it a bit more than what we have. So what we have here is, what did your grandfather teach you?
And we get good morals and the government of my temper, which is, I think pretty exactly what I wanted it to return. So that's good. I see here there's like a new line character as well, which I think we need to format the text before we put it in to remove those.
But that's cool. It's pretty cool. So that's really good. It's working, but we're not actually making our own queries. So we need to add that. And to do that, all we need to do is add Q in here. And this is our query string. And in here, we just make that equal to query.
And now if we do that, rather than opening it in the browser, I'm going to open Insomnia here. So Insomnia, what you can see now is a, it's just a HTTP client. So it allows us to send requests really easily and just format everything in a nice way. I should have opened this before.
I'm not sure why I didn't. I kind of just forgot about it. So localhost 8000. So localhost 8000, we're going to the query endpoint. And then in our query, we have the Q parameter. And in here, I want to say, what did your grandfather teach you? Okay, and then we send this and we should get a response on the right.
Cool. And this is a lot better. So it's good. And let's try something else. So I think I've read a lot of the time, he just like talks about the universe. So what is the universe? And send that. Takes a while to run at the moment. Cool. So we'll get a query and then we get answers.
And it's a well-arranged universe. Universe loves to make whatever it is about to be. Interesting. That which knows beginning and end and knows the reason, which pervades all substance and through all time by fixed periods, revolutions and minuses the universe. Yeah, it's pretty deep. Yeah, that was cool. So I think that's everything.
I don't want to include anything else. So that's pretty cool. So now if we have a look at this again, we've just done like this entire section here and the API. I mean, we'll probably make changes in the future, but that's kind of all this done. So you can cross that off.
And then the next bit is going to be probably the most difficult bit, at least for me, which is going to be the front end in Angular. So it should be pretty interesting. So let's see how that goes. But for now, I think that's it. So I'll see you again in the next one.