back to indexLangSmith 101 for AI Observability | Full Walkthrough

Chapters
0:0 LangChain's LangSmith
0:24 LangSmith Setup
2:37 LangSmith Tracing
4:54 Custom LangSmith Traceables
8:18 LangSmith Conclusion
00:00:00.000 |
Okay so now we're going to take a look at AI observability using Linesmith. Now Linesmith 00:00:06.980 |
is another piece of the broader Lineshain ecosystem. Its focus is on allowing us to see 00:00:14.600 |
what our LLMs, agents etc are actually doing and it's something that we would definitely recommend 00:00:21.100 |
using if you are going to be using Lineshain and Lineshain. Now let's take a look at how we would 00:00:26.040 |
set Linesmith up which is incredibly simple. So I'm going to open this in Colab and I'm just going 00:00:32.660 |
to install the prerequisites here. You'll see these are all the same as before but we now have the 00:00:37.880 |
Linesmith library here as well. Now we are going to be using Linesmith throughout the course so in all 00:00:44.760 |
the following chapters we're going to be importing Linesmith and that will be tracking everything we're 00:00:49.860 |
doing but you don't need Linesmith to go through the course. It's an optional dependency but as mentioned 00:00:55.800 |
I would recommend it. So we'll come down to here and the first thing that we will need is the Lineshain 00:01:00.960 |
API key. Now we do need an API key but that does come with a reasonable free tier. So we can see here 00:01:09.780 |
they have each of the plans and this is the one that we are by default on. So it's free for one user 00:01:17.320 |
up to 5000 traces per month. If you're building out an application I think it's fairly easy to go beyond 00:01:25.120 |
that but it really depends on what you're building. So it's a good place to start with and then of course 00:01:30.480 |
you can upgrade as required. So we would go to smith.linechain.com and you can see here that this 00:01:40.120 |
will log me in automatically. I have all of these tracing projects. These are all from me running the 00:01:45.340 |
various chapters of the course. Yours if you do use Linesmith throughout the course your Linesmith 00:01:50.740 |
dashboard will end up looking something like this. Now what we need is an API key. So we go over to 00:01:57.500 |
settings. We have API keys and we're just going to create an API key. Because we're just going through 00:02:04.640 |
some personal learning right now I would go with personal access token. We can give a name or 00:02:09.020 |
description if you want. Okay and we'll just copy that and then we come over to our notebook and we 00:02:14.740 |
enter our API key there. And that is all we actually need to do. That's absolutely everything. I suppose 00:02:19.920 |
the one thing to be aware of is that you should set your Linesmith project to whatever project you're 00:02:25.300 |
working within. So of course within the course we have individual project names for each chapter but for 00:02:31.180 |
your own projects of course you should make sure this is something that you recognize and is useful to 00:02:36.840 |
you. So Linesmith actually does a lot without us needing to do anything. So we can actually go 00:02:41.940 |
through. Let's just initialize our LLM and start invoking it and seeing what Linesmith returns to us. 00:02:48.200 |
So we'll need our OpenAI API key. Enter it here and then let's just invoke hello. 00:02:56.640 |
Okay so nothing has changed on this end. Right so us running code there's nothing different here. 00:03:01.820 |
However now if we go to Linesmith I'm going to go back to my dashboard. 00:03:06.960 |
Okay and you can see that the the order of these projects just changed a little bit and that's because 00:03:13.460 |
the most recently used project i.e this one at the top Linesmith course Linesmith OpenAI which is the 00:03:19.660 |
current chapter we're in. That was just triggered. So I can go into here and I can see oh look at this. 00:03:25.400 |
So we actually have something in the Linesmith UI and we didn't all we did was enter our 00:03:30.840 |
Linesmith API key. That's all we did and we set some environment variables and that's it. So we can 00:03:35.480 |
actually click through to this and it will give us more information. So you can see what was the input 00:03:40.620 |
what was the output and some other metadata here. You see you know it's not that much in here however 00:03:48.580 |
when we do the same for agents we'll get a lot more information. So I can even show you a quick 00:03:55.280 |
example from the future chapters. If we come through to agents intro here for example 00:04:02.480 |
and we just take a look at one of these. Okay so we have this input and output but then on the left 00:04:10.100 |
here we get all this information. And the reason we get all this information is because agents are 00:04:14.800 |
they're performing multiple LLM calls etc etc. So there's a lot more going on. So we can see okay 00:04:21.700 |
what is the first LLM call and then we get these tool use traces. We get another LLM call and the tool 00:04:28.840 |
use and another LLM call. So you can see all this information which is incredibly useful and incredibly 00:04:34.020 |
easy to do because all I did when setting this up in that agent chapter was simply set the API key and 00:04:40.960 |
the environment variables as we have done just now. So you get a lot out of very little effort with 00:04:47.600 |
LLM which is great. So let's return to our LLM project here and let's invoke some more. Now I've 00:04:55.080 |
already shown you you know we're going to see a lot of things just by default but we can also add 00:04:59.940 |
other things that LangSmith wouldn't typically trace. So to do that we will just import a traceable 00:05:07.740 |
decorator from LangSmith and then let's make these just random functions traceable within LangSmith. 00:05:16.240 |
Okay so we'll run those. We have three here so we're going to generate a random number. We're going to 00:05:23.640 |
modify how long a function takes and also generate a random number and then in this one we're going to 00:05:31.600 |
either return this no error or we're going to raise an error. So we're going to see how LangSmith handles 00:05:39.140 |
these different scenarios. So let's just iterate through and run those a few times. So we're just 00:05:45.340 |
going to run each one of those 10 times. Okay so let's see what happens. So they're running. Let's go 00:05:53.080 |
over to our LangSmith UI and see what is happening over here. So we can see that everything is updating. 00:05:58.840 |
We're getting that information through and we can see if we go into a couple of these we can see 00:06:03.120 |
a little more information. So the input and the output took three seconds. 00:06:09.920 |
see random error here. In this scenario random error passed without any issues. 00:06:17.480 |
Okay so now we have the rest of that information and we can see that occasionally if there is an 00:06:25.000 |
error from our random error function it is signified with this and we can see the traceback as well 00:06:31.860 |
that was returned there which is useful. Okay so we can see if an error has been raised we have to see 00:06:36.440 |
what that error is. We can see the various latencies of these functions. So you can see that varying 00:06:44.440 |
throughout here. We see all the inputs to each one of our functions and then of course the outputs. 00:06:51.080 |
So we can see a lot in there which is pretty good. Now another thing that we can do is we can actually 00:06:58.600 |
filter. So if we come to here we can add a filter. Let's filter for errors. 00:07:04.440 |
that would be value error. And then we just get all of the cases where one of our functions has 00:07:12.300 |
returned or raised an error or a value error specifically. Okay so that's useful. And then 00:07:17.840 |
yeah there's various other filters that we can add there. So we could add a name for example if we 00:07:24.620 |
wanted to look for the generate string delay function only. We could also do that. Okay and then we can see 00:07:32.800 |
see the varying latencies of that function as well. Cool. So we have that. Now one final thing that we 00:07:41.480 |
might want to do is maybe we want to make those function names a bit more descriptive or easy to 00:07:47.960 |
search for for example. And we could do that by setting the name of the traceable decorator like so. 00:07:53.660 |
So let's run that. Run this a few times. And then let's jump over to Linesmith again going to 00:08:01.520 |
Linesmith project. Okay and you can see those coming through as well. So then we could also search for 00:08:06.320 |
those based on that new name. So what was it? Chit Chantamaker like so. And then we can see all that 00:08:14.040 |
information being streamed through to Linesmith. So that is our introduction to Linesmith. There is really 00:08:22.420 |
not all that much to go through here. It's very easy to help and as we've seen it gives us a lot of 00:08:28.160 |
observability into what we are building. And we will be using this throughout the course. We don't rely 00:08:35.180 |
on it too much. It's a completely optional dependency so you don't want to use Linesmith. You don't need 00:08:39.720 |
to. But it's there and I would recommend doing so. So that's it for this chapter. We'll move on to the next one.