back to indexClaudette source walk-thru - Answer.AI dev chat #1
00:00:02.360 |
And this is the first of our Answer.ai developer chats, 00:00:17.040 |
who hopefully this will be a useful little summary of what 00:00:20.840 |
But we thought we'd also make it public because why not? 00:00:33.400 |
And we're going to be talking about a new library I've 00:00:37.720 |
been working on called Claudette, which is Claude's 00:00:45.440 |
And Jono has been helping me a bit with the library, 00:00:49.360 |
but he's going to, I think, largely feign ignorance 00:00:51.640 |
about it today in order to be an interviewer to attempt 00:01:11.520 |
And then I think there's a few different directions 00:01:16.040 |
One is the specifics of this library, how does it work? 00:01:20.000 |
But maybe also, since especially this is the first developer 00:01:22.640 |
chat and preview, we can also go into some of the meta questions 00:01:25.720 |
like, when does something become a library like this? 00:01:45.640 |
And in the top right is a link to the documentation. 00:01:51.160 |
And the documentation that you see here, the index, 00:01:57.880 |
But it's better to read it here because it looks a bit better. 00:02:03.840 |
And this is a library that people can pip install? 00:02:31.760 |
So you could also open that up in Colab, for example. 00:02:43.800 |
What is the point of this library in a nutshell? 00:02:47.360 |
OK, so by way of background, I started working on it 00:03:02.440 |
felt like-- still feel a bit like Claude is a bit underrated 00:03:08.160 |
I think most people use OpenAI because that's 00:03:20.560 |
And the nice thing about some of these models 00:03:26.320 |
have their things they're better at and things 00:03:30.000 |
So for example, I'm pretty interested in Haiku 00:03:49.320 |
which is where you can help it with not knowing stuff. 00:03:52.840 |
So yeah, I was pretty interested, particularly 00:04:01.840 |
did this video called A Hacker's Guide to LLMs last year. 00:04:08.760 |
really just to help out a friend, to be honest. 00:04:22.000 |
And one of the things I did in that was to say, like, oh, look, 00:04:24.720 |
you can create something that has the kind of behavior 00:04:32.000 |
you're used to seeing in things like Constructor and Langchain 00:04:38.800 |
So you don't have to always use big, complex frameworks. 00:04:44.640 |
would love to be able to use a library that's that small 00:04:51.200 |
try and build something that's super simple, very 00:05:04.080 |
And that way, they still don't have to write their own. 00:05:06.400 |
But they also don't have to feel like it's a mysterious thing. 00:05:11.240 |
to be this fairly minimal, like no really very few abstractions 00:05:27.680 |
but also to be pretty capable and pretty convenient. 00:05:35.640 |
You think it would be fair to say that this is more 00:05:38.040 |
replacing the maybe more verbose code that we've copied 00:05:40.960 |
and pasted from our own implementations a few times 00:05:43.200 |
versus introducing too many completely new abstractions? 00:05:47.720 |
Yeah, I think a lot of people got their start with LLMs 00:05:55.840 |
is a really good way to get started in some ways 00:06:01.480 |
But a lot of people kind of come away feeling like, 00:06:09.360 |
And I don't feel like I'm really learning at this point. 00:06:21.320 |
So partly, it's kind of for those folks to be like, OK, 00:06:24.160 |
here's how you can do things a bit closer to the bone 00:06:28.160 |
without doing everything yourself from scratch. 00:06:30.160 |
And for people who are already reasonably capable Python 00:06:35.160 |
programmers feel like, OK, I want to leverage that, 00:06:37.760 |
jump into LLMs and leverage my existing programming knowledge. 00:06:41.600 |
This is a path that doesn't involve first learning 00:06:44.880 |
some big new framework full of lots of abstractions 00:06:47.120 |
and tens of thousands of lines of code to something with, 00:06:50.760 |
I don't know what it is, maybe a couple of hundred lines of code, 00:06:58.240 |
And you can see step-by-step exactly what it's doing. 00:07:04.040 |
Do you want to start with a demo of what it does, 00:07:05.520 |
or do you want to start straight with those hundred lines of code 00:07:10.440 |
I'm inclined-- normally, I'd say do the demo, 00:07:12.760 |
but I'm actually inclined to step through the code 00:07:15.040 |
because the code's a bit, as you know, weird in that the code is-- 00:07:33.800 |
And that's because I tried something slightly different 00:07:38.400 |
to what I've usually done in the past, which I've tried to create 00:07:44.040 |
So the source code of this is something that we can 00:07:52.160 |
but it also is designed to teach you about the API 00:07:59.800 |
that it's doing to build on top of that and so forth. 00:08:02.000 |
So I think the best way to show you what it does 00:08:18.440 |
is a blogging platform that, amongst other things, 00:08:21.640 |
So we're just seeing the rendered version of this notebook. 00:08:56.200 |
You can close them all at once from this menu here, 00:09:00.000 |
And that basically will get rid of all the bits that 00:09:07.920 |
And if we say show all code, then you can see, yeah, 00:09:23.200 |
OK, so the idea of this notebook is, as I said, 00:09:30.120 |
as well as being the entire source code of the library, 00:09:32.400 |
it's also by stepping through it, we'll see how Claude works. 00:09:37.240 |
And so Claude has three models, Opus, Sonnet, and Haiku. 00:10:01.720 |
again, is actually a rendered version of a notebook. 00:10:09.920 |
And so you can see, for example, if I say models, 00:10:12.560 |
it shows me the same models that came from here. 00:10:18.640 |
It ends up as part of this Claudette notebook. 00:10:21.160 |
So that's the best Claude bottle, middle, worst. 00:10:33.440 |
with how much more you can do with these fast, cheap models 00:10:43.880 |
Any questions or comments so far from anybody? 00:10:58.800 |
But this is just trying to make it as smooth as possible, 00:11:03.800 |
Yeah, I don't want to have to remember these things. 00:11:05.960 |
And obviously, I wouldn't remember those dates 00:11:17.680 |
and definitely, this tiny minor thing is something I found nice 00:11:20.720 |
is to not have to think about model names ever again, 00:11:23.400 |
and also know that it goes, like, best, middle, worst. 00:11:27.480 |
I can just go straight to, like, OK, worst one. 00:11:45.040 |
If you pip install Claudette, you'll get this for free. 00:11:49.600 |
So I think it's nice if you're going to show somebody 00:11:54.760 |
how to use your code, you should, first of all, 00:11:56.800 |
show how to use the things that your code uses. 00:12:04.960 |
So the way it works is that you create the client. 00:12:16.080 |
So I'm going to pass in a message, I'm Jeremy. 00:12:19.440 |
Each message has a role of either user or assistant. 00:12:28.360 |
it's actually unnecessary, because they always 00:12:30.600 |
have to go user assistant, user assistant, user assistant. 00:12:35.960 |
So if you pass in the wrong one, you get an error. 00:12:39.160 |
So strictly speaking, they're kind of redundant. 00:12:42.160 |
So in this case, and they're just dictionaries, right? 00:12:49.200 |
So this is something I've said, whereas the assistant is 00:12:57.400 |
And then you can pass in various other things. 00:12:59.160 |
As you can see, there's a number of other things 00:13:05.560 |
that you can pass in, like a system prompt, stop sequences, 00:13:17.280 |
So if I go ahead and run that, I get back a message. 00:13:28.880 |
And on the whole, it doesn't really matter which you choose. 00:13:32.360 |
When you build them, it's easier just to make them dictionaries. 00:13:51.920 |
If you're not sure what tokens are and basics like that, 00:13:57.440 |
then check out this Hacker's Guide to Language Models, 00:14:10.520 |
It could also reply with images, for instance. 00:14:14.160 |
And the text is, that's what it has to say for me. 00:14:26.920 |
The OpenAI one is more complicated to work with, 00:14:31.200 |
because they didn't decide on this basic idea of like, oh, 00:14:41.760 |
So one thing I know, and I'm sure lots of other people 00:14:45.120 |
do as well, is that often when you interact with an assistant, 00:14:50.920 |
about how the assistant should see their role. 00:14:55.360 |
You just started right off with a role from yourself as a user. 00:15:02.080 |
already starts with the default guidance to the assistants? 00:15:14.800 |
happy to talk to you without a system prompt. 00:15:22.200 |
But they went through instruction fine-tuning 00:15:29.320 |
So they know how to have some kind of default personality, 00:15:45.840 |
And I also know it's going to end up rendered 00:15:47.800 |
as our documentation and as our kind of rendered source. 00:15:51.920 |
So I don't really want things to look like this. 00:15:56.040 |
So the first thing I did was to format the output. 00:16:19.880 |
So this is just something that finds the first text block. 00:16:32.800 |
the bit I normally care about, because I don't normally 00:16:37.600 |
I know what the role's going to be, et cetera. 00:16:41.640 |
And then so from the text block, I want to pull out the text. 00:16:46.120 |
So this is just something that pulls out the text. 00:16:48.760 |
And so now from now on, I can always just say contents 00:16:55.040 |
So something I really like, though, is like, OK, 00:16:58.920 |
to know the extra information, like the stop sequence 00:17:03.000 |
So in Jupyter, if you create this particular named method 00:17:14.000 |
then it displays that object using that Markdown. 00:17:28.520 |
And so you can see what that looks like here. 00:17:31.280 |
There's the contents, and there's the details. 00:17:38.160 |
how did Jeremy add this behavior to Anthropx class? 00:17:45.520 |
Now, this is a nice little fast core thing called patch, 00:17:48.800 |
where if you define a function, and you say patch, 00:17:53.400 |
it changes those existing types to give it this behavior. 00:17:56.920 |
So this is now, if we look at tools beta message 00:18:10.560 |
- I was going to say, there's like a trade-off 00:18:13.200 |
in terms of time here, where if you only ever 00:18:15.400 |
had to look at something once, you just manually type out 00:18:18.760 |
response dot messages zero dot block whatever dot choices 00:18:31.440 |
Like, every time I want to show what the response is, 00:18:47.120 |
is not the exact order it happened in in my head. 00:18:50.040 |
Because, yeah, it wasn't until I did this a couple of times 00:18:52.800 |
and was trying to find the contents and blah, blah, blah, 00:18:59.080 |
This is probably like 15 minutes later, I went back. 00:19:07.240 |
going to be important, like how much money you're spending 00:19:12.360 |
So I decided to make it easy to keep track of that. 00:19:34.280 |
so now if I say usage, I can see all that information. 00:19:37.960 |
And then since we want to be able to track usage, 00:19:40.000 |
we have to be able to add usage things together. 00:19:42.720 |
So if you override done to add in Python, it lets you use plus. 00:19:54.200 |
like all these basic things I'm working with all the time, 00:20:08.600 |
the Anthropic documentation, manually writes these. 00:20:14.720 |
But it doesn't take very long to write this once either. 00:20:17.760 |
And now if you just-- something as simple as defaulting 00:20:34.960 |
And so then since it always goes user, assistant, user, 00:20:41.000 |
you should be able to just send in a list of strings. 00:21:04.920 |
I can now pass in a list of messages a bit more easily. 00:21:24.520 |
So if I pass in something which has a content attribute, 00:21:38.560 |
And so that way, you can see the messages now. 00:21:54.160 |
watched my LLM Hacker's Guide, you know this. 00:22:01.440 |
Like when you chat with chatGPT, it looks like it has state. 00:22:12.320 |
So when I say I forgot my name, can you remind me, please? 00:22:14.720 |
I also have to pass it all of my previous questions 00:22:33.320 |
OK, so I feel like something as simple as this 00:22:36.360 |
is already useful for experimenting and playing 00:22:40.520 |
And for me, I would rather generally use something 00:22:45.200 |
like a notebook interface for interacting with a model 00:22:48.080 |
than the kind of default chatGPT thing or Cloud thing. 00:23:00.880 |
So this was a big kind of thing for me is like, OK, I want to-- 00:23:05.600 |
I want to make a notebook at least as ergonomic as chatGPT 00:23:11.680 |
plus all of the additional usability of a notebook. 00:23:14.560 |
So these are the little things that I think help. 00:23:22.760 |
because I generally pick the model once per session. 00:23:37.440 |
So maybe the transition that I see happening right now 00:23:40.680 |
is everything up until this point was like housekeeping 00:23:48.000 |
making my own convenience functions for that. 00:23:52.280 |
give you tracking usage over multiple conversations, 00:23:54.520 |
keeping track of the history and all of that. 00:23:56.400 |
So it seems like now we're shifting to like, OK, 00:23:58.560 |
I can do the same things that the API allows me to do, 00:24:05.280 |
But now it's like, OK, what else would I like to do? 00:24:12.000 |
because I don't want to spend all my money unknowingly. 00:24:18.280 |
And so what I used to do was to always go back 00:24:22.800 |
and check the billing because you can actually 00:24:28.000 |
So this way, it's just like I just saying like, OK, well, 00:24:35.040 |
And then I just wrote this tiny little private thing here. 00:24:38.640 |
We'll ignore prefill for now, which just stores 00:24:46.760 |
So now when I call it a few times, each time I call it, 00:24:54.600 |
And so again, I was going to ignore stream for a moment. 00:25:01.960 |
So dunder call is the thing that you basically 00:25:13.800 |
I'll come back to some of the details in a moment. 00:25:15.920 |
But the main thing it does is it calls make messages 00:25:27.480 |
And then it remembers the result and keeps track of the usage. 00:25:38.240 |
I do something, and I've now tracked the usage. 00:25:43.360 |
And so if I call it again, that 20 should be higher, now 40. 00:25:48.320 |
So it's still not remembering my chat history or anything. 00:26:00.320 |
So you'll see this is like a large function by my standards. 00:26:05.120 |
It's like 1, 2, 3, 4, 5, 6, 7, 8 whole lines. 00:26:16.600 |
So there's a couple of other things we do here. 00:26:23.960 |
is one of the few companies to officially support, 00:26:27.440 |
which is called prefill, which is where you can say 00:26:44.320 |
It literally has to start its answer with this. 00:26:59.400 |
So yeah, so basically, when you call this little tracking 00:27:09.400 |
And so if you want some prefill, then as you can see, 00:27:15.000 |
And the way it also-- so that's just to the answer 00:27:17.080 |
because Anthropic doesn't put it in the answer. 00:27:37.480 |
So basically, you pass in an assistant message at the end. 00:27:44.320 |
This isn't documented necessarily in their API 00:27:46.640 |
because it's like, oh, this is how you send a user message 00:27:51.400 |
And you have to kind of dig a little bit more to say, oh, 00:27:53.760 |
if I send you an assistant message as the last message 00:27:56.040 |
in the conversation, this is how we'll interpret it. 00:28:04.360 |
They actually understand that prefill is incredibly powerful. 00:28:14.240 |
Claude does not listen to system prompts much at all. 00:28:50.040 |
I mean, they support tool calling or whatever. 00:29:00.320 |
That forces it to start answering the Python code. 00:29:04.000 |
So yeah, lots of useful things you can do with prefill. 00:29:08.240 |
Have you noticed personally that the improvement is significant 00:29:15.280 |
But I'm just curious what your anecdotal impression is. 00:29:21.520 |
So yeah, if you want it to answer with that start, 00:29:25.080 |
Unfortunately, GPT-4.0 doesn't generally do it properly. 00:29:38.080 |
So they're a bit all over the place at the moment. 00:29:46.120 |
Now, streaming is a bit hard to see with a short question. 00:29:56.000 |
And so you can see it generally comes out, bop, bop, bop, bop, 00:29:59.600 |
I don't have any pre-written funny-- that's terrible. 00:30:28.720 |
I don't think I want to wait to do the whole thing anyway. 00:31:19.940 |
which, without the framework, would be annoying. 00:31:25.020 |
So with this little tiny framework, it's automatic. 00:31:28.020 |
And you see this in the notebook in its final form, right? 00:31:30.400 |
But this call method was first written without any streaming. 00:31:34.020 |
Like, get it working on the regular case first. 00:31:52.140 |
And then we can test the stream function by itself 00:31:55.020 |
and test it out with the smaller primitives first 00:31:58.180 |
and then put it into a function that then finally 00:32:05.940 |
And this is like one of these little weird complexities 00:32:15.060 |
It's like, oh, we could just refactor this and move this 00:32:19.060 |
And then we don't need a whole separate method. 00:32:22.380 |
As soon as there's a yield anywhere in a function, 00:32:31.700 |
you have to pull your yields out into separate methods. 00:32:49.180 |
And we can add the two together, prefill and streaming. 00:33:02.500 |
And is there a way to try to reset the counter if you wanted 00:33:10.020 |
I mean, the way I would do it was I would just 00:33:17.020 |
But you could certainly go C.use equals usage 0, 0. 00:33:22.260 |
In fact, 0, 0 is the default. So actually, now I 00:33:25.540 |
think about it, we could slightly improve our code 00:33:32.300 |
would be a big benefit because we don't like characters. 00:33:39.460 |
So yeah, you could just say C.use equals usage. 00:33:54.580 |
do this manipulating the attributes of objects 00:34:04.940 |
people would often create some kind of set usage method 00:34:30.500 |
If I've had a multi-turn exchange with an assistant, 00:34:38.220 |
to convince the assistant that it said something it didn't? 00:34:43.140 |
Yeah, because we don't have any state in our class, right? 00:34:51.980 |
so here, we're passing in a single string, right? 00:35:02.740 |
So I said hi, and the model said hi to you too. 00:35:30.820 |
but we're just convincing it that this is a conversation 00:35:34.740 |
So now Claude is probably going to be slightly confused 00:35:37.780 |
by the fact that it reported itself not to be Claude. 00:35:42.340 |
No, I mean, I don't-- we haven't set a system message 00:35:53.500 |
So as I said, Claude's very happy to be told what it said, 00:36:05.620 |
Oh, you may be surprised to hear that I am actually Australian. 00:36:21.900 |
we get sidetracked and talk to Rupes for a good long time. 00:36:38.980 |
is the conversation, because it's got no state. 00:36:42.220 |
Now it's forgotten everything it's just said. 00:36:49.940 |
So remembering-- oh, actually, so before we do that, 00:37:01.740 |
got into multi-turn dialogue automatic stuff, 00:37:04.100 |
I wanted to have the basic behavior that Anthropic's SDK 00:37:39.380 |
is going through a lot of variations of the same thing. 00:37:46.180 |
And before you write code to churn out variations, 00:37:54.780 |
if I had a client and a bit of an exchange already built up, 00:37:59.180 |
and then I wanted to fork that and create five of them 00:38:02.020 |
and then continue them in five different ways, 00:38:16.900 |
would be the fluid way of doing it with this API? 00:38:32.200 |
Options equals-- how do you spell Zimbabwean, Johnno? 00:39:00.160 |
This is boring again, because I think we've gone back 00:39:12.240 |
Haiku is just really doesn't like pretending. 00:39:22.200 |
All right, anyway, Claude is being a total disappointment. 00:39:37.240 |
So the fact that it's reasonable to do this just 00:39:44.960 |
which is that there's no state hiding inside the C function 00:39:47.640 |
that we need to worry about mangling when we do that. 00:39:57.960 |
Everyone is a totally independent REST call to a-- 00:40:02.480 |
There's no nothing to tie these things together. 00:40:07.680 |
All right, we probably just spent cents on that question. 00:40:12.480 |
It's so funny, they're like a few dollars per million tokens 00:40:20.200 |
I look at this and like, whoa, all those tokens. 00:40:25.440 |
I've got to get used to not being too scared. 00:40:43.760 |
It would call some other weather API or something. 00:40:53.160 |
are specified using this particular format, which 00:40:57.960 |
And my goal is that you should never have to think about that. 00:41:14.160 |
So instead, we're going to create a very complicated tool. 00:41:17.240 |
It's something that adds two things together. 00:41:20.160 |
And so I think the right way to create a tool 00:41:26.600 |
So the thing about these tools, as you see from their example, 00:41:33.280 |
is they really care a lot about descriptions of the tool, 00:41:40.520 |
And they say quite a lot in their documentation 00:41:44.720 |
about how important all these details are to provide. 00:41:48.640 |
So luckily, I wrote a thing a couple of years 00:41:56.040 |
ago called Documents that makes it really easy to add 00:42:15.920 |
Or if you want more room, you can put the comment before it. 00:42:23.600 |
And you can also put a description of the result. 00:42:27.280 |
You can also put a description of the function. 00:42:29.760 |
And so if you do all those things, then you can see here. 00:42:38.040 |
So this is the thing that creates the JSON schema. 00:42:42.680 |
You can see it's created the JSON schema from that, 00:42:50.760 |
And the return comment ends up in this returns. 00:43:04.880 |
The model would probably still be able to use it. 00:43:21.120 |
so you would have to somehow create that JSON schema. 00:43:24.480 |
I don't know if it's got some default thing that 00:43:28.360 |
Oh, I'm more thinking like if we don't follow the documents 00:43:35.360 |
So if we got rid of these, so if we got rid of the documents, 00:43:44.440 |
You could get rid of the types and the defaults. 00:44:07.720 |
It appears like you have to annotate it there. 00:44:17.960 |
OK, so currently, that's the minimum that it needs. 00:44:25.600 |
And I don't know if it actually requires a description. 00:44:28.560 |
I suspect it probably does, because otherwise, maybe 00:44:31.360 |
I guess it could guess what it's for from the name. 00:44:34.560 |
But yeah, it wouldn't be particularly useful. 00:44:37.640 |
So OK, so now that we've got a tool, when we call Claude, 00:44:50.800 |
And now we're also going to add a system prompt. 00:44:53.560 |
And I'm just going to use that system prompt. 00:45:08.240 |
I also think user-facing, I think it's weird the way 00:45:11.880 |
Claude tends to say, OK, I will use the sum tool 00:45:21.160 |
like, they haven't got as much user-facing stuff. 00:45:23.160 |
They don't have any user-facing tool use yet. 00:45:25.680 |
So yeah, I don't think their tool use is quite 00:45:31.320 |
So if we pass in this prompt, what is that plus that? 00:46:12.800 |
And it grabs the name of the function to call. 00:46:16.280 |
And it grabs that function from your symbol table. 00:46:21.320 |
And it calls that function with the input that was requested. 00:46:26.720 |
So when I said the symbol table or the namespace, 00:46:32.000 |
from the name of the tool to the definition of the tool. 00:46:38.320 |
So if you don't pass one, it uses globals, which, 00:46:59.520 |
And it just creates a mapping from the name to the function. 00:47:06.800 |
I'm just going to say, if I'm a somewhat beginner, 00:47:18.120 |
And I might not know what mappings are or callables. 00:47:20.840 |
There's namespaces and get_atras and dicts and is_instance. 00:47:25.520 |
How should I approach this code versus maybe the examples that 00:47:29.560 |
Because this is the source code of this library. 00:47:32.480 |
But you're not writing this with lots of comments or explanations. 00:47:37.080 |
So what should I-- like, if I come to this library 00:47:41.120 |
should I be focusing on the deep Python internals 00:47:44.280 |
versus the usage versus like the big picture? 00:48:05.520 |
And none of that-- like, you can see in the docs, 00:48:27.880 |
And yeah, the fact that I'm skipping over these details 00:48:36.000 |
or that everybody should understand them or any of that. 00:48:49.240 |
So these are all things that are built into Python. 00:48:54.440 |
But yeah, that'd probably be part of something 00:48:59.080 |
So one of the things a lot of intermediate Python programmers 00:49:05.440 |
to learn about bits of Python they didn't know about. 00:49:07.960 |
And then they use it as jumping off points to study. 00:49:13.280 |
And that's also why, like, OK, why do I not have many comments? 00:49:21.760 |
why you're doing something, not what you're doing. 00:49:25.880 |
So for something that you could answer, like, oh, 00:49:35.200 |
And so in this case, all of the things I'm doing, 00:49:38.760 |
once you know what the code does, why is it doing it 00:49:43.760 |
Like, why do we get the name of the function from the object? 00:49:53.720 |
They're things you call them, and you pass in the input. 00:50:04.320 |
you actually don't need to know any of these details. 00:50:11.720 |
yeah, the reason I'm using these features of the language 00:50:19.240 |
it's because I'm using them in a really normal, idiomatic way 00:50:27.640 |
that's a perfectly useful thing to learn about. 00:50:32.120 |
And I'll add that, like, I'm learning this stuff 00:50:35.960 |
Like, you don't have to know any of this to be a good programmer, 00:50:40.200 |
And I think, like, some of these things we wrote multiple ways, 00:50:43.360 |
maybe one that was more verbose first, and then we say, 00:50:45.600 |
oh, I think we can do this in this more clever way 00:50:48.800 |
So if you are watching this and you are wanting to learn 00:50:57.120 |
But it's also, like, it's totally OK if you're not, 00:51:02.880 |
is the way I write all of my code, pretty much, 00:51:10.000 |
I write nearly all of it outside of a function in cells. 00:51:19.240 |
So, like, let's set ns to none, so then I can run this. 00:51:27.400 |
It's like, oh, wow, everything in Python is a dictionary. 00:51:41.120 |
But just offer one perspective to maybe make a little bridge 00:51:45.320 |
to the why these internals might be unfamiliar point of view, 00:51:49.000 |
just to recap and make sure I understand it right. 00:51:51.800 |
From the user point of view here, when we use tools, 00:51:59.000 |
in the way we're doing it now, that describes a function 00:52:09.600 |
So with this library, I can write a function in Python 00:52:13.600 |
and then tell clod to call the function that's 00:52:21.000 |
For that to work, if it wants to, if it chooses to. 00:52:25.800 |
needs to do the magic of reading a text string that 00:52:35.240 |
but having that become Python code that runs in Python. 00:52:38.720 |
And that's a somewhat unfamiliar thing to do in Python. 00:52:44.520 |
or back in Lisp, where a lot of this stuff got started. 00:52:59.400 |
So in the end, this is the function we want to call. 00:53:07.920 |
In Python, this is just syntax sugar, basically, 00:53:13.160 |
for passing in a dictionary and dereferencing it. 00:53:27.280 |
So we were never passed a string of code to eval or execute. 00:53:57.120 |
Yeah, it's the name that came from our schema, which 00:54:04.880 |
Yeah, so if you look back at our tool schema, 00:54:22.160 |
So the flow is, we write our function in Python. 00:54:26.760 |
The library automatically knows how to interpret the Python 00:54:29.760 |
and turn it into a structured representation, 00:54:35.880 |
We're also feeding it the name for the function 00:54:38.880 |
that it's going to use when it wants to come back to us 00:54:43.200 |
When it comes back to us and says, hey, call the function, 00:54:46.440 |
We look up the original function, and then we execute. 00:54:50.360 |
It knows it's got a function that can do this 00:54:56.360 |
And so then if it gets a request that can use that tool, 00:55:06.720 |
I'm going to call the function that Jeremy provided, 00:55:11.960 |
Yeah, so we'll see a bunch of examples of this. 00:55:14.400 |
And this is generally part of what's called the React 00:55:17.560 |
framework, nothing to do with React, the JavaScript GUI 00:55:21.280 |
thing, but React was a paper that basically said like, hey, 00:55:28.720 |
And again, my LLM Hackers video is the best place 00:55:36.360 |
And so here we're implementing the React pattern, 00:55:38.400 |
or at least we're implementing the things necessary for Cloud 00:56:11.600 |
is in a notebook, which means you can play with it, which 00:56:16.320 |
I think is fantastically powerful because you never 00:56:21.320 |
You literally can copy and paste it into a cell and experiment. 00:56:25.400 |
And it's also worth learning these keyboard shortcuts 00:56:28.200 |
like CV to copy and paste the cell, and like Apple A, 00:56:35.200 |
Apple left square bracket, control shift hyphen. 00:56:39.640 |
There's all these nice things worth learning, 00:56:42.360 |
all these keyboard shortcuts to be able to use this Jupyter 00:56:50.880 |
we've now got this thing called call function, which 00:56:54.800 |
can take the tool use request from Cloud, this function call 00:57:01.440 |
And it passes back a dictionary with the result of the tool 00:57:08.000 |
call, and when it asked us to make this call, 00:57:24.320 |
And that's the bit that says this is the answer 00:57:37.800 |
and Cloud will say, oh, great, I got the answer, 00:57:44.040 |
So I put all that together here and make a tool response 00:57:47.640 |
where you pass in the tool response request from Cloud, 00:58:03.240 |
We call that call function for every tool use request. 00:58:07.360 |
There can be more than one, and we add that to their response. 00:58:12.880 |
And so if you have a look now here, when we call that, 00:58:18.800 |
it calculates the sum, and it's going to pass back the-- 00:58:24.720 |
going to add in the tool use request and the response 00:58:36.840 |
and you can see Cloud returns the string, the response. 00:58:43.240 |
So it's turned the result of the tool request into a response. 00:58:48.240 |
And so this is how stuff like Code Interpreter in ChatGPT 00:58:53.020 |
So it might be easier to see it all in one place, 00:58:58.400 |
and this is like another demo of how we can use it. 00:59:00.920 |
Instead of calling functions, we can also call methods. 00:59:08.440 |
So we can do the same thing, get schema dummy dot sums. 00:59:13.320 |
Yeah, so we make the message containing our prompt. 00:59:16.680 |
So that's the question, what's this plus this? 00:59:22.320 |
Cloud decides that it wants you to create a tool request. 00:59:25.720 |
We make the tool request, calculate the answer, 00:59:29.800 |
add that to the messages, and put it all together. 00:59:54.120 |
So if you're not comfortable and familiar with the React 01:00:04.520 |
Definitely worth spending time learning about, 01:00:07.560 |
because it's an incredibly powerful technique 01:00:22.160 |
feel this way, that there's so many things that language 01:00:33.440 |
If you tell it like, oh, you've got access to this proof 01:00:37.960 |
checking tool, or you've got access to this account creation 01:00:45.240 |
And those tools could be things like reroute this code 01:01:03.000 |
you're not under obligation to send the response back 01:01:10.280 |
at this query from a customer and then respond appropriately. 01:01:21.320 |
that could be like, oh, I should exit this block, 01:01:33.600 |
Yeah, we're going to see a bunch more examples 01:01:36.520 |
in the next section because there's a whole module called 01:01:40.560 |
tool loop, which has a really nice example, actually, 01:01:46.520 |
to use this for customer service interaction. 01:01:56.960 |
going to go on to something much more familiar to everybody, 01:02:11.720 |
And it's going to start out as an empty list. 01:02:14.480 |
And it's also going to contain the client, which 01:02:28.200 |
it'll just pass it along to the client to get its use. 01:02:37.440 |
OK, so the system prompt, pass it in, no tools, no usage, 01:02:49.080 |
Again, there's a stream version and a non-stream version. 01:02:55.520 |
If you pass in stream, it'll use the stream version. 01:03:13.320 |
We could have put these methods directly in inside here. 01:03:17.440 |
But I feel like I really prefer to do things much more 01:03:25.280 |
And then I can just gradually add a little bit to it 01:03:28.920 |
And I can also document it a little bit as the time, 01:03:57.560 |
So I just call get schema for you automatically. 01:03:59.800 |
And then at the end, we'll add to the history 01:04:21.040 |
And the reason why is because each time it calls the client, 01:04:29.640 |
So again, we can also add pre-fill, just like before. 01:04:41.560 |
So you can see adding chat required almost no code. 01:04:47.680 |
Really, it's just a case of adding everything 01:04:50.600 |
to the history, and every time you call the client, 01:04:54.320 |
So that's all a stateful-seeming language model is. 01:05:11.160 |
And the nice thing, the kind of interesting thing 01:05:13.160 |
here is that because the tool use request and response are 01:05:19.680 |
both added to the history, to do the next step, 01:05:27.000 |
So I just run it, and it goes ahead and tells me the answer. 01:05:41.760 |
is, would you like to go ahead with this tool activation? 01:05:46.720 |
with the tool use block, like it would like to use this tool. 01:05:51.440 |
before it actually runs the code that you gave it? 01:05:53.720 |
Maybe you want to check the inputs or something like that? 01:05:57.720 |
So you would need to put that into your function. 01:06:06.280 |
So one of the things-- in fact, we'll see it shortly. 01:06:33.800 |
So I had my earlier question before we introduced chat, 01:06:40.360 |
just by forcing stuff into earlier exchanges. 01:06:49.240 |
Now that we have these tool interactions with tool IDs, 01:06:54.600 |
Like, let's say I had a sequence of interactions 01:07:24.020 |
So at this point, it's now going to be very confused, 01:08:29.920 |
and is that added as a tool block to the history? 01:08:42.080 |
It's just the tool block as part of the history. 01:08:58.160 |
So I was delighted to discover how straightforward images are 01:09:51.400 |
And the source is a dictionary containing base64, 01:09:57.760 |
Anyway, you don't have to worry about any of that 01:10:16.040 |
You can have multiple images and multiple pieces of text 01:10:21.920 |
So to do that, it means you can't just pass in strings. 01:10:30.680 |
So here we can say, all right, let's create-- 01:10:33.560 |
this is a single message containing multiple parts. 01:10:37.760 |
So maybe these functions should be called image part and text 01:10:44.240 |
A single message contains an image and this prompt. 01:10:55.880 |
And the first message contains a list of parts. 01:11:07.120 |
no particular reason to have to manually do these things. 01:11:09.920 |
We can perfectly well just look and see, oh, it's a string. 01:11:13.080 |
We should make it a text message, or it's bytes. 01:11:23.160 |
This is something I remember Jono and I talked about. 01:11:26.320 |
I think you said you feel like this is kind of like part 01:11:29.440 |
of the Jeremy way of coding is I don't go back and refactor 01:11:33.280 |
things, but I just redefine them later in my notebook. 01:11:36.960 |
And so I previously hadn't exported makeMessage. 01:11:44.160 |
now going to actually call makeContent to automatically 01:11:52.160 |
And so now we can just pass in-- we can call our client. 01:11:58.120 |
The list of one message contains a list of parts, 01:12:06.400 |
So behind the scenes, when we then run the last cell, 01:12:15.040 |
it actually generates a Python file containing 01:12:29.240 |
particularly when you look at how much empty space there is. 01:12:32.160 |
And these all things say which cell it comes from and so forth. 01:12:37.960 |
OK, so that is the first of two modules to look at. 01:12:47.200 |
Any thoughts or questions before we move on to the tool loop? 01:12:59.680 |
when you started this, if it was beyond what you've already 01:13:08.920 |
Is there plans for this to grow into a fully stateful chat 01:13:13.680 |
thing that can offer up different functionality? 01:13:15.880 |
What's the journey of, oh, I should write this thing that's 01:13:21.720 |
I mean, I imagine I must be a very frustrating person 01:13:24.800 |
to work with because I'd never have any plans, really. 01:13:32.360 |
that maybe I should do something in this general direction. 01:13:34.920 |
And then people ask me, like, oh, why are you doing that? 01:13:44.120 |
So yeah, I don't think I had any particular plans 01:13:57.800 |
including in the Anthropic documentation for Claude, 01:14:04.400 |
think people should have to write stuff like that. 01:14:07.840 |
And then when I started to write my own thing using 01:14:16.400 |
I looked at some of the things that are out there, 01:14:20.040 |
kind of general LLM toolkits, APIs, libraries. 01:14:25.120 |
And on the whole, I found them really complicated, too long, 01:14:34.200 |
taking advantage of my existing Python knowledge. 01:14:42.160 |
called LLM, which Jono and I started looking at together. 01:14:47.200 |
But it was missing a lot of the features that we wanted. 01:14:55.400 |
But yeah, in the end, I guess the other thing about-- 01:15:00.680 |
so the interesting thing about Simon's approach with LLM 01:15:03.840 |
is it's a general front end to dozens of different LLM 01:15:15.160 |
And as a result, he kind of has to have this lowest common 01:15:20.280 |
denominator API of like, oh, they all support this. 01:15:26.800 |
So this was a bit of an experiment in being like, 01:15:28.840 |
OK, I'm going to make this as Claude-friendly as possible. 01:15:32.440 |
Which is why I even gave it a name based on Claude. 01:15:38.440 |
you know, that's why I said this is Claude's friend. 01:15:44.800 |
And I didn't know ahead of time whether that would turn out 01:16:12.600 |
Like, the platforms are getting-- which is nice, 01:16:20.840 |
will be GPT's friend and Gemini's friend as well. 01:16:30.240 |
Maybe they'll have an entirely consistent API. 01:16:41.680 |
to be as good as possible to work with that LLM. 01:16:46.480 |
is it possible to make them compatible with each other 01:16:52.560 |
I mean, I'd be interested to hear your thoughts, Jono. 01:16:54.760 |
But like, when we wrote the GPT version together, 01:17:01.360 |
and we literally just duplicated the original Baudet notebook, 01:17:06.640 |
started at the top cell, and just changed the-- 01:17:10.880 |
and did a search and replace of Anthropic with OpenAI, 01:17:15.080 |
and of Claude with GPT, and then just went through each cell one 01:17:18.760 |
at a time to see how do you port that to the OpenAI API. 01:17:24.200 |
And I found that it took us, what, a couple of hours? 01:17:35.040 |
is that this is not the full and only output of the AnswerAI 01:17:40.000 |
This is like, oh, you saw things Jeremy is tinkering with just 01:17:44.320 |
So maybe, yeah, it's good to set expectations appropriately. 01:17:46.800 |
But also, yeah, it didn't really feel like it was pretty easy, 01:17:50.520 |
been inspired by, is the generous way of saying it, 01:18:07.920 |
You know, because it's so sequential, in some ways, 01:18:12.480 |
But it doesn't mean you can do it from the top 01:18:16.720 |
and you go all the way through until you get to the bottom. 01:18:21.560 |
And the only part that was even mildly tricky 01:18:25.960 |
made for the OpenAI one, which instantly got mirrored back 01:18:29.120 |
And then, again, because they were built in the same way, 01:18:30.880 |
it was like, oh, we've tweaked the way we do. 01:18:37.960 |
It was very easy to just go and find the equivalent function, 01:18:44.720 |
to write software, especially for this kind of like-- 01:18:49.240 |
beyond what is one or two notebooks of stuff. 01:18:52.600 |
If it did, I would add another project or another notebook. 01:18:58.560 |
These are kind of like the bases which we can build on. 01:19:20.560 |
in their documentation of a customer service agent. 01:19:24.200 |
And again, there's a lot of this boilerplate. 01:19:30.960 |
And then it's all a second time, because it's now the functions. 01:19:41.680 |
there's like a little pretend bunch of customers 01:19:59.760 |
And then each customer has a number of orders. 01:20:03.440 |
So it's a kind of a relational, but it's more like MongoDB 01:20:13.400 |
Yeah, so they basically describe this rather long, complex 01:20:20.320 |
And as you can see, they do absolutely everything 01:20:25.440 |
if you're really trying to show people the very low level 01:20:29.800 |
But I thought it'd be fun to do exactly the same thing, 01:20:39.720 |
So the first feature they implement is getCustomerInfo. 01:20:44.400 |
You pass in a customer ID, which is a string, 01:20:55.720 |
And so you'll see here we've got the documents, 01:20:59.160 |
we've got the doc string, and we've got the type. 01:21:14.400 |
And then something that they didn't quite implement 01:21:30.360 |
If it is there, then we'll set the status to cancel 01:21:51.240 |
is from their descriptions in the doc string here. 01:21:55.600 |
So if we now go chat.tools, because we passed it in, 01:22:05.120 |
And so when it calls them, it's going to, behind the scenes, 01:22:13.560 |
But to see what that looks like, we could just do it here. 01:22:24.200 |
in a different library, which we created called tools.lm. 01:22:34.600 |
OK, getSchema, oops, O for O in chat.tools, there you go. 01:22:48.400 |
pretty similar to what Anthropx version had manually. 01:22:54.160 |
So yeah, we can say, tell me the email address of customer C1. 01:23:13.400 |
And so remember, with our thing, that's already now 01:23:20.120 |
And it automatically calls it on our history. 01:23:31.240 |
So you can see, as soon as we got that request, 01:23:45.600 |
our dear friend Hamel has a thing about saying, 01:23:49.080 |
I already want to be able to inspect what's going on. 01:23:51.320 |
Maybe we could do this at a couple of different levels. 01:24:08.320 |
was a tool use block asking for calling this function 01:24:16.080 |
And then we passed back-- and that had a particular ID. 01:24:18.480 |
We passed back saying, oh, that tool ID request, 01:24:33.440 |
OK, there's-- it's just telling us what we told it. 01:24:40.840 |
like I wanted to see the actual tool definitions and things, 01:24:58.320 |
So that has to be done before you import Anthropic. 01:25:17.440 |
And so here is the request method post URL, headers, 01:25:38.040 |
So now this is including all of that same information again, 01:25:44.160 |
because the model on Anthropic's side is not stateful. 01:25:48.400 |
We can see, OK, we've still got all of the tools, 01:26:03.320 |
can trust that the library does what you want. 01:26:08.920 |
Thank you, Anthropic, for having that environment variable. 01:26:13.080 |
Yeah, because in the end, if you're stuck on something, 01:26:15.680 |
then all that's happening is that those pieces of text 01:26:36.480 |
So this is interesting, because it can't be done in one go. 01:26:41.760 |
So the answer it gave us was, OK, tell me about customer C1. 01:26:54.720 |
So if we pass it back, then it says, OK, there are two orders. 01:27:15.560 |
So it's passed back some text and a tool use request 01:27:27.200 |
would have had both tool use requests in one go. 01:27:50.120 |
So this is something interesting that it does. 01:27:57.960 |
That's something that Opus, in particular, does. 01:28:00.560 |
So then-- no, OK, it's still only doing one at a time. 01:28:16.360 |
I haven't seen them before when I do API access. 01:28:24.280 |
going to have to-- given that it's only doing one at a time, 01:28:28.640 |
one to get the information about the customer, 01:28:30.840 |
then to cancel order A1, and then to cancel order A2. 01:28:33.920 |
But each time that we get back another tool use request, 01:28:42.400 |
So we've added a thing here called tool loop. 01:28:46.280 |
And for up to 10 steps, it'll check whether it's 01:29:03.520 |
Just like we just called-- because self is chat, right? 01:29:09.880 |
Optionally, I added a function that you could just 01:29:16.480 |
And I also added a thing called continuation func, which 01:29:21.600 |
So if these are both empty, then nothing happens. 01:29:24.160 |
It's just doing that again and again and again. 01:29:30.280 |
So now, if we say, can you tell me the email address 01:29:46.840 |
It's like, OK, please cancel all orders for customer C1. 01:30:17.960 |
You have a for loop that calls the thing 10 times. 01:30:35.680 |
And remember, it's got the whole history, right? 01:31:04.920 |
I think it should get back false, and it should know. 01:31:29.480 |
So I created this little library called ToolsLM. 01:31:40.200 |
which is GetSchema, which is this little thing here. 01:31:45.840 |
And it's actually got a little example of a Python Code 01:31:55.760 |
Yeah, it's also got this little thing called Shell. 01:32:09.880 |
So GetShell is just a little Python Interpreter. 01:32:23.920 |
And CodeChat is going to have one extra method in it, 01:32:34.880 |
So this is important to tell it all this stuff, 01:32:39.480 |
And the result of the expression on the last line 01:32:43.360 |
If the user declines request to execute, then it's declined. 01:32:51.080 |
And so you can see here, I have this little confirmation 01:32:57.560 |
And if they say no thank you, then I return declined. 01:33:08.800 |
And I have a list of imports that I do automatically. 01:33:20.240 |
And just a little reminder, Haiku is not so smart. 01:33:22.960 |
So I tend to be a little bit more verbose about reminding it 01:33:27.920 |
And I wanted to see if it could combine the Python 01:33:33.600 |
So I created a simple little thing here called getUser. 01:33:50.600 |
So in trying to figure out how to get Haiku to work-- 01:34:01.440 |
was to give it more examples of how things ought to work. 01:34:32.160 |
that would include the actual tool calling syntax or anything? 01:34:39.000 |
It seems to be enough for it to know what I'm talking about. 01:34:53.120 |
So for the OpenAI one, I just added today, actually, 01:34:59.640 |
is a function you can call because GPT does care. 01:35:03.560 |
So we might add the same thing here, mockToolUse. 01:35:10.480 |
yeah, here's the function you're pretending to call. 01:35:12.680 |
Here's the result we're pretending that function had. 01:35:38.040 |
that multiplies together the ASCII values of each character 01:35:52.160 |
So that's just coming from this input with this message. 01:35:57.120 |
So it's actually, if you enter anything at all, it'll stop. 01:36:08.240 |
We'll press N. OK, so it responded with a tool use 01:36:19.920 |
And because it's in the tool loop, it did run that code. 01:36:31.000 |
So it's responded with both text as well as a tool use request. 01:36:45.840 |
So all that's happened is that behind the scenes, 01:37:18.800 |
So if I just write checksum, that should show it to me. 01:37:41.080 |
So you can play around with the interpreter yourself. 01:37:43.880 |
So you can see it has the interpreter has now 01:37:46.960 |
And so this is where it gets quite interesting. 01:37:48.920 |
Use it to get the checksum of the username of this session. 01:37:51.960 |
So it knows that one of the tools it's been told exists 01:38:01.880 |
added a setTools, append the self.runCell automatically. 01:38:12.720 |
So it now knows that it can get the user and it can run a cell. 01:38:18.080 |
So if I now call that, you can see it's called getUser. 01:38:31.120 |
use with multiple tools, including our code interpreter, 01:38:34.760 |
which I think is a pretty powerful concept, actually. 01:38:43.440 |
And if you wanted to see the actual code it was writing, 01:38:47.720 |
or look at the history, or inspect that in some other way. 01:38:55.960 |
So we've used showContents, which is specifically just 01:39:01.440 |
If we change it to print, it'll show everything. 01:39:05.120 |
Or yeah, you can do whatever you like in that trace function. 01:39:10.400 |
And of course, we could also set the anthropic lobbying debug 01:39:16.880 |
So yeah, none of this needs to be mysterious. 01:39:33.480 |
where the only thing I bother documenting on the home page 01:39:36.240 |
is chat, because that's what 99.9% of people use. 01:39:39.760 |
You just call chat, you just pass in a system prompt, 01:39:52.000 |
So for the user, there's not much to know, really. 01:39:57.200 |
It's only if you want to mess around making your own code 01:39:59.640 |
interpreter, or trying something that you'd even 01:40:11.680 |
I mean, I feel like I'm quite excited about this way 01:40:22.120 |
Like, all I did just now was walk through the source 01:40:33.040 |
might have done it all already, but if you didn't, 01:40:38.280 |
and the anthropic API, and the React pattern, 01:40:45.400 |
And I remember I asked you, Jono, when I first 01:40:50.160 |
could you read this notebook and tell me what you think? 01:40:52.480 |
And you said the next day, OK, I read the notebook. 01:40:54.600 |
I now feel ready that I could both use and work 01:41:06.960 |
where you're coming from, how comfortable you 01:41:10.040 |
I think using it, it's very nice, very approachable. 01:41:12.920 |
You get the website with the documentation right there. 01:41:15.240 |
The examples that you use to develop the pieces 01:41:20.440 |
Yeah, reading through the source code definitely felt like, OK, 01:41:23.520 |
I think I could grasp where I needed to make changes. 01:41:26.320 |
I had to add something for a different project 01:41:32.880 |
Yeah, so it's quite a fun way to build stuff. 01:41:48.320 |
One is you have some personal metaprogramming practices that 01:41:55.960 |
So patching into classes instead of defining in classes, 01:41:59.960 |
liberal use of delegates, liberal use of double star 01:42:12.280 |
but eval-ish metaprogramming around interpreting-- 01:42:14.960 |
I mean, it's using the symbol table as a dictionary. 01:42:33.880 |
can help people to create stuff that otherwise 01:42:38.680 |
might be hard to create or might have otherwise 01:42:44.400 |
is to not write stuff that the computer should 01:43:00.760 |
instead of importing from FastCore and from the tools 01:43:06.640 |
it's like, oh, often I'll have a utils notebook, right? 01:43:10.880 |
that are completely orthogonal to what I'm actually doing, 01:43:15.760 |
It's like, oh, I have a thing for getting a list 01:43:21.200 |
into a different format or a thing for reading files 01:43:23.800 |
and changing the image type and resizing it to a maximum size. 01:43:27.480 |
You can still have the same idea where, like, 01:43:30.400 |
This is me implementing the pieces one by one. 01:43:32.360 |
Maybe there's somewhere else where, like, oh, there's 01:43:34.560 |
a few utility functions that are defined and documented 01:43:37.080 |
in their own separate place so they don't clutter things up. 01:43:39.640 |
This more general literate notebook-driven small example 01:43:45.440 |
doesn't require that you've written the FastCore library, 01:43:48.280 |
but it's helped along by having those pieces at hand. 01:43:51.960 |
And also, in general, FastCore, particularly FastCore.basics, 01:43:56.520 |
is designed to be the things I feel like maybe 01:44:06.880 |
without importing from that and without using stuff from that 01:44:15.000 |
I'll say, like, it's something I've definitely noticed. 01:44:17.520 |
There's two reactions I see to people reading my code, which, 01:44:30.840 |
And it's very intentionally not the same as everybody else's. 01:44:36.960 |
And they're like, why don't you just do everything the same way 01:44:42.880 |
And some people go, like, wow, there's a bunch of things here 01:44:51.560 |
And so, like, our friend Hamill, in particular, 01:44:55.800 |
He's just like, oh, I was just reading your source code 01:44:57.560 |
to that thing you did, and I learned about this whole thing. 01:45:11.840 |
Otherwise, they would have given up long ago. 01:45:19.840 |
If people can relate it to things in other languages, 01:45:22.600 |
Like, every time you do patch, I'm just like, OK, 01:45:25.520 |
Or every time you, you know, a lot of these things 01:45:30.280 |
Or in Ruby, they even call it monkey patching. 01:45:35.840 |
into the standard library, some Python programmers are like, 01:45:38.360 |
no, you're not allowed to use the dynamic features 01:45:41.960 |
of this dynamic language that were created to make it dynamic. 01:45:54.280 |
If you were reading the source code and you wanted more, 01:45:56.840 |
well, I think you got all the more you could want. 01:46:00.480 |
Yeah, hopefully we might try and do these for a few other 01:46:02.920 |
projects as well and work through the backlog 01:46:04.920 |
of the little side things that haven't really been documented. 01:46:08.000 |
But yeah, is there anything else you wanted to add for this 01:46:14.360 |
I mean, series is probably too grand a word, you know? 01:46:18.280 |
I think it's just like, from my point of view, 01:46:38.000 |
will feel some social pressure to do the same thing. 01:46:40.400 |
And I will then get to learn more about their work. 01:46:44.800 |
And Alexis and I have had a similar conversation. 01:46:46.840 |
And he's promised to teach me about some of his work 01:47:06.280 |
And hopefully this is something that provides some level 01:47:08.720 |
of public benefit to at least some people is-- 01:47:13.560 |
this is probably something that would be a private, secret, 01:47:19.680 |
And here, it's like, no, there's no need to do that. 01:47:23.920 |
Other people can benefit, too, if they want to. 01:47:26.120 |
Did any of you have anything else to add to that? 01:47:35.200 |
My one thought is I think these are a good idea. 01:47:41.320 |
So it's good to do deep dives on something that has depth to it. 01:47:45.000 |
It's also good to do shallow dives on something 01:47:48.400 |
that is quick and might be trivial to the person 01:47:52.680 |
explaining it, but is totally unfamiliar and, therefore, 01:47:55.800 |
high value to the person who hasn't seen it before. 01:48:01.680 |
Let's see who achieves the first TikTok-length explainer.