back to index

Git push get an AI API: Ryan Fox-Tyler


Chapters

0:0 Intro
0:22 AI platform
0:33 Getting started
0:57 Agenda
1:29 Hyper Categories
2:21 Game Play
2:33 Game Start
4:58 Results
7:12 Concepts
7:49 Why AI
8:25 Documentation
9:37 GitHub issue triage
10:4 Connecting to GitHub
13:56 Using the template
25:19 Inference History
26:4 System Logs
31:49 Hyper Mode
39:29 Deployment

Whisper Transcript | Transcript Only Page

00:00:00.560 | At Hypermode, we believe iteration is everything, right? That's why we built a platform that makes it really easy to iterate over AI features. And over the course of today, over the next 50 minutes, really, you're going to be able to build a couple of AI features and incrementally iterate on them, right?
00:00:28.760 | And we think that incremental iteration is everything.
00:00:30.740 | And so that's why we're going to walk through it.
00:00:33.580 | So let's get started.
00:00:35.780 | As we dig in, there's a few things to be aware of.
00:00:38.100 | We are just launching.
00:00:39.180 | So I'm sure there'll be a couple of things at break.
00:00:41.040 | So bear with us on that, but we'll work through it.
00:00:44.320 | We are going to skip some steps to just condense this.
00:00:47.060 | So you're going to push right to main.
00:00:49.060 | That's OK.
00:00:49.760 | That's always fun.
00:00:51.740 | But let's really just kind of start to dig in.
00:00:55.140 | And over on the agenda, we're going to start with really kind of showing
00:01:00.020 | you a few different ways where this plays out.
00:01:01.620 | So we're going to start by playing a familiar game where we're using AI
00:01:05.600 | to scale the format.
00:01:07.120 | Then we're going to understand how that game's constructed
00:01:09.060 | and kind of break it down to the building blocks,
00:01:11.120 | and then start to apply those into triaging GitHub issues.
00:01:14.120 | We pick that because there's open data that we can easily use on that.
00:01:17.180 | You could think about applying that to a lot of different contexts,
00:01:19.600 | whether that's customer records or product records or other things
00:01:22.220 | in your application.
00:01:23.560 | But we'll start with GitHub issues and talk
00:01:25.400 | about how we can generalize those concepts to add AI
00:01:27.320 | to your application.
00:01:29.540 | So we've created a game that we call Hyper Categories.
00:01:33.620 | It's like the familiar game Scategories,
00:01:35.520 | but it works for a bigger format of people.
00:01:37.660 | The traditional game doesn't really work for more than six
00:01:40.540 | or eight people.
00:01:41.680 | So we're going to invite you all to play with us.
00:01:43.560 | So you'll need your phones in a second for a QR code to kind of join
00:01:45.900 | into this, but the prompt really is pretty simple.
00:01:48.600 | Given a starting letter, provide the most unique entry
00:01:51.180 | that matches each category.
00:01:53.100 | You'll get a point if you actually match the category
00:01:56.200 | and you start with the right letter,
00:01:57.420 | but you'll share that point with anyone else
00:01:58.960 | that says something similar.
00:02:00.540 | So if the category was furniture and the letter was T,
00:02:03.280 | and you said table, anyone else that says table,
00:02:05.580 | you're going to share that point around that way.
00:02:07.780 | So you need to be unique.
00:02:08.780 | And we'll talk about how we're using AI to kind of power this.
00:02:11.580 | But it also helps you to start to think about the building
00:02:15.900 | blocks that you're going to use.
00:02:17.100 | So let's actually jump in.
00:02:18.600 | We'll play, and then we'll kind of break it down
00:02:19.980 | afterwards with how it's actually running.
00:02:21.880 | We do.
00:02:27.280 | We have a Hypermode jacket for whoever wins.
00:02:29.940 | So Matt's going to get that built and running.
00:02:35.180 | Here we go.
00:02:35.680 | Can everybody see that?
00:02:45.060 | All right.
00:02:45.560 | We are playing with things that start with the letter T.
00:02:49.260 | So please scan that QR code if you can.
00:02:52.460 | Hopefully, everybody can.
00:02:54.620 | If not, there is a link to it in the doc that accompanies
00:02:59.040 | this workshop, which they can get to how.
00:03:04.540 | Hype.foo/workshop, right?
00:03:10.960 | We'll let it go a little longer than the timer here.
00:03:17.460 | It won't cut off.
00:03:18.560 | So if you haven't joined in yet, feel free.
00:03:20.260 | There is a bot playing against you as well.
00:03:38.540 | I'm curious to see the creativity that people come up with here.
00:03:44.860 | the bot tends to pick the obvious answers.
00:03:46.900 | So it doesn't do very well in this one.
00:03:48.500 | Need some music.
00:04:09.560 | Jeopardy music.
00:04:14.460 | There we go.
00:04:15.500 | Thank you.
00:04:17.840 | A little improv there.
00:04:18.640 | Yeah.
00:04:22.500 | I see people typing still.
00:04:25.940 | So we'll give it a little bit more time.
00:04:27.400 | We can end it whenever you like.
00:04:42.480 | I see most phones down.
00:04:50.020 | So another five seconds.
00:04:53.960 | Five.
00:04:54.500 | Why don't we see the results?
00:04:55.900 | Three, two, one.
00:04:57.860 | All right.
00:05:00.060 | Where is Kirk Byers?
00:05:03.000 | Congratulations.
00:05:03.860 | Come see us after.
00:05:04.500 | We'll grab you a jacket.
00:05:07.140 | Awesome to see.
00:05:07.700 | So you see the bot there in fourth place.
00:05:09.660 | So great job for three of you for beating the bot.
00:05:12.940 | And we have a tie.
00:05:14.500 | Oh, yeah.
00:05:14.800 | Sorry.
00:05:15.080 | I missed.
00:05:16.080 | Look at it.
00:05:16.980 | Francisco.
00:05:19.180 | Awesome.
00:05:19.680 | So come see us after.
00:05:20.580 | We have two jackets.
00:05:21.620 | We can make that work.
00:05:22.140 | Shall we look at the results?
00:05:23.340 | Yeah, let's look at it.
00:05:24.380 | All right.
00:05:26.460 | Things you wear.
00:05:27.420 | Tevas.
00:05:29.440 | Oh, that got an invalid.
00:05:30.600 | So we'll dig into that.
00:05:31.600 | AI is not always perfect in the evaluation.
00:05:34.640 | Still one.
00:05:35.100 | We need to train the model better.
00:05:36.340 | Yeah.
00:05:37.240 | Cool.
00:05:37.740 | So why don't we jump over to show you how this is actually
00:05:39.380 | working behind the scenes.
00:05:40.580 | This is a template that you can deploy.
00:05:42.200 | You can play it back at your company or wherever you're at.
00:05:45.940 | But really, we start to think about these type of AI
00:05:48.180 | applications.
00:05:50.220 | Can we get the screens lined up?
00:05:51.820 | The one on the left is a little off there.
00:05:54.620 | Oh, there we go.
00:05:56.020 | As these building blocks, right, that you're
00:05:57.760 | starting to assemble these into your application, right?
00:06:00.560 | So when you submit that response or when the bot
00:06:02.140 | submits the response, we score that as well.
00:06:04.240 | We validate the starting letter.
00:06:05.840 | We validate the category, right?
00:06:07.440 | And so we are using a model to make sure that you are hitting
00:06:09.600 | that category as we saw in the results there.
00:06:11.440 | It's not always perfectly right.
00:06:12.540 | We can continue to train that.
00:06:13.940 | But one of the things we learned as we were building this
00:06:15.940 | was if the letter was P and the thing was occupation,
00:06:20.260 | you could type in P doctor and the model would think
00:06:22.900 | that that was valid, right?
00:06:24.760 | So that's why we added a dictionary validation as well.
00:06:27.960 | And we really see this as illustrating the way we see the
00:06:30.540 | world of AI applications are developed.
00:06:32.440 | It's this mix of models and traditional programming paradigms
00:06:35.580 | that come together that actually make it useful, right?
00:06:38.940 | So we have that filtering stage and then we go into the scoring
00:06:41.680 | stage and we start to cluster those responses.
00:06:43.760 | We look at it and say if you said table and someone else said tables,
00:06:46.980 | we consider those the same answer, right?
00:06:48.680 | So we're using those processes to make sure that those things are matched
00:06:52.620 | and it's not just an exact text search match.
00:06:54.620 | These are ways you could start to scale this game to a size of this group, right?
00:06:58.260 | And then that responds to the leaderboard.
00:07:00.300 | And we show all this just so you start to get a feel for how these things are assembled.
00:07:03.960 | It's a set of building blocks.
00:07:06.800 | Behind this is a set of functions, models, and data sources, right?
00:07:13.380 | So you start to think about this at a deeper level.
00:07:15.720 | It's basic programming concepts, right?
00:07:17.580 | We have a set of functions.
00:07:19.320 | We have a collection of user responses that we're doing similarity search across.
00:07:22.960 | A set of models that help power both those functions directly as well as the embeddings
00:07:26.720 | for the models or for the collections.
00:07:28.660 | And we're going to walk through all this in the next example and you're actually going to get hands on it
00:07:31.600 | and build these pieces into some connections, right?
00:07:34.240 | Whether that's within hypermode itself, out to your database, to a traditional API, to a model host.
00:07:40.640 | We make it really easy to stitch all that together.
00:07:43.420 | And so that's what we want to show you now is really to apply these concepts into triaging a GitHub issue.
00:07:49.660 | So if anyone has ever worked on an open source project, you know that you get GitHub issues
00:07:53.660 | from a wide variety of sources and quality is not always great, right?
00:07:57.700 | So we thought, what if we could actually apply AI to make it easier to triage those issues,
00:08:02.700 | to understand what type of issue it was, whether similar issues that may be already reported
00:08:07.280 | or trends that you're starting to see, and actually be able to summarize trends coming out of the repo, right?
00:08:11.340 | So that's the part that we would love for you to build with us.
00:08:15.480 | So you guys ready?
00:08:16.620 | We're going to we're going to build this, actually build it.
00:08:19.820 | If you've got laptops out, that'd be good.
00:08:22.320 | I'm going to get out of here and we're going to we're going to kind of work together.
00:08:25.080 | The all this is documented, by the way, kind of chronologically, the way we're going to follow in the workshop.
00:08:31.460 | And to get to there, there's a short link, which is what is it?
00:08:36.040 | Hype.foo/workshop that will just take you into our Docs link, or you can go to docs.hypermode.com
00:08:43.040 | and go find the AI World's Fair tab here on the left.
00:08:47.080 | And feel free to browse around the rest of the docs and learn about Hypermode as you go.
00:08:51.040 | This is the platform we're using to build this, but also the concepts we're going to go through here
00:08:54.320 | It really could could work any way that you're building an application that's going to leverage AI and functions and data and all that stuff here.
00:09:01.900 | Let's see, we already we already kind of went through a lot of this basic stuff with Hypergories.
00:09:05.900 | I will just say for prereqs, if you hadn't seen this earlier, hopefully you've got some kind of text editor, I like to use VS Code.
00:09:12.900 | You'll need Node.js 20 or higher, and that's that's pretty much it.
00:09:17.480 | Obviously, Git in some way to to clone a repo and so forth.
00:09:21.480 | I like to use GitHub desktop, so you might see that flash on and off, but you can use the command line or whatever you're comfortable with.
00:09:27.220 | So, let's start with GitHub issue triage.
00:09:32.800 | We're going to go to this website we set up called JustShip.ai and this is something we just launched, so we'd love to hear any feedback you may have from it.
00:09:45.140 | There are a lot of other templates on here, but we're going to work with this GitHub issue triage template for the sake of this workshop.
00:09:51.180 | And if you're following along, so just click on the GitHub issue triage and we're going to click, it's going to tell you a little bit about the template, you could click to go view the template, but we're just going to go right ahead and deploy it.
00:10:03.420 | Now, for most of you, this will be the first time that you've been into hyper mode, so you will get a screen that looks like this, but when you go to connect it to your GitHub account, it's going to ask you to sign in.
00:10:15.640 | And that would be a normal GitHub login, so if you don't have GitHub, then you would need a GitHub account in order to continue.
00:10:22.780 | Question in the front?
00:10:24.180 | If I sign in, I need to give it access to all my current and future repositories.
00:10:29.780 | If I only select one, it seems to be one that doesn't appear.
00:10:32.880 | You should only need to give it access to the repository that we're working with today.
00:10:38.260 | If you're saying something different, we'd love to see what that looks like.
00:10:42.100 | If you have any concerns about that and don't want to continue, you could just watch along or we could check afterwards.
00:10:52.100 | I know that, I can't remember the exact detail, but I know that there's something on GitHub where some of the wording that GitHub gives you on that first link is a little confusing.
00:11:00.560 | It's only requesting repos permissions. That screen is a little confusing of the first time if you don't have any existing repos.
00:11:08.100 | So you're going to basically, this is me here, Matt Johnson Pint, and if you have other orgs, of course, you could pick them here.
00:11:18.100 | And you can name your repo anything you want. You can make it private or public.
00:11:23.100 | But we're going to create that, and I'm going to go to the one that I pre-created.
00:11:28.100 | It should land you when all is said and done. Let me find it on something that looks like this.
00:11:38.100 | So, we'll go through this a little bit more later, but this is basically setting up, cloning a template off of GitHub.
00:11:47.100 | And if you look on GitHub, you'll see that there is a template, let me zoom in, that was cloned from the ship issue triage template repo.
00:11:57.100 | And it's pretty bare bones.
00:11:59.100 | It's got some boilerplate setup, but most everything we're working with here today is going to be in this functions directory.
00:12:05.100 | Let me go back to the doc and just see where we're at for moving things along.
00:12:14.100 | This will give them a little time to do that.
00:12:19.100 | Okay, so once we have that, we're going to want to clone it, right?
00:12:30.100 | So, you can git clone it.
00:12:32.100 | I have already git cloned that repo.
00:12:36.100 | So, if I open it in Visual Studio Code, I will be looking at something that looks like this.
00:12:45.100 | I'll just pause for just a few minutes to make sure that everybody can clone the starter template.
00:12:50.100 | You're going to see me work with it for a little bit, too.
00:12:52.100 | So, if you're a little behind, that's fine.
00:12:54.100 | You can always catch up later or just continue to watch.
00:12:59.100 | I will point out that you may see some things in red, even though everything is fine.
00:13:05.100 | And I'm going to just show you how to get rid of that.
00:13:07.100 | When you first clone the repo, you want to cd into the functions directory.
00:13:12.100 | And you're going to do an npm install.
00:13:15.100 | It's going to go download all the dependencies off of the new package manager.
00:13:19.100 | And notice they're still in red.
00:13:21.100 | There is a little quirk in VS code that we haven't quite figured out how to avoid.
00:13:25.100 | At this stage, you need to open a TypeScript file.
00:13:29.100 | But you can do a few different things.
00:13:31.100 | You can just restart the TypeScript language server and the red will go away.
00:13:34.100 | This is a VS code issue that I've got an open issue on with that team.
00:13:40.100 | But it does install correctly.
00:13:42.100 | Two questions here.
00:13:43.100 | So when you're on JustShipAI, should you deploy on this?
00:13:49.100 | Sure.
00:13:50.100 | The question was if you are on JustShipAI, should you deploy or just go to GitHub?
00:13:54.100 | If you view it on GitHub, you're just going to see the template.
00:13:57.100 | And GitHub will also let you just use this template directly.
00:14:01.100 | It's up in the upper right-hand corner.
00:14:03.100 | But you can go through that stuff.
00:14:04.100 | But it's a little easier if you just go through the deploy and we'll automate that for you.
00:14:08.100 | Because we're also installing the HyperMode application.
00:14:11.100 | If you go through the green button on GitHub, then you'll have to import your project into HyperMode after the fact.
00:14:18.100 | Yeah.
00:14:19.100 | If you click on the deploy, that's creating your org and everything on HyperMode?
00:14:23.100 | Yeah.
00:14:24.100 | Jay, can you take a look at what they're saying?
00:14:32.100 | If people are having difficulties, we're going to send you somebody over and hopefully it's not everybody in the room.
00:14:39.100 | But yes, please, we can make this interactive within certain time constraints.
00:14:45.100 | There's nobody in the room after us, but I know you have other sessions to go to also.
00:14:49.100 | So we want to keep this on track.
00:14:52.100 | Shall I continue?
00:14:57.100 | Okay.
00:14:58.100 | So, we've installed the project.
00:15:01.100 | I've installed the dependencies.
00:15:03.100 | Let's build it.
00:15:04.100 | So we're going to use npm here.
00:15:06.100 | If you're used to yarn or bun or any pnpm, those should all work.
00:15:10.100 | But I'm going to do npm run build.
00:15:13.100 | And you're going to get some output that has our nice little pretty ASCII art that I'm wearing right here.
00:15:22.100 | And it's going to give you some metadata about your library.
00:15:24.100 | And it's going to describe that there are two functions in this starter template.
00:15:28.100 | There's one called classify issue and there's another one called trend summary.
00:15:32.100 | So together we're going to take a look at those two functions.
00:15:35.100 | And so we can understand how it works and what we're actually doing here.
00:15:39.100 | And then I'm going to actually run some of these for you so you can see them in action.
00:15:43.100 | So let's just go look at...
00:15:46.100 | I'm actually going to take them backwards order here.
00:15:48.100 | They're listed alphabetically.
00:15:49.100 | But we're going to look at the trend summary first.
00:15:53.100 | So this particular one, if we look at this, I'll make one other quick callout.
00:15:59.100 | So you may, if you're a web developer at all, you may see this and say, oh, we're in a TS file and it looks like we're writing TypeScript.
00:16:06.100 | We're actually writing something called AssemblyScript, which is a TypeScript-like language.
00:16:10.100 | It uses a lot of the same features of TypeScript and a lot of the same ecosystem as TypeScript.
00:16:14.100 | The reason we're using AssemblyScript is because all of your HyperMode functions are getting compiled to WebAssembly.
00:16:19.100 | And that's how we actually execute all of this code that you're writing in a nice, secure, fast, performant manner as it's running on the part of the HyperMode platform.
00:16:28.100 | Won't dwell on that too much, but that's the intent here.
00:16:32.100 | HyperMode will get support for other languages in the future.
00:16:35.100 | This is what we're using today.
00:16:37.100 | So what are we using from here?
00:16:41.100 | We're going to use, for this first one, we're going to use OpenAI.
00:16:44.100 | And you'll see that we import an OpenAI chat model and a couple system message and user message objects so we know how to use it.
00:16:50.100 | I've got another little helper function that I wrote here.
00:16:54.100 | Let me just close that.
00:16:56.100 | That is going to go get some issues from GitHub.
00:17:00.100 | And if I drill into that, this is not AI at all.
00:17:03.100 | This is just an HTTP REST call.
00:17:05.100 | GitHub has a nice API that you can use to go fetch data.
00:17:08.100 | And I'm going to fetch data.
00:17:11.100 | Pass in some parameters into a template.
00:17:13.100 | I can even do a little logging so I can see what's going on.
00:17:17.100 | Requires some headers.
00:17:19.100 | There's a little comment here that, like, we have a little secrets management system, so you don't have to put secrets in your code.
00:17:24.100 | And we can still pass your GitHub API tokens through securely and so forth.
00:17:28.100 | Ultimately, you get your data.
00:17:30.100 | We come back and say, hey, there's some issue data.
00:17:32.100 | We're going to throw it in an issues object so we can start working with it in the rest of your code.
00:17:36.100 | And that's what you're getting back here in this issues.
00:17:39.100 | It's an issues object.
00:17:40.100 | It's an array of issues, right?
00:17:42.100 | Hey, go get some GitHub issues.
00:17:45.100 | So once we have the issues, what are we going to do next?
00:17:48.100 | We're going to need to build a prompt to send off for analysis.
00:17:53.100 | Because what we want here is we're going to take a whole bunch of information from the issues.
00:17:56.100 | And we're going to say, hey, OpenAI, in this case, we're going to use GPT-4O.
00:18:01.100 | Would you summarize these for us?
00:18:03.100 | What's the overall trend?
00:18:05.100 | Is that what we're asking for?
00:18:07.100 | So it's not just summarize the issues, but like, give me a trend.
00:18:10.100 | So that's part of my instruction off to there.
00:18:13.100 | Provide a summary of the trends in their repository based on the issues created.
00:18:17.100 | And we're going to just send OpenAI, like a constructed string that includes like the timestamp,
00:18:24.100 | the user handle so we can kind of get some analysis, and like the issue title.
00:18:30.100 | And we're passing all that through into OpenAI.
00:18:35.100 | The other thing I'll call out for you here is like, now we're into some hypermode-specific code
00:18:41.100 | that we call our model interface.
00:18:43.100 | And this model that I'm looking at, it is an OpenAI chat model.
00:18:46.100 | So it knows everything about the OpenAI API.
00:18:49.100 | You don't have to go over and read the OpenAI docs in order to construct this API call.
00:18:54.100 | You just say, hey, I'm going to use the OpenAI chat model.
00:18:57.100 | And when I create input, it's telling me right here on the IntelliSense and then type ahead.
00:19:01.100 | I can just say, create some messages.
00:19:03.100 | It's going to take some messages.
00:19:04.100 | I've got a bunch of different messages of different types.
00:19:07.100 | It even goes as far as like optional parameters.
00:19:09.100 | Here I show temperature.
00:19:10.100 | So a lot of people, when they're brand new to AI, they're like, they never heard this term before.
00:19:14.100 | What is temperature?
00:19:15.100 | I don't know what that is.
00:19:16.100 | And then you see top P and you're like, I don't get that.
00:19:19.100 | It's really intuitive for a developer to just say, you know, I can just say input dot.
00:19:24.100 | And I get like, here are all the different options.
00:19:27.100 | There's like log probs and n and parallel tool calls and all the different options that Giv has, or sorry, that OpenAI has.
00:19:34.100 | And, you know, I can read them individually.
00:19:37.100 | If I use them incorrectly, it won't compile.
00:19:40.100 | All that kind of stuff that you would expect from a strongly typed client library.
00:19:45.100 | And we've baked that into hyper mode.
00:19:47.100 | Not just for OpenAI, but for other models as well.
00:19:50.100 | We've got some out of the box and we're going to be expanding on them.
00:19:53.100 | And we've designed the system so that if we don't support a certain models interface, you can write it yourself.
00:19:58.100 | There's nothing that prevents you from doing that.
00:20:00.100 | You can just inherit from a base class and write the implementation.
00:20:03.100 | I'll pause there for a second because I felt like that was a lot of words.
00:20:08.100 | This is a pretty basic rag use case, right?
00:20:12.100 | We're fetching some data.
00:20:13.100 | We're passing it to the model to ask for summarization.
00:20:15.100 | You could do this a lot of ways, right?
00:20:17.100 | At hyper mode, we make it really easy to stitch all those pieces together, right?
00:20:20.100 | And you can start to iterate through that, right?
00:20:22.100 | To Matt's point, you can try new models, right?
00:20:24.100 | And you don't have to go learn a new model and understand a different model uses temperature in a different way or has different thresholds on it.
00:20:29.100 | Right here, when you invoke a new model, do input dot temperature.
00:20:32.100 | You're going to see that right there in context.
00:20:34.100 | And so we find that it makes it a lot more productive for users as they're starting to stitch the model.
00:20:37.100 | These things together, not having to make any of these choices and really kind of hard choices up front, knowing that you can always swap them out really easily.
00:20:44.100 | One of the things we found when we were designing this is that a lot of the use cases seem to believe that all models were interchangeable.
00:20:53.100 | And we found just through testing that it's just not the case.
00:20:55.100 | Not only are prompts different and outputs are different, but also the different hyperparameters are different.
00:21:00.100 | The different ways you can control the APIs are different.
00:21:04.100 | Like everybody's got their own little custom snowflake tweaks and we needed a way to consolidate that so that the user can very easily get to all the different options that are available.
00:21:14.100 | So this is the API we came up with.
00:21:15.100 | And I'd love feedback about that for any API design engineers that are in the room.
00:21:20.100 | Come talk to me after.
00:21:21.100 | All right, so get OpenAI returns me some complex things.
00:21:27.100 | There is the ability to openAI to have multiple choices.
00:21:29.100 | We're just going to take the first one and we're going to take that message.
00:21:32.100 | And this may seem like a little thing, but like once I get the content of that string, I could do other things with it.
00:21:39.100 | In this case, all I'm going to do is just trim it because like sometimes it comes back with extra characters.
00:21:43.100 | But say I wanted to throw it in a database or I wanted to log it or I wanted to go call another model with that input.
00:21:49.100 | As soon as I say that people are like, oh, like Blankchain.
00:21:51.100 | Well, kind of.
00:21:52.100 | It's just programming.
00:21:53.100 | It's just like you're taking one input and you're manipulating it to do the next thing.
00:21:57.100 | So let's just run this.
00:22:00.100 | So what does that mean when we run it?
00:22:02.100 | In this case, it's already in HyperMode because I cloned from the template.
00:22:05.100 | But to say I've just written this, all I have to do, and this is where the title comes from, is just commit, git push.
00:22:11.100 | And we pick it up automatically and roll that out into your HyperMode console, which I'll go back to.
00:22:17.100 | This is the one we're looking at.
00:22:19.100 | And you'll see that I've got trend summary on here.
00:22:23.100 | Now, I've already configured this back end to have an API key off to opening eye, which you would need.
00:22:30.100 | But if you were doing it yourself, you would go over to hosts and you would need a GitHub token as well,
00:22:35.100 | because we're going to go pull issues off GitHub, right?
00:22:37.100 | So let's run this.
00:22:39.100 | We've got a little query tab here that uses Graphical.
00:22:41.100 | It's a built-in thing.
00:22:43.100 | We may replace that with something a little more interactive later, but it works for now.
00:22:47.100 | And I use Hugging Face Transformers, but let's, somebody have, somebody throw out a,
00:22:53.100 | favorite repository.
00:22:54.100 | I need the, the org name and the, anybody have another one?
00:22:59.100 | Anyone?
00:23:00.100 | Anyone?
00:23:01.100 | I'll pick something at random.
00:23:03.100 | Okay.
00:23:04.100 | I used to be, for a long time, I was a C# developer.
00:23:07.100 | Don't hold that against me.
00:23:09.100 | So I'm going to go, let's go see what's going on in the .NET Runtime.
00:23:12.100 | And I haven't tested this, so we'll just see what happens here.
00:23:15.100 | It's going to go fetch the top 100 issues off of .NET Runtime Repo.
00:23:19.100 | And within a few seconds here, it has summarized it and said, hey, there's the trend summary.
00:23:24.100 | And you notice we made a GraphQL call, but I didn't have to write any GraphQL schema.
00:23:30.100 | HyperMode automatically generates the GraphQL schema based on the functions that you export.
00:23:35.100 | So all I had to do was make this function trend summary and tell it what its inputs are.
00:23:39.100 | And it now has a full lit up GraphQL endpoint.
00:23:42.100 | I haven't been calling it from the HyperMode console, but you could use another tool like
00:23:47.100 | Postman, has a really good GraphQL interface, or if you use any GraphQL client where you want,
00:23:51.100 | or you could just use curl on the command line and write it into a script or your favorite
00:23:56.100 | HTTP client.
00:23:57.100 | You can use anything you want.
00:23:58.100 | Matt, we've got a question?
00:24:00.100 | Yeah.
00:24:01.100 | Question.
00:24:02.100 | Will it create GraphQL types from inferred types?
00:24:07.100 | It does.
00:24:08.100 | Yeah, I'll repeat the question.
00:24:10.100 | It says, does HyperMode create the GraphQL types inferred from the TypeScript types?
00:24:16.100 | It does.
00:24:17.100 | That is part of the HyperMode experience.
00:24:19.100 | It will even create custom scalers for you if you have those.
00:24:23.100 | There are some limitations, but we are actively engineering to try to overcome them.
00:24:28.100 | The biggest implementation you'll see right away is, at the moment, input types are not
00:24:33.100 | supported.
00:24:34.100 | You can have structured output types.
00:24:35.100 | So input types at the moment have to be all scalers.
00:24:37.100 | Or arrays.
00:24:38.100 | But say you have your custom object in GraphQL, we'll let you supply that.
00:24:43.100 | We need to do some engineering work to make that work.
00:24:47.100 | Yeah.
00:24:48.100 | It's a really nice way to define your code once.
00:24:54.100 | And you have a single source of truth, and it can be in a file you import.
00:24:58.100 | If you're in a corporation that has lots of libraries, you can make separate libraries for
00:25:01.100 | this.
00:25:02.100 | And it's just the NPM package ecosystem.
00:25:04.100 | So it just folds in like you would expect.
00:25:06.100 | When we support other languages, it will be whatever the package ecosystem is for those
00:25:09.100 | languages.
00:25:10.100 | We're not trying to do anything special here.
00:25:12.100 | So, Matt here, why don't we show what you get as well when you run that query, right?
00:25:16.100 | So if you go to the inferences tab, you'll see actually a history of this model run, right?
00:25:21.100 | Because we know that when you start to actually run models, you need more observability into
00:25:24.100 | that, right?
00:25:25.100 | Your app may be triggering things from the front end, dynamically prompting that model.
00:25:29.100 | So we make it really easy to see every time you hit a model, you're going to get one of these
00:25:32.100 | entries to understand how long it took, what was the input, and what was the output, right?
00:25:36.100 | And so you can start to debug this with additional visibility.
00:25:39.100 | And we find that makes it really easy, especially if you're trying to chain multiple models together
00:25:43.100 | in a single function.
00:25:44.100 | Yeah, like if I wanted to know, well, how many tokens did OpenAI consume for that?
00:25:48.100 | It did return that to me, but I didn't use it in my code.
00:25:51.100 | I could have, but even though I didn't use my code, I do have it in our inference history.
00:25:55.100 | So you can go back and say, oh, that prompt, you know, took 3,000 tokens.
00:26:00.100 | And then you also have access to raw logs of the functions as they run, right?
00:26:05.100 | So you can log things out to that.
00:26:06.100 | You'll see system logs are part of that, and they're all kind of oriented based on each individual
00:26:12.100 | function execution, right?
00:26:13.100 | So you can have high volumes of overlapping functions.
00:26:16.100 | It makes it really easy to debug here to understand what's going on, how models are being invoked,
00:26:20.100 | what the specific inputs and outputs of those are, so that you can continue to iterate and
00:26:23.100 | understand what's going on.
00:26:24.100 | This is the one I just ran.
00:26:26.100 | These are some ones I was running earlier, but we'll show you those in a minute.
00:26:30.100 | There's a whole bunch of other stuff on here.
00:26:32.100 | One of the interesting ones is like deployments.
00:26:34.100 | I've only done one on this repo.
00:26:36.100 | It was just the initial commit.
00:26:37.100 | But as we push, you'll see that, like, you have to get hash and you can click to link over
00:26:41.100 | there, and it makes it really easy to, like, see who changed what over time.
00:26:47.100 | Okay.
00:26:48.100 | Let's go on to -- so this one we're not going to have you guys implement yourself, because
00:26:54.100 | I don't want anybody in here to have to supply an opening iKey or GitHub token right now.
00:26:58.100 | But like I said, they are in the starter template and workshop.
00:27:01.100 | You can do that on your own time, however you like.
00:27:03.100 | We're going to go on to the next one, which I think everybody can do here together, because
00:27:07.100 | we're going to actually run the model on hypermode rather than externally.
00:27:10.100 | Is that the right spot in the sequence?
00:27:14.100 | Okay.
00:27:15.100 | Yeah.
00:27:16.100 | Bear with us if we're just going through presentation coordination here.
00:27:21.100 | Okay.
00:27:22.100 | So the next one we're going to look at -- I guess there's two functions that are in this starter
00:27:26.100 | template.
00:27:27.100 | The next one is classify issue.
00:27:28.100 | And we're going to use this at first, and then I'm going to have you guys modify it.
00:27:31.100 | We're going to actually put your code to work here.
00:27:33.100 | If you've never written assembly script or anything before, just pretend it's TypeScript.
00:27:37.100 | If you run into a thing where it says, hey, I don't know what type that is, that's where
00:27:40.100 | you get into the assembly script.
00:27:41.100 | And I can answer those questions, but it's pretty straightforward.
00:27:46.100 | So what does this function do?
00:27:49.100 | Well, in this case -- so this is going to be a little disconnected from the last one we
00:27:52.100 | did, because it's not going to go query GitHub.
00:27:54.100 | It's just going to take as inputs the title and the description and an issue ID.
00:27:59.100 | And we're going to log some information on what we're doing.
00:28:02.100 | And we're going to just build a really dumb prompt here, or input anyway, of summarizing
00:28:09.100 | the title and the description encatenated together.
00:28:12.100 | In a real app, you'd want to play with this a little bit and figure out what the right
00:28:15.100 | exact input that matches the model you're using.
00:28:17.100 | Oh, let's talk about the model.
00:28:19.100 | So this model is called Distilbert MNLI GitHub Issues.
00:28:23.100 | We found it on Hugging Face.
00:28:25.100 | This is a community-contributed model, I believe.
00:28:30.100 | But it's pre-trained on GitHub Issues.
00:28:32.100 | So it's a purpose-built model.
00:28:33.100 | We don't need to spend time model training here.
00:28:35.100 | We don't need to start with anything big.
00:28:36.100 | This is a really small model.
00:28:38.100 | It's good for just deciding whether GitHub Issue is -- I believe it tells you whether it's
00:28:43.100 | an issue, a bug, a feature request.
00:28:45.100 | Or a question.
00:28:46.100 | Or a question.
00:28:47.100 | Or a question.
00:28:48.100 | Yeah.
00:28:49.100 | That's what it's going to do.
00:28:50.100 | The interface for this, you'll see we import classification models from our models.is library.
00:28:56.100 | This word experimental, don't be scared off by that.
00:28:59.100 | It's just we're trying to make sure that we are careful about when we finally call it
00:29:03.100 | non-experimental.
00:29:04.100 | But this will match most of the classification models that are on Hugging Face at the moment.
00:29:09.100 | And yeah.
00:29:12.100 | So what are we going to do?
00:29:14.100 | We're going to build a prompt.
00:29:15.100 | We're going to say, hey, get that classification model.
00:29:17.100 | And now I've got an input that in this case requires one or more input strings as an array.
00:29:23.100 | And the reason it's one or more is because you can classify things in batches.
00:29:26.100 | I could say, hey, there are 10 GitHub issues I'd like you to classify.
00:29:29.100 | I only need to make one model call for that.
00:29:32.100 | And that's why I've got prediction zero, the first prediction coming back because I'm only
00:29:37.100 | asking for one.
00:29:38.100 | But those would match the inputs and outputs.
00:29:42.100 | And again, if there were other options here, you could say input dot.
00:29:45.100 | In this case, it's the only one.
00:29:47.100 | But we would wrap that up for you in the interface.
00:29:50.100 | And then we'll just log output.
00:29:52.100 | So this is really simple.
00:29:53.100 | Again, I've already got it running, so let's just run it.
00:29:56.100 | And the queries for these, by the way, are pasted into the docs.
00:30:00.100 | If you're following along in the docs, let me just make sure I show you where we are.
00:30:04.100 | We are at issue type classification.
00:30:06.100 | And there's some description there, but basically this is the code we're working on.
00:30:10.100 | Here's the example query we're going to run.
00:30:12.100 | I'm going to paste this query.
00:30:14.100 | And this query's got a lot of what looks like junk in it, but it's because we just took literally
00:30:20.100 | the entire text of the query off of GitHub and made sure it was properly escaped and stuff.
00:30:26.100 | So if you were using the GitHub API, it would work.
00:30:29.100 | If you're literally with your mouse copying and pasting text, you may find getting the new lines in there
00:30:35.100 | and the quotes escaped might just be a little tricky on the fly.
00:30:39.100 | But if you had input from an app, it would be doing that for you.
00:30:41.100 | It's just a string in and a string out.
00:30:45.100 | But let's go run that guy.
00:30:48.100 | Where am I?
00:30:49.100 | Make sure I'm in the right area.
00:30:50.100 | I think I've already got it in here.
00:30:52.100 | Yeah, I'll just paste it anyway, just make sure.
00:30:54.100 | Is that the one?
00:30:57.100 | It didn't paste correctly.
00:30:59.100 | There might be a docs issue there.
00:31:00.100 | This one I know works.
00:31:02.100 | Maybe something I need to fix in the docs.
00:31:07.100 | I don't know if Jay can fix that.
00:31:09.100 | The one that's in the docs isn't escaped properly.
00:31:12.100 | So when I copy and paste it, it's got new lines in there.
00:31:15.100 | Weird.
00:31:16.100 | But basically everything I'm putting in should be in the description.
00:31:19.100 | If you're trying this out and you get some weird error like this doesn't work, just reduce
00:31:26.100 | it down to some string in here just so you can follow along.
00:31:31.100 | We'll make sure we clean up the example later.
00:31:34.100 | Ideally it would be the entire thing of the issue.
00:31:36.100 | Indeed, this is the classification category that came out because it's a real issue.
00:31:42.100 | So what's happening here, just to kind of highlight some of the contrast here, right?
00:31:46.100 | When you deployed this project, this small model, different than kind of open AI's models,
00:31:50.100 | right?
00:31:51.100 | This small model actually got provisioned automatically in hyper mode.
00:31:53.100 | So when you're running this query, you're running against a dedicated instance of this model just
00:31:57.100 | for your project, and you can start to customize and adjust that, right?
00:32:01.100 | And so really wanted to show you these two functions to kind of show you the contrast, but
00:32:04.100 | also the similarities, right?
00:32:05.100 | The model interfaces are relatively similar and pretty intuitive to make it easy to kind of
00:32:09.100 | continue to iterate through that.
00:32:11.100 | Why don't we jump ahead slightly just in the interest of time?
00:32:14.100 | Sure.
00:32:15.100 | Start building a natural language search.
00:32:19.100 | Well, one thing I would like to do is just give people a few minutes to do some failure.
00:32:24.100 | We don't have time.
00:32:25.100 | Sorry.
00:32:26.100 | Your projects will continue to run.
00:32:28.100 | If you reference the docs, you're welcome to continue to iterate with this.
00:32:31.100 | We can spend all the time with you after if you'd like to.
00:32:34.100 | We won't do it, but the suggestion in the docs and there's a walkthrough is like,
00:32:38.100 | how would I add a threshold to this function?
00:32:40.100 | And say, hey, I need to know based on its confidence.
00:32:43.100 | Like, how accurate is this label?
00:32:45.100 | And that's something you don't need to train a model on.
00:32:48.100 | It's just code.
00:32:49.100 | You just write it.
00:32:50.100 | There's a-- instead of just that, there's confidence score.
00:32:53.100 | So you can decide what the confidence score you want is and do some comparisons and just write
00:32:58.100 | that code.
00:33:00.100 | But we'll let people on their own.
00:33:03.100 | And then we jump ahead to building a natural language search to identify similar issues.
00:33:07.100 | Right?
00:33:08.100 | Oftentimes, people report repeated issues.
00:33:10.100 | And wouldn't it be great if there could be a bot that responds to every issue saying,
00:33:13.100 | that sounds a lot like these three other issues.
00:33:16.100 | That's what we're going to build in the next 12 minutes.
00:33:18.100 | Okay.
00:33:19.100 | Sorry for the rush, guys.
00:33:21.100 | Like I said, there's nobody in the room after.
00:33:24.100 | We'll be around.
00:33:25.100 | So if you want to walk through this slower, if you have questions, we will help you.
00:33:29.100 | Okay.
00:33:30.100 | So we're going to go-- we're going to start writing some new stuff.
00:33:34.100 | Let's just take this directly from the docs so that we can walk through together.
00:33:37.100 | The first thing we're going to do is we're going to create what we call a hypermode collection.
00:33:41.100 | And basically, it's like, where are you going to store your data?
00:33:45.100 | Hypermode isn't a database, but we give you a very simple way to have an in-memory key
00:33:50.100 | value store that we can apply an index to.
00:33:52.100 | And that way, we can basically do your vector search in memory, in hypermode.
00:33:57.100 | So first, we have to go to our manifest and say, hey, there is a collection.
00:34:02.100 | So far, we haven't talked about the manifest, but you'll see in the root, there's a file
00:34:05.100 | called hypermode.json.
00:34:06.100 | And it has models.
00:34:07.100 | It has the two models we've been using so far.
00:34:09.100 | It has hosts, which were both for the first example.
00:34:14.100 | That's how we get out to other places.
00:34:17.100 | And I'm going to add that collection, so I'm just going to drop it right in here.
00:34:22.100 | So we have our collection defined.
00:34:24.100 | That's all we have to do.
00:34:25.100 | And there's some additional options and stuff, but we're not going to go through them all right
00:34:32.100 | The next thing we're going to do is we're going to add another model.
00:34:34.100 | So one of the things with -- if we're going to do a vector search, we need to know, well,
00:34:38.100 | what model are we creating our beddings off of, right?
00:34:41.100 | And we don't want you to think about that too much, but you do have to supply a model.
00:34:45.100 | So we're going to take another one from Hugging Face, and we're going to go add it to the
00:34:49.100 | model section.
00:34:50.100 | Boom.
00:34:55.100 | We're going to use mini-lm from Sentence Transformers.
00:34:58.100 | And you'll see that syntax in the documentation, but basically that's the source of the model
00:35:04.100 | we're using.
00:35:05.100 | It's going to be hosted on hypermode, and we got that from Hugging Face.
00:35:08.100 | That's it.
00:35:09.100 | Save that.
00:35:11.100 | And I'm going to do a -- I'm going to commit that for a minute.
00:35:18.100 | So let's do command line, git add everything, and git commit add collection and model.
00:35:33.100 | Okay.
00:35:34.100 | And then we've got nice git signing on, so it's asking me to authenticate.
00:35:39.100 | But once that's in, I can git push.
00:35:42.100 | And just from the push, it's going to start spinning up that model in hypermode.
00:35:47.100 | Now, you didn't have to do that at this step.
00:35:48.100 | You could do the code and then do it all at once, but this just gives it time to warm up
00:35:51.100 | a little bit since we're under time constraint.
00:35:55.100 | Okay.
00:35:56.100 | The next thing I'm going to do is I'm going to go grab the embedder function.
00:36:02.100 | So we need an embedder function.
00:36:05.100 | You know how that -- we showed you the model in hyperface a couple times.
00:36:08.100 | Well, this time we're going to use an embeddings model.
00:36:10.100 | And I'm going to go create a new file.
00:36:14.100 | embedder.ts and I'm going to paste that in here.
00:36:21.100 | And I won't walk the whole thing, but basically embedders need a text in and an array of vectors
00:36:27.100 | out, and the vectors are float32 arrays.
00:36:29.100 | So it looks like a two-dimensional array, but it's an array of float32s.
00:36:34.100 | And that's an assembly script thing that we have a concept of a float32 instead of just
00:36:39.100 | a number.
00:36:40.100 | But that's basically how it works.
00:36:41.100 | And we're going to create it, and we're going to just return all the predictions.
00:36:44.100 | One thing I have to do also is I have to -- the way that these exports work is I have to go
00:36:50.100 | to my index file and actually just make sure I include it.
00:36:54.100 | Otherwise, it's not going to leave the module when I build.
00:36:57.100 | At this stage, I should be able to -- I have to go to the functions directory.
00:37:02.100 | And I can say npm run build.
00:37:05.100 | And we should see it.
00:37:07.100 | Yeah.
00:37:08.100 | That embedder is in the output.
00:37:11.100 | It's still not searchable, though.
00:37:13.100 | If I were to run it at this stage, you'd see the array of all those floats.
00:37:16.100 | And it's kind of not very usable from a user's perspective.
00:37:20.100 | So let's grab the actual function that's going to do the search.
00:37:24.100 | And that's also in the docs.
00:37:26.100 | So I'm just going to copy that in.
00:37:27.100 | And I'm going to do the same sort of thing.
00:37:31.100 | I'm going to come here and say new file.
00:37:34.100 | Search.
00:37:35.100 | We'll call it.
00:37:36.100 | It can be called anything.
00:37:37.100 | I'm going to paste that in.
00:37:38.100 | And before I show it to you, I'm going to make sure I don't forget to export it.
00:37:41.100 | Sometimes I forget to -- not from GitHub.
00:37:44.100 | We're going to search.
00:37:46.100 | And I know we're tight on time, but just real quick.
00:37:51.100 | What does this look like?
00:37:52.100 | It's going to do a few things.
00:37:56.100 | It's going to define -- it's going to import our collections object.
00:38:01.100 | It's going to define a class that's going to be for our similar issues.
00:38:04.100 | This is -- you're asking, like, can we return structured objects?
00:38:07.100 | Yes, I'm doing it right here.
00:38:08.100 | I return a similar issues array.
00:38:10.100 | And we automatically get GraphQL generated in our open schema.
00:38:13.100 | I'm going to search the collection using the information we have to manifest.
00:38:18.100 | And this time I'm going to say just give me the top three.
00:38:20.100 | And I do want the text and the results.
00:38:23.100 | Often you don't.
00:38:24.100 | Often you just say give me the ID back because I'm going to go query a database using that
00:38:28.100 | And so for you to return me the text if it's big could be an extra step you don't need to
00:38:34.100 | That's it.
00:38:35.100 | Return them.
00:38:36.100 | Return me an object using the output of that search.
00:38:40.100 | So let's -- let's do the git push and git add dot git commit dash and add search functions
00:38:55.100 | and git push.
00:38:57.100 | And -- oh, I didn't do -- I probably should have made sure it builds locally.
00:39:01.100 | It's going to build it in -- oh, I'm not in the right directory.
00:39:05.100 | Functions.
00:39:06.100 | Npm run build.
00:39:08.100 | Because if I had a compiler or something it would happen here.
00:39:11.100 | And we are working on improving our local dev experience as well.
00:39:14.100 | Because ideally we'd like you to be able to run it here too.
00:39:16.100 | But we don't have that just yet.
00:39:17.100 | So for now you just push it and we run it there.
00:39:20.100 | Notice it says hey there's a custom data type similar issue.
00:39:23.100 | That's going to be in our new GraphQL.
00:39:25.100 | We'll go back to hyper mode.
00:39:27.100 | And are we in time?
00:39:29.100 | Good.
00:39:30.100 | Eight minutes.
00:39:31.100 | Okay.
00:39:32.100 | Thanks for hanging with me here.
00:39:33.100 | Let's look at deployments for a second.
00:39:34.100 | I'm interested there already.
00:39:35.100 | So those are the two pushes that I did.
00:39:37.100 | Get push.
00:39:38.100 | It just works.
00:39:39.100 | I can now go to the home page and I should see hey there's an issues collection.
00:39:44.100 | And I've got four functions.
00:39:46.100 | And I've got that new model that I added.
00:39:48.100 | All right.
00:39:49.100 | So everything we just did is already in hyper mode.
00:39:51.100 | One thing I want to do just for sake of time is I'm going to go precede this collection
00:39:57.100 | with some existing data.
00:39:58.100 | And I've got a link to that in the, not there in the doc.
00:40:04.100 | Down at the bottom there is a link here.
00:40:08.100 | Example CSV file.
00:40:09.100 | So we're going to take these issues which we scraped earlier off of I think transformers.
00:40:14.100 | I think that's where we grabbed it from.
00:40:16.100 | And I'm just going to download that file from GitHub issues.csv and I'm going to go over
00:40:22.100 | to hyper mode and upload them there.
00:40:25.100 | Now you don't have to do it this way.
00:40:27.100 | You could write myself a function that says hey take this other API input and go upload
00:40:32.100 | Or you could call GitHub API and upload it that way.
00:40:34.100 | So there's a lot of different ways you can do it.
00:40:36.100 | But we're going to put it in manually right now.
00:40:39.100 | There it is.
00:40:40.100 | And what's going to happen when Matt does this is because of the way that we've set up the
00:40:44.100 | collection and the embedding function, every time you add something new to the collection,
00:40:48.100 | whether that's via your code or via this upload, it's going to automatically embed it.
00:40:52.100 | It's going to use that function, automatically embed it and make it available for your search.
00:40:56.100 | It's going to use that same embedding function in the search so that you know that what you're
00:41:00.100 | searching and what you're searching against have it embedded the same way.
00:41:03.100 | So you don't need to bring three different systems together to get this working.
00:41:07.100 | All in one box, you get natural language search off of these couple of functions.
00:41:12.100 | There's a little bit of delay as it's working here.
00:41:14.100 | But if I just patient for a minute and refresh, eventually we'll see the inference history from
00:41:20.100 | those embedding functions should pop up.
00:41:23.100 | Those are processing in the background here.
00:41:25.100 | These are the ones that we ran when we were classifying, but we should see here.
00:41:30.100 | Let's just pick the model.
00:41:33.100 | Why do I not see that model?
00:41:34.100 | Shouldn't I see the embedder?
00:41:35.100 | Yeah.
00:41:37.100 | Do the functions runs work?
00:41:39.100 | Let me just try a search and see what happens.
00:41:40.100 | I think.
00:41:41.100 | Probably still queueing up.
00:41:42.100 | All right.
00:41:43.100 | I'm going to search.
00:41:44.100 | We'll see if it's still going.
00:41:45.100 | I think we're getting no results here.
00:41:46.100 | We can just wait a little more.
00:41:47.100 | But we're going to see if, hey, is anybody -- we're going to search a new issue.
00:42:03.100 | Like imagine the workflow here.
00:42:04.100 | Somebody writes an issue and says, hey, I want to add a Spanish version of your README file.
00:42:09.100 | And so our app that's using these APIs is going to be like a bot that says, hey, we may already
00:42:14.100 | have that already.
00:42:15.100 | You know, maybe other people have asked.
00:42:18.100 | And indeed -- okay, so it did finish uploading from the collection.
00:42:21.100 | We can go figure out why it's not showing the logs.
00:42:23.100 | It should have showed in our logs.
00:42:24.100 | But indeed we can say, hey, there was a Turkish README and there's some other bits about languages
00:42:30.100 | and a French version of this.
00:42:31.100 | But I don't see anything with the Spanish README.
00:42:33.100 | So it's probably okay.
00:42:35.100 | There aren't any similar issues.
00:42:37.100 | These are the most similar that are available.
00:42:40.100 | I am curious why that didn't come through.
00:42:44.100 | There they are.
00:42:45.100 | Just took a second.
00:42:46.100 | Yeah.
00:42:47.100 | Okay.
00:42:48.100 | We'll work on that.
00:42:49.100 | These are batches, by the way.
00:42:50.100 | So they're a little long here.
00:42:51.100 | But I think what we did is we split them up to like 25 at a time.
00:42:54.100 | And so they take like, you know, two seconds for that one and one second for that one.
00:42:57.100 | But we can see the history.
00:42:59.100 | And like there's a stuff you really don't want to look at, all of the float arrays, right?
00:43:04.100 | So that's your vector embeddings.
00:43:06.100 | And then the search just uses that vector embeddings.
00:43:09.100 | That's it.
00:43:11.100 | I think that's -- I know we kind of rushed through that.
00:43:14.100 | I'm sorry.
00:43:15.100 | Hopefully some of you got a chance to code some of it.
00:43:17.100 | But like I said, feel free to stick around or to continue to walk through the doc on your own time.
00:43:24.100 | And I'm really excited to see what else you guys want to build.
00:43:28.100 | We have a lot of other documentation off to the side here that you can see like other stuff we can do.
00:43:32.100 | You can make HTTP calls.
00:43:34.100 | You can -- what else can you do?
00:43:36.100 | You can work with the collections more shortly.
00:43:38.100 | Not yet, but shortly.
00:43:39.100 | We'll have the ability to go and connect to any Postgres database that you want to.
00:43:43.100 | That's coming soon.
00:43:45.100 | And yeah, there's like the sky's the limit, right?
00:43:50.100 | Pick your models and write your functions and away you go.
00:43:53.100 | Anything else we want to show?
00:43:55.100 | Yeah, no, that's what we had to show you, right?
00:43:57.100 | It was like these are a set of building blocks that you can apply in your applications across context, across different points of data.
00:44:03.100 | Hopefully you start to understand how these pieces work, right?
00:44:06.100 | We see this everywhere.
00:44:07.100 | Like when we start to look at around and say there are so many places where you want to do that natural language sorting or search or bringing these pieces in.
00:44:14.100 | And we just need people like you to help bring that into those applications, right?
00:44:17.100 | And so that's what we really get excited about.
00:44:20.100 | We don't want you to have to start from scratch.
00:44:22.100 | So as Matt said, we launched Just Ship AI.
00:44:24.100 | Those are templates.
00:44:25.100 | You can deploy those right away.
00:44:27.100 | You all have access to hyper mode now.
00:44:29.100 | You got a 14 day trial.
00:44:30.100 | Feel free to respond to that email and you can claim the credits that Kevin mentioned this morning.
00:44:34.100 | Happy to get your team set up and run through this in a more contextual way for you as useful.
00:44:39.100 | But thank you.
00:44:40.100 | Thank you.
00:44:41.100 | Thank you.
00:44:42.100 | Thank you.
00:44:43.100 | Thank you.
00:44:44.100 | Thank you.
00:44:45.100 | Thank you.
00:44:46.100 | Thank you.
00:44:47.100 | Thank you.
00:44:48.100 | Thank you.
00:44:49.100 | Thank you.
00:44:50.100 | Thank you.
00:44:51.100 | Thank you.
00:44:52.100 | Thank you.
00:44:53.100 | Thank you.
00:44:54.100 | Thank you.
00:44:54.100 | Thank you.
00:44:55.100 | Thank you.
00:44:56.100 | Thank you.
00:44:56.600 | We'll see you next time.