back to index

AI Engineer World’s Fair 2025 — GraphRAG


Whisper Transcript | Transcript Only Page

00:00:00.360 | - Quick question, everyone who attended
00:00:01.840 | the keynote this morning,
00:00:03.740 | we got a very good question from the organizer.
00:00:06.680 | Not a question, a statement.
00:00:08.400 | Is reg dead or is agents taking over reg?
00:00:12.440 | How many of you have implemented reg in production?
00:00:16.340 | Oh, plenty.
00:00:18.840 | So I can say with high confidence, reg is not dead.
00:00:21.240 | My take on reg is,
00:00:25.680 | I think if reg can solve the problem
00:00:29.780 | that you're working on in production,
00:00:31.520 | you don't need agents, and vice versa.
00:00:34.380 | Why, wrong analogy, why build a network tower
00:00:38.960 | when you can get done with a smaller,
00:00:41.180 | minuscule version of it?
00:00:42.440 | So I think that is how I see reg being important.
00:00:45.500 | And there are a lot of use cases
00:00:46.700 | where reg has found its application.
00:00:49.460 | So I'll stop there, we'll talk more in my presentation,
00:00:53.960 | but I'm waiting for 11:15 because this session
00:00:58.060 | is streamed live on YouTube for our friends
00:01:01.880 | who couldn't attend this conference.
00:01:04.180 | So when he gives me a heads up, I'll start.
00:01:08.820 | Should we start?
00:01:11.340 | Okay. Oh, thank you.
00:01:12.960 | All right. So thanks for coming.
00:01:15.120 | To quickly introduce myself, my name is Mitesh.
00:01:16.980 | I lead the developer advocate team at NVIDIA.
00:01:19.580 | And the goal of my team is to create technical workflows,
00:01:22.860 | notebooks for different applications,
00:01:25.140 | and then we release that code base on GitHub.
00:01:27.920 | So developers in general, which is me and you, all of us together,
00:01:32.980 | we can harness that knowledge and take it further for the application
00:01:37.100 | or use case that you're working on.
00:01:38.720 | So that is what my team does, including myself.
00:01:41.340 | In today's talk, I'm going to talk about this project that we did
00:01:45.280 | with one of our partners and some of my colleagues at NVIDIA
00:01:49.540 | and our partner about how can we create a GraphRack system,
00:01:53.480 | what are the advantages of it, and if we add the hybrid nature to it,
00:01:57.720 | how it is helpful.
00:01:58.600 | So that's what my talk is going to be on.
00:02:02.560 | I will not be able to give you a 10-fit view where I can dive
00:02:08.420 | with you in the code base, but there is a GitHub link at the end
00:02:11.140 | of this talk which you can scan, and all these notebooks,
00:02:15.580 | whatever I'm going to talk about, is available for you to take home.
00:02:19.580 | But I'll give you a 10,000-fit view if you are trying to build
00:02:22.080 | your own GraphRack system, how can you build it?
00:02:26.180 | So a quick refresher, what is knowledge graph, and why are they important?
00:02:32.920 | So it is a network that represents relationship between different entities,
00:02:37.960 | and those entities can be anything.
00:02:39.200 | It can be people, places, concept events.
00:02:42.600 | A simple example would be me being here, what is my relationship
00:02:46.320 | to AI World Fair Conference, AI Engineers World Fair Conference,
00:02:49.980 | and my relationship is I'm a speaker at this conference.
00:02:52.560 | What is my relationship to anyone who's attending here?
00:02:55.300 | Well, our relationship is you attended my session.
00:02:58.560 | So this edge of relationship between the two entities becomes very important
00:03:03.840 | to which only graph-based network can exploit, or knowledge graphs can exploit.
00:03:09.520 | And that is the reason why there's a lot of active research happening in this domain
00:03:13.920 | of how you can harness GraphRack, how you can harness knowledge graph and put it into a Rack-based system.
00:03:22.660 | So the goal is three things, how can you create a triplet, which defines the relationship between these entities
00:03:31.180 | that GraphRack are a graph-based system or knowledge graph is really good at exploiting.
00:03:36.400 | And that's what is unique about this knowledge graph.
00:03:41.780 | So if you think about why can they work better than semantic Rack system,
00:03:50.120 | well, it captures the information between entities in much more detail.
00:03:53.660 | So those connections can provide a very comprehensive view of the knowledge that you are creating in your Rack system,
00:04:03.040 | and that will become very important to exploit when you are retrieving some of that information
00:04:07.700 | and converting that into a response for the user who is asking that question.
00:04:12.920 | And it tells the ability to organize your data from multiple sources.
00:04:18.300 | I mean, that's a given, no matter what kind of Rack system you're building.
00:04:22.300 | So how do we create a GraphRack or a hybrid system?
00:04:27.260 | So this is the high-level diagram of what it entails.
00:04:30.600 | So I broke it down into four components.
00:04:32.380 | The very first thing is your data, you need to process your data.
00:04:35.480 | The better you process your data, the better is the knowledge graph.
00:04:38.120 | The better is the knowledge graph, the better is the retrieval.
00:04:40.580 | So four components: data, data processing, your graph creation,
00:04:44.120 | or your semantic embedding vector database creation.
00:04:49.380 | Those are the three steps, and then the last step is, of course,
00:04:51.960 | inferencing when you're asking questions to your Rack pipeline.
00:04:56.620 | And at a higher level, this can be broken down into two big pieces:
00:05:03.000 | offline, online.
00:05:04.840 | So all your data processing work, which is a one-time process, is offline.
00:05:10.580 | And once you have created your knowledge graph, which is your triplet entity
00:05:16.120 | relationship, entity two, or your semantic vector database, once you have it,
00:05:21.760 | then it's all about querying it and converting that information into a response
00:05:27.340 | that is readable to the user.
00:05:29.340 | It cannot be something that, here are the three relationships, and then we,
00:05:33.800 | as the user, have to go figure out what does this exactly mean.
00:05:39.340 | So the top part of this flow diagram is where you build your semantic vector database,
00:05:47.980 | which is you pick your documents, and then you convert them into vector embeddings,
00:05:53.860 | and you store them into a vector database.
00:05:56.720 | So that piece is how you create your semantic vector database.
00:06:02.100 | And then the piece below is how you create your knowledge graph.
00:06:05.500 | And it is much more, there are much more steps that you have to follow,
00:06:11.360 | a care that you have to take when you're creating your knowledge graph.
00:06:19.040 | So diving in the first step, creating your knowledge graph, how can you create those triplets
00:06:23.420 | out of documents that are not that structured?
00:06:26.540 | So creating triplets, which exposes the information between two entities,
00:06:31.620 | and picking up those entities so that that information becomes helpful is very important.
00:06:36.460 | Here's a simple example.
00:06:37.560 | This document is of ExxonMobil's results, I think, their quarterly results.
00:06:43.300 | And we try to pick up the relationship or create the knowledge graph using an LLM.
00:06:50.780 | And if you see at the first line, it's ExxonMobil, which is a company, that's the entity.
00:06:56.240 | Cut is the feature of that entity, spending oil and gas exploration and activity.
00:07:03.980 | My apologies.
00:07:05.460 | Cut is the relationship between ExxonMobil and spending on oil and gas exploration.
00:07:10.260 | And activity is the name of the entity, spending on oil and gas exploration.
00:07:16.240 | So this is how the relationship needs to be exploited.
00:07:19.060 | Now, the question that comes to our mind is that sounds very difficult to do.
00:07:24.440 | And exactly, it is difficult to do.
00:07:26.540 | And that is the reason why we need to harness or we need to use LLMs to figure out a way
00:07:31.420 | to extract this information and structure it for us so that we can save it in a triplet format.
00:07:39.120 | And how can we do that?
00:07:41.760 | Prompt engineering, but we need to be much more defined about it.
00:07:46.920 | So based on the use case that you are trying to work on, you can define your oncology.
00:07:53.240 | And once you have defined your oncology, you can put it in your prompt and then ask the LLM
00:07:59.100 | to go extract this information that is oncology specific from the documents and then structure
00:08:05.280 | it in that way so that that can be stored in the form of a triplet.
00:08:09.340 | This step is very important.
00:08:11.040 | You might be spending a lot of time here to make sure your prompt is doing the right thing
00:08:15.740 | and it is creating the right oncology for you.
00:08:18.380 | If your oncology is not right, if your triplets are not right, if they are noisy, your retrieval
00:08:23.540 | will be noisy.
00:08:24.540 | So this is where you will be going back and forth figuring out how to get a better oncology.
00:08:30.380 | So this is where you will spend, this is where you will spend 80% of your time to make sure
00:08:38.020 | you get the oncology right and you will be going back and forth in an intuitive manner to see
00:08:41.720 | how you can make it better over time.
00:08:45.020 | And then the next vector database for a hybrid RAC system is to create the semantic vector
00:08:51.760 | database.
00:08:52.760 | And that is very reasonably straightforward or it is well studied.
00:08:56.640 | So you pick your document.
00:08:57.640 | This is the first page of attention is all you need research paper.
00:09:00.680 | And you break it into chunk sizes and you have another factor called overlap.
00:09:06.400 | And chunk sizes are important because what semantic vector database does is it will pick up
00:09:11.880 | that chunk and convert that into, use the embedding model and convert that into a embedding vector
00:09:18.120 | and storing the vector database.
00:09:19.820 | And it will, if you don't have an overlap, then the context between the previous and the next
00:09:25.000 | chunk will be lost if there is any relationship.
00:09:27.260 | So you try to be smart on how much overlap do I need between my previous chunk and the next
00:09:32.860 | chunk and what is the size of the chunk that I should use when I'm chunking my documents
00:09:37.980 | into different paragraphs.
00:09:39.280 | That is where the advantage of graph reg comes into play.
00:09:43.180 | Because if you think about it, the important information, which is the relationship between
00:09:48.120 | different entities are not exploited by your semantic vector database, but they are exploited
00:09:55.160 | really well when you're trying to use a knowledge graph or create a knowledge graph based system.
00:10:02.280 | So once you have created this knowledge graph, what is the next step?
00:10:06.280 | Now comes the retrieval piece, which is you ask a question.
00:10:11.180 | What is ExxonMobil's cut this quarter that it is looking like?
00:10:17.120 | And knowledge graph will help you figure out how to retrieve those nodes or those entities and the relationship between them.
00:10:26.260 | If you do a very flat retrieval, which is a single hop, you are missing the most important piece that graph allows you, which is exploitation through multiple nodes that you can think about.
00:10:41.100 | That becomes very, very, very important.
00:10:44.000 | I cannot stress how important that becomes.
00:10:46.000 | So think of different strategies.
00:10:47.000 | Again, you would spend a lot of time to optimize this, whether you should look at single hop, double hop, how much deep you want to go so that nodes, the relationship between your first node to the second node, your second node to the third node is exploited pretty well.
00:11:01.000 | And the more deeper you go, the better context you will get, but there's a disadvantage of that.
00:11:06.900 | The more deeper you go, the more time you're going to spend on retrieving that information.
00:11:10.900 | So then latency becomes a factor as well, especially when you're working in a production environment.
00:11:15.900 | So there is a sweet spot that you have to hit when you're trying to go, how deep you want to go, how many hops you want to go into your graph versus what is the latency that you can survive.
00:11:29.800 | So that becomes very, very important.
00:11:32.800 | And some of those searches can be accelerated.
00:11:35.800 | So we created a library called cool graph, which is available or integrated in a lot of libraries out there, like network X and whatnot.
00:11:45.700 | But that acceleration becomes important so that it gives you the flexibility to get deeper into your graph, both through multiple hops, but at the same time, you can reduce the latency.
00:11:55.700 | So your performance of your graph improves a lot.
00:12:02.700 | So this is where the retrieval piece comes into play, where you can have different strategies defined so that when you're querying your data and getting the responses, you can have better responses.
00:12:16.600 | And the other important piece, I personally worked on this piece, so I can talk at length on this, but I'm going to give you a very high level, is evaluating the performance.
00:12:26.600 | And there are multiple factors that you can evaluate on faithfulness, answer relevancy, precision recall, if you try to use an LLM model, helpfulness, correctiveness, coherence, complexity, verbosity, all these factors becomes very important.
00:12:40.500 | So there's a library, so there's a library called RAGAS.
00:12:44.400 | It is meant to evaluate your RAG workflow end to end.
00:12:49.400 | Anyone who used RAGAS for evaluating your graph?
00:12:52.400 | All right, a few of them, thank you.
00:12:54.400 | But it is an amazing library that you can use to evaluate your RAG pipeline end to end, because it evaluates the response, it evaluates the retrieval, and it evaluates what the query is.
00:13:06.400 | So it will evaluate your pipeline end to end, which becomes very handy when you're trying to test whether my retrieval is doing the right thing or whether the question that I'm asking is the LLM interpreting it in the right way or not.
00:13:19.300 | So you can break down your responses in the RAGAS pipeline, we evaluate all those pieces and see what your eventual score is.
00:13:27.300 | So it is a pip install library.
00:13:29.300 | The other is LLM, and RAGAS under the hood uses an LLM, no surprises there.
00:13:35.300 | By default, it is integrated with GPT, but it provides you the flexibility that if you have your own model, you can bring it in as well, and you can wire it up with your API, and you can use that LLM to figure out on these four evaluation parameters that RAGAS offers.
00:13:56.300 | So it's quite, I wouldn't say it's comprehensive, but it's really good in terms of giving you that flexibility.
00:14:02.300 | The other part is using a model that is meant to evaluate specifically the response coming out of the LLM.
00:14:10.300 | And that is where this model, Lama Nimotron 340 billion reward model that we released, I think, a few years ago.
00:14:15.300 | At that time, it was a really good response model.
00:14:17.300 | It's a 340 billion parameter model, so reasonably big, but it evaluates, it's a reward model.
00:14:24.300 | So it will go and evaluate the response of another LLM and judge it in terms of how the responses are looking like on these five parameters.
00:14:34.300 | But it is meant to go and judge other LLMs.
00:14:36.300 | That is how it was trained.
00:14:42.300 | So moving further, I would like to use this analogy that for, to create a graph right system, it will take you, which is 80% of the job, it will take you 20% of your time.
00:14:54.300 | But then to make it better, which is the last 20%, sorry, which is the 80-20 rule, the last 20% will take 80% of your time.
00:15:05.300 | Because now you are in the process of optimizing it further to make sure it works for the use case good enough for the application that you're working on.
00:15:16.300 | And there are some strategies there which I would like to walk you through.
00:15:20.300 | So one, as I said before, which I couldn't stress enough, the way you are creating your knowledge graph out of your unstructured data becomes very important.
00:15:28.300 | The better your knowledge graph, the better results you're going to get.
00:15:32.300 | And something that we did as experimentation through this use case that we were exploring with one of our partners,
00:15:39.300 | was can we fine tune an LLM model to get the quality of the triplets that we are creating better?
00:15:48.300 | And does that improve results?
00:15:50.300 | Can we do a better job at data processing, like removing regex, apostrophes, brackets, words that, characters that don't matter?
00:15:58.300 | If we remove them, does it give you better results?
00:16:01.300 | So these are like small things that you can think about, but it gives you, it improves the performance of your overall system.
00:16:08.300 | So that is where you, I'm talking about 80% of your time.
00:16:11.300 | Small nitty-gritty of the things that you are, the knobs that you are fine tuning slowly and steadily to make sure your performance gets better and better.
00:16:20.300 | And I would like to share a few strategies that we did, which we got, which led us to, which led us to get better results.
00:16:29.300 | So the very first thing is Regex, or just cleaning out your data.
00:16:35.300 | We removed apostrophes, other characters that are not that important, if you think about triplet generation.
00:16:44.300 | That led us to better results.
00:16:49.300 | We then implemented another strategy of reducing the, not missing out of longer output, making it smaller.
00:16:57.300 | That got us better results.
00:16:59.300 | And we also fine tuned the LAMA 3.3 model or 3.2 model, and that got us better results.
00:17:05.300 | So if you look at the last three columns, you'll see that by using LAMA 3.3 as is, we got 71% accuracy.
00:17:13.300 | So this was tested on 100 triplets to see how it is performing.
00:17:18.300 | And as it, sorry, 100 documents.
00:17:21.300 | So as it got better, and as we introduced LARA, we fine tuned the LAMA 3.1 model, our accuracy or performance went up from 71% to 87%.
00:17:31.300 | And then we did those small tweaks, it improved the performance better.
00:17:34.300 | Again, remember, there's 100 documents, so the accuracy is looking high, but if your document pool increases, that will come down a bit.
00:17:40.300 | But in comparison to where we were before, we saw improvement.
00:17:43.300 | And that is where the small tweaks come into play, which will be very, very, very helpful to you when you're putting a system, a graph reg or a reg system into production.
00:17:56.300 | The other is from a latency standpoint.
00:17:58.300 | So if your graph gets bigger and bigger, now you're talking about a network which goes into millions or billions of parameters, or millions and billions of nodes.
00:18:08.300 | Now, how do you, how do you do search in those millions and billions, in the graph that has got millions or billions of nodes?
00:18:18.300 | And that is where acceleration comes into play.
00:18:20.300 | So with Coograph, which is now available through NetworkX, so NetworkX is also a pip install library.
00:18:27.300 | Anyone who uses NetworkX here?
00:18:29.300 | All right, a few, okay.
00:18:31.300 | So NetworkX is also a pip install library.
00:18:33.300 | Under the hood, it uses acceleration.
00:18:36.300 | And if you see a few of the algorithms, we did a performance test on that.
00:18:42.300 | And you can see the amount of latency in terms of overall execution and reducing drastically.
00:18:48.300 | So that is where you can, again, small tweaks, which will lead you to better results.
00:18:53.300 | So these are two things that we experimented, which led us to better results in terms of accuracy, as well as reducing the overall latency.
00:19:00.300 | And these are small tweaks, and it leads us to better results.
00:19:07.300 | So then the question, obviously, is should I use graph, or should I use semantic base drag system, or should I use hybrid?
00:19:15.300 | And I'm going to give you the diplomatic answer.
00:19:17.300 | It depends.
00:19:18.300 | But there are a few things I would like you guys to take home, which will help you to come up to a decision.
00:19:26.300 | So that you can make an educated guess that for this use case that I'm working on, a drag system would solve the problem, I don't need a graph, and vice versa.
00:19:33.300 | Or I need a hybrid approach.
00:19:35.300 | So it depends on two factors.
00:19:37.300 | One is your data.
00:19:39.300 | Traditionally, if you look at retail data, if you look at FSI data, if you look at employee database of companies, those have a really good structure defined.
00:19:48.300 | So those kind of data set becomes really good use cases for graph-based system.
00:19:55.300 | And the other thing you think about is even if you have unstructured data, can you create a good graph, a knowledge graph out of it?
00:20:02.300 | If the answer is yes, then it's worthwhile experimenting to go the graph path.
00:20:09.300 | And it will depend on the application and use case.
00:20:12.300 | So if your use case requires to understand the complex relationship and then extract that information for the questions that you are asking, only then it makes sense to use graph-based.
00:20:25.300 | Because remember, these are compute-heavy systems.
00:20:28.300 | So you need to make sure that these things are taken care of.
00:20:31.300 | I am running out of time, I think.
00:20:33.300 | But as I said before, all these things that I talked about, I gave you a 10,000-feet view.
00:20:38.300 | If you want to get a 100-feet view where you are coding into things, all these things is available on GitHub.
00:20:43.300 | Even the fine-tuning of the LAMA 1.1 LoRa model.
00:20:46.300 | And we had a workshop, a two-hour workshop.
00:20:48.300 | So I gave you a 20-minute talk, but this whole workshop is covered in two hours as well.
00:20:52.300 | And lastly, join our developer programs.
00:20:55.300 | We do release all these things on a regular basis.
00:20:58.300 | If you join the mailing list, you get this information based on your interest.
00:21:02.300 | And as my colleague mentioned, I will be across the hall at Neo4j booth to answer questions if any.
00:21:09.300 | I would love to interact with you and see if you have any questions and I can answer those questions.
00:21:14.300 | Thank you for your time.
00:21:23.300 | Thank you.
00:21:24.300 | Thank you, Mattesh.
00:21:25.300 | That was fantastic.
00:21:26.300 | We've got another great talk coming up here.
00:21:29.300 | Chen, you can come on up.
00:21:33.300 | And if I get this right, you're going to take a philosophical perspective on this.
00:21:37.300 | Yes, yes.
00:21:40.300 | Hello.
00:21:41.300 | Hello.
00:21:41.300 | Yeah, thanks.
00:21:42.300 | Five.
00:21:43.300 | Four.
00:21:44.300 | Three.
00:21:47.300 | Hey, thanks.
00:21:48.300 | two one
00:21:55.340 | hey where's the note there
00:22:45.340 | okay so i don't know if there's a way to get speaker notes onto the screens at the bottom you guys know
00:23:02.780 | yes you have to go on full screen do you need the notes yeah i need the notes
00:23:12.540 | i see the mode you can also just walk through it on the side and just
00:23:17.180 | yeah sure like that can you collapse this side
00:23:20.940 | yeah yeah okay so hi hi everybody uh my name is chink young lamb um i'm a founder and CEO of patho.ai
00:23:37.660 | uh a bit of background by my company uh patho.ai started two years ago with an invitation from
00:23:43.340 | national science foundation from the spir grant funding investigating lm bird we did a lm bird
00:23:50.940 | driven discovery application uh since then we branch out to leverage what we learned about building ai
00:23:57.660 | system for large corporation we are currently building expert ai system for several clients currently the
00:24:04.700 | system we build goes beyond rack system um many of our clients is asking for ai system that perform
00:24:11.580 | tasks like a research and advisory role based on their area of interest uh today the talk is about
00:24:18.940 | sharing with our fellow ai engineer what we learned so far within this kind of system
00:24:24.300 | okay uh what is knowledge okay generally philosophically i say knowledge is the understanding
00:24:29.660 | and awareness gained through experience education and a comprehension of facts and principle
00:24:34.300 | and that lead to the next question is what is knowledge graph right so knowledge graph is a systematic
00:24:41.500 | method of preserving wisdom by connecting them and creating a network or interconnect relationship that's
00:24:47.500 | important the graph represents the thought process and comprehensive taxonomy of a specific domain of expertise
00:24:55.980 | that's why this is very important for people moving forward it's about ai system then think a lot and return
00:25:03.340 | advice instead of just retrieves you know data from your database right so that comes to the development of this uh
00:25:11.340 | k-a-g okay what is k-a-g k-a-g stands for knowledge augment generations and it's different from rack okay
00:25:18.220 | it is enhanced language model by integrating structure knowledge graph for more accurate and insightful response
00:25:25.180 | making it smarter more structure approach than a simple rack k-a-g doesn't just retrieve
00:25:32.860 | remember you understand this is different okay after interviewing a lot of my client okay so we also
00:25:42.220 | expert in a certain area of scale i found that there are common ways of their thinking decision making
00:25:48.380 | process the way that make them expert in their area knowledge graph seems to be a perfect repeat
00:25:53.820 | so here is the graph or state diagram if you are computer engineering grad like me so um it shows a wisdom
00:26:02.140 | the form the for the wisdom note as you can see is that it's a core right this wisdom it just
00:26:09.660 | isn't static it actively guide decision and views by other element
00:26:15.260 | the output from the wisdom actually goes to decision making in the blue right wisdom isn't passive it guide
00:26:25.260 | decision helping us choose wisely okay and then the decision making analyze the situation
00:26:32.060 | given in the circle in the circle in the green and decision on make you know in a vacuum okay they
00:26:40.460 | analyze real world situation that's the difference okay so look at the wisdom input okay look at the
00:26:46.540 | relationship feedback from the knowledge to wisdom in gold color example of that is knowledge to wisdom
00:26:54.540 | like all your book smart and encyclopedia wikipedia whatever you store plus once that data get absorbed
00:27:05.420 | by our whatever model you use up there it need to regurgitate that and understand that's why it's very
00:27:11.100 | important that wisdom is able to synthesize the data after you ingested knowledge you know that's the kind of
00:27:16.460 | abstract but i'll come to that later how about i'm talking about okay okay from inside example of that is
00:27:22.460 | wisdom derived pattern from chaos right some of my client has a lot of social media their product how
00:27:30.300 | do they you know track their product settlement from from social media right so it's very chaotic and from x
00:27:36.380 | tweet right so so from that you can see some pattern of their competitor versus uh current what my product
00:27:42.060 | is that that is like an example of that and i will go to that later okay when all these connected nodes matter
00:27:49.420 | together why do they matter all the nodes relate to one another to ever increase enriching our wisdom
00:27:57.100 | storing system okay this talk is about storing wisdom right so knowledge tells you what it is right and
00:28:04.540 | experience tell you what worked before inside invent what to try next right like a pizza knowledge is
00:28:15.500 | recipe experience is knowing your oven burnt crust inside is like hey it is adding you know honey to
00:28:24.860 | the crust you make caramelized perfectly right so the most important part of the knowledge graph is
00:28:30.460 | feedback loop okay feedback isn't one way street it learn from itself look at the feedback from the
00:28:37.340 | going back going back to all the note from inside to wisdom okay um situation informs future wisdom
00:28:45.020 | experience deepen it inside sharpen it like a tree growing roots the more effect the stronger it gets
00:28:53.580 | now now i want to ask you a question in general where do you see this circle in your life maybe a tough
00:29:00.380 | decision that you know taught you something so one practical application for leadership is wisdom avoid knee jack
00:29:10.860 | reaction by learning from feedback as for personal growth ever notice how past mistakes make you wiser
00:29:18.140 | that's the loop in the action all this so the takeaway from the slide in this is wisdom isn't
00:29:25.820 | a trophy you earn it is the muscle you exercise the more you feed knowledge experience insight the more
00:29:33.660 | they guide you now i will show you how it being mapped to my current client you know all this is like very
00:29:41.900 | abstract right so hi one of my clients actually doing a competitive analysis uh they used to have a
00:29:48.380 | marketing department doing that but they want ai to do that right they asked me to build the system
00:29:53.900 | this is exactly what i did with the same taxonomy of storing all this so this talk taxonomy will be
00:30:00.140 | later on i talk about how multi-agents going to handle all that here is one of the chatbot that i
00:30:06.220 | build for my client to do you know not just some we not just some cap okay it's our wisdom graph power ai
00:30:14.460 | designed to turn data in the strategy right dominant so what kind of question i talk about how do i win
00:30:19.980 | my competitor in this market space that's kind of a very sophisticated question right so without uh if
00:30:24.860 | you do simply just write my first speaker talk about right right so it's not going to cut it they're not
00:30:29.980 | going to able to answer that kind of question okay what i did is this uh we retained the same taxonomy and
00:30:35.580 | the wisdom is then mapped the same engine there the wisdom engine this engine is like an orchestration
00:30:41.180 | agent that does a lot of decision making including advising what the arms able to see based on the
00:30:47.580 | current situation what to do next right so um what i did is uh for the uh decision making i map it to
00:30:55.420 | a strategy generator so these customers are talking about a competitive analysis right so um i map the
00:31:00.700 | knowledge in term of knowledge what do they have they have market data right so i
00:31:05.500 | map this experience to hp is one of a kind past campaign so they have a lot of campaign doing a
00:31:13.260 | lot of marketing and then um the inside is actually mapped to uh in industrial insight so they have a
00:31:20.300 | database doing storing that and then of course the most important is is the situation the situation is
00:31:27.020 | how how am i doing how my product selling right so so that that is like a situation and then i map that to a
00:31:33.740 | a competitor witness that means they say if you make the lm aware of that you probably get a very good
00:31:41.260 | answer and then you know the chatbot will probably be doing the right thing advising so from here very
00:31:47.500 | high level you know state diagram or that how do i map it to a system that drive well here comes the trick so
00:31:55.180 | so anybody here heard of any all right all right all right so so so i i first encounter similar situation
00:32:02.940 | when i passed iot project which is not great developed by uh ibm right so it's the same kind of thing it's
00:32:09.820 | like no code but but underneath the hood there's a bunch of code okay it's all node.js code okay so uh but
00:32:15.820 | but for the for for proving your concept and all that it's very very very flexible and i right highly recommend
00:32:21.820 | that and and and here here you can take a look at the the workflow the break from i enable the
00:32:26.860 | implementation of this complicated state diagram with um what i say is there is a different community
00:32:33.420 | you know one of the very powerful node is the ai agent well previously nann is just workflow automation
00:32:38.620 | too i'm not selling for anything i'm just telling you i'm using it uh for prototyping uh further down
00:32:44.380 | the road maybe the client say us too like i i really need to you know go is now have the option
00:32:50.140 | to drive uh different model like open ai model and tropic model and even on-prem model and then that
00:32:56.220 | the key in making the stick the same machine work is that move out in a graph rag track why are we talking
00:33:01.900 | about fusion and decoder well i'm glad you asked because the next big breakthrough was knowledge graph with
00:33:06.780 | fusion and decoder so you can use knowledge graphs with fusion and decoder uh as a technique and this sort of
00:33:13.100 | improves upon the fusion and decoder paper by using knowledge graphs to understand the relationships
00:33:18.620 | between the retrieved passages and so it helps with this efficiency bottleneck and improves uh the process
00:33:25.580 | i'm not going to walk through this diagram step by step but this is the diagram in the paper of the
00:33:29.260 | architecture where it uses the graph and then does this kind of two-stage re-ranking of the passages and
00:33:35.340 | it helps with uh improving the efficiency while also lowering the cost and so the team took all this
00:33:40.700 | research and came to get came together to build their own implementation of fusion and decoder since
00:33:45.420 | we actually build our own models uh to make that kind of the final piece of the puzzle and it really
00:33:50.860 | helped our hallucination rate it really drove it down and then we published a white paper with our own
00:33:55.180 | findings of it and so then we kind of had that piece of the puzzle and there's a few other techniques
00:34:00.460 | that we don't have time to go over but point being we're assembling together multiple techniques based on
00:34:05.820 | research to get the best results we can for our customers so that's all well and good but like
00:34:10.860 | does it actually work like that's the important part right so we did some benchmarking last year we
00:34:15.180 | used amazon's robust qa data set and compared our retrieval system with knowledge graph and fusion
00:34:20.780 | decoder and everything uh with our with seven different vector search uh systems and we found
00:34:27.740 | that we had the the best accuracy and the fastest response time so encourage you to check that out and
00:34:32.860 | kind of check out this process benchmarks are really cool but what's even cooler is like what it unlocks
00:34:38.700 | for our customers which are various features in the product um for one because like most draft
00:34:45.580 | structures we can actually expose the thought process because we have that relationships and the additional
00:34:51.260 | context where you can show the snippets and the sub queries and the sources for how the rag system is
00:34:56.940 | actually getting the answers and we can expose this in the api to developers as well as in the product
00:35:01.900 | and then we're also to have able to have knowledge graphics sell at multi-hop questions where we can
00:35:08.060 | reason across multiple documents and multiple topics without any struggles and then lastly it can handle
00:35:15.740 | complex data formats where vector retrieval struggles where an answer might be split into multiple pages
00:35:21.500 | or maybe there's a similar term that doesn't quite match what the user is looking for but because we have that
00:35:27.340 | graph structure and and the and fusion and decoder with the additional context and relationships we're
00:35:31.980 | able to uh formulate these correct answers so again my main takeaway here is that there are many ways that
00:35:40.300 | you can get the benefits of knowledge graphs in rag that could be through a graph database it could be
00:35:45.340 | through doing something creative with posters it could be through a search engine uh but you can
00:35:49.820 | you take advantage of the relationships that you can build with knowledge graphs uh in your rag system
00:35:55.980 | and as you get there you can challenge your assumptions and focus on the customers to be
00:35:59.900 | able to get to the end result to to make the team successful and so for our team it was focusing on the
00:36:05.020 | customer needs instead of what was hyped staying flexible based on the expertise of the team and letting
00:36:11.020 | research challenge their assumptions um so if you want to join this amazing team we're hiring across
00:36:16.780 | research engineering and products uh we would love to talk to you about any of our open roles and i'm
00:36:21.900 | available for questions you can come find me in the hallway or reach out to me on twitter or linkedin
00:36:27.340 | and that's all i've got for you thank you so much
00:37:10.300 | can you hear me now there i am lies on okay in the giant umbrella
00:37:14.300 | Can you hear me now?
00:37:36.160 | There I am.
00:37:36.980 | Liza on.
00:37:37.680 | Okay.
00:37:38.580 | In the giant umbrella that is Graphrag, there are many techniques, many approaches, many ways
00:37:43.680 | that get things done, there's knowledge graph construction, there's retrieval, but then
00:37:48.620 | there's the notion of going post-rag and thinking about different ways of thinking about what
00:37:53.700 | knowledge is, what we're actually doing in the first place.
00:37:56.920 | So next up is my good friend Daniel from Zapp to lead us through that.
00:38:01.440 | Daniel.
00:38:13.060 | Not yet?
00:38:16.280 | Oh, here we go.
00:38:17.440 | Five, four, three, two, one.
00:38:19.260 | Great.
00:38:20.840 | Well, welcome, everybody.
00:38:22.620 | Thank you, Andrea, Andreas, as well, for the intro.
00:38:26.020 | I'm Daniel, the founder of Zapp AI, and we build memory infrastructure for AI agents.
00:38:32.900 | And I'm going to tell you that you're doing memory all wrong.
00:38:38.420 | Well, it may not be you directly, but it may be the framework that you're using to build
00:38:43.520 | your agents.
00:38:43.980 | I also think that knowledge graphs are awesome.
00:38:48.640 | Otherwise, why would we be here, right?
00:38:50.740 | And you should be using them for agent memory, not just for Graphrag.
00:38:57.740 | So before I dive into expanding on my hot takes, I want to touch on why memory is so important.
00:39:03.340 | So we're routinely building agents that forget important context about our users.
00:39:09.440 | All that dynamic data that we're gathering from conversations between the agent and the
00:39:14.800 | user, all the data, business data from our applications, line of business applications,
00:39:20.180 | et cetera, there's so much richness about who the user is, and yet we're not enabling our
00:39:25.520 | agents with that data.
00:39:26.800 | And our agents respond as a result generically or hallucinate even worse.
00:39:32.540 | And this definitely isn't the path to AGI, or more concretely, retaining our customers.
00:39:40.760 | So memory isn't about semantically similar content.
00:39:44.300 | RAG does that really well.
00:39:46.340 | And when I talk about RAG here, I'm primarily talking about vector database-based RAG, not
00:39:53.480 | necessarily GraphRAG.
00:39:56.060 | But consider the stylized example where we have learned a brand preference for Adidas shoes.
00:40:01.420 | And unfortunately, Robbie's Adidas shoes fall apart, so he's rather unhappy.
00:40:07.280 | So the preference changes.
00:40:09.820 | However, Robbie's follow-up question to the agent, where he asked what sneakers he should
00:40:16.980 | purchase, is most similar to the first Adidas fact.
00:40:21.580 | And so if we're using a vector database, that fact may be at the top of the search results,
00:40:27.240 | and the agent responds incorrectly.
00:40:29.500 | So when using RAG, each fact is an isolated and immutable piece of content.
00:40:37.340 | And this is a real problem.
00:40:39.300 | The three facts on the left exist with no understanding of causality.
00:40:45.620 | Semantic search can't reason with the why things change over time.
00:40:51.940 | And this is why RAG approaches fail as memory.
00:40:55.480 | RAG lacks a native temporal and relational reasoning.
00:41:01.580 | And none of this should be a surprise under the hood.
00:41:04.640 | We're just working with similarity in an abstract space.
00:41:08.060 | There's no explicit relationships between these embeddings, these vector representations
00:41:13.140 | of the facts that we've generated for our memory.
00:41:18.260 | However, when we look at knowledge graphs, we can define explicit relationships.
00:41:24.100 | Graphs can model the why, and at Zepp, we've got them to model the when as well, behind the
00:41:30.780 | preference change, which adds a temporal dimension that your agent can reason over.
00:41:37.040 | And this structural difference is fundamental to how memory should work.
00:41:41.520 | So which is a good segue to Graffiti.
00:41:45.780 | Graffiti is Zepp's open source framework for building real-time, dynamic temporal graphs.
00:41:51.920 | And it addresses these exact problems.
00:41:55.200 | Graffiti is temporally aware and graph relational.
00:42:01.080 | You can find it on GitHub, go to git.new forward slash graffiti.
00:42:05.620 | It has 10,000 plus stars, almost 11,000 quadrupled within the last six weeks.
00:42:12.700 | So thank you, everybody, who's dried out graffiti and loved it.
00:42:15.540 | So let's dive into how each of these attributes of graffiti works.
00:42:20.540 | So this is the secret source.
00:42:24.320 | Graffiti extracts and tracks multiple temporal dimensions for each fact.
00:42:30.520 | It identified when a fact is valid and becomes invalid.
00:42:35.880 | On the right-hand side, you can see how, when we're using the example that I illustrated
00:42:42.820 | a few slides back, how graffiti would pause those different time frames.
00:42:47.900 | And this enables temporal reasoning.
00:42:51.120 | What did the user prefer in February?
00:42:55.080 | And it can answer questions that RAG simply cannot handle.
00:42:58.420 | Or it enables your agent to answer questions that RAG simply cannot handle.
00:43:06.160 | And so when we look at what RAG can do, we actually sit with a bunch of contradictory embeddings
00:43:16.900 | with no resolution in the vector database.
00:43:19.460 | So if we're updating the brand preference, we'll have a new brand preference fact in the vector database.
00:43:30.080 | However, graffiti understands that broken shoes invalidate the love relationship,
00:43:35.240 | which creates a causal relationship between those three events in the previous slide.
00:43:40.960 | Broken shoes result in disappointment, which results in a brand preference change.
00:43:49.500 | Graffiti doesn't delete the history of facts as they change, as they're invalidated, but marks them invalid, rather.
00:43:58.240 | And so we store a sequence of state changes on the graph, which allows your agent to then reason with those state changes over time.
00:44:09.680 | So for example, the next time I come back to the e-commerce agent to purchase shoes is not going to recommend the Adidas shoes to me.
00:44:18.400 | And here's the resulting graph, a closer approximation to how humans might process and recall changing state over time.
00:44:29.740 | On the graph, we can see that the existing Adidas brand preference is still there, but it hasn't expired that date.
00:44:39.780 | We also see that there's a new brand preference for Puma shoes, which is valid, and it doesn't have an invalid that date.
00:44:53.740 | So graffiti doesn't abandon embeddings.
00:44:57.280 | They're still incredibly useful.
00:44:59.140 | Graffiti uses semantic search and BM25 full-text retrieval to identify sub-graphs within the broader graffiti graph.
00:45:08.960 | And these can be traversed using graph traversal techniques to develop a richer understanding of memory.
00:45:17.200 | So we can find adjacent facts that might fill out the agent's understanding of memory.
00:45:24.060 | And the results are then each fused together.
00:45:28.420 | And so this offers a very fast, accurate retrieval approach.
00:45:33.200 | And graffiti has a number of different search recipes built into it.
00:45:37.760 | So you can really explore how to take different approaches to retrieving data for your particular agent.
00:45:46.100 | So a little bit of a bonus.
00:45:50.960 | When we look at recent changes that we've added to graffiti, we allow developers now to model their business domain on the graph.
00:46:01.260 | Because a mental health application will have very different types of things it needs to store and recall from memory to an e-commerce agent.
00:46:12.100 | And so graffiti allows you to build constructs, custom entities, and edges that represent the business objects within your particular business or application.
00:46:27.180 | And so here we have an example of a media preference where we've been learning all about a user's preferred podcasts and music.
00:46:39.660 | And we have defined an actual structure here for media preference.
00:46:44.340 | And what this does is it allows us to then also retrieve, explicitly retrieve, media preferences from the graph, rather than a bunch of other noise that we might have added to memory.
00:46:58.320 | And this ontology really enables you to bring a lot of depth to how memory operates.
00:47:07.280 | So I'm not advocating that you replace RAG everywhere.
00:47:11.860 | RAG, graph RAG, the various forms of graph RAG, and graffiti, each has its strengths and ideal use cases.
00:47:20.300 | The key is recognizing when you need each.
00:47:23.620 | Graffiti is really strong when you're wanting to integrate incrementally dynamic data into a graph without significant recomputation.
00:47:34.160 | It's really strong when you want to model your business domain.
00:47:37.860 | It's strong where it has very low latency retrieval.
00:47:42.060 | There's no LLM in the path.
00:47:43.540 | If you've tried graph RAGs, they often have an LLM in the path incrementally summarizing the output from the graph.
00:47:51.300 | It can take tens of seconds.
00:47:52.900 | Graffiti operates in under hundreds of milliseconds.
00:47:57.000 | And so the key is recognizing which solution offers to your business, what it offers to your business.
00:48:06.100 | And most agentic applications could use RAG, a RAG approach, and graffiti.
00:48:14.360 | So just summing it all up, agent memory is not about knowledge retrieval.
00:48:19.180 | Temporal and relational reasoning is so critical to coherent memory.
00:48:24.920 | We need to track state changes over time.
00:48:27.520 | We need to understand how something like preferences or user traits might change over time.
00:48:35.100 | And that's something that contemporary RAG solutions lack.
00:48:41.480 | So we published a paper earlier this year describing ZEPP's use of graffiti.
00:48:47.900 | And it's a deep dive into the graffiti architecture and how ZEPP performs as a consequence of using graffiti under the hood.
00:48:55.660 | So you can follow the link below to land at the archive preprint if you'd like to take a look.
00:49:00.600 | And I'm sure the slides will be available after the talk, so you can go to the paper.
00:49:05.820 | So a quick plug for ZEPP.
00:49:10.460 | ZEPP goes beyond simple agent memory to build a unified customer record derived from both chat history and business data.
00:49:20.720 | So you can stream in user chat conversations, but also stream in business data from your SaaS application, from line of business applications like CRM or billing systems.
00:49:31.820 | And it builds this unified, holistic view of the user, really enabling your agent to have an accurate and very comprehensive real-time understanding of the user so it can solve complex problems for that user.
00:49:48.560 | So stick around for the agent memory lunch and learn, which is the next session.
00:49:54.000 | It's being led by Mark Bain.
00:49:56.020 | And in it, amongst other folks, I'll be demoing ZEPP's approach to domain-specific memory, built on graffiti's custom entities and edges.
00:50:06.420 | So thanks for listening to me.
00:50:09.000 | We have a few minutes, so I'm happy to answer questions.
00:50:12.560 | And I will, yeah, if there are any.
00:50:18.400 | No questions.
00:50:20.720 | Oh, there's one over there.
00:50:22.000 | The question was, do you need to use ZEPP to use graffiti?
00:50:29.220 | Graffiti is open source.
00:50:30.920 | It's available on GitHub.
00:50:32.340 | You can go to the link, git.new, graffiti.
00:50:36.180 | And all you'll need today is Neo4j.
00:50:41.680 | So our partners, Neo4j, can assist you with a community edition install.
00:50:46.380 | And strongly recommend their desktop product.
00:50:49.960 | It's wonderful.
00:50:50.540 | And you can get going very easily.
00:50:52.960 | Another question here?
00:50:56.000 | So underneath the hood, how do you invalidate graph edges?
00:51:12.080 | Are we using LLMs?
00:51:16.320 | So graffiti makes extensive use of LLMs to intelligently parse incoming data, which could be unstructured or structured.
00:51:27.520 | So the unstructured conversation, unstructured emails, structured data in JSON format, and fuse it together on the graph.
00:51:37.220 | And as part of integrating, we're using LLMs to identify, in a pipeline, identify conflicting facts.
00:51:45.880 | And so that's where we get this ability to go from broken shoes to disappointment to a switch in brand preferences.
00:51:56.540 | So the LLMs is able to understand emotional valence of the events that it is seeing.
00:52:03.300 | One more question?
00:52:07.060 | How do we handle invalidation of facts?
00:52:20.500 | Yeah, depending on the context.
00:52:22.620 | So the question was, how do we handle revalidation of facts if a state flips back to a prior state?
00:52:31.040 | And so it depends on the context.
00:52:33.660 | A new edge might be created that represents a success of fact.
00:52:41.100 | Or the invalid update might be nullified.
00:52:45.840 | Yeah, that's a really good question.
00:52:58.020 | So why can graffiti do real-time updates but Microsoft GraphRag cannot?
00:53:03.220 | So micro GraphRag is, an oversimplification, is summarizing document chunks or documents at many different levels.
00:53:15.400 | And creating repeated summarizations at different levels.
00:53:18.380 | So a summary of the summaries of the summaries, et cetera.
00:53:21.940 | And that's very computationally expensive.
00:53:24.420 | So if any of the underlying data changes, you're ending up with a cascading number of summarizations.
00:53:31.020 | It's expensive and complicated.
00:53:33.800 | Graffiti is designed to identify specific nodes and edges that are implicated in an update.
00:53:44.960 | And then is able to invalidate with a surgical precision the edges that are implicated in the conflict.
00:53:56.360 | Or we just add new edges or nodes into the graph where it's relevant.
00:54:05.120 | And so we're able to use a variety of search pipelines as well as a number of different heuristics to really make very focused changes in the graph, which are lightweight and cheap.
00:54:21.600 | Here we go.
00:54:22.260 | How does it get for the normalized ontology?
00:54:25.980 | I'm guessing that data is very dynamic.
00:54:27.920 | How do you know what that is?
00:54:28.960 | Yeah.
00:54:29.500 | That's a good question as well.
00:54:31.020 | So there are two ways that graffiti operates.
00:54:36.660 | The first, last question, the first one is that graffiti will build the ontology for you and will very carefully try to deduplicate edge and node types.
00:54:50.820 | Secondly, as I mentioned a little bit earlier, we allow you to define an ontology using Pydantic, Zod, Go structs, et cetera.
00:55:05.220 | All right.
00:55:05.620 | I think we're at time.
00:55:06.580 | So thank you, everybody.
00:55:07.420 | And as Daniel mentioned, there's lunch being served outside right now, actually.
00:55:15.500 | So I encourage you all to go get a bite to eat.
00:55:17.340 | At 1 o'clock, come back into the room.
00:55:19.740 | We're going to have a panel discussion about overall agentic memory, different implementations of it.
00:55:24.780 | And you're going to be part of that panel as well, right?
00:55:26.540 | So it'll be a great session.
00:55:27.720 | So come back at 1 o'clock for a longer session of day, gentle memory.
00:55:32.000 | I have a question.
00:55:33.720 | Why don't I hop down?
00:55:35.960 | Yeah.
00:55:36.180 | Are there any limitations when it comes to the data set, graffiti can handle?
00:55:42.360 | Because I have...
00:55:44.060 | Thank you.
00:55:47.180 | I will come back about 10 minutes before 1.
00:55:51.540 | Okay.
00:55:52.120 | Yeah.
00:55:52.600 | Minutes.
00:55:53.940 | Yeah.
00:55:54.300 | There you go.
00:55:56.220 | Minutes.
00:55:57.220 | Minutes.
00:55:58.220 | And...
00:55:58.220 | And...
00:56:10.220 | All right, now we have a lunch break.
00:56:19.540 | Yeah, okay, cool.
00:56:21.040 | What's going on in this room?
00:56:35.120 | That's why I love our room.
00:56:36.580 | She's overdoing that, and I'm overdoing that, and I'm overdoing that situation.
00:56:38.580 | I'm overdoing that situation.
00:56:40.580 | I'm overdoing that situation.
00:56:42.580 | I'm overdoing that situation.
00:56:42.960 | I'm overdoing that situation.
00:56:58.960 | I'm overdoing that situation.
00:57:05.960 | Thank you.
00:57:35.940 | Thank you.
00:58:05.920 | Check, check, check, check, check, check, check, check, check, one, two.
00:58:17.780 | Okay.
00:58:18.440 | Thank you.
00:58:18.720 | Thank you.
00:58:18.980 | Thank you.
00:58:19.020 | Thank you.
00:58:21.020 | Thank you.
00:58:22.460 | Thank you.
00:58:24.460 | Thank you.
00:58:25.460 | Thank you.
00:58:27.460 | Thank you.
00:58:29.460 | Thank you.
00:58:30.460 | Thank you.
00:58:30.460 | Thank you.
00:58:31.460 | Thank you.
00:58:31.460 | Thank you.
00:58:31.460 | Thank you.
00:58:32.460 | Thank you.
00:58:32.460 | Thank you.
00:58:33.460 | Thank you.
00:58:33.460 | Okay.
00:58:34.460 | Thank you.
00:58:34.460 | Thank you.
00:58:34.460 | Thank you.
00:58:35.460 | Thank you.
00:58:35.460 | Thank you.
00:58:35.460 | Thank you.
00:58:36.460 | Thank you.
00:58:36.460 | Thank you.
00:58:37.460 | Thank you.
00:58:37.460 | Thank you.
00:58:38.460 | Thank you.
00:58:39.460 | Thank you.
00:58:40.460 | Thank you.
00:58:40.460 | Thank you.
00:58:41.460 | Thank you.
00:58:41.460 | Thank you.
00:58:42.460 | Thank you.
00:58:42.460 | Thank you.
00:58:43.460 | Thank you.
00:58:43.460 | Thank you.
00:58:44.460 | Thank you.
00:58:45.460 | Thank you.
00:58:46.460 | Thank you.
00:58:47.460 | Thank you.
00:58:48.460 | Thank you.
00:58:49.460 | Thank you.
00:58:50.460 | Thank you.
00:58:51.460 | Thank you.
00:58:52.460 | Thank you.
00:58:53.460 | Thank you.
00:58:54.460 | Thank you.
00:58:55.460 | Thank you.
00:58:56.460 | Thank you.
00:58:57.460 | Thank you.
00:58:58.460 | Thank you.
00:58:59.460 | Thank you.
00:59:00.460 | Thank you.
00:59:01.460 | Thank you.
00:59:02.460 | Thank you.
00:59:03.460 | Thank you.
00:59:04.460 | Thank you.
00:59:05.460 | Thank you.
00:59:06.460 | Thank you.
00:59:07.460 | Thank you.
00:59:08.460 | Thank you.
00:59:09.460 | Thank you.
00:59:10.460 | Thank you.
00:59:11.460 | Thank you.
00:59:12.460 | Thank you.
00:59:13.460 | Thank you.
00:59:14.460 | Thank you.
00:59:15.460 | Thank you.
00:59:16.460 | Thank you.
00:59:17.460 | Thank you.
00:59:18.460 | Thank you.
00:59:19.460 | Thank you.
00:59:20.460 | Thank you.
00:59:21.460 | Thank you.
00:59:22.460 | Thank you.
00:59:23.460 | Thank you.
00:59:24.460 | Thank you.
00:59:25.460 | Thank you.
00:59:26.460 | Thank you.
00:59:27.460 | Thank you.
00:59:28.460 | Thank you.
00:59:29.460 | Thank you.
00:59:30.460 | Thank you.
00:59:31.460 | Thank you.
00:59:32.460 | Thank you.
00:59:33.460 | Thank you.
00:59:34.460 | Thank you.
00:59:35.460 | Thank you.
00:59:36.460 | Thank you.
00:59:37.460 | Thank you.
00:59:38.460 | Thank you.
00:59:39.460 | Thank you.
00:59:40.460 | Thank you.
00:59:41.460 | Thank you.
00:59:42.460 | Thank you.
00:59:43.460 | Thank you.
00:59:44.460 | Thank you.
00:59:45.460 | Thank you.
00:59:46.460 | Thank you.
00:59:46.460 | Thank you.
00:59:47.460 | Thank you.
00:59:48.460 | Thank you.
00:59:49.460 | Thank you.
00:59:50.460 | Thank you.
00:59:51.460 | Thank you.
00:59:52.460 | Thank you.
00:59:53.460 | Thank you.
00:59:54.460 | Thank you.
00:59:55.460 | Thank you.
00:59:56.460 | Thank you.
00:59:57.460 | Thank you.
00:59:58.460 | Thank you.
00:59:59.460 | Thank you.
01:00:00.460 | Thank you.
01:00:01.460 | Thank you.
01:00:02.460 | Thank you.
01:00:03.460 | Thank you.
01:00:04.460 | Thank you.
01:00:05.460 | Thank you.
01:00:06.460 | Thank you.
01:00:07.460 | Thank you.
01:00:08.460 | Thank you.
01:00:09.460 | Thank you.
01:00:10.460 | Thank you.
01:00:11.460 | Thank you.
01:00:12.460 | Thank you.
01:00:13.460 | Thank you.
01:00:14.460 | Thank you.
01:00:15.460 | Thank you.
01:00:16.460 | Thank you.
01:00:17.460 | Thank you.
01:00:18.460 | Thank you.
01:00:19.460 | Thank you.
01:00:20.460 | Thank you.
01:00:21.460 | Thank you.
01:00:22.460 | Thank you.
01:00:23.460 | Thank you.
01:00:24.460 | Thank you.
01:00:25.460 | Thank you.
01:00:26.460 | Thank you.
01:00:27.460 | Thank you.
01:00:28.460 | Thank you.
01:00:29.460 | Thank you.
01:00:30.460 | Thank you.
01:00:31.460 | Thank you.
01:00:32.460 | Thank you.
01:00:33.460 | Thank you.
01:00:34.460 | Thank you.
01:00:35.460 | Thank you.
01:00:36.460 | Thank you.
01:00:37.460 | Thank you.
01:00:37.460 | Thank you.
01:00:38.460 | Thank you.
01:00:39.460 | Thank you.
01:00:40.460 | Thank you.
01:00:41.460 | Thank you.
01:00:42.460 | Thank you.
01:00:43.460 | Thank you.
01:00:44.460 | Thank you.
01:00:45.460 | Thank you.
01:00:46.460 | Thank you.
01:00:47.460 | Thank you.
01:00:48.460 | Thank you.
01:00:49.460 | Thank you.
01:00:50.460 | Thank you.
01:00:51.460 | Thank you.
01:00:52.460 | Thank you.
01:00:53.460 | Thank you.
01:00:54.460 | Thank you.
01:00:55.460 | Thank you.
01:00:56.460 | Thank you.
01:00:57.460 | Thank you.
01:00:58.460 | Thank you.
01:00:59.460 | Thank you.
01:01:00.460 | Thank you.
01:01:01.460 | Thank you.
01:01:02.460 | Thank you.
01:01:03.460 | Thank you.
01:01:04.460 | Thank you.
01:01:05.460 | Thank you.
01:01:06.460 | Thank you.
01:01:07.460 | Thank you.
01:01:08.460 | Thank you.
01:01:09.460 | Thank you.
01:01:10.460 | Thank you.
01:01:11.460 | Thank you.
01:01:12.460 | Thank you.
01:01:13.460 | Thank you.
01:01:14.460 | Thank you.
01:01:15.460 | Thank you.
01:01:16.460 | Thank you.
01:01:17.460 | Thank you.
01:01:18.460 | Thank you.
01:01:19.460 | Thank you.
01:01:20.460 | Thank you.
01:01:21.460 | Thank you.
01:01:22.460 | Thank you.
01:01:23.460 | Thank you.
01:01:24.460 | Thank you.
01:01:25.460 | Thank you.
01:01:26.460 | Thank you.
01:01:27.460 | Thank you.
01:01:28.460 | Thank you.
01:01:29.460 | Thank you.
01:01:30.460 | Thank you.
01:01:31.460 | Thank you.
01:01:32.460 | Thank you.
01:01:33.460 | Thank you.
01:01:34.460 | Thank you.
01:01:35.460 | Thank you.
01:01:36.460 | Thank you.
01:01:37.460 | Thank you.
01:01:38.460 | Thank you.
01:01:39.460 | Thank you.
01:01:40.460 | Thank you.
01:01:41.460 | Thank you.
01:01:42.460 | Thank you.
01:01:43.460 | Thank you.
01:01:44.460 | Thank you.
01:01:45.460 | Thank you.
01:01:46.460 | Thank you.
01:01:47.460 | Thank you.
01:01:48.460 | Thank you.
01:01:48.460 | Thank you.
01:01:49.460 | Thank you.
01:01:50.460 | Thank you.
01:01:51.460 | Thank you.
01:01:52.460 | Thank you.
01:01:53.460 | Thank you.
01:01:54.460 | Thank you.
01:01:55.460 | Thank you.
01:01:56.460 | Thank you.
01:01:57.460 | Thank you.
01:01:58.460 | Thank you.
01:01:59.460 | Thank you.
01:02:00.460 | Thank you.
01:02:01.460 | Thank you.
01:02:02.460 | Thank you.
01:02:03.460 | Thank you.
01:02:04.460 | Thank you.
01:02:05.460 | Thank you.
01:02:06.460 | Thank you.
01:02:07.460 | Thank you.
01:02:08.460 | Thank you.
01:02:09.460 | Thank you.
01:02:10.460 | Thank you.
01:02:11.460 | Thank you.
01:02:12.460 | Thank you.
01:02:13.460 | Thank you.
01:02:14.460 | Thank you.
01:02:15.460 | Thank you.
01:02:16.460 | Thank you.
01:02:17.460 | Thank you.
01:02:18.460 | Thank you.
01:02:19.460 | Thank you.
01:02:20.460 | Thank you.
01:02:21.460 | Thank you.
01:02:22.460 | Thank you.
01:02:23.460 | Thank you.
01:02:24.460 | Thank you.
01:02:25.460 | Thank you.
01:02:26.460 | Thank you.
01:02:27.460 | Thank you.
01:02:28.460 | Thank you.
01:02:29.460 | Thank you.
01:02:30.460 | Thank you.
01:02:31.460 | Thank you.
01:02:32.460 | Thank you.
01:02:33.460 | Thank you.
01:02:34.460 | Thank you.
01:02:35.460 | Thank you.
01:02:36.460 | Thank you.
01:02:37.460 | Thank you.
01:02:38.460 | Thank you.
01:02:39.460 | Thank you.
01:02:40.460 | Thank you.
01:02:41.460 | Thank you.
01:02:42.460 | Thank you.
01:02:43.460 | Thank you.
01:02:44.460 | Thank you.
01:02:45.460 | Thank you.
01:02:46.460 | Thank you.
01:02:47.460 | Thank you.
01:02:48.460 | Thank you.
01:02:49.460 | Thank you.
01:02:50.460 | Thank you.
01:02:51.460 | Thank you.
01:02:52.460 | Thank you.
01:02:53.460 | Thank you.
01:02:54.460 | Thank you.
01:02:55.460 | Thank you.
01:02:56.460 | Thank you.
01:02:57.460 | Thank you.
01:02:58.460 | Thank you.
01:02:59.460 | Thank you.
01:03:00.460 | Thank you.
01:03:01.460 | Thank you.
01:03:02.460 | Thank you.
01:03:03.460 | Thank you.
01:03:04.460 | Thank you.
01:03:05.460 | Thank you.
01:03:06.460 | Thank you.
01:03:07.460 | Thank you.
01:03:08.460 | Thank you.
01:03:09.460 | Thank you.
01:03:10.460 | Thank you.
01:03:11.460 | Thank you.
01:03:12.460 | Thank you.
01:03:13.460 | Thank you.
01:03:14.460 | Thank you.
01:03:15.460 | Thank you.
01:03:16.460 | Thank you.
01:03:17.460 | Thank you.
01:03:18.460 | Thank you.
01:03:19.460 | Thank you.
01:03:20.460 | Thank you.
01:03:21.460 | Thank you.
01:03:22.460 | Thank you.
01:03:23.460 | Thank you.
01:03:24.460 | Thank you.
01:03:25.460 | Thank you.
01:03:26.460 | Thank you.
01:03:27.460 | Thank you.
01:03:28.460 | Thank you.
01:03:29.460 | Thank you.
01:03:30.460 | Thank you.
01:03:31.460 | Thank you.
01:03:32.460 | Thank you.
01:03:33.460 | Thank you.
01:03:34.460 | Thank you.
01:03:35.460 | Thank you.
01:03:36.460 | Thank you.
01:03:37.460 | Thank you.
01:03:38.460 | Thank you.
01:03:39.460 | Thank you.
01:03:40.460 | Thank you.
01:03:41.460 | Thank you.
01:03:42.460 | Thank you.
01:03:43.460 | Thank you.
01:03:45.460 | Thank you.
01:03:46.460 | Thank you.
01:03:47.460 | Thank you.
01:03:48.460 | Thank you.
01:03:49.460 | Thank you.
01:03:50.460 | Thank you.
01:03:51.460 | Thank you.
01:03:52.460 | Thank you.
01:03:53.460 | Thank you.
01:03:54.460 | Thank you.
01:03:55.460 | Thank you.
01:03:56.460 | Thank you.
01:03:57.460 | Thank you.
01:03:58.460 | Thank you.
01:03:59.460 | Thank you.
01:04:00.460 | Thank you.
01:04:01.460 | Thank you.
01:04:02.460 | Thank you.
01:04:03.460 | Thank you.
01:04:04.460 | Thank you.
01:04:05.460 | Thank you.
01:04:06.460 | Thank you.
01:04:07.460 | Thank you.
01:04:08.460 | Thank you.
01:04:09.460 | Thank you.
01:04:10.460 | Thank you.
01:04:11.460 | Thank you.
01:04:12.460 | Thank you.
01:04:13.460 | Thank you.
01:04:14.460 | Thank you.
01:04:15.460 | Thank you.
01:04:16.460 | Thank you.
01:04:17.460 | Thank you.
01:04:18.460 | Thank you.
01:04:19.460 | Thank you.
01:04:20.460 | You, thank you.
01:04:21.460 | Thank you.
01:04:21.460 | Thank you.
01:04:22.460 | Thank you.
01:04:23.460 | Thank you.
01:04:24.460 | Thank you.
01:04:25.460 | Thank you.
01:04:26.460 | Thank you.
01:04:27.460 | Thank you.
01:04:28.460 | Thank you.
01:04:29.460 | Thank you.
01:04:30.460 | Thank you.
01:04:30.460 | Thank you.
01:04:31.460 | Thank you.
01:04:32.460 | Thank you.
01:04:33.460 | Thank you.
01:04:33.460 | Thank you.
01:04:34.460 | Thank you.
01:04:35.460 | Thank you.
01:04:36.460 | Thank you.
01:04:37.460 | Thank you.
01:04:38.460 | Thank you.
01:04:39.460 | Thank you.
01:04:40.460 | Thank you.
01:04:40.460 | Thank you.
01:04:41.460 | Thank you.
01:04:41.460 | Thank you.
01:04:42.460 | Thank you.
01:04:43.460 | Thank you.
01:04:44.460 | Thank you.
01:04:45.460 | Thank you.
01:04:45.460 | Thank you.
01:04:46.460 | Thank you.
01:04:46.460 | Thank you.
01:04:47.460 | Thank you.
01:04:48.460 | Thank you.
01:04:49.460 | Thank you.
01:04:50.460 | Thank you.
01:04:51.460 | Thank you.
01:04:52.460 | Thank you.
01:04:53.460 | Thank you.
01:04:54.460 | Thank you.
01:04:55.460 | Thank you.
01:04:56.460 | Thank you.
01:04:57.460 | Thank you.
01:04:58.460 | Thank you.
01:04:59.460 | Thank you.
01:05:00.460 | Thank you.
01:05:01.460 | Thank you.
01:05:01.460 | Thank you.
01:05:01.460 | Thank you.
01:05:02.460 | Thank you.
01:05:03.460 | Thank you.
01:05:04.460 | Thank you.
01:05:05.460 | Thank you.
01:05:06.460 | Thank you.
01:05:07.460 | Thank you.
01:05:08.460 | Thank you.
01:05:09.460 | Thank you.
01:05:10.460 | Thank you.
01:05:11.460 | Thank you.
01:05:12.460 | Thank you.
01:05:13.460 | Thank you.
01:05:14.460 | Thank you.
01:05:15.460 | Thank you.
01:05:16.460 | Thank you.
01:05:17.460 | Thank you.
01:05:18.460 | Thank you.
01:05:19.460 | Thank you.
01:05:20.460 | Thank you.
01:05:21.460 | Thank you.
01:05:22.460 | Thank you.
01:05:23.460 | Thank you.
01:05:24.460 | Thank you.
01:05:25.460 | Thank you.
01:05:26.460 | Thank you.
01:05:27.460 | Thank you.
01:05:28.460 | Thank you.
01:05:29.460 | Thank you.
01:05:30.460 | Thank you.
01:05:31.460 | Thank you.
01:05:32.460 | Thank you.
01:05:33.460 | Thank you.
01:05:34.460 | Thank you.
01:05:35.460 | Thank you.
01:05:36.460 | Thank you.
01:05:37.460 | Thank you.
01:05:38.460 | Thank you.
01:05:39.460 | Thank you.
01:05:39.460 | Thank you.
01:05:40.460 | Thank you.
01:05:41.460 | Thank you.
01:05:42.460 | Thank you.
01:05:43.460 | Thank you.
01:05:44.460 | Thank you.
01:05:45.460 | Thank you.
01:05:46.460 | Thank you.
01:05:47.460 | Thank you.
01:05:48.460 | Thank you.
01:05:49.460 | Thank you.
01:05:50.460 | Thank you.
01:05:51.460 | Thank you.
01:05:52.460 | Thank you.
01:05:52.460 | Thank you.
01:05:53.460 | Thank you.
01:05:54.460 | Thank you.
01:05:55.460 | Thank you.
01:05:56.460 | Thank you.
01:05:57.460 | Thank you.
01:05:58.460 | Thank you.
01:05:59.460 | Thank you.
01:06:00.460 | Thank you.
01:06:01.460 | Thank you.
01:06:02.460 | Thank you.
01:06:03.460 | Thank you.
01:06:04.460 | Thank you.
01:06:05.460 | Thank you.
01:06:06.460 | Thank you.
01:06:07.460 | Thank you.
01:06:08.460 | Thank you.
01:06:09.460 | Thank you.
01:06:10.460 | Thank you.
01:06:11.460 | Thank you.
01:06:12.460 | Thank you.
01:06:13.460 | Thank you.
01:06:14.460 | Thank you.
01:06:15.460 | Thank you.
01:06:17.460 | Thank you.
01:06:18.460 | Thank you.
01:06:19.460 | Thank you.
01:06:20.460 | Thank you.
01:06:21.460 | Thank you.
01:06:22.460 | Thank you.
01:06:23.460 | Thank you.
01:06:24.460 | Thank you.
01:06:25.460 | Thank you.
01:06:26.460 | Thank you.
01:06:27.460 | Thank you.
01:06:28.460 | Thank you.
01:06:29.460 | Thank you.
01:06:30.460 | Thank you.
01:06:31.460 | Thank you.
01:06:32.460 | Thank you.
01:06:33.460 | Thank you.
01:06:34.460 | Thank you.
01:06:35.460 | Thank you.
01:06:36.460 | Thank you.
01:06:37.460 | Thank you.
01:06:38.460 | Thank you.
01:06:39.460 | Thank you.
01:06:40.460 | Thank you.
01:06:41.460 | Thank you.
01:06:42.460 | Thank you.
01:06:43.460 | Thank you.
01:06:44.460 | Thank you.
01:06:45.460 | Thank you.
01:06:46.460 | Thank you.
01:06:47.460 | Thank you.
01:06:48.460 | Thank you.
01:06:49.460 | Thank you.
01:06:50.460 | Thank you.
01:06:51.460 | Thank you.
01:06:52.460 | Thank you.
01:06:53.460 | Thank you.
01:06:54.460 | Thank you.
01:06:55.460 | Thank you.
01:06:56.460 | Thank you.
01:06:57.460 | Thank you.
01:06:58.460 | Thank you.
01:06:59.460 | Thank you.
01:07:00.460 | Thank you.
01:07:01.460 | Thank you.
01:07:02.460 | Thank you.
01:07:03.460 | Thank you.
01:07:04.460 | Thank you.
01:07:05.460 | Thank you.
01:07:06.460 | Thank you.
01:07:07.460 | Thank you.
01:07:08.460 | Thank you.
01:07:09.460 | Thank you.
01:07:10.460 | Thank you.
01:07:11.460 | Thank you.
01:07:12.460 | Thank you.
01:07:14.460 | Thank you.
01:07:15.460 | Thank you.
01:07:16.460 | Thank you.
01:07:17.460 | Thank you.
01:07:18.460 | Thank you.
01:07:19.460 | Thank you.
01:07:20.460 | Thank you.
01:07:21.460 | Thank you.
01:07:22.460 | Thank you.
01:07:23.460 | Thank you.
01:07:24.460 | Thank you.
01:07:25.460 | Thank you.
01:07:26.460 | Thank you.
01:07:27.460 | Thank you.
01:07:28.460 | Thank you.
01:07:29.460 | Thank you.
01:07:30.460 | Thank you.
01:07:31.460 | Thank you.
01:07:32.460 | Thank you.
01:07:33.460 | Thank you.
01:07:34.460 | Thank you.
01:07:35.460 | Thank you.
01:07:36.460 | Thank you.
01:07:37.460 | Thank you.
01:07:37.460 | Thank you.
01:07:38.460 | Thank you.
01:07:39.460 | Thank you.
01:07:40.460 | Thank you.
01:07:41.460 | Thank you.
01:07:42.460 | Thank you.
01:07:43.460 | Thank you.
01:07:44.460 | Thank you.
01:07:45.460 | Thank you.
01:07:46.460 | Thank you.
01:07:47.460 | Thank you.
01:07:48.460 | Thank you.
01:07:49.460 | Thank you.
01:07:50.460 | Thank you.
01:07:51.460 | Thank you.
01:07:52.460 | Thank you.
01:07:53.460 | Thank you.
01:07:54.460 | Thank you.
01:07:55.460 | Thank you.
01:07:56.460 | Thank you.
01:07:57.460 | Thank you.
01:07:58.460 | Thank you.
01:07:59.460 | Thank you.
01:08:00.460 | Thank you.
01:08:01.460 | Thank you.
01:08:02.460 | Thank you.
01:08:03.460 | Thank you.
01:08:04.460 | Thank you.
01:08:05.460 | Thank you.
01:08:06.460 | Thank you.
01:08:07.460 | Thank you.
01:08:08.460 | Thank you.
01:08:09.460 | Thank you.
01:08:10.460 | Thank you.
01:08:11.460 | Thank you.
01:08:12.460 | Thank you.
01:08:13.460 | Thank you.
01:08:14.460 | Thank you.
01:08:15.460 | Thank you.
01:08:16.460 | Thank you.
01:08:17.460 | Thank you.
01:08:18.460 | Thank you.
01:08:19.460 | Thank you.
01:08:20.460 | Thank you.
01:08:21.460 | Thank you.
01:08:22.460 | Thank you.
01:08:23.460 | Thank you.
01:08:24.460 | Thank you.
01:08:25.460 | Thank you.
01:08:26.460 | Thank you.
01:08:27.460 | Thank you.
01:08:28.460 | Thank you.
01:08:29.460 | Thank you.
01:08:30.460 | Thank you.
01:08:31.460 | Thank you.
01:08:32.460 | Thank you.
01:08:33.460 | Thank you.
01:08:34.460 | Thank you.
01:08:35.460 | Thank you.
01:08:36.460 | Thank you.
01:08:37.460 | Thank you.
01:08:38.460 | Thank you.
01:08:39.460 | Thank you.
01:08:40.460 | Thank you.
01:08:41.460 | Thank you.
01:08:42.460 | Thank you.
01:08:43.460 | Thank you.
01:08:44.460 | Thank you.
01:08:45.460 | Thank you.
01:08:46.460 | Thank you.
01:08:47.460 | Thank you.
01:08:48.460 | Thank you.
01:08:49.460 | Thank you.
01:08:50.460 | Thank you.
01:08:51.460 | Thank you.
01:08:52.460 | Thank you.
01:08:53.460 | Thank you.
01:08:54.460 | Thank you.
01:08:55.460 | Thank you.
01:08:56.460 | Thank you.
01:08:57.460 | Thank you.
01:08:57.460 | Thank you.
01:08:58.460 | Thank you.
01:08:59.460 | Thank you.
01:09:00.460 | Thank you.
01:09:01.460 | Thank you.
01:09:02.460 | Thank you.
01:09:03.460 | Thank you.
01:09:04.460 | Thank you.
01:09:05.460 | Thank you.
01:09:06.460 | Thank you.
01:09:07.460 | Thank you.
01:09:08.460 | Thank you.
01:09:09.460 | Thank you.
01:09:10.460 | Thank you.
01:09:11.460 | Thank you.
01:09:12.460 | Thank you.
01:09:13.460 | Thank you.
01:09:14.460 | Thank you.
01:09:15.460 | Thank you.
01:09:16.460 | Thank you.
01:09:17.460 | Thank you.
01:09:18.460 | Thank you.
01:09:19.460 | Thank you.
01:09:20.460 | Thank you.
01:09:21.460 | Thank you.
01:09:22.460 | Thank you.
01:09:23.460 | Thank you.
01:09:24.460 | Thank you.
01:09:25.460 | Thank you.
01:09:26.460 | Thank you.
01:09:27.460 | Thank you.
01:09:28.460 | Thank you.
01:09:29.460 | Thank you.
01:09:30.460 | Thank you.
01:09:31.460 | Thank you.
01:09:32.460 | Thank you.
01:09:33.460 | Thank you.
01:09:34.460 | Thank you.
01:09:35.460 | Thank you.
01:09:36.460 | Thank you.
01:09:37.460 | Thank you.
01:09:38.460 | Thank you.
01:09:38.460 | Thank you.
01:09:39.460 | Thank you.
01:09:40.460 | Thank you.
01:09:41.460 | Thank you.
01:09:42.460 | Thank you.
01:09:43.460 | Thank you.
01:09:44.460 | Thank you.
01:09:45.460 | Thank you.
01:09:46.460 | Thank you.
01:09:47.460 | Thank you.
01:09:48.460 | Thank you.
01:09:49.460 | Thank you.
01:09:50.460 | Thank you.
01:09:51.460 | Thank you.
01:09:52.460 | Thank you.
01:09:53.460 | Thank you.
01:09:54.460 | Thank you.
01:09:55.460 | Thank you.
01:09:56.460 | Thank you.
01:10:24.460 | Thank you.
01:10:25.460 | Thank you.
01:10:26.460 | Thank you.
01:10:27.460 | Thank you.
01:10:28.460 | Thank you.
01:10:29.460 | Thank you.
01:10:30.460 | Thank you.
01:10:32.460 | Thank you.
01:10:33.460 | Thank you.
01:10:34.460 | Thank you.
01:10:35.460 | Thank you.
01:10:36.460 | Thank you.
01:10:37.460 | Thank you.
01:10:38.460 | Thank you.
01:10:39.460 | Thank you.
01:10:40.460 | Thank you.
01:10:41.460 | Thank you.
01:10:42.460 | Thank you.
01:10:42.460 | Thank you.
01:10:43.460 | Thank you.
01:10:44.460 | Thank you.
01:10:44.460 | Thank you.
01:10:45.460 | Thank you.
01:10:45.460 | Thank you.
01:10:46.460 | Thank you.
01:10:47.460 | Thank you.
01:10:47.460 | Thank you.
01:10:48.460 | Thank you.
01:10:49.460 | Thank you.
01:10:49.460 | Thank you.
01:10:50.460 | Thank you.
01:10:51.460 | Thank you.
01:10:52.460 | Thank you.
01:10:52.460 | Thank you.
01:10:53.460 | Thank you.
01:10:54.460 | Thank you.
01:10:54.460 | Thank you.
01:10:55.460 | Thank you.
01:10:56.460 | Thank you.
01:10:56.460 | Thank you.
01:10:57.460 | Thank you.
01:10:58.460 | Thank you.
01:10:58.460 | Thank you.
01:10:59.460 | Thank you.
01:11:00.460 | Thank you.
01:11:01.460 | Thank you.
01:11:02.460 | Thank you.
01:11:03.460 | Thank you.
01:11:04.460 | Thank you.
01:11:05.460 | Thank you.
01:11:06.460 | Thank you.
01:11:07.460 | Thank you.
01:11:08.460 | Thank you.
01:11:09.460 | Thank you.
01:11:10.460 | Thank you.
01:11:11.460 | Thank you.
01:11:12.460 | Thank you.
01:11:13.460 | Thank you.
01:11:14.460 | Thank you.
01:11:15.460 | Thank you.
01:11:16.460 | Thank you.
01:11:17.460 | Thank you.
01:11:18.460 | Thank you.
01:11:19.460 | Thank you.
01:11:20.460 | Thank you.
01:11:21.460 | Thank you.
01:11:22.460 | Thank you.
01:11:23.460 | Thank you.
01:11:24.460 | Thank you.
01:11:25.460 | Thank you.
01:11:26.460 | Thank you.
01:11:27.460 | Thank you.
01:11:28.460 | Thank you.
01:11:29.460 | Thank you.
01:11:30.460 | Thank you.
01:11:31.460 | Thank you.
01:11:32.460 | Thank you.
01:11:33.460 | Thank you.
01:11:34.460 | Thank you.
01:11:35.460 | Thank you.
01:11:35.460 | Thank you.
01:11:36.460 | Thank you.
01:11:37.460 | Thank you.
01:11:38.460 | Thank you.
01:11:39.460 | Thank you.
01:11:40.460 | Thank you.
01:11:41.460 | Thank you.
01:11:42.460 | Thank you.
01:11:43.460 | Thank you.
01:11:44.460 | Thank you.
01:11:45.460 | Thank you.
01:11:46.460 | Thank you.
01:11:47.460 | Thank you.
01:11:48.460 | Thank you.
01:11:49.460 | Thank you.
01:11:50.460 | Thank you.
01:11:51.460 | Thank you.
01:11:52.460 | Thank you.
01:11:53.460 | Thank you.
01:11:54.460 | Thank you.
01:11:55.460 | Thank you.
01:11:56.460 | Thank you.
01:11:57.460 | Thank you.
01:11:58.460 | Thank you.
01:11:59.460 | Thank you.
01:12:00.460 | Thank you.
01:12:01.460 | Thank you.
01:12:02.460 | Thank you.
01:12:03.460 | Thank you.
01:12:04.460 | Thank you.
01:12:05.460 | Thank you.
01:12:06.460 | Thank you.
01:12:07.460 | Thank you.
01:12:08.460 | Thank you.
01:12:09.460 | Thank you.
01:12:10.460 | Thank you.
01:12:11.460 | Thank you.
01:12:12.460 | Thank you.
01:12:13.460 | Thank you.
01:12:14.460 | Thank you.
01:12:15.460 | Thank you.
01:12:16.460 | Thank you.
01:12:17.460 | Thank you.
01:12:18.460 | Thank you.
01:12:19.460 | Thank you.
01:12:20.460 | Thank you.
01:12:20.460 | Thank you.
01:12:21.460 | Thank you.
01:12:22.460 | Thank you.
01:12:23.460 | Thank you.
01:12:24.460 | Thank you.
01:12:25.460 | Thank you.
01:12:26.460 | Thank you.
01:12:27.460 | Thank you.
01:12:28.460 | Thank you.
01:12:29.460 | Thank you.
01:12:30.460 | Thank you.
01:12:31.460 | Thank you.
01:12:32.460 | Thank you.
01:12:33.460 | Thank you.
01:12:34.460 | Thank you.
01:12:35.460 | Thank you.
01:12:36.460 | Thank you.
01:12:37.460 | Thank you.
01:12:38.460 | Thank you.
01:12:39.460 | Thank you.
01:12:40.460 | Thank you.
01:12:41.460 | Thank you.
01:12:42.460 | Thank you.
01:12:43.460 | Thank you.
01:12:44.460 | Thank you.
01:12:45.460 | Thank you.
01:12:46.460 | Thank you.
01:12:47.460 | Thank you.
01:12:48.460 | Thank you.
01:12:49.460 | Thank you.
01:12:50.460 | Thank you.
01:12:51.460 | Thank you.
01:12:52.460 | Thank you.
01:12:53.460 | Thank you.
01:12:54.460 | Thank you.
01:12:55.460 | Thank you.
01:12:56.460 | Thank you.
01:12:57.460 | Thank you.
01:12:58.460 | Thank you.
01:12:59.460 | Thank you.
01:13:00.460 | Thank you.
01:13:01.460 | Thank you.
01:13:02.460 | Thank you.
01:13:03.460 | Thank you.
01:13:04.460 | Thank you.
01:13:05.460 | Thank you.
01:13:06.460 | Thank you.
01:13:07.460 | Thank you.
01:13:08.460 | Thank you.
01:13:09.460 | Thank you.
01:13:10.460 | Thank you.
01:13:11.460 | Thank you.
01:13:12.460 | Thank you.
01:13:13.460 | Thank you.
01:13:14.460 | Thank you.
01:13:15.460 | Thank you.
01:13:16.460 | Thank you.
01:13:17.460 | Thank you.
01:13:18.460 | Thank you.
01:13:19.460 | Thank you.
01:13:20.460 | Thank you.
01:13:21.460 | Thank you.
01:13:22.460 | Thank you.
01:13:23.460 | Thank you.
01:13:24.460 | Thank you.
01:13:25.460 | Thank you.
01:13:26.460 | Thank you.
01:13:27.460 | Thank you.
01:13:28.460 | Thank you.
01:13:29.460 | Thank you.
01:13:30.460 | Thank you.
01:13:31.460 | Thank you.
01:13:32.460 | Thank you.
01:13:32.460 | Thank you.
01:13:33.460 | Thank you.
01:13:34.460 | Thank you.
01:13:35.460 | Thank you.
01:13:36.460 | Thank you.
01:13:37.460 | Thank you.
01:13:38.460 | Thank you.
01:13:39.460 | Thank you.
01:13:40.460 | Thank you.
01:13:41.460 | Thank you.
01:13:42.460 | Thank you.
01:13:43.460 | Thank you.
01:13:44.460 | Thank you.
01:13:45.460 | Thank you.
01:13:46.460 | Thank you.
01:13:47.460 | Thank you.
01:13:48.460 | Thank you.
01:13:49.460 | Thank you.
01:13:50.460 | Thank you.
01:13:51.460 | Thank you.
01:13:52.460 | Thank you.
01:13:54.460 | Thank you.
01:13:55.460 | Thank you.
01:13:56.460 | Thank you.
01:13:57.460 | Thank you.
01:13:58.460 | Thank you.
01:13:59.460 | Thank you.
01:14:00.460 | Thank you.
01:14:01.460 | Thank you.
01:14:02.460 | Thank you.
01:14:03.460 | Thank you.
01:14:04.460 | Thank you.
01:14:05.460 | Thank you.
01:14:06.460 | Thank you.
01:14:07.460 | Thank you.
01:14:08.460 | Thank you.
01:14:09.460 | Thank you.
01:14:10.460 | Thank you.
01:14:11.460 | Thank you.
01:14:12.460 | Thank you.
01:14:13.460 | Thank you.
01:14:14.460 | Thank you.
01:14:15.460 | Thank you.
01:14:16.460 | Thank you.
01:14:17.460 | Thank you.
01:14:18.460 | Thank you.
01:14:19.460 | Thank you.
01:14:20.460 | Thank you.
01:14:21.460 | Thank you.
01:14:22.460 | Thank you.
01:14:23.460 | Thank you.
01:14:24.460 | Thank you.
01:14:25.460 | Thank you.
01:14:26.460 | Thank you.
01:14:27.460 | Thank you.
01:14:28.460 | Thank you.
01:14:29.460 | Thank you.
01:14:30.460 | Thank you.
01:14:31.460 | Thank you.
01:14:31.460 | Thank you.
01:14:32.460 | Thank you.
01:14:33.460 | Thank you.
01:14:34.460 | Thank you.
01:14:35.460 | Thank you.
01:14:36.460 | Thank you.
01:14:37.460 | Thank you.
01:14:38.460 | Thank you.
01:14:39.460 | Thank you.
01:14:40.460 | Thank you.
01:14:41.460 | Thank you.
01:14:42.460 | Thank you.
01:14:43.460 | Thank you.
01:14:44.460 | Thank you.
01:14:45.460 | Thank you.
01:14:46.460 | Thank you.
01:14:47.460 | Thank you.
01:14:48.460 | Thank you.
01:14:49.460 | Thank you.
01:14:50.460 | Thank you.
01:14:51.460 | Thank you.
01:14:52.460 | Thank you.
01:14:53.460 | Thank you.
01:14:54.460 | Thank you.
01:14:55.460 | Thank you.
01:14:56.460 | Thank you.
01:14:57.460 | Thank you.
01:14:58.460 | Thank you.
01:14:59.460 | Thank you.
01:15:00.460 | Thank you.
01:15:01.460 | Thank you.
01:15:02.460 | Thank you.
01:15:02.460 | Thank you.
01:15:03.460 | Thank you.
01:15:04.460 | Thank you.
01:15:05.460 | Thank you.
01:15:06.460 | Thank you.
01:15:07.460 | Thank you.
01:15:08.460 | Thank you.
01:15:09.460 | Thank you.
01:15:10.460 | Thank you.
01:15:11.460 | Thank you.
01:15:12.460 | Thank you.
01:15:13.460 | Thank you.
01:15:14.460 | Thank you.
01:15:15.460 | Thank you.
01:15:16.460 | Thank you.
01:15:17.460 | Thank you.
01:15:18.460 | Thank you.
01:15:19.460 | Thank you.
01:15:20.460 | Thank you.
01:15:21.460 | Thank you.
01:15:22.460 | Thank you.
01:15:23.460 | Thank you.
01:15:24.460 | Thank you.
01:15:25.460 | Thank you.
01:15:26.460 | Thank you.
01:15:27.460 | Thank you.
01:15:28.460 | Thank you.
01:15:29.460 | Thank you.
01:15:30.460 | Thank you.
01:15:31.460 | Thank you.
01:15:32.460 | Thank you.
01:15:33.460 | Thank you.
01:15:34.460 | Thank you.
01:15:35.460 | Thank you.
01:15:36.460 | Thank you.
01:15:37.460 | Thank you.
01:15:38.460 | Thank you.
01:15:39.460 | Thank you.
01:15:40.460 | Thank you.
01:15:40.460 | Thank you.
01:15:41.460 | Thank you.
01:15:42.460 | Thank you.
01:15:43.460 | Thank you.
01:15:44.460 | Thank you.
01:15:45.460 | Thank you.
01:15:46.460 | Thank you.
01:15:47.460 | Thank you.
01:15:48.460 | Thank you.
01:15:49.460 | Thank you.
01:15:50.460 | Thank you.
01:15:51.460 | Thank you.
01:15:52.460 | Thank you.
01:15:53.460 | Thank you.
01:15:54.460 | Thank you.
01:15:55.460 | Thank you.
01:15:56.460 | Thank you.
01:15:57.460 | Thank you.
01:15:58.460 | Thank you.
01:15:59.460 | Thank you.
01:16:00.460 | Thank you.
01:16:01.460 | Thank you.
01:16:02.460 | Thank you.
01:16:03.460 | Thank you.
01:16:04.460 | Thank you.
01:16:05.460 | Thank you.
01:16:06.460 | Thank you.
01:16:07.460 | Thank you.
01:16:08.460 | Thank you.
01:16:09.460 | Thank you.
01:16:10.460 | Thank you.
01:16:11.460 | Thank you.
01:16:12.460 | Thank you.
01:16:13.460 | Thank you.
01:16:14.460 | Thank you.
01:16:15.460 | Thank you.
01:16:16.460 | Thank you.
01:16:17.460 | Thank you.
01:16:18.460 | Thank you.
01:16:19.460 | Thank you.
01:16:20.460 | Thank you.
01:16:21.460 | Thank you.
01:16:22.460 | Thank you.
01:16:23.460 | Thank you.
01:16:24.460 | Thank you.
01:16:25.460 | Thank you.
01:16:26.460 | Thank you.
01:16:27.460 | Thank you.
01:16:28.460 | Thank you.
01:16:29.460 | Thank you.
01:16:30.460 | Thank you.
01:16:31.460 | Thank you.
01:16:32.460 | Thank you.
01:16:33.460 | Thank you.
01:16:34.460 | Thank you.
01:16:35.460 | Thank you.
01:16:36.460 | Thank you.
01:16:37.460 | Thank you.
01:16:38.460 | Thank you.
01:16:39.460 | Thank you.
01:16:40.460 | Thank you.
01:16:41.460 | Thank you.
01:16:42.460 | Thank you.
01:16:43.460 | Thank you.
01:16:44.460 | Thank you.
01:16:45.460 | Thank you.
01:16:46.460 | Thank you.
01:16:47.460 | Thank you.
01:16:48.460 | Thank you.
01:16:49.460 | Thank you.
01:16:50.460 | Thank you.
01:16:51.460 | Thank you.
01:16:52.460 | Thank you.
01:16:53.460 | Thank you.
01:16:54.460 | Thank you.
01:16:55.460 | Thank you.
01:16:56.460 | Thank you.
01:16:57.460 | Thank you.
01:16:58.460 | Thank you.
01:16:59.460 | Thank you.
01:17:00.460 | Thank you.
01:17:01.460 | Thank you.
01:17:02.460 | Thank you.
01:17:03.460 | Thank you.
01:17:04.460 | Thank you.
01:17:05.460 | Thank you.
01:17:06.460 | Thank you.
01:17:07.460 | Thank you.
01:17:08.460 | Thank you.
01:17:09.460 | Thank you.
01:17:10.460 | Thank you.
01:17:11.460 | Thank you.
01:17:12.460 | Thank you.
01:17:13.460 | Thank you.
01:17:14.460 | Thank you.
01:17:15.460 | Thank you.
01:17:16.460 | Thank you.
01:17:17.460 | Thank you.
01:17:18.460 | Thank you.
01:17:19.460 | Thank you.
01:17:20.460 | Thank you.
01:17:21.460 | Thank you.
01:17:22.460 | Thank you.
01:17:22.460 | Thank you.
01:17:23.460 | Thank you.
01:17:24.460 | Thank you.
01:17:25.460 | Thank you.
01:17:26.460 | Thank you.
01:17:27.460 | Thank you.
01:17:28.460 | Thank you.
01:17:29.460 | Thank you.
01:17:30.460 | Thank you.
01:17:31.460 | Thank you.
01:17:32.460 | Thank you.
01:17:32.460 | Thank you.
01:17:33.460 | Thank you.
01:17:33.460 | Thank you.
01:17:34.460 | Thank you.
01:17:35.460 | Thank you.
01:17:36.460 | Thank you.
01:17:37.460 | Thank you.
01:17:38.460 | Thank you.
01:17:39.460 | Thank you.
01:17:40.460 | Thank you.
01:17:41.460 | Thank you.
01:17:42.460 | Thank you.
01:17:43.460 | Thank you.
01:17:44.460 | Thank you.
01:17:45.460 | Thank you.
01:17:46.460 | Thank you.
01:17:47.460 | Thank you.
01:17:48.460 | Thank you.
01:17:49.460 | Thank you.
01:17:50.460 | Thank you.
01:17:51.460 | Thank you.
01:17:52.460 | Thank you.
01:17:53.460 | Thank you.
01:17:54.460 | Thank you.
01:17:55.460 | Thank you.
01:17:56.460 | Thank you.
01:17:57.460 | Thank you.
01:17:58.460 | Thank you.
01:17:59.460 | Thank you.
01:18:00.460 | Thank you.
01:18:01.460 | Thank you.
01:18:02.460 | Thank you.
01:18:03.460 | Thank you.
01:18:04.460 | Thank you.
01:18:05.460 | Thank you.
01:18:06.460 | Thank you.
01:18:07.460 | Thank you.
01:18:08.460 | Thank you.
01:18:09.460 | Thank you.
01:18:10.460 | Thank you.
01:18:11.460 | Thank you.
01:18:12.460 | Thank you.
01:18:13.460 | Thank you.
01:18:14.460 | Thank you.
01:18:15.460 | Thank you.
01:18:16.460 | Thank you.
01:18:17.460 | Thank you.
01:18:18.460 | Thank you.
01:18:19.460 | Thank you.
01:18:20.460 | Thank you.
01:18:21.460 | Thank you.
01:18:22.460 | Thank you.
01:18:23.460 | Thank you.
01:18:24.460 | Thank you.
01:18:25.460 | Thank you.
01:18:26.460 | Thank you.
01:18:27.460 | Thank you.
01:18:28.460 | Thank you.
01:18:28.460 | Thank you.
01:18:29.460 | Thank you.
01:18:30.460 | Thank you.
01:18:31.460 | Thank you.
01:18:32.460 | Thank you.
01:18:33.460 | Thank you.
01:18:34.460 | Thank you.
01:18:34.460 | Thank you.
01:18:35.460 | Thank you.
01:18:36.460 | Thank you.
01:18:37.460 | Thank you.
01:18:38.460 | Thank you.
01:18:39.460 | Thank you.
01:18:40.460 | Thank you.
01:18:41.460 | Thank you.
01:18:42.460 | Thank you.
01:18:43.460 | Thank you.
01:18:44.460 | Thank you.
01:18:45.460 | Thank you.
01:18:46.460 | Thank you.
01:18:47.460 | Thank you.
01:18:48.460 | Thank you.
01:18:49.460 | Thank you.
01:18:50.460 | Thank you.
01:18:51.460 | Thank you.
01:18:52.460 | Thank you.
01:18:53.460 | Thank you.
01:18:54.460 | Thank you.
01:18:55.460 | Thank you.
01:18:56.460 | Thank you.
01:18:57.460 | Thank you.
01:18:58.460 | Thank you.
01:18:59.460 | Thank you.
01:19:00.460 | Thank you.
01:19:01.460 | Thank you.
01:19:02.460 | Thank you.
01:19:03.460 | Thank you.
01:19:04.460 | Thank you.
01:19:05.460 | Thank you.
01:19:05.460 | Thank you.
01:19:06.460 | Thank you.
01:19:07.460 | Thank you.
01:19:08.460 | Thank you.
01:19:09.460 | Thank you.
01:19:10.460 | Thank you.
01:19:11.460 | Thank you.
01:19:12.460 | Thank you.
01:19:13.460 | Thank you.
01:19:14.460 | Thank you.
01:19:15.460 | Thank you.
01:19:16.460 | Thank you.
01:19:16.460 | Thank you.
01:19:17.460 | Thank you.
01:19:18.460 | Thank you.
01:19:19.460 | Thank you.
01:19:20.460 | Thank you.
01:19:21.460 | Thank you.
01:19:22.460 | Thank you.
01:19:22.460 | Thank you.
01:19:23.460 | Thank you.
01:19:24.460 | Thank you.
01:19:25.460 | Thank you.
01:19:25.460 | Thank you.
01:19:26.460 | Thank you.
01:19:27.460 | Thank you.
01:19:27.460 | Thank you.
01:19:28.460 | Thank you.
01:19:29.460 | Thank you.
01:19:30.460 | Thank you.
01:19:31.460 | Thank you.
01:19:31.460 | Thank you.
01:19:32.460 | Thank you.
01:19:33.460 | Thank you.
01:19:33.460 | Thank you.
01:19:34.460 | Thank you.
01:19:35.460 | Thank you.
01:19:36.460 | Thank you.
01:19:37.460 | Thank you.
01:19:38.460 | Thank you.
01:19:39.460 | Thank you.
01:19:40.460 | Thank you.
01:19:41.460 | Thank you.
01:19:42.460 | Thank you.
01:19:43.460 | Thank you.
01:19:44.460 | Thank you.
01:19:45.460 | Thank you.
01:19:46.460 | Thank you.
01:19:46.460 | Thank you.
01:19:47.460 | Thank you.
01:19:48.460 | Thank you.
01:19:49.460 | Thank you.
01:19:50.460 | Thank you.
01:19:51.460 | Thank you.
01:19:52.460 | Thank you.
01:19:53.460 | Thank you.
01:19:54.460 | Thank you.
01:19:55.460 | Thank you.
01:19:56.460 | Thank you.
01:19:57.460 | Thank you.
01:19:58.460 | Thank you.
01:19:59.460 | Thank you.
01:20:00.460 | Thank you.
01:20:01.460 | Thank you.
01:20:02.460 | Thank you.
01:20:03.460 | Thank you.
01:20:04.460 | Thank you.
01:20:05.460 | Thank you.
01:20:06.460 | Thank you.
01:20:07.460 | Thank you.
01:20:08.460 | Thank you.
01:20:08.460 | Thank you.
01:20:09.460 | Thank you.
01:20:10.460 | Thank you.
01:20:11.460 | Thank you.
01:20:12.460 | Thank you.
01:20:13.460 | Thank you.
01:20:14.460 | Thank you.
01:20:15.460 | Thank you.
01:20:16.460 | Thank you.
01:20:16.460 | Thank you.
01:20:17.460 | Thank you.
01:20:18.460 | Thank you.
01:20:18.460 | Thank you.
01:20:19.460 | Thank you.
01:20:20.460 | Thank you.
01:20:20.460 | Thank you.
01:20:21.460 | Thank you.
01:20:22.460 | Thank you.
01:20:22.460 | Thank you.
01:20:23.460 | Thank you.
01:20:24.460 | Thank you.
01:20:25.460 | You can wear this on your belt.
01:20:54.420 | You can wear it on your belt.
01:20:55.420 | You can do whatever you want.
01:20:56.420 | You can put it in your pocket.
01:20:57.420 | Could you help me?
01:20:58.420 | Yeah.
01:20:59.420 | I think pocket's the best.
01:21:00.420 | I usually put it in pockets.
01:21:01.420 | You gotta hide it.
01:21:02.420 | There's a camera on it.
01:21:03.420 | You'd like the cable to be invisible too.
01:21:05.420 | So, could you help me?
01:21:06.420 | Yeah, sure.
01:21:07.420 | Just tuck it in.
01:21:08.420 | Of course.
01:21:09.420 | There you go.
01:21:10.420 | Feel free.
01:21:11.420 | You're welcome.
01:21:12.420 | All set?
01:21:13.420 | All right.
01:21:14.420 | You're welcome.
01:21:15.420 | Thanks.
01:21:21.380 | Can we do a test for all of us before we head to the stage?
01:21:26.380 | Sure.
01:21:27.380 | Thanks.
01:21:28.380 | Is the computer all set?
01:21:30.380 | Everything fine?
01:21:31.380 | Perfect.
01:21:32.380 | I think I need to mirror, right?
01:21:33.380 | Mirror the screens.
01:21:34.380 | Yeah.
01:21:35.380 | Okay.
01:21:35.380 | I'll do that.
01:21:36.380 | Okay.
01:21:37.380 | Okay.
01:21:38.380 | Okay.
01:21:39.380 | Okay.
01:21:40.380 | Okay.
01:21:41.380 | Okay.
01:21:42.380 | Okay.
01:21:43.380 | Okay.
01:21:44.380 | Okay.
01:21:45.380 | Okay.
01:21:46.380 | Okay.
01:21:47.380 | Okay.
01:21:48.380 | Okay.
01:21:49.380 | Okay.
01:21:50.380 | And the ethernet is already here.
01:22:04.380 | Okay.
01:22:05.380 | I think I need like five minutes if we have three minutes.
01:22:31.380 | That's the right amount.
01:22:33.380 | All right.
01:22:35.380 | Okay.
01:22:36.380 | All right.
01:22:38.380 | All right.
01:22:40.380 | All right.
01:23:09.380 | Can we do a test?
01:23:16.380 | He's going to do that?
01:23:18.380 | I see.
01:23:19.380 | I'm video.
01:23:20.380 | I see.
01:23:21.380 | He's audio.
01:23:22.380 | I see.
01:23:23.380 | He said something about mirroring your screen up there, so we have to, like, set that up.
01:23:38.380 | Yeah.
01:23:39.380 | Yeah.
01:23:40.380 | You can practice that.
01:23:42.380 | You could actually tell that to Daniel and Vasilija just in case.
01:23:45.380 | Okay.
01:23:46.380 | So that there is no, I mean, there are always glitches, but less of them.
01:23:51.380 | Yeah.
01:23:52.380 | Oh, can I do this test?
01:23:52.380 | You need a moment.
01:23:52.380 | You need a moment.
01:23:52.380 | Okay.
01:23:52.380 | Great.
01:23:53.380 | Great.
01:23:54.380 | Thank you.
01:23:54.380 | Okay.
01:23:55.380 | Okay.
01:23:56.380 | Let me straighten that out.
01:23:56.380 | Yeah.
01:23:56.380 | Come on back to the tent.
01:23:56.380 | Okay.
01:23:57.380 | All right.
01:23:58.380 | Okay.
01:23:59.380 | Okay.
01:23:59.380 | Underneath.
01:24:00.380 | Beautiful.
01:24:00.380 | All right.
01:24:01.380 | All right.
01:24:01.380 | Thank you.
01:24:02.380 | Thank you.
01:24:04.380 | All right.
01:24:04.380 | Once you get up there, you're going to get there.
01:24:04.380 | Okay.
01:24:05.380 | cut that from five.
01:24:06.380 | Okay.
01:24:07.380 | Yeah.
01:24:08.380 | Okay.
01:24:08.380 | Okay.
01:24:09.380 | Can I do this test or?
01:24:10.380 | You need a moment.
01:24:11.380 | You need a moment.
01:24:12.380 | Okay.
01:24:13.380 | Okay.
01:24:14.380 | Nope.
01:24:15.380 | Hello.
01:24:16.380 | Hello.
01:24:17.380 | Hello.
01:24:18.380 | Hello.
01:24:19.380 | Hello.
01:24:20.380 | All right.
01:24:21.380 | Okay.
01:24:22.380 | Okay.
01:24:23.380 | Thank you.
01:24:24.380 | Let me straighten that out.
01:24:25.380 | Come on back to the tent.
01:24:26.380 | All right.
01:24:27.380 | Okay.
01:24:28.380 | Underneath.
01:24:29.380 | Beautiful.
01:24:30.380 | All right.
01:24:31.380 | All right.
01:24:32.380 | Once you get up there, you're going to cut that from five.
01:24:36.380 | All right.
01:24:37.380 | Okay.
01:24:38.380 | Are you done?
01:24:39.380 | Nope.
01:24:40.380 | Hello.
01:24:41.380 | Hello.
01:24:42.380 | Hello.
01:24:43.380 | That's right.
01:24:48.380 | This is the musical section of the afternoon.
01:24:51.380 | Hi, everyone.
01:24:52.380 | Welcome to the graph rag track.
01:24:54.380 | And we're having a lunch and learn.
01:24:55.380 | And I should remember myself, but I've told the other speakers to stay here in the middle
01:25:00.380 | of the stage.
01:25:01.380 | For the lunch and learn, we have the great treat that my good friend, Mark Bain, Mark, who's
01:25:06.380 | over there, Mark, who's going to be taking us through agentic memory, doing a kind of a broad
01:25:12.380 | sweep, like, you know, dive into agentic memory, but then we're also going to have a panel discussion
01:25:15.380 | around it and a couple of demos.
01:25:17.380 | This is going to be a longer session than the rest of the graph rag track.
01:25:20.380 | It's about 45 minutes.
01:25:21.380 | Totally worth staying for the entire time.
01:25:23.380 | It should be amazing.
01:25:24.380 | Mark, are you ready to talk?
01:25:25.380 | Mark, are you ready to talk?
01:25:26.380 | Of course.
01:25:27.380 | My friend, Mark Bain, please.
01:25:30.380 | All right.
01:25:31.380 | One, one, one.
01:25:35.380 | All right.
01:25:36.380 | How is everyone doing here?
01:25:38.380 | Woo-hoo!
01:25:39.380 | I'm super excited to be here with you.
01:25:42.380 | This is my first time speaking at AI Engineer.
01:25:47.380 | And we have an amazing group of speakers, guest speakers.
01:25:54.380 | Vasilija Markovic from Cogni.
01:25:56.380 | Vasilija.
01:25:57.380 | Oh, there is Vasilija.
01:26:00.380 | Daniel Chalev from Graffiti and Zepp AI.
01:26:03.380 | And Alex Gilmore from Neo4j.
01:26:07.380 | The plan looks like this.
01:26:12.380 | I will do a very quick power talk about the topic that I'm super passionate.
01:26:18.380 | The AI memory.
01:26:20.380 | Next, we'll have four live demos.
01:26:23.380 | And we'll move on to some new solution that we are proposing.
01:26:28.380 | A GraphRack chat arena that I will be able to demonstrate.
01:26:33.380 | And I would like you to follow along once it's being demonstrated.
01:26:40.380 | And at the very end, we'll have a very short Q&A session.
01:26:47.380 | There is a Slack channel that I would like you to join.
01:26:52.380 | So, please scan the QR code right now before we begin.
01:26:57.380 | And let's make sure that everyone has access to this material.
01:27:02.380 | There is a walkthrough shirt on the channel that will go through closer to the end of our workshop.
01:27:13.380 | But I would like you to start setting it up, if you may, on your laptops if you want to follow along.
01:27:20.380 | All right.
01:27:30.380 | It's workshop-graphrackchat.
01:27:32.380 | You can also find it on Slack.
01:27:34.380 | And you can join the channel.
01:27:36.380 | So, a little bit about myself.
01:27:39.380 | So, hi, everyone again.
01:27:41.380 | I'm Mark Bain.
01:27:42.380 | And I'm very passionate about the memory.
01:27:46.380 | What is memory?
01:27:47.380 | The deep physics.
01:27:49.380 | And applications of memory across different technologies.
01:27:54.380 | You can find me at Mark and Bain on social media or on my website.
01:27:59.380 | And let me tell you a little bit of a story about myself.
01:28:06.380 | So, when I was 16 years old, I was very good at maths.
01:28:11.380 | And I did math olympiads with many brilliant minds, including Wojciech Zaremba, the co-founder of OpenAI.
01:28:19.380 | And thanks to that deep understanding of maths and physics, I did have many great opportunities to be exposed to the problem of AI memory.
01:28:33.380 | So, first of all, I would like to recall two conversations that I had with Wojciech and Ilya in 2014 in September.
01:28:45.380 | When I came here to study at Stanford, at one party, we met with Ilya and Wojciech, who back then worked at Google.
01:28:55.380 | And they were kind of trying to pitch me that there will be a huge revolution in AI.
01:29:02.380 | And I kind of like followed that.
01:29:04.380 | I was a little bit unimpressed back then.
01:29:06.380 | Right now, I probably kind of take it as a very big excitement when I look back to the times.
01:29:14.380 | And I was really wishing good luck to the guys who were doing deep learning.
01:29:20.380 | Because back then, I didn't really see this prospect of GPUs giving that huge edge in compute.
01:29:29.380 | And so, during that conversation, it was like 20 minutes.
01:29:34.380 | At the very end, I asked Ilya, all right, so there is going to be a big AI revolution.
01:29:42.380 | But how will these AI systems communicate with each other?
01:29:48.380 | And the answer was very perplexing and kind of sets the stage to what's happening right now.
01:29:55.380 | Ilya simply answered, I don't know.
01:29:58.380 | I think they will invent their own language.
01:30:01.380 | So, that was 11 years ago.
01:30:04.380 | Fast forward to now.
01:30:06.380 | The last two years, I've spent doing very deep research on physics of AI.
01:30:12.380 | And kind of like delve into all of these most modern AI architectures.
01:30:18.380 | Including attention, diffusion models, VAEs, and many other ones.
01:30:23.380 | And I realized that there is something critical.
01:30:29.380 | Something missing.
01:30:31.380 | And this power talk is about this missing thing.
01:30:35.380 | So, over the last two years, I kind of followed on on my last years of doing a lot of research
01:30:46.380 | in physics, computer science, information science.
01:30:49.380 | And I came to this conclusion that memory, AI memory, in fact, is any data in any format.
01:31:00.380 | And this is important.
01:31:02.380 | Including code, algorithms, and hardware, and any causal changes that affect them.
01:31:12.380 | That was something very mind-blowing to reach that conclusion.
01:31:16.380 | And that conclusion sets the tone to this whole track.
01:31:21.380 | The GraphRack track.
01:31:26.380 | In fact, I was also perplexed by how biological systems use memory.
01:31:33.380 | And how different cosmological structures or quantum structures, they, in fact, have a memory.
01:31:39.380 | They kind of remember.
01:31:41.380 | And...
01:31:46.380 | Let's get back to maths and to physics.
01:31:49.380 | And geometry.
01:31:50.380 | When I was doing science olympiads, I was really focused on two, three things.
01:31:55.380 | Geometry, trigonometry, and algebra.
01:31:58.380 | And I realized in the last year that more or less the volume of loss in physics perfectly matches the volume of loss in mathematics.
01:32:13.380 | And also the constants in mathematics.
01:32:16.380 | If you really think deeply through geometry, they match the constants both in mathematics and in physics.
01:32:23.380 | And if you really think even deeper, they kind of like transcend over all the other disciplines.
01:32:29.380 | So that way we think a lot.
01:32:36.380 | And I found out that the principles that govern LLMs are the exact same principles that govern neuroscience.
01:32:48.380 | And they are the exact same principles that govern mathematics.
01:32:52.380 | I studied...
01:32:54.380 | I studied papers of...
01:32:58.380 | Perlman.
01:32:59.380 | I don't know if you've heard who is Perlman.
01:33:01.380 | Perlman is this mathematician who refused to take a $1 million award for proving the...
01:33:10.380 | one of the most important conjectures.
01:33:19.380 | About symmetries of three spheres.
01:33:23.380 | And once I realized that this deep math of...
01:33:31.380 | spheres and circles is very much linked with how attention and diffusion models work.
01:33:43.380 | Basically the formulas that Perlman reached are linking entropy with curvature.
01:33:53.380 | And curvature, basically if you think of curvature, it's attention.
01:33:57.380 | It's gravity.
01:33:58.380 | So in a sense there are multiple disciplines where the same things are appearing multiple
01:34:05.380 | times.
01:34:07.380 | And I will be publishing a series of papers with some amazing supervisors who are co-authors
01:34:19.380 | of two of these methods, methodologies.
01:34:26.380 | The transformers and VAEs.
01:34:28.380 | And I came to this realization that this equation governs everything.
01:34:34.380 | Governs mass.
01:34:35.380 | Governs physics.
01:34:36.380 | Governs our AI memory.
01:34:38.380 | Governs neuroscience.
01:34:40.380 | Biology.
01:34:41.380 | Physics.
01:34:42.380 | Chemistry.
01:34:43.380 | And so on and so forth.
01:34:44.380 | So...
01:34:45.380 | I came to this equation that memory times compute would like to be a squared imaginary unit circle.
01:35:00.380 | If that existed ever, we would have perfect symmetries and we would kind of not exist.
01:35:07.380 | because for us to exist, this asymmetries needs to show up.
01:35:11.380 | And in a sense, every single LLM through weights and biases, the weights are giving the structure.
01:35:20.380 | The compute that comes and transforms the data in sort of the raw format, the compute turns
01:35:27.380 | it into weights.
01:35:29.380 | The weights are basically, if you take these billions of parameters, the weights are the
01:35:34.380 | sort of like matrix structure of how this data looks like when you really find relationships
01:35:41.380 | in the raw data.
01:35:42.380 | So...
01:35:43.380 | All right.
01:35:44.380 | And then there are these biases, these tiny shifts that are kind of like trying to like
01:35:50.380 | in a robust way adapt to this model so that it doesn't break apart but still is...
01:35:57.380 | Still is very well reflecting the reality.
01:36:00.380 | So something is missing.
01:36:02.380 | So when we take weights and biases and we apply scaling laws and we keep adding more data, more
01:36:07.380 | compute, we kind of get a better and better and better understanding of the reality.
01:36:12.380 | In a sense, if we had infinite data, we wouldn't have any biases.
01:36:18.380 | And this understanding is, again, the principle of this track, of GraphRack.
01:36:27.380 | The disappearance of biases is what we are looking for when we are scaling our models.
01:36:34.380 | So in a sense, the amount of memory and compute should be exactly the same.
01:36:40.380 | It's just slightly expressed in a different way.
01:36:44.380 | But if there are some...
01:36:46.380 | There are any imbalances, then something important happens.
01:36:53.380 | And I came to another conclusion that our universe is basically a network database.
01:36:59.380 | It has a graph structure and it's a temporal structure.
01:37:03.380 | So it keeps on moving, following some certain principles and rules.
01:37:09.380 | And these principles and rules are not necessarily fuzzy.
01:37:16.380 | They have to be fuzzy because otherwise everything would be completely predictable.
01:37:24.380 | But if it would be completely predictable, it means that me, myself, would know everything
01:37:30.380 | about every single of you, about myself from the past and myself from the future.
01:37:35.380 | So in a sense, it's impossible.
01:37:37.380 | And that's why we have this sort of like heat diffusion entropy models.
01:37:43.380 | They allow us to exist.
01:37:45.380 | But something is preserved.
01:37:48.380 | Any single asymmetry that happens at the quantum level, any single tiny asymmetry that happens
01:38:05.380 | preserves causal links.
01:38:08.380 | And these causal links are the exact thing that I would like you to have as a takeaway from this workshop.
01:38:20.380 | The difference between simple RAC, hybrid RAC, any types of RAC, and graph RAC,
01:38:27.380 | is that we are having the ability to keep these causal links in our memory systems.
01:38:36.380 | Basically, the relationships are what preserves causality.
01:38:41.380 | That's why we can solve hallucinations.
01:38:46.380 | That's why we can optimize hypothesis generation and testing.
01:38:56.380 | So we will be able to do amazing research in biosciences, chemical sciences,
01:39:04.380 | just because of understanding that this causality is preserved within the relationships.
01:39:11.380 | And these relationships, when there are these asymmetries that are needed,
01:39:16.380 | they kind of create this curvature, I would say.
01:39:21.380 | So we intuitively feel every single of you is choosing some specific workshops and talks that you guys go to.
01:39:31.380 | Right now, all of you are attending to the talk and workshop that we are giving.
01:39:38.380 | It means that it matters to you.
01:39:41.380 | And it means that potentially you see value.
01:39:46.380 | And this value, this information, is transcended through space and time.
01:39:51.380 | It's very subjective to you or any other object.
01:39:56.380 | And I think we really need to understand this.
01:40:04.380 | So LLMs are basically these weights and biases or correlations.
01:40:09.380 | They give us this opportunity to be fuzzy.
01:40:12.380 | You know, actually one thing that I learned from Wojciech 10, 8, 11 years ago was that hallucinations are the exact necessary thing to be able to solve a problem where you have too little memory or too little compute for the combinatorial space of the problem you are solving.
01:40:33.380 | So you're basically imagining.
01:40:36.380 | We are taking some hypothesis based on your history and you are kind of trying to project it into the future.
01:40:42.380 | But you have too little memory, too little compute to do that.
01:40:45.380 | So you can be as good as the amount of memory and compute you have.
01:40:48.380 | So it means that the missing part is something that you kind of can curve thanks to all of these causal relationships and this fuzziness.
01:41:01.380 | And reasoning is reading of these asymmetries and the causal links.
01:41:13.380 | Hence, I really believe that agentic systems are sort of the next big thing right now because they are following the network database principle.
01:41:29.380 | But to be causal, to recover this causality from our fuzziness, we need graph databases.
01:41:39.380 | We need causal relationships.
01:41:41.380 | And that's the major thing in this emerging trend of GraphRack that we are here to talk about.
01:41:51.380 | And I would like to, at this moment, invite on stage our three amazing guest speakers.
01:42:01.380 | And I would like to start with Vasilje.
01:42:03.380 | Vasilje, please come over to the stage.
01:42:06.380 | Next will be Alex and Daniel.
01:42:11.380 | And I will present something myself.
01:42:14.380 | All right.
01:42:15.380 | So Vasilje will show us how to search and optimize memory based on certain use case at hand.
01:42:24.380 | All right.
01:42:25.380 | All right.
01:42:25.380 | All right.
01:42:26.380 | All right.
01:42:27.380 | All right.
01:42:28.380 | All right.
01:42:29.380 | All right.
01:42:30.380 | All right.
01:42:31.380 | All right.
01:42:32.380 | All right.
01:42:33.380 | So let's just make sure this works.
01:42:37.380 | All right.
01:42:38.380 | All right.
01:42:39.380 | Let's just make sure this works.
01:42:41.380 | All right.
01:42:42.380 | All right.
01:42:43.380 | All right.
01:42:44.380 | All right.
01:42:45.380 | All right.
01:42:46.380 | All right.
01:42:47.380 | All right.
01:42:48.380 | All right.
01:42:49.380 | So let's just make sure this works.
01:42:50.380 | Nice to meet you all.
01:42:51.380 | And I'm Vasilje.
01:42:52.380 | I'm originally from Montenegro, a small country in the Balkans.
01:42:55.380 | Beautiful.
01:42:56.380 | So if you want to go there, my cousins Igor and Miloš are going to welcome you.
01:43:00.380 | Everyone knows everyone.
01:43:01.380 | So, you know, if in case you're just curious about memory, I'm building a memory tool on top
01:43:07.380 | of the graph and vector databases.
01:43:10.380 | My background's in business, big data engineering and clinical psychology.
01:43:14.380 | So a lot what Mark talked about kind of connects to that.
01:43:18.380 | I'm going to show you a small demo here.
01:43:20.380 | The demo is to do a Mexican standoff between two developers where we are analyzing their
01:43:25.380 | GitHub repositories.
01:43:26.380 | And these data from the GitHub depositories is in the graph.
01:43:31.380 | And these Mexican standoff means that we will let the crew of agents go analyze, look
01:43:37.380 | at their data and try to compare them against each other and give us a result that should
01:43:41.380 | represent who should we hire, let's say, ideally out of these two people.
01:43:46.380 | So what we're seeing here currently is how Cognify works in the background.
01:43:50.380 | So Cognify is working by adding some data, turning that into a semantic graph, and then we can
01:43:56.380 | search it with wide variety of options.
01:43:58.380 | We plugged in crew AI on top of it so we can pretty much do this on the fly.
01:44:01.380 | So here in the background I have a client running.
01:44:04.380 | This client is connected to the system.
01:44:07.380 | So it's now currently searching the data sets and starting to build the graphs.
01:44:13.380 | So let's see.
01:44:14.380 | It takes a couple of seconds.
01:44:16.380 | But in the background we are effectively ingesting the GitHub data from the GitHub API,
01:44:23.380 | building the semantic structure, and then letting the agents actually search it and make decisions on top of it.
01:44:29.380 | So every time with live demos things might go wrong.
01:44:33.380 | So I have a video version in case this does.
01:44:36.380 | Let's see.
01:44:41.380 | And I'll switch to the video.
01:44:42.380 | Oh, here we go.
01:44:43.380 | So the semantic graph starts generating.
01:44:45.380 | And as you can see we have activity log where the graph is being continuously updated on the fly.
01:44:51.380 | Data is being stored in memory.
01:44:53.380 | And then data is being enriched and the agents are going and making decisions on top.
01:44:58.380 | So what you can see here on the side is effectively the agentic logic that is reading, writing, analyzing,
01:45:06.380 | and using all of this, let's say, preconfigured set of weights and benchmarks to analyze any person here.
01:45:13.380 | So Cogni is a framework that's modular.
01:45:15.380 | You can build these tasks.
01:45:16.380 | You can ingest from any type of a data source.
01:45:18.380 | 30 plus data sources supported now.
01:45:20.380 | You can build any type of a custom graph.
01:45:22.380 | You can build graphs from relational databases, semi-structured data.
01:45:25.380 | And we also have this memory association layers inspired by the cognitive science approach.
01:45:30.380 | And then effectively, as we kind of build and enrich this graph on the fly, we see that, you know, it's getting bigger.
01:45:38.380 | It's getting more popular.
01:45:39.380 | And then we're storing the data back into the graph.
01:45:41.380 | So this is the stateful temporal aspect of it.
01:45:45.380 | We kind of build the graph in a way that we can add the data back, that we can analyze these reports,
01:45:50.380 | that we can search them, and that we can let other agents access them on the fly.
01:45:54.380 | The idea for us was let's have a place where agents can write and continuously add the data in.
01:45:59.380 | So I'll have a look at the graph now so we can inspect the bits.
01:46:04.380 | So if we click on any node, we can see the details about the commits, about the information from the developers, the PRs, whatever they did in the past, and which report they contributed to.
01:46:17.380 | And then at the end, as the graph is pretty much filled, we will see the final report kind of starting to come in.
01:46:24.380 | So let's see how far we got with this.
01:46:26.380 | So it's taking, it's preparing now the final output for the hiring decision task.
01:46:32.380 | So let's have a look at that when it gets loaded.
01:46:35.380 | We just finished this this morning.
01:46:38.380 | I hope to have a hosted version for you all today, but it didn't work.
01:46:42.380 | because AI is causing some trouble.
01:46:44.380 | So let's, we have to resolve this one.
01:46:48.380 | So let's see.
01:46:51.380 | Yes, so I will just show you the video with the end so we don't wait for it.
01:47:04.380 | So here you can see that towards the end, we can see the graph.
01:47:11.380 | And we can see the final decision, which is a green node.
01:47:18.380 | And in the green node, we can see that we decided to hire Laszlo, our developer, who has a PhD in graphs.
01:47:25.380 | So it's not really difficult to make that call.
01:47:27.380 | And we see why and we see the numbers and the benchmarks.
01:47:31.380 | So thank you.
01:47:32.380 | This has been very fast three minute demo.
01:47:34.380 | So hope you enjoyed.
01:47:35.380 | And if you have some questions, I'm here afterwards.
01:47:37.380 | We have, we are open source.
01:47:38.380 | So happy to see new users.
01:47:40.380 | And if you're interested, try it.
01:47:41.380 | Thanks.
01:47:42.380 | Woohoo!
01:47:43.380 | Thank you.
01:47:46.380 | Thank you, Vasilija.
01:47:47.380 | Next up is Alex.
01:47:49.380 | So Vasilija showed us something I call semantic memory.
01:47:54.380 | So basically you take your data, you load it and cognify it, as they like to say.
01:48:01.380 | Come on, come on up, Alex.
01:48:03.380 | And that's the base.
01:48:07.380 | That's something we already are doing.
01:48:10.380 | I think.
01:48:11.380 | And next up is Alex will show us Neo4j MCP server.
01:48:17.380 | The stage is yours.
01:48:19.380 | Okay.
01:48:20.380 | Yeah.
01:48:21.380 | I'm going to go ahead.
01:48:22.380 | Okay.
01:48:22.380 | Okay.
01:48:23.380 | So, hi everyone.
01:48:24.380 | My name's Alex.
01:48:25.380 | I'm an AI architect at Neo4j.
01:48:26.380 | I'm going to demo the memory MCP server that we have available.
01:48:33.380 | So there is this walkthrough document that I have.
01:48:34.380 | We'll make this available in the Slack or by some means so that you can do this on your own.
01:48:40.380 | But it's pretty simple to set up.
01:48:41.380 | And what we're going to showcase today is really like the foundational functionality that we would
01:48:48.380 | like to see in a agentic memory sort of application.
01:48:51.380 | Primarily we're going to take a look at semantic memory in this MCP server.
01:48:54.380 | But we are currently currently going to have a lot of time.
01:48:56.380 | We're going to look at semantic memory in this MCP server.
01:48:59.380 | But we are currently going to have a lot of time to see.
01:49:02.380 | I'm going to look at semantic memory in this MCP server.
01:49:06.380 | And we're going to add additional memory types as well, which we'll discuss probably
01:49:15.380 | later on in the presentation.
01:49:17.380 | So, in order to do this, we will need a Neo4j database.
01:49:22.380 | Neo4j is a graph native database that we'll be using to store our knowledge graph that we're
01:49:26.380 | creating.
01:49:27.380 | They have a Aura option which is hosted in the cloud.
01:49:31.380 | Or we can just do this locally with the Neo4j desktop app.
01:49:35.380 | Additionally, we're going to do this via cloud desktop.
01:49:38.380 | And so, we just need to download that.
01:49:41.380 | And then we can just add this config to the MCP configuration file in Cloud.
01:49:47.380 | And this will just connect to the Neo4j instance that you create.
01:49:50.380 | And what's happening here is we're going to,
01:49:53.380 | Cloud will pull down the memory server from PyPy.
01:49:57.380 | And it will host it in the back end for us.
01:49:59.380 | And then it will be able to use the tools that are accessible via the MCP server.
01:50:03.380 | And the final thing that we're going to do before we can actually have the conversation
01:50:07.380 | is we're just going to use this brief system prompt.
01:50:09.380 | And what this does is just ensure that we are properly recalling and then logging memories
01:50:15.380 | after each interaction that we have.
01:50:16.380 | So, with that, we can take a look at a conversation that I had in Qua desktop using this memory server.
01:50:25.380 | And so, this is a conversation about starting an agentic AI memory company.
01:50:31.380 | And so, we can see all these tool calls here.
01:50:34.380 | And so, initially, we have nothing in our memory store, which is as expected.
01:50:38.380 | But as we kind of progress through this conversation, we can see that at each interaction,
01:50:44.380 | it tries to recall memories that are related to the user prompt.
01:50:48.380 | And then, at the end of this interaction, it will create new entities in our knowledge graph and relationships.
01:50:56.380 | And so, in this case, an entity is going to have a name, a type, and then a list of observations.
01:51:03.380 | And these are just facts that we know about this entity.
01:51:05.380 | And this is what is going to be updated as we learn more.
01:51:10.380 | In terms of the relationships, these are just identifying how these entities relate to one another.
01:51:16.380 | And this is really the core piece of why using a graph database as sort of the context layer here is so important.
01:51:23.380 | Because we can identify how these entities are actually related to each other.
01:51:28.380 | It provides a very rich context.
01:51:31.380 | And so, as this goes on, we can see that we have quite a few interactions.
01:51:35.380 | We are adding observations, creating more entities.
01:51:38.380 | And at the very end here, we can see we have quite a lengthy conversation.
01:51:43.380 | We can say, you know, let's review what we have so far.
01:51:46.380 | And so, we can read the entire knowledge graph back as context.
01:51:50.380 | And Claude can then summarize that for us.
01:51:52.380 | And so, we have all the entities we've found, all the relationships that we've identified,
01:51:56.380 | and all the facts that we know about these entities based on our conversation.
01:52:00.380 | And so, this provides a nice review of what we've discussed about this company and our ideas about how to create it.
01:52:07.380 | Now, we can also go into Neo4j browser.
01:52:10.380 | This is available both in Aura and local.
01:52:12.380 | And we can actually visualize this knowledge graph.
01:52:14.380 | And we can see that we discussed Neo4j.
01:52:16.380 | We discussed MCP and line graph.
01:52:18.380 | And if we click on one of these nodes, we can see that there is a list of observations that we have.
01:52:23.380 | And this is all the information that we've tracked throughout that conversation.
01:52:27.380 | And so, it's important to know that, like, even though this knowledge graph was created with a single conversation,
01:52:32.380 | we can also take this and use it in additional conversations.
01:52:35.380 | We can use this knowledge graph with other clients such as cursor IDE or Windsurf.
01:52:40.380 | And so, this is really a powerful way to create a, like, memory layer for all of your applications.
01:52:49.380 | And so, with that, I'll pass it on.
01:52:53.380 | Thank you.
01:52:54.380 | All right.
01:52:55.380 | Give a round of applause to Alex.
01:52:57.380 | Thank you, Alex.
01:53:01.380 | The next up is Daniel.
01:53:03.380 | I'll just assure personal beliefs about MCPs.
01:53:08.380 | I was testing MCPs of Neo4j, Graffiti, Cogni, Mem0, just before the workshop.
01:53:16.380 | And I'm a strong believer that this is our future.
01:53:19.380 | We'll have to work on that.
01:53:21.380 | And in a second, I will be showing a mini GraphRack chat arena.
01:53:26.380 | And next up, something very, very important that Daniel does is temporal graphs.
01:53:31.380 | Daniel is co-founder of Graffiti and Zepp.
01:53:34.380 | They have 10,000 stars on GitHub and growing very fast.
01:53:38.380 | The stage is here, Daniel.
01:53:39.380 | Please show us what you do.
01:53:41.380 | Thank you.
01:53:42.380 | So, five, four, three, two, one.
01:53:47.380 | Did that work?
01:53:49.380 | It seems to have, right?
01:53:52.380 | So, I'm here today to tell you that there's no one-size-fits-all memory.
01:54:02.380 | And why you need to model your memory after your business domain.
01:54:07.380 | So, if you saw me a little bit earlier, and I was talking about Graffiti, Zepp's open-source temporal graph framework, you might have seen me just speak to how you can build custom entities and edges in the graffiti graph for your particular business domain.
01:54:27.380 | So, business objects from your business domain.
01:54:30.380 | So, business objects from your business domain.
01:54:31.380 | What I'm going to demo today is actually how Zepp implements that and how easy it is to use from Python, TypeScript, or Go.
01:54:39.380 | And what we've done here is we've solved the fundamental problem plaguing memory.
01:54:44.380 | And we're enabling developers to build out memory that is far more cogent and capable for many different use cases.
01:54:57.380 | So, I'm going to just show you a quick example of where things go really wrong.
01:55:03.380 | So, many of you might have used ChatGPT before.
01:55:06.380 | It generates facts about you in memory.
01:55:09.380 | And you might have noticed that it really struggles with relevance.
01:55:13.380 | Sometimes it just pulls out all sorts of arbitrary facts about you.
01:55:17.380 | And, unfortunately, when you store arbitrary facts and retrieve them as memory, you get inaccurate responses or hallucinations.
01:55:26.380 | And the same problem happens when you're building your own agents.
01:55:30.380 | So, here we go.
01:55:32.380 | We have an example media assistant.
01:55:35.380 | And it should remember things about jazz music, NPR, podcasts, the daily, et cetera.
01:55:41.380 | All the things that I like to listen to.
01:55:43.380 | But, unfortunately, because I'm in conversation with the agent or it's picking up my voice when I'm, you know, it's a voice agent.
01:55:50.380 | It's learning all sorts of irrelevant things.
01:55:53.380 | Like, I wake up at 7:00 a.m., my dog's name is Melody, et cetera.
01:55:57.380 | And the point here is that irrelevant facts pollute memory.
01:56:04.380 | They're not specific to the media player business domain.
01:56:08.380 | And so, the technical reality here is, as well, that many frameworks take this really simplistic approach to generating facts.
01:56:17.380 | If you're using a framework that has memory capabilities, agent framework, it's generating facts and throwing it into a vector database.
01:56:24.380 | And, unfortunately, the facts dumped into the vector database or Redis mean that when you're recalling that memory, it's difficult to differentiate what should be returned.
01:56:33.380 | We're going to return what is semantically similar.
01:56:37.380 | And here we have a bunch of facts that are semantically similar to my request for my favorite tunes.
01:56:45.380 | We have some good things and, unfortunately, Melody is there as well because Melody is a dog named Melody.
01:56:52.380 | And that might be something to do with tunes.
01:56:54.380 | And so, a bunch of irrelevant stuff.
01:57:01.380 | So, basically, semantic similarity is not business relevance.
01:57:08.380 | And this is not unexpected.
01:57:10.380 | I was speaking a little bit earlier about how vectors are just basically projections into an embedding space.
01:57:17.380 | There's no causal or relational relations between them.
01:57:23.380 | And so, we need a solution.
01:57:25.380 | We need domain aware memory, not better semantic search.
01:57:29.380 | So, with that, I am going to, unfortunately, be showing you a video because the Wi-Fi has been absolutely terrible.
01:57:40.380 | And let me bring up the video.
01:57:49.380 | Okay.
01:57:50.380 | So, I built a little application here.
01:57:54.380 | And it is a finance coach.
01:57:58.380 | And I've told it I want to buy a house.
01:58:01.380 | And it's asking me, well, how much do I earn a year?
01:58:07.380 | It's asking me about what student loan debt I might have.
01:58:10.380 | And we'll see that on the right-hand side, what is stored in Zepp's memory are some very explicit business objects.
01:58:24.380 | We have financial goals, debts, income sources, et cetera.
01:58:30.380 | These are defined by the developer.
01:58:33.380 | And they're defined in a way which is really simple to understand.
01:58:38.380 | We can use Pydantic or Zod or Go structs.
01:58:43.380 | And we can apply business rules.
01:58:45.380 | So, let's go take a look at some of the code here.
01:58:48.380 | We have a TypeScript financial goal schema using Zepp's underlying SDK.
01:58:54.380 | We can define these entity types.
01:58:57.380 | We can give a description to the entity type.
01:58:59.380 | We can even define fields, the business rules for those fields and the values that they take on.
01:59:05.380 | And then we can build tools for our agent to retrieve a financial snapshot which runs multiple Zepp searches at the same time concurrently and filters by specific node types.
01:59:19.380 | And when we start our Zepp application, what we're going to do is we're going to register these particular goals, sorry, objects with Zepp.
01:59:32.380 | So, it knows to build this ontology in the graph.
01:59:37.380 | So, let's do a quick little addition here.
01:59:40.380 | I'm going to say that I have $5,000 a month rent.
01:59:47.380 | I think it's rent.
01:59:50.380 | And in a few seconds, we see that Zepp's already parsed that new message and has captured that $5,000.
01:59:59.380 | So, we can go look at the chart, the graph.
02:00:01.380 | This is the Zepp front end.
02:00:03.380 | And we can see the knowledge graph for this user has got a debt account entity.
02:00:10.380 | It's got fields on it that we've defined as a developer.
02:00:15.380 | And so, again, we can really get really tight about what we retrieve from Zepp by filtering.
02:00:20.380 | Okay.
02:00:21.380 | So, we're at time.
02:00:22.380 | So, just very quickly, we wrote a paper about how all of this works.
02:00:26.380 | You can get to it by that link below and appreciate your time today.
02:00:33.380 | You can look me up afterwards.
02:00:35.380 | Great paper, really.
02:00:38.380 | Thank you.
02:00:39.380 | All right.
02:00:40.380 | So, once I'm getting ready, I would appreciate if you confirm with me whether you have access
02:00:46.380 | to Slack.
02:00:47.380 | Is the Slack working for you?
02:00:49.380 | The Slack channel?
02:00:50.380 | All right.
02:00:51.380 | I think we are slowly running out of time.
02:00:52.380 | So, I'd appreciate if you have any questions to any of the speakers.
02:00:57.380 | Please write these questions on Slack.
02:00:59.380 | And we will be outside of this room.
02:01:02.380 | And we are happy to answer more of these questions just after the workshop.
02:01:06.380 | I, right now, move on with a use case that I developed and to this GraphRack chat arena.
02:01:15.380 | To be specific, before delving into agentic memory, into knowledge graphs, I led a private
02:01:30.380 | cyber security lab.
02:01:33.380 | And worked for defense clients.
02:01:35.380 | A very big client with very serious problems on the security side.
02:01:41.380 | And I used to, in one project, I had to navigate between something like a 27, 29 different terminals
02:01:51.380 | And I realized that LLMs are not only amazing to translate these languages, but they are also very good to kind of
02:02:03.380 | create a new type of shell.
02:02:04.380 | A human language shell.
02:02:04.380 | A human language shell.
02:02:04.380 | There are such shells.
02:02:05.380 | But such shells, they would really be excellent if they have episodes.
02:02:06.380 | episodic memory.
02:02:07.380 | The sort of temporal memory.
02:02:08.380 | The sort of temporal memory.
02:02:09.380 | Of what we were doing.
02:02:10.380 | Of what we were doing.
02:02:11.380 | The sort of temporal memory.
02:02:12.380 | Of what we were doing.
02:02:13.380 | Of what we were doing.
02:02:14.380 | Of what we were doing.
02:02:15.380 | We were doing.
02:02:16.380 | So, what we were doing.
02:02:17.380 | Of what we were doing.
02:02:18.380 | We were doing.
02:02:19.380 | Of what we were doing.
02:02:20.380 | Of what we were doing.
02:02:21.380 | of create a new type of shell, a human language shell.
02:02:25.060 | There are such shells.
02:02:26.920 | But such shells, they would really be excellent
02:02:31.600 | if they have episodic memory, the sort of temporal memory
02:02:36.600 | of what was happening in this shell historically.
02:02:39.840 | And if we have access to this temporal history, the events,
02:02:44.940 | we kind of know what the users were doing,
02:02:47.520 | what their behaviors are.
02:02:49.260 | We kind of can control every single code execution function
02:02:52.680 | that's running, including the ones of agents.
02:02:55.820 | So I spotted with some investors and advisors of mine,
02:03:00.240 | I spotted a niche, something we call agentic firewall.
02:03:04.940 | And I wanted to do a super quick demo of how it would work.
02:03:08.540 | So basically you would run commands and type pwd,
02:03:16.520 | and in a sense, I suppose lots of us had computer science classes
02:03:22.120 | or we worked in shell.
02:03:24.140 | And we have to remember all of these commands.
02:03:26.900 | Like, show me running Docker containers.
02:03:32.120 | Like, it's Docker PS, right?
02:03:33.620 | But if you go for more advanced commands...
02:03:37.080 | I think it's for a reason, yeah.
02:03:41.700 | I think it's for a reason.
02:03:43.160 | Oh, let me see, one second.
02:03:45.960 | Sorry about that.
02:03:48.960 | All right, it's there.
02:03:50.300 | Okay.
02:03:51.300 | Thank you.
02:03:51.300 | In general, I would need to know right now some command that can extract me,
02:04:09.900 | for instance, the name of the container that's running and its status.
02:04:14.180 | Show me just image and status.
02:04:19.200 | I can make mistakes, like human language, fuzzy mistakes.
02:04:23.620 | Show if Apache is running.
02:04:28.540 | All right, show the command we did three commands ago.
02:04:39.760 | So basically if you plug in the agentic memory to things like that, I think it got it wrong,
02:04:52.580 | but you get me right.
02:04:54.520 | So if I get through, like, different shells and terminals, and I have this textual context
02:05:01.600 | of what was done, and the context of the certain machine of what is happening here, and it kind
02:05:10.020 | of spans across all the machines, all the users, and all the sessions in PTYs, TTYs, I think
02:05:17.360 | that we can really have a very good context also for security.
02:05:22.660 | So that space, the temporal logs, the episodic logs, is something that I see will boom and
02:05:28.840 | emerge.
02:05:30.020 | So I believe that all of our agents that will be executing codes in terminals will be executing
02:05:38.620 | it through, maybe not all, but the ones that are running on the enterprise gate.
02:05:44.540 | They will be going through agentic firewalls.
02:05:47.760 | I'm close to sure about that.
02:05:50.940 | So that's my use case.
02:05:53.400 | And now let's move on to GraphRack chat arena.
02:05:57.020 | So you have on Slack a link to this doc.
02:06:02.400 | And this doc is allowing you to set up a repo that we've created for this workshop.
02:06:08.320 | And we'll be promoting it afterwards.
02:06:10.880 | So about a year ago, I met with Jerry Liu from LamaLindex, and we were chatting quite a while
02:06:16.280 | about how to evolve this conversational memory.
02:06:20.760 | And he gave me two pieces of advice.
02:06:22.340 | One of them, think about data abstractions.
02:06:25.180 | The other, think about evals.
02:06:27.340 | Data abstractions, I kind of quickly solved within, like, two months.
02:06:30.960 | Evals, I realized that there wouldn't be any evals in form of a benchmark.
02:06:36.540 | This, all of these hot potatoes and all of that, it's fun.
02:06:39.040 | I know that there are great papers written by our guest speakers and other folks about hot
02:06:44.100 | potatoes.
02:06:45.100 | But it's not the thing.
02:06:46.420 | You can't do a benchmark for a thing that doesn't exist.
02:06:50.060 | Basically, the agentic GraphRack memory will be this type of memory that evolves.
02:06:57.380 | So you don't know what will evolve.
02:06:59.300 | So if you don't know what will evolve, you will need a simulation arena.
02:07:02.640 | And that will be the only right eval.
02:07:08.420 | So one year, fast forward, and we've created a prototype of such agentic memory arena.
02:07:15.840 | Think about it like web arena, but for memory.
02:07:18.200 | And let me quickly show you that.
02:07:20.320 | You can go to this repository.
02:07:22.020 | I did a fork of that.
02:07:23.760 | There is Memzero.
02:07:24.760 | There is Graffiti.
02:07:25.760 | There is Cogni.
02:07:28.500 | And there will be two approaches.
02:07:29.840 | One approach will be sort of the repo, the library itself, and the other is through MCPs.
02:07:37.820 | Because we don't really know what will work out better.
02:07:40.620 | So whether repos or the MCPs will work out better.
02:07:42.920 | So we need to test these different approaches.
02:07:45.260 | We need to create this arena for that.
02:07:48.220 | So we basically cloned that repo.
02:07:50.340 | And we use ADK for that.
02:07:53.980 | So we get this nice chat where you can talk to these agents.
02:08:00.200 | And you can switch between agents.
02:08:03.080 | So I want to talk with Neo.
02:08:05.160 | And there is a Neo4j agent running behind the scenes.
02:08:08.720 | There is a Cypher graph agent running behind the scenes.
02:08:12.680 | And I can kind of, for now, switch between these agents.
02:08:16.060 | Maybe I'll increase the font size a little bit.
02:08:18.440 | So the Neo agents basically answering the questions about this amazing technology, the graphs, specifically Neo4j.
02:08:26.160 | And I can switch to Cypher.
02:08:27.160 | And then an agent that is excellent at running Cypher queries talks with me.
02:08:35.400 | And I'm writing add to graph that mark.
02:08:40.360 | And I'm passionate about memory architectures.
02:08:42.580 | And basically, what it does is it runs these layers that are created by Cogni, by Memzero, by Graffiti, and all the other vendors of semantic and temporal memory solutions.
02:08:56.240 | Or, specifically, created by an MCP server that Alex was demonstrating, the Neo4j MCP server.
02:09:05.720 | So I'm really looking forward to how this technology evolves.
02:09:11.780 | But what I really -- what I quickly wanted to show you is that it already works.
02:09:15.780 | It has this science of being this agentic memory arena.
02:09:20.440 | So I can ask my graph through questions, and the agent goes to the connection.
02:09:27.180 | This is just one -- you know what's amazing?
02:09:29.660 | It's just one Neo4j graph.
02:09:33.160 | It's just one Neo4j graph on the backend, and all of these technologies that can be tested.
02:09:38.860 | How the graphs are being created and retrieved.
02:09:42.100 | It's like -- when I think of that, it's like the most brilliant idea that we can do with agentic memory simulations.
02:09:50.300 | So I get answers from the graph.
02:09:53.200 | Here is the graph.
02:09:54.700 | I can basically rerun the commands to see what's happening on this graph.
02:10:00.820 | And let me just move on.
02:10:04.140 | And next thing is I would like to add to the graph that Vasilio will show how to integrate
02:10:10.180 | Cogni and ta-da-da-da, so I add new information.
02:10:13.660 | And the cipher writes it to the graph.
02:10:16.880 | And then I want to do something else.
02:10:18.920 | It's super early stage still.
02:10:20.600 | But then I transfer it to graffiti, and I can repeat the exact same process.
02:10:26.000 | So I can right now, using graffiti, search what I just added.
02:10:30.780 | And I can switch between these different memory solutions.
02:10:33.680 | So that's why I'm so excited about that.
02:10:36.320 | And we do not have time to, like, practice it together, do the workshop, but I'm sure we'll
02:10:41.760 | write some articles.
02:10:43.440 | So please follow us.
02:10:45.580 | And I would appreciate, if you have any questions, pass them on to Slack.
02:10:50.520 | I will ask Andreas whether we have time for a short Q&A or do we need to move it to, like,
02:10:57.600 | breakout or outside of the room.
02:10:59.600 | It could take, like, five minutes.
02:11:01.600 | Five minutes.
02:11:02.600 | All right.
02:11:03.600 | So that's all for now for today.
02:11:06.680 | I really would like Vasilya, Daniel, and Alex to come back to stage so you can ask any of
02:11:14.240 | Please direct the questions to any of us.
02:11:18.240 | And we'll try to answer them.
02:11:19.880 | Yeah.
02:11:20.880 | Let's go.
02:11:22.880 | I'm Lucas.
02:11:23.880 | I want to ask a fundamental question.
02:11:25.760 | How do you decide what is a bad memory over time?
02:11:30.440 | Because you could, like, as a developer and as a person, we evolve the line of thought, right?
02:11:36.920 | So one thing that you thought was good, like, three years, ten years ago may not be good right
02:11:41.880 | today.
02:11:43.100 | So how do you decide?
02:11:44.380 | Sure.
02:11:45.380 | A very good question.
02:11:46.780 | So I will answer in -- maybe you guys can help.
02:11:50.680 | I will answer in a very scientific way.
02:11:53.260 | So basically the one that causes a lot of noise.
02:11:56.220 | The noisy one doesn't make a lot of sense.
02:11:59.260 | So you decrease noise by redundancy and by relationships.
02:12:05.160 | So the less relationships and the more noisiness, the -- so in a sense, a not well-connected node
02:12:13.740 | has the potential of not being correct.
02:12:16.840 | But there are other ways to validate that.
02:12:20.540 | Would you like to follow on?
02:12:21.840 | Yeah.
02:12:22.840 | Sure.
02:12:23.840 | A practical way.
02:12:24.840 | We will let you model the data with Pydantix so you can kind of load the data you need and
02:12:29.340 | add weights to the edges and nodes so you can do something like temporal weighting.
02:12:33.880 | You can add your custom logic and then effectively you would know how your data is kind of evolving
02:12:38.880 | in time and then how it's becoming less or more relevant and what is the set of algorithms
02:12:43.980 | you would need to apply.
02:12:44.980 | So this is the idea.
02:12:45.980 | Not solve it for you, but let's help you solve the tooling.
02:12:48.980 | But, yeah, there is -- depends on the use case, I would say.
02:12:52.080 | Yeah.
02:12:53.080 | I have nothing to add.
02:12:54.080 | I think that's a great explanation.
02:12:55.080 | I think what I would add is that there is missing causal links.
02:12:59.280 | Missing causal links is what is most probably a good indicator of fuzzy nice.
02:13:06.080 | Yeah.
02:13:07.080 | Next question.
02:13:08.080 | Can you hear me?
02:13:11.180 | How would you embed in the security or privacy into the network or the application layer?
02:13:17.180 | If there's a corporate, they have top secret data, or I have personal data that is a graph.
02:13:21.180 | I want to share that, but not all of it.
02:13:24.180 | Oh, that's a really good one.
02:13:26.480 | I think I'll answer that very briefly.
02:13:29.880 | So basically, you do have to have that context.
02:13:32.600 | You do have to have these decisions, intentions of colonels, of majors, and anyone like in
02:13:38.700 | the enterprise -- like CISOs and anyone in the enterprise stack.
02:13:43.160 | And in a sense, it also gets kind of like fuzzy and complex, so I expect this to be a very
02:13:48.640 | big challenge.
02:13:49.640 | That's why I want to work on that.
02:13:50.640 | But I'm sure that applying ontologies, the right ontologies, first of all, to this enterprise
02:13:55.920 | cyber security stack and really kind of provides these guardrails for navigating this challenging
02:14:03.580 | problem and decreasing this fuzziness and errors.
02:14:06.480 | Thank you.
02:14:07.480 | Yeah.
02:14:08.480 | I would also just add, like, all these applications are built on Neo4j.
02:14:12.260 | And so in Neo4j, you can, like, do role-based access controls, and so you can prevent users
02:14:18.700 | from accessing data that they're not allowed to see.
02:14:21.100 | So it's something that you can configure with that.
02:14:22.960 | And one more thing.
02:14:24.960 | This question is for Mark.
02:14:25.960 | Yeah.
02:14:26.960 | Yeah.
02:14:27.960 | Go on.
02:14:28.960 | Go on.
02:14:29.960 | Go on.
02:14:30.960 | You were about to say something?
02:14:31.960 | Please go ahead first.
02:14:32.960 | Yeah.
02:14:33.960 | Just one thing.
02:14:34.960 | Kind of keep it, like, very physically separate.
02:14:36.920 | For us, it really works well.
02:14:39.040 | People react to that really well.
02:14:40.620 | So that's one way.
02:14:42.620 | Independent graphs.
02:14:43.620 | Personal graphs.
02:14:44.620 | Yeah.
02:14:45.620 | Mark, in your earlier presentation, you mentioned this equation related gravity, entropy and something,
02:14:51.060 | and also memory and compute.
02:14:53.820 | Could you show those two again and explain them again?
02:14:56.280 | Of course.
02:14:57.280 | Yeah.
02:14:58.280 | If we have time.
02:14:59.280 | Other than that, it's probably for a series of papers to properly explain that.
02:15:04.140 | So that's one.
02:15:05.140 | Memory times compute.
02:15:06.140 | Equal size square.
02:15:07.320 | The other one is that if you take all the attention, diffusion, and VAs which are doing
02:15:11.380 | the smoothing, it preserves the sort of asymmetries.
02:15:15.520 | So very briefly speaking, let's set up the vocabulary.
02:15:18.580 | So first of all, curvature equals attention equals gravity.
02:15:22.800 | This is the very simple, most important principle here.
02:15:26.060 | I will need to, when writing these papers, we are really tightly trying to define these three.
02:15:30.640 | Next.
02:15:31.640 | Inclusion, heat, entropy.
02:15:32.640 | It's the exact same thing.
02:15:34.640 | We just need to align definitions.
02:15:36.640 | And if it's not the exact same thing, if there are other definitions, we need to show what's
02:15:40.560 | really different.
02:15:42.600 | And now, if you think about attention, it kind of shows the sort of like pathways towards certain
02:15:49.640 | asymmetries.
02:15:50.880 | If you take a sphere, if you start bending that sphere and make it like, you know, like
02:15:54.880 | you kind of try to extend it, two things happen.
02:15:59.440 | Entropy increases and curvature increases, in a sense.
02:16:03.080 | And Perelman, what he did, he proved that you can like bend these spheres in any way, 3D
02:16:07.880 | spheres, 4D and 5D and higher level spheres were already solved.
02:16:12.880 | So he solved for 3D sphere.
02:16:14.880 | And these equations are proving that basically there won't be any other architectures for
02:16:19.880 | LLMs.
02:16:20.880 | It will be just attention diffusion models and VAs.
02:16:22.880 | Maybe not just VAs, but like kind of like something that smooths, leaves room for biases.
02:16:29.880 | All right.
02:16:30.880 | Thank you all.
02:16:31.880 | I really appreciate you coming.
02:16:33.880 | I hope it was helpful.
02:16:34.880 | Thank you, the guest speakers.
02:16:37.880 | And we'll answer the questions outside of the room.
02:16:40.880 | I appreciate that.
02:16:41.880 | Thank you.
02:16:46.880 | We've got about maybe a 10-minute break before the next speaker is up.
02:16:50.880 | But we've got a bit of setup to do.
02:16:52.880 | So this is a great time to grab a coffee.
02:16:53.880 | Michael is going to be talking to us next.
02:16:56.880 | A practical graph rack, right?
02:16:59.880 | Yeah.
02:17:00.880 | Hi, if you are staying for the next session, I believe you have to go out and get your badge
02:17:04.880 | scanned because that's how they keep track of how many people are at each session.
02:17:09.880 | So that was the directive.
02:17:11.880 | Thank you.
02:17:12.880 | Thank you.
02:17:13.880 | Thank you.
02:17:14.880 | We'll hack later.
02:17:17.880 | Thanks.
02:17:18.880 | Thanks.
02:17:19.880 | Thank you.
02:17:21.880 | Thank you.
02:17:22.880 | Yeah.
02:17:23.880 | Thank you, everyone.
02:17:24.880 | We're closing out this room for a turnover.
02:17:27.880 | I'll be around.
02:17:28.880 | Appreciate you.
02:17:29.880 | You have a very nice presentation.
02:17:30.880 | I like your way of presenting from another perspective.
02:17:33.880 | Please go out and get your battery scanned if you're going to stay in the room.
02:17:36.880 | Let's head out.
02:17:37.880 | Thank you.
02:17:38.880 | Yeah.
02:17:39.880 | They need to get ready.
02:17:40.880 | So I will be there.
02:17:41.880 | Let's talk.
02:17:42.880 | I will be there in 30 seconds.
02:17:43.880 | Don't worry.
02:17:44.880 | in these directions.
02:17:45.880 | Am I a robot?
02:17:45.880 | Of course.
02:17:46.880 | Let's go out.
02:17:47.880 | Yes, we will.
02:17:48.880 | Go on.
02:17:49.880 | Love the physics connection you built there.
02:17:50.880 | Can I, do you write blogs about this?
02:17:51.880 | Can I discover some of your content?
02:17:52.880 | I will be writing such like deep science, like theoretical physics papers.
02:17:53.880 | First theoretical physics, then I mean, I have drafts that are being like reviewed.
02:17:59.880 | It's, it's, it's, it's, it's really like, one second.
02:18:08.880 | So it's really challenging to kind of question general relativity and it's like built on Perlman
02:18:28.880 | and all of these like quantum physics.
02:18:32.880 | It's just like, I, I only feel comfortable doing that when I have very good supervisor.
02:18:36.880 | So it takes time.
02:18:37.880 | I basically, so the way it was, is I was starting with like Cogni and graffiti and Memzero.
02:18:43.880 | We were like kind of like building things, but I was like, hi, I want to get into science.
02:18:47.880 | So follow me on LinkedIn and on the website.
02:18:52.880 | Okay.
02:18:53.880 | I can probably write some posts about it briefly.
02:18:57.880 | But what we are trying to do is like do this deep theoretical physics papers first.
02:19:03.880 | And after that, so, so papers like one, two, three will be about that.
02:19:08.880 | Papers like three, five will be about relating that to transformers, diffusion models, heat transfer, and all of these other things.
02:19:19.880 | And in a sense, I, I, I feel like doing popular science, someone will take care of that.
02:19:26.880 | So we are trying to do like real research.
02:19:29.880 | It's more campaign.
02:19:30.880 | It's more campaign.com.
02:19:32.880 | I can show it to you.
02:19:33.880 | One second.
02:19:34.880 | Um, could take a picture.
02:19:39.880 | Of course.
02:19:40.880 | Yeah.
02:19:41.880 | Of course.
02:19:42.880 | Can you, can you add my, one second.
02:19:43.880 | Can you take it?
02:19:44.880 | Yeah.
02:19:45.880 | Thanks.
02:19:46.880 | Yeah.
02:19:47.880 | Can you end mic? One second. Can you take it? Yeah. Thanks, Mike. Yeah.
02:19:53.320 | Yeah. Yeah, we'll take care of it.
02:20:11.120 | Okay. Did you get mic two back?
02:20:40.140 | He's ready for your...
02:20:43.140 | Oh, okay. Cool.
02:20:46.140 | Okay, so, he's gonna be on one, and he's on two.
02:20:50.140 | Okay, that's good.
02:20:52.140 | Okay, go ahead and do the count back from five.
02:20:58.140 | Hello, hello? Say five, four, three, two, one.
02:21:01.140 | Michael!
02:21:02.140 | Count back from five, please.
02:21:05.140 | A little slower.
02:21:07.140 | All good?
02:21:10.140 | Give me a countdown back from ten?
02:21:12.140 | Ten, nine, eight, seven, five, six, five, four, three, two, one.
02:21:16.140 | All right, you're good.
02:21:17.140 | All right, you're good.
02:21:18.140 | Thanks, Mike.
02:21:19.140 | So, go ahead and take this down.
02:21:21.140 | Oh, I'm sorry.
02:21:23.140 | How's that?
02:21:24.140 | This part, I'll fix this one.
02:21:30.140 | because we have to have to raise a certain way.
02:21:37.140 | so it doesn't sound off my phone.
02:21:38.140 | Sure.
02:21:38.140 | Okay, count back from five for me.
02:21:38.140 | Five, four, three, two, one, zero.
02:21:42.140 | You're good.
02:21:43.140 | Is that okay?
02:21:43.140 | Yeah, yeah, that's fine.
02:21:44.140 | Is that okay?
02:21:45.140 | Yeah, that's fine.
02:21:46.140 | Is that okay?
02:21:47.140 | Yeah, that's fine.
02:21:48.140 | Is that okay?
02:21:49.140 | Yeah, that's fine.
02:21:50.140 | Is that okay?
02:21:51.140 | Yeah, that's fine.
02:21:53.140 | Okay.
02:21:54.140 | Okay, can I add some five for me? Five, four, three, two, one, zero. Okay, you know best. All right, thank you very much.
02:22:22.900 | Thank you.
02:22:29.560 | Thank you.
02:22:59.540 | At the beginning you have all these slides about like the papers. Do we want to skip some of them? Because I had this like summary slide that had all the search stuff on one slide. How do you want to do it? Because it's like six slides.
02:23:15.700 | Yes. I mean, I'm just going to click through them. Okay. So I'll go here, right? So from here, switch. She said we need to stay at the podium. So you have to step back and then I go here because of the live streaming. Okay.
02:23:29.480 | Okay, so I'll just intro. I think we should be up here together for the intro. Yeah. But then if you don't want to stand here and watch me, then you can sit down.
02:23:46.340 | Yeah. Do you have a clicker? No, I forgot mine at home. Do you have one?
02:24:01.340 | No, I got one. Yeah, I got one. I mean, if the...
02:24:08.340 | Yeah, that's true.
02:24:11.340 | Yeah, that's true.
02:24:16.340 | Oh, I think. Just one USB stick. One. Oh, USB A.
02:24:23.340 | We're always ready. Good.
02:24:31.340 | Oh, yeah, Oregon. That's good.
02:24:36.340 | Hello, everyone. Hope you had some good coffee. Please come in. We are talking about graph rec today. That's the graph rec trick.
02:24:45.340 | The graph rec trick, of course. And we want to look at patterns for successful graphic applications for making LLMs a little bit smarter by putting knowledge graph into the picture.
02:24:56.340 | My name is Michael Hunger. I'm VP of product innovation at Neo4j.
02:25:00.340 | My name is Steven Shin. I lead the developer relations at Neo4j. And actually, we're both co-authoring. This is fun because we're both already authors and finally we've been friends for years and we finally get to co-author a book.
02:25:14.340 | We're also authoring graph rec, the definitive guide for O'Reilly. So basically, we didn't sleep this past weekend because we had a book deadline.
02:25:22.340 | So I'm going to talk a little bit about kind of how to high level what graph rec is, why it's important, what we're seeing in the media. And then Michael's going to drill down into all of the details and patterns and give you a bunch of takeaways and things you can do.
02:25:36.340 | This is probably, if you want to know how to do graph rec, Michael's quick deep dive on this is the best introduction you can get. So I'm also excited.
02:25:47.340 | Awesome. That's good going.
02:25:48.340 | Okay. So the case for graph reg is where we're going to start. And the challenge with using LLMs and using other patterns for this is basically they don't have the enterprise domain knowledge.
02:26:01.340 | They don't verify or explain the answers. They're subject to hallucinations. And they've ethical and data bias concerns. And you can see that very much like our friendly parrot here, they are all the things which parrots behave and act like, except a cute bird.
02:26:18.340 | So we want to do better than this with graph reg and figure out how we can use domain specific knowledge, accurate, contextual and explainable answers. And really, I think like what a lot of companies and what the industry is figuring out is it's really a data problem.
02:26:34.340 | You need good data. You need good data. You need to have data you can power your system with. One of the patterns you can do this with is RAG. So you can stick your external data into a RAG system. You can get stuff back from a database for the pattern.
02:26:49.340 | But vector databases and RAG fall short because it's lacking kind of your full data set. It's only pulling back a fraction of the information by vector simulating algorithms.
02:27:01.340 | Typically, a lot of the especially modern vector databases, which everyone's using, they're easy to start with, but they're not robust. They're not mature. They're not something which has scalability and fallback and gives you that what you need to get into build a strong, robust enterprise system.
02:27:17.340 | And vector similarity is not the same as relevance. So results you get back from using a basic RAG system. They give you back things which are related to the topic, but it's not complete.
02:27:29.340 | And it's typically also not very relevant. And then it's very hard to explain what's coming out of the system.
02:27:35.340 | So we need to answer. Lifeline. Graphrag. And what Graphrag is, is we're bringing the knowledge and the context and the environment to what LMs are good at.
02:27:49.340 | So you can think of this kind of like the human brain. Our left brain is, our right brain is more creative. It does more like building things. It does more extrapolation of information.
02:28:01.340 | Whereas our left brain is the logical part. That's what actually has reasoning, has facts, and can enrich data. And it's built off of knowledge graphs.
02:28:09.340 | So a knowledge graph is a collection of nodes, relationships, and properties.
02:28:15.340 | Here's a really simple example of a knowledge graph where you have two people. They live together. You have a car.
02:28:21.340 | But when you look into the details, it's actually like a little bit more complex than it seems at first.
02:28:25.340 | Because they both have a car, but the owner of the car is not the person who drives it.
02:28:31.340 | This is kind of like my family. My wife does all the bills, but then she hands me the keys whenever we get on the freeway. She hates driving.
02:28:40.340 | So knowledge graphs also are a great way of getting really rich data. Here's an example of the Stack Overflow graph built into a knowledge graph where you can see all of the rich metadata and the complexity of the results.
02:28:52.340 | And we can use this to evolve RAG into a more complex system, basically graph RAG, where we get better relevancy.
02:28:59.340 | We're getting more relevant results. We get more context because now we can actually pull back all of the related information by graph closeness algorithms.
02:29:07.340 | We can explain what's going on because it's no longer just vectors. It's no longer statistical probabilities coming out of a vector database.
02:29:15.340 | We actually have nodes. We have structure. We have semantics we can look at.
02:29:18.340 | And we can add in security and role-based access on top of this.
02:29:22.340 | So it's context-rich. It's grounded. This gives us a lot of power.
02:29:26.340 | And it gives us the ability to start explaining what we're doing, where now we can visualize it, we can analyze it, and we can log all of this.
02:29:34.340 | Now, this is one of the initial papers, the graph RAG paper from Microsoft Research, where they went through this and they showed that you could actually get not only better results, but less token costs.
02:29:45.340 | It was actually less expensive to do a graph RAG algorithm.
02:29:49.340 | There have been a lot of papers since then which show all of the different research and interesting work which is going on in the graph RAG area.
02:29:59.340 | And this is just a quick view of the different studies and results which are coming out.
02:30:03.340 | But even from the early Data.World study where they showed a three times improvement in graph RAG capabilities.
02:30:09.340 | And the analysts are even showing how graph RAG is trending up.
02:30:14.340 | So this is the Gartner kind of hype cycle from 2024.
02:30:19.340 | And you can see generic AI is kind of, you know, on the downtrends.
02:30:23.340 | RAG is getting over the hump.
02:30:24.340 | But graph RAG and a bunch of these things actually are providing and breathing more life into the AI ecosystem.
02:30:30.340 | So a lot of great reports from Gartner showing that it's grounded in facts.
02:30:35.340 | It resolves hallucinations.
02:30:37.340 | Together knowledge graphs and AI are solving these problems.
02:30:40.340 | And it's getting a lot of adoption by different industry leaders.
02:30:43.340 | By big organizations who are taking advantage of this and actually producing production applications.
02:30:49.340 | And making it work like LinkedIn customer support where they actually wrote this great research paper.
02:30:55.340 | Where they showed that using a knowledge graph for customer support scenarios actually gave them better results.
02:31:01.340 | And allowed them to improve the quality and reduce the response time for getting back to customers.
02:31:08.340 | Median per issue resolution time was reduced by 28.6%.
02:31:13.340 | I mentioned the data.world study which basically was a comparison of doing RAG on SQL versus RAG on graph databases.
02:31:20.340 | And they showed a three times improvement in accuracy of LLM responses.
02:31:24.340 | And let's chat about patterns, Michael.
02:31:26.340 | Because I think everyone's here to learn how to do this.
02:31:29.340 | Exactly.
02:31:30.340 | So let's look at how to do this actually.
02:31:32.340 | Right?
02:31:33.340 | So and if you look at graph RAG, that's actually two sides to the coin.
02:31:38.340 | So one, of course, you don't start in a vacuum.
02:31:40.340 | You have to create your knowledge graph, right?
02:31:42.340 | And we see basically multiple steps to get there.
02:31:45.340 | Initially, you get unstructured information.
02:31:47.340 | You will substructure it.
02:31:48.340 | You put it into a lexical graph, which represents documents, chunks, and their relationships.
02:31:53.340 | And the second step, you can then extract entities using, for instance, LLMs with this graph schema
02:31:59.340 | to extract entities and the relationships from that graph.
02:32:02.340 | And in the third phase, you would enrich this graph, for instance, with graph algorithms,
02:32:06.340 | doing things like, you know, page rank, community summarization, and so on.
02:32:11.340 | And then when you have this build-up knowledge graph, then you do graph REC as the search
02:32:17.340 | mechanism.
02:32:18.340 | Either with local search or global search and other ways.
02:32:22.340 | Right?
02:32:23.340 | So let's first look at the first phase of, like, knowledge graph construction a little bit.
02:32:28.340 | So like, always in data engineering, there is, if you want to have higher quality outputs,
02:32:32.340 | you have to put in more effort at the beginning.
02:32:34.340 | Right?
02:32:35.340 | So it's been nothing comes for free.
02:32:36.340 | There's no free lunch after all.
02:32:37.340 | But what you do at the beginning is basically paying off multiple times because what you get
02:32:42.340 | out of your unstructured documents is actually high quality, high structured information, which
02:32:47.340 | you then can use to extract contextual information for your queries, which allows
02:32:52.340 | to reach retrieval at the end.
02:32:55.340 | Okay.
02:32:56.340 | And so after seeing graph REC being used by a number of users customers, we've seen -- we
02:33:02.340 | looked at research papers.
02:33:04.340 | we saw that a number of patterns emerging in terms of like how we structure our graphs,
02:33:09.340 | how we query these graphs, and so on.
02:33:11.340 | And so we started to collect these patterns and put them on graph REC.com.
02:33:15.340 | And we want to -- I wanted to show what this looks like.
02:33:18.340 | So we have basically example graphs in the pattern.
02:33:23.340 | The pattern has a name, description, context, and we see also queries that are used for extracting
02:33:29.340 | this information.
02:33:30.340 | Right?
02:33:31.340 | Let's look at the three steps in a little bit more detail on the graph model side.
02:33:42.340 | So on one side, we have for lexical graph, you represent documents and the elements.
02:33:47.340 | So that could be something simple as a chunk.
02:33:49.340 | But if you have structured elements and documents, you can also do something like, okay, I have
02:33:53.340 | a book which has chapters, which have sections, which have paragraphs, where the paragraph is
02:33:58.340 | the semantically cohesive unit that you would use to, for instance, create a vector embedding
02:34:02.340 | that you can use later for vector search.
02:34:05.340 | But what's really interesting in the graph is that you can connect these things all up, right?
02:34:09.340 | So you know exactly who is the predecessor, who is the successor to a chunk, who is the parent
02:34:13.340 | of an element.
02:34:14.340 | And using something like vector or text similarity, you can also connect these chunks as well by a
02:34:21.340 | the other way.
02:34:22.340 | And then you can use all these relationships when you extract the context and the retrieval
02:34:37.340 | phase to find what are related chunks by document, by temporal sequence, by similarity and other
02:34:43.340 | things.
02:34:44.340 | Right?
02:34:45.340 | So that's only on the lexical side.
02:34:46.340 | This looks like this.
02:34:47.340 | So for instance, you have an RFP and you want to break it up in a structured way.
02:34:51.340 | Then you basically create the relationships between these chunks or these subsections at the
02:34:57.340 | text to the vector embeddings and then you do it at scale and then you get a full lexical
02:35:03.340 | graph out of that.
02:35:05.340 | Next phase is entity extraction, which is also something that has been around for quite
02:35:12.340 | some time with NLP.
02:35:13.340 | But LLMs actually take this to the next level with the multi-language understanding, with
02:35:17.340 | their high flexibility, good language skills for extraction.
02:35:20.340 | So you basically provide a graph schema and an instruction prompt to the LLM plus your pieces
02:35:28.340 | of information, pieces of text.
02:35:30.340 | Now with large context windows, you can even put in $10,000, $100,000 tokens for extraction.
02:35:37.340 | If you have, you can also put in already existing ground pools.
02:35:40.340 | So for instance, if you have existing structure data where your entities, let's say products
02:35:45.340 | or genes or partners or clients are already existing, then you can also put this in as part
02:35:51.340 | of the prompt.
02:35:52.340 | So that the LLM doesn't do an extraction, but more an recognition and finding approach,
02:35:59.340 | where you find the entities and then you extract the relationships from them and then you can
02:36:03.340 | store additional facts and additional information that you store as part of relationships and
02:36:08.340 | entities as well.
02:36:09.340 | So basically in the first part you have the lexical graph, which is scripted on a document structure,
02:36:13.340 | but in the second part you extract the relevant entities and their relationships.
02:36:18.340 | If you have already an existing knowledge graph, you can also connect this to an existing
02:36:22.340 | knowledge graph.
02:36:23.340 | So imagine you have a CRM where you already have customer clients and leads in your knowledge
02:36:28.340 | graph, but then you want to enrich this with, for instance, protocol from core transcripts
02:36:33.340 | and then you basically connect this to your existing structure data as well.
02:36:37.340 | So that's also a possibility.
02:36:38.340 | And then in the next phase what you can do is you can run graph algorithms for enrichment,
02:36:43.340 | which then, for instance, can do clustering on the entity graph and then you generate something
02:36:48.340 | like communities where an LLM can generate summaries across them and such.
02:36:55.340 | And for especially the last one, it's interesting because what you identify is actually cross document
02:37:00.340 | topics, right?
02:37:01.340 | So because it's basically each document is in a temporal vertical representation of information,
02:37:08.340 | but what this is is actually it looks at which topics are here occurring across many different
02:37:13.340 | documents.
02:37:14.340 | So you find these kind of topic clusters across documents as well.
02:37:18.340 | Cool.
02:37:19.340 | So if you look at the second phase, the search phase, which is basically the retrieval part
02:37:24.340 | of red, what we see here is basically that in a graphic retriever, you don't just do a simple
02:37:31.340 | vector lookup to get the results return, but what you do, you do an initial index search.
02:37:37.340 | It could be vector search, full text search, hybrid search, spatial search, other kinds of searches
02:37:42.340 | to find the entry points in your graph.
02:37:44.340 | And then you basically can take, as you can see here, starting from these entry points,
02:37:49.340 | you then follow the relationships up to a certain degree or up to a certain relevancy to fetch
02:37:55.340 | in additional context.
02:37:56.340 | And this context can be coming from user question.
02:37:59.340 | It can be external user context that comes in, for instance, when someone from, let's say,
02:38:04.340 | your finance department is looking at your data, you return different information than someone
02:38:09.340 | from the, let's say, engineering department is looking at your data, right?
02:38:12.340 | So you also take this external context into a column, how much in which context you retrieve.
02:38:17.340 | And then you return to the LLM to generate the answer, not just basically text fragments
02:38:22.340 | like you would do in vector search, but you also create the, return these more complete subset
02:38:29.340 | of the, of the contextual graph to the LLM as well.
02:38:33.340 | And modern LLMs are actually more trained on graph processing as well.
02:38:37.340 | So they can actually deal with these additional pattern structures where you have node relationship,
02:38:43.340 | node patterns that you provide as additional context to the LLM.
02:38:48.340 | and then of course I mentioned that you can enrich it using graph algorithms so that you can do things
02:38:53.340 | like clustering, link prediction, page rank and other things to enrich your data.
02:38:58.340 | Cool.
02:38:59.340 | Let's look at some practical examples.
02:39:00.340 | We don't have too much time left.
02:39:02.340 | So one is knowledge graph construction from unstructured sources.
02:39:05.340 | So there's a number of libraries.
02:39:07.340 | You've already heard some today from people that do these kind of things.
02:39:12.340 | So one thing that you build is a tool that allows you to take PDFs, YouTube transcripts,
02:39:20.340 | local documents, web articles, Wikipedia articles, and it extracts your data into a graph.
02:39:27.340 | And let me just switch over to the demo here.
02:39:32.340 | So this is the tool.
02:39:34.340 | So I uploaded information from different Wikipedia pages, YouTube videos, articles, and so on.
02:39:41.340 | And here is, for instance, a Google DeepMind extraction.
02:39:45.340 | So you can use a lot of different LLMs here.
02:39:47.340 | And then you can also, if you want to, in graph enhancement, provide graph schema as well.
02:39:52.340 | So you can, for instance, say a person works for a company and add these patterns to your schema.
02:40:02.340 | And then the LLM is using this information to drive the extraction as well.
02:40:07.340 | And so if you look at the data that has been extracted from DeepMind, it's this one here.
02:40:14.340 | We can actually see from the Wikipedia article two aspects.
02:40:19.340 | One is the document with the chunks, which is this part of the graph, right?
02:40:24.340 | And then the second part is the entities that have been extracted from this article as well.
02:40:29.340 | So you see actually the connected knowledge graph of entities, which are companies, locations,
02:40:33.340 | people, and technologies.
02:40:34.340 | So it followed our, followed our schema to extract this.
02:40:39.340 | And then if I want to run graph rec, you have here a number of different retrievers.
02:40:43.340 | So we have vector retriever, graph and full text, entity retrievers, and others that you can select.
02:40:49.340 | All of this is also an open source project.
02:40:51.340 | So you can just go to GitHub and have a look at this.
02:40:54.340 | And so I just ran this before because internet is not so reliable here.
02:40:57.340 | So what has DeepMind worked on?
02:40:58.340 | And I get a detailed explanation.
02:41:00.340 | And then if I want to, I can here look at details.
02:41:04.340 | So it shows me which sources that it used, alpha4, Google, Wikipedia, another PDF.
02:41:09.340 | I see which chunks have been used, which is the full text and hybrid search.
02:41:13.340 | But then I also see which entities have been used from the graph.
02:41:15.340 | So I can actually really see from an explainability perspective, these are the entities that have
02:41:20.340 | been retrieved by the graph rec retriever, passed to the LLM, in addition to the text that's
02:41:26.340 | connected to these entities.
02:41:27.340 | So it gets a richer response as such.
02:41:30.340 | And then you can also do eval on that with as well.
02:41:36.340 | So while I'm on the screen, let me just show you another thing that we worked on, which is
02:41:40.340 | more like an energetic approach where you basically put these individual retrievers into a configuration
02:41:46.340 | and configuration, where you have basically domain-specific retrievers that are running
02:41:53.340 | individual suffocates.
02:41:54.340 | So for instance, if you look at, let's say, this one, it has the query here.
02:42:00.340 | And basically a tool with inputs and a description.
02:42:03.340 | And then you can have an agentic loop using these tools, basically doing graphics with each
02:42:08.340 | individual tool, taking the responses and then doing deeper tool calls.
02:42:13.340 | I'll show you a deeper example in a minute.
02:42:17.340 | So this is basically what I showed you.
02:42:20.340 | This is all available as open source libraries.
02:42:23.340 | You can use it yourself from Python as well.
02:42:26.340 | Or it showed Neoconverse, which was able not just output text, but also charts and other
02:42:32.340 | visualizations, networks, visualizations as well.
02:42:35.340 | And what's interesting here in the agentic approach, you don't just use vector search to
02:42:40.340 | retrieve your data, but you basically break down user question into individual tasks and
02:42:45.340 | extract parameters and run these individual tools, which then are either run in sequence
02:42:50.340 | or in a loop to return the data, and then you get basically these outputs back.
02:42:55.340 | And then for each of these things, individual tools are called and used here.
02:43:01.340 | And the last thing that I want to show is the graphic Python package, which is basically
02:43:06.340 | also encapsulating all of the construction and the retrieval into one package.
02:43:11.340 | So you can build a large graph, you can implement the retrieval and create the pipelines here.
02:43:16.340 | And here's an example of where I pass in PDFs plus a graph schema and then basically it runs
02:43:24.340 | the import into Neo4j and then I can, in the Python notebook, visualize the data later on.
02:43:30.340 | And with that I leave you with, one second, the takeaway, which is on graphrack.com you find all of these resources,
02:43:40.340 | a lot of the patterns and we'd love to have contributions and love to talk more.
02:43:46.340 | I'm outside at the booth if you have more questions.
02:43:49.340 | Yeah, so that was great and I think you're getting it all from the expert with all the tooling.
02:43:54.340 | Actually, Michael's team builds a lot of the tools like Knowledge Graph Builder.
02:43:57.340 | I'm very excited you all came to the graphrack track and hope to chat with you all more.
02:44:02.340 | If you have questions for me and Michael, just meet us in the Neo4j booth across the way.
02:44:06.340 | Thank you.
02:44:07.340 | Thank you.
02:44:08.340 | Thank you.
02:44:09.340 | Thank you, Michael and Steven.
02:44:10.340 | That was fantastic.
02:44:11.340 | My big takeaway was that there is so much to look at.
02:44:13.340 | It's amazing.
02:44:14.340 | Is this one for power?
02:44:15.340 | That's the Wi-Fi.
02:44:16.340 | Wi-Fi.
02:44:17.340 | Okay.
02:44:18.340 | Okay.
02:44:19.340 | Thank you.
02:44:20.340 | Thank you.
02:44:21.340 | Thank you.
02:44:22.340 | Thank you, Michael and Steven.
02:44:23.340 | That was fantastic.
02:44:24.340 | My big takeaway was that there is so much to look at.
02:44:26.340 | It's amazing.
02:44:27.340 | So in this next talk, who's going to be taking us through a multi-agent framework for network
02:44:50.340 | analysis, is this right?
02:44:51.340 | Correct.
02:44:52.340 | Fantastic.
02:44:53.340 | Correct.
02:44:54.340 | Thank you.
02:44:55.340 | Thank you.
02:44:56.340 | Thank you.
02:45:26.320 | All right.
02:45:29.920 | One, two, three, four, five.
02:45:32.500 | Five, four, three, two, one.
02:45:36.100 | Microphone check.
02:45:38.140 | One, two, one, two.
02:45:39.460 | All right.
02:45:41.820 | Good afternoon, everyone.
02:45:43.820 | My name is Ola Mabadeje.
02:45:46.020 | I'm a product guy from Cisco.
02:45:48.660 | So my presentation is going to be a little more producty than techy,
02:45:52.780 | but I think you're going to enjoy it.
02:45:56.300 | And so I've been at Cisco working on AI for the last three years,
02:46:01.260 | and I work in this group called OutShift.
02:46:04.140 | So OutShift is Cisco's incubation group.
02:46:06.960 | Our charter is to help Cisco look at emerging technologies
02:46:10.840 | and see how these emerging technologies can help us accelerate
02:46:13.500 | the roadmaps of our traditional business units.
02:46:17.300 | And so by training, I'm an electrical engineer,
02:46:21.760 | doubled into network engineering, enjoyed it,
02:46:25.360 | and I've been doing that for a while.
02:46:27.080 | But over the last three years, focused on AI.
02:46:29.300 | Our group also focuses on quantum technology,
02:46:32.640 | so quantum networking is something that we're focused on.
02:46:34.940 | And if you want to learn more about what we do with OutShift at Cisco,
02:46:39.760 | you can learn more about that.
02:46:42.360 | So for today, we're going to dive into this real quick.
02:46:45.300 | And like I said, I'm a product guy,
02:46:47.640 | so I usually start with my customers' problems,
02:46:49.620 | trying to understand what are they trying to solve for,
02:46:51.960 | and then from that, walk backwards towards creating a solution for that.
02:46:55.320 | So as part of the process for us,
02:46:57.960 | we usually go through this incubation phase
02:46:59.800 | where we ask customers a lot of questions,
02:47:01.560 | and then we come up with prototypes.
02:47:03.500 | We do A testing, B testing,
02:47:05.320 | and then we kind of deliver an MVP into a production environment.
02:47:09.100 | And once we get product market fit,
02:47:11.040 | that product graduates into the Cisco's businesses.
02:47:14.260 | So this customer had this issue.
02:47:15.660 | They said, when we do change management,
02:47:17.720 | we have a lot of challenges with failures in production.
02:47:21.280 | How can we reduce that?
02:47:23.220 | Can we use AI to reduce that problem?
02:47:25.260 | So we double-clicked on that problem statement,
02:47:27.500 | and we realized it was a major problem across the industry.
02:47:29.980 | I won't go into the details here, but it's a big problem.
02:47:32.880 | Now, for us to solve the problem,
02:47:35.340 | we need to understand, does AI really have a place here,
02:47:38.100 | or it's just going to be rule-based automation to solve this problem?
02:47:41.400 | And when we looked at the workflow,
02:47:43.000 | we realized that there are specific spots in the workflow
02:47:45.660 | where AI agents can actually help address a problem.
02:47:48.760 | And so we kind of highlighted three, four, and five,
02:47:51.520 | where we believe that AI agents can help increase the value
02:47:55.000 | for customers and reduce the pain points that they were describing.
02:47:58.020 | And so we sat down together with the teams.
02:48:00.880 | We said, let's figure out a solution for this.
02:48:04.540 | And so this solution consists of three big buckets.
02:48:07.320 | The first one is the fact that it has to be a natural language interface
02:48:11.360 | where network operations teams can actually interact with the system.
02:48:14.440 | So that's the first thing.
02:48:16.040 | And not just engineers, but also systems.
02:48:18.760 | So for example, in our case,
02:48:20.020 | we built this system to talk to an ITSM tool such as ServiceNow.
02:48:23.980 | So we actually have agents on the ServiceNow side talking to agents on our side.
02:48:27.880 | The second piece of this is the multi-agent system that sits within this application.
02:48:32.900 | So we have agents that are tasked at doing specific things.
02:48:36.360 | So an agent that is tasked as doing impact assessments, doing testing,
02:48:40.260 | doing reasoning around potential failures that could happen in the network.
02:48:44.980 | And then the third piece of this is where we're going to spend some of the time today,
02:48:48.660 | which is a network knowledge graph.
02:48:50.180 | So we're having the concept of a digital twin in this case.
02:48:53.160 | So what we're trying to do here is to build a twin of the actual production network.
02:48:57.060 | And that twin includes a knowledge graph plus a set of tools to execute testing.
02:49:03.280 | And so we're going to dive into that in a little bit.
02:49:05.960 | But before we go into that, we had this challenge of,
02:49:09.760 | okay, we want to build a representation of the actual network.
02:49:14.680 | How are we going to do this?
02:49:16.280 | Because if you know networking pretty well,
02:49:19.440 | networking is a very complex technology.
02:49:22.700 | You have a variety of vendors in a customized environment,
02:49:26.440 | a variety of devices, firewalls, switches, routers, and so on.
02:49:29.420 | And all of these different devices are spitting out data
02:49:33.260 | in different formats.
02:49:34.020 | So the challenge for us is how can we create a representation of this real-world network
02:49:39.560 | using knowledge graphs in a data schema that can be understood by agents.
02:49:43.860 | And so the goal is for us to create this ingestion pipeline
02:49:47.220 | that can represent the network in such a way
02:49:49.680 | that agents can take the right actions in a meaningful way and predictive way.
02:49:53.760 | And so for us to kind of proceed with that,
02:49:56.660 | we had these three big buckets of things to consider.
02:50:00.200 | So we had to think about what are the data sources going to be.
02:50:03.800 | So if you, again, in networking, their controller systems,
02:50:06.780 | their devices themselves, their agents in the devices,
02:50:09.880 | their configuration management systems,
02:50:12.420 | all of these things are all collecting data from the network,
02:50:15.300 | or they all have data about the network.
02:50:16.540 | Now, when they spit out their data,
02:50:18.680 | they're spitting it out in different languages,
02:50:20.760 | Yang, Jason, and so on.
02:50:22.520 | Another set of considerations to have.
02:50:24.820 | And then in terms of how the data is actually coming out,
02:50:27.460 | it could be coming out in terms of streaming telemetry,
02:50:29.620 | it could be configuration files in Jason,
02:50:31.220 | it could be some other form of data.
02:50:33.480 | How can we look at all of these three different considerations
02:50:36.520 | and be able to come up with a set of requirements
02:50:38.620 | that allows us to actually build a system
02:50:40.820 | that addresses the customer's pain point again?
02:50:43.300 | And so the team, from the product side,
02:50:46.680 | we had a set of requirements.
02:50:47.700 | We wanted a system that,
02:50:49.240 | a knowledge graph that can have multimodal flexibility,
02:50:52.720 | that means it can talk key value pairs,
02:50:56.240 | you understand JSON files,
02:50:57.860 | you understand relationships across different entities in the network.
02:51:02.280 | Second thing is performance.
02:51:04.580 | If an engineer is querying a knowledge graph,
02:51:07.680 | we want to have instant access to the node,
02:51:10.560 | information about the node,
02:51:11.860 | no matter where the location of that node is.
02:51:14.120 | That was important for our customers.
02:51:16.000 | The second thing was operational flexibility.
02:51:17.740 | So the schema has to be such that
02:51:19.460 | we can consolidate into one schema framework.
02:51:22.220 | The fourth piece here is where the RAG piece comes into play.
02:51:27.220 | So we've been hearing a little about graph RAG for a little bit today.
02:51:30.500 | We wanted this to be a system
02:51:32.800 | that has ability to have vector indexing in it
02:51:35.020 | so that when you want to do semantic searches,
02:51:36.580 | at some point, you can do that as well.
02:51:38.460 | And then in terms of just ecosystem stability,
02:51:41.100 | we want to make sure that
02:51:43.140 | when we put this in the customer's environment,
02:51:44.620 | there's not going to be a lot of heavy lifting
02:51:47.720 | that's going to be done by the customer
02:51:48.780 | to integrate with their systems.
02:51:50.000 | And again, he has to support multiple vendors.
02:51:52.080 | So these were the requirements from a product side.
02:51:54.180 | And then our engineering teams,
02:51:55.360 | kind of we started to consider
02:51:56.480 | some of the options on the table.
02:51:57.640 | Neo4j, obviously, market leader,
02:52:00.980 | and the various other open source tools.
02:52:02.600 | At the end of the day,
02:52:04.140 | the engineering teams decided
02:52:05.500 | to kind of do some analysis around this.
02:52:07.880 | So I'm showing the table on the right-hand side.
02:52:09.900 | It's not an exhaustive list of things that they considered,
02:52:12.520 | but these were the things that they looked at
02:52:13.920 | that they wanted to see,
02:52:14.760 | okay, what is the right solution
02:52:16.700 | to address the requirements coming from product?
02:52:20.160 | And we kind of all centered around the first two here,
02:52:25.380 | Neo4j and ArangoDB.
02:52:26.420 | But for historical reasons,
02:52:28.500 | the team decided to go with ArangoDB
02:52:30.440 | because we had some use cases
02:52:31.760 | that were in the security space.
02:52:32.980 | There was kind of a recommendation system type of use cases
02:52:37.100 | that we wanted to kind of continue using.
02:52:38.600 | But we are still exploring the use of Neo4j
02:52:42.140 | for some of the use cases
02:52:43.140 | that are coming up as part of this project.
02:52:45.600 | So we settled on ArangoDB for this
02:52:48.960 | and we eventually came up with a solution
02:52:51.200 | that looks like this.
02:52:51.920 | So we have this knowledge graph solution.
02:52:53.340 | This is an overview of it.
02:52:54.460 | On the left-hand side,
02:52:56.520 | we have all of the production environment.
02:52:58.100 | We have the controllers,
02:52:59.720 | the Splunk, which is the same system,
02:53:02.020 | traffic telemetry coming in.
02:53:04.080 | All of them are coming into this ingestion service,
02:53:06.060 | which is doing an ETL,
02:53:08.020 | transforming all of this information
02:53:09.500 | into one schema, open config.
02:53:12.320 | So open config schema is a schema
02:53:13.780 | that is designed around networking primarily
02:53:16.320 | and it helps us to,
02:53:18.040 | because there's a lot of documentation about it
02:53:19.980 | on the internet,
02:53:20.660 | so LLMs understand this very well.
02:53:22.560 | So this setup is primarily a database
02:53:28.400 | of networking information
02:53:31.700 | that has open config schema
02:53:33.220 | as a primary way for us to communicate with it.
02:53:35.060 | So natural language communication
02:53:37.100 | through an individual engineer
02:53:39.120 | or the agents that are actually interacting
02:53:41.300 | with that system.
02:53:42.000 | And so we built this in the form of layers.
02:53:44.660 | So if you're into networking again,
02:53:47.920 | there is a set of entities in the network
02:53:51.140 | that you want to be able to interact with.
02:53:52.680 | And so we have layered this up in this way
02:53:54.840 | such that if there's a tool call
02:53:57.380 | or there's a decision to be made about a test,
02:54:00.140 | for example,
02:54:00.600 | let's say you want to do a test
02:54:01.640 | about configuration drift as an example.
02:54:04.760 | You don't need to go to all of the layers of the graph.
02:54:07.460 | You just go straight down
02:54:08.460 | into the raw configuration file
02:54:09.680 | and be able to do your comparisons there.
02:54:11.500 | If you're trying to do, like,
02:54:13.240 | a test around reachability,
02:54:14.280 | for example,
02:54:14.800 | then you need a couple of layers,
02:54:16.020 | maybe you need raw configuration layers,
02:54:17.560 | data plane layers,
02:54:19.480 | and control plane layers.
02:54:20.260 | So it's structured in a way
02:54:22.300 | that when the agents are making their calls
02:54:24.260 | to this system,
02:54:24.940 | they understand what the request is
02:54:27.740 | from the system
02:54:30.160 | and they're able to actually go to the right layer
02:54:31.800 | to pick up the information
02:54:32.680 | that they need to execute on it.
02:54:35.080 | So this is kind of a high-level view
02:54:36.740 | of what the graph system looks like in layers.
02:54:39.020 | Now, I'm going to kind of switch gears
02:54:43.500 | and go back to the system.
02:54:45.440 | Remember I described a system
02:54:46.500 | that had agents,
02:54:47.880 | a knowledge graph,
02:54:49.160 | a digital twin,
02:54:50.220 | as well as natural language interface.
02:54:51.440 | So let's talk about the agentic layer.
02:54:53.680 | And before I kind of talk about
02:54:54.980 | the specific agents
02:54:56.080 | in this system,
02:54:58.540 | on this application,
02:54:59.200 | we are looking at
02:55:01.040 | how we are going to build a system
02:55:02.480 | that is based on open standards
02:55:04.200 | for all of the internet.
02:55:05.920 | And this is one of the challenges
02:55:07.400 | we have within Cisco.
02:55:08.260 | We are looking at a system,
02:55:09.880 | a set of a collective,
02:55:12.660 | open source collective
02:55:13.700 | that includes all of the partners
02:55:15.000 | we see down here.
02:55:15.760 | So we have OutShift by Cisco,
02:55:17.840 | we have Langchain, Galileo,
02:55:19.540 | we have all of these members
02:55:21.320 | who are supporters of this collective.
02:55:23.840 | And what we are trying to do
02:55:25.300 | is to set up a system
02:55:26.440 | that allows agents
02:55:28.120 | from across the world,
02:55:29.760 | so it's a big vision
02:55:30.880 | that they can talk to each other
02:55:33.140 | without having to do heavy lifting
02:55:34.540 | of reconstructing your agents
02:55:35.980 | every time you want
02:55:36.580 | to integrate them
02:55:37.200 | with another agent.
02:55:38.180 | So it consists of identity,
02:55:39.980 | schema framework
02:55:41.560 | for defining an agent's skills
02:55:42.960 | and capabilities,
02:55:43.700 | the directory
02:55:44.860 | where you actually store
02:55:45.700 | these agents,
02:55:46.320 | and then how you actually
02:55:47.340 | compose the agents,
02:55:48.340 | both of the semantic layer
02:55:49.680 | and the synthetic layer,
02:55:50.480 | and then how do you observe
02:55:51.740 | the agents in process.
02:55:52.720 | All of these are part
02:55:54.000 | of this collective's vision
02:55:55.380 | as a group.
02:55:56.820 | And if you want to learn more
02:55:58.160 | about this,
02:55:58.580 | it's on agency.org,
02:55:59.900 | and I also have a slide
02:56:01.780 | here that kind of talks
02:56:02.480 | about there's real code,
02:56:04.320 | actually,
02:56:04.660 | that you can leverage today.
02:56:05.660 | Or if you want to contribute
02:56:06.880 | to the code,
02:56:07.480 | you can actually go there.
02:56:09.120 | There's a GitHub repo here
02:56:10.380 | that you can go to
02:56:11.040 | and you can start to contribute
02:56:12.480 | or use the data.
02:56:14.240 | There's documentation
02:56:16.080 | available as well,
02:56:16.940 | and there's Apple applications
02:56:17.860 | that allows you
02:56:18.460 | to actually see
02:56:19.080 | how this works in real life.
02:56:20.260 | And we know that there's MCP,
02:56:23.020 | there's A2A,
02:56:23.860 | all of these protocols
02:56:24.980 | are becoming very popular.
02:56:27.040 | We also integrate
02:56:28.080 | all of these protocols
02:56:28.900 | because the goal, again,
02:56:29.960 | is not to create something
02:56:31.920 | that is bespoke.
02:56:32.700 | We want to make it open
02:56:34.020 | to everyone
02:56:34.460 | to be able to create agents
02:56:35.700 | and be able to make these agents
02:56:36.800 | work in production environments.
02:56:37.900 | So, back to the specific application
02:56:41.200 | we're talking about.
02:56:41.960 | Based on this framework,
02:56:43.840 | we delivered this set of agents.
02:56:46.120 | We built this set of agents
02:56:47.760 | as a group.
02:56:48.420 | So, we have five agents right now
02:56:49.980 | as part of this application.
02:56:51.020 | There's an assistant agent
02:56:52.920 | that's kind of the planner
02:56:53.780 | that kind of orchestrates things
02:56:55.380 | across the globe,
02:56:56.080 | across all of these agents.
02:56:57.180 | And then we have other agents
02:56:59.440 | that are all based
02:57:00.140 | on React reasoning loops.
02:57:01.120 | There's one particular agent
02:57:02.820 | I want to call out here,
02:57:03.620 | the query agent.
02:57:04.340 | This query agent
02:57:05.100 | is the one that actually interacts
02:57:06.380 | directly with the knowledge graph
02:57:07.680 | on a regular basis.
02:57:08.540 | We have to fine-tune these agents
02:57:11.640 | because we initially started
02:57:13.400 | by doing a...
02:57:15.040 | Attempting to use RAG
02:57:16.980 | to do some querying
02:57:18.740 | of the knowledge graph
02:57:19.460 | but that was not working out well
02:57:20.700 | so we decided that
02:57:21.620 | for immediate results
02:57:23.240 | we're going to fine-tune it
02:57:24.300 | and so we did some fine-tuning
02:57:25.700 | of this agent
02:57:27.840 | with some schema information
02:57:29.260 | as well as example queries.
02:57:30.560 | And so, that helped us
02:57:32.140 | to actually reduce two things.
02:57:33.460 | The number of tokens
02:57:34.240 | we were born in
02:57:34.860 | because every time we were...
02:57:36.220 | Before that,
02:57:36.780 | the AQL queries
02:57:37.780 | were going through
02:57:38.440 | all of the layers
02:57:39.140 | of the knowledge graph
02:57:40.180 | and in a reasoning loop
02:57:42.220 | was consuming lots of tokens
02:57:43.560 | and taking a lot of time
02:57:44.480 | for it to result...
02:57:45.640 | to return results.
02:57:46.880 | After fine-tuning
02:57:47.580 | we saw a drastic reduction
02:57:48.800 | in number of tokens consumed
02:57:50.080 | as well as the amount
02:57:51.380 | of time it took
02:57:52.220 | to actually come back
02:57:52.920 | with the results.
02:57:53.400 | So, that kind of helped us there.
02:57:55.060 | So, I'm going to kind of pause here.
02:57:57.720 | I'm talking a lot about...
02:57:58.880 | There's a lot of slide wear here.
02:58:00.300 | I want to show a quick demo
02:58:01.320 | of what this actually looks like.
02:58:03.080 | So, tying together everything
02:58:04.380 | from the natural language interface
02:58:07.080 | interaction with an ITSM system
02:58:08.880 | to how the agents interact
02:58:10.600 | to how that collects
02:58:11.620 | information from knowledge graph
02:58:12.800 | and delivers results
02:58:14.060 | to the customer.
02:58:16.040 | So, the scenario we have here
02:58:19.080 | is a network engineer
02:58:20.320 | wants to make a change
02:58:21.300 | to a firewall rule.
02:58:22.360 | They have to do that
02:58:23.520 | to accommodate a new server
02:58:24.700 | into the network.
02:58:25.520 | And so, what they need to do
02:58:27.160 | is to, first of all,
02:58:28.000 | start from ITSM,
02:58:29.080 | so to submit a ticket
02:58:30.080 | in the service now.
02:58:34.380 | Now, our system here,
02:58:36.420 | the UI I'm showing right here
02:58:38.620 | is the UI of the actual system
02:58:40.600 | we've built,
02:58:41.040 | the application we've built.
02:58:41.960 | We have ingested information
02:58:43.880 | about the tickets here
02:58:47.160 | in natural language.
02:58:48.060 | And so, the agents here
02:58:50.040 | are able to actually start
02:58:50.900 | to work on this.
02:58:51.640 | So, I'm going to play video here
02:58:52.780 | just to make it more relatable.
02:58:55.920 | So, the first thing
02:58:56.680 | that's happening here
02:58:57.340 | is that these agents,
02:58:58.760 | the first agent
02:59:00.200 | is asking that
02:59:01.660 | for the information
02:59:04.660 | to be synthesized
02:59:05.780 | in a summarized way
02:59:06.640 | so that they can understand
02:59:07.580 | what to quickly do.
02:59:09.180 | The next action
02:59:10.640 | that has been asked here
02:59:11.620 | is for you to create
02:59:12.560 | an impact assessment.
02:59:13.340 | So, impact assessment
02:59:14.200 | here just means
02:59:14.900 | that I have to understand
02:59:15.740 | will this change
02:59:17.520 | have any implications
02:59:18.480 | for me beyond
02:59:19.540 | the immediate target area?
02:59:22.240 | And that's going
02:59:23.000 | to be summarized
02:59:23.700 | and we are now going
02:59:24.660 | to ask the agents
02:59:25.840 | that is responsible
02:59:26.660 | for this particular task
02:59:27.920 | to go and attach
02:59:29.180 | this information
02:59:29.820 | into the ITSM ticket.
02:59:31.560 | So, I'm going to say
02:59:32.660 | attach this information
02:59:35.360 | about the impact assessment
02:59:36.400 | into the ITSM ticket.
02:59:38.900 | So, that's been done.
02:59:39.700 | Now, the next step
02:59:40.840 | is to actually
02:59:41.640 | create a test plan.
02:59:42.440 | So, test plan
02:59:43.120 | is one of the biggest problems
02:59:44.300 | that our customers
02:59:45.000 | are facing.
02:59:45.480 | They run a lot of tests
02:59:47.840 | but they miss out
02:59:49.080 | on the right test to run.
02:59:50.000 | So, these agents
02:59:51.320 | are actually able
02:59:51.880 | to reason through
02:59:52.480 | a lot of information
02:59:53.140 | about test plans
02:59:54.160 | across the internet
02:59:55.020 | and based on the intent
02:59:56.540 | that was collected
02:59:57.480 | from the ServiceNow ticket
02:59:58.980 | is going to come up
03:00:00.160 | with a list of tests
03:00:01.220 | that you have to run
03:00:02.100 | to be able to make sure
03:00:02.960 | that this firewall rule change
03:00:04.700 | doesn't make a big impact
03:00:06.080 | or create problems
03:00:06.920 | in production environments.
03:00:07.880 | So, as you can see here,
03:00:09.260 | this agent has gone ahead
03:00:10.240 | and actually listed
03:00:11.020 | all of the test cases
03:00:12.080 | that need to be run
03:00:12.840 | and the expected results
03:00:14.340 | for each of the tests.
03:00:15.160 | So, we are going to ask
03:00:16.380 | this agent
03:00:17.100 | to attach this information
03:00:18.400 | again back
03:00:19.100 | to the ITSM ticket
03:00:20.340 | because that's where
03:00:21.520 | the approval board
03:00:22.460 | needs to see
03:00:23.000 | this information
03:00:23.440 | before they approve
03:00:25.340 | the implementation
03:00:25.820 | of this change
03:00:26.660 | in production environment.
03:00:28.140 | So, we can see here
03:00:29.400 | that that information
03:00:30.160 | has now been attached
03:00:31.060 | back by this agent
03:00:32.160 | to the ITSM ticket.
03:00:33.380 | So, two separate systems
03:00:34.800 | but agents
03:00:35.280 | talking to each other.
03:00:35.940 | Now, the next step
03:00:37.920 | is actually run a test
03:00:38.900 | on all of these test cases.
03:00:40.480 | So, in this case,
03:00:42.400 | the configuration file
03:00:43.540 | that is going to be used
03:00:44.300 | to make the change
03:00:44.920 | in the firewall
03:00:45.420 | is sitting in the GitHub repo
03:00:46.740 | and so, we are going
03:00:47.960 | to do a pull request
03:00:48.860 | of that config file
03:00:49.960 | and going to take
03:00:51.080 | that information.
03:00:51.820 | So, this is the GitHub repo
03:00:53.220 | where we're going
03:00:54.100 | to do a pull request.
03:00:54.940 | We're going to take
03:00:56.120 | the link for that pull request
03:00:57.400 | and paste it in the ticket,
03:00:58.760 | the ITSM ticket
03:00:59.680 | and so that when
03:01:01.060 | the execution agent
03:01:02.880 | starts doing his job,
03:01:03.920 | he's actually going
03:01:04.900 | to pull from that
03:01:05.680 | and use it to run his test.
03:01:06.880 | So, at this moment,
03:01:09.060 | we are going to start
03:01:10.560 | running the test.
03:01:11.060 | We're going to ask
03:01:11.700 | this agent to go ahead
03:01:12.960 | and actually run the test
03:01:15.100 | and execute on this test
03:01:16.160 | and so, I have attached
03:01:19.040 | the change, sorry,
03:01:20.400 | I don't have my glasses,
03:01:21.220 | I've attached my change
03:01:23.300 | candidates to the ticket.
03:01:25.580 | Can you go ahead
03:01:26.460 | and run the test?
03:01:27.100 | So, what is going
03:01:28.160 | to happen here is
03:01:28.880 | if you look on the right
03:01:29.580 | hand side of this screen
03:01:30.520 | here, a series of things
03:01:32.220 | are happening.
03:01:32.600 | The first thing is
03:01:33.420 | that this agent
03:01:34.460 | called the executor agent
03:01:35.580 | goes, looks at the test cases
03:01:37.600 | and then it goes
03:01:38.760 | into the knowledge graph
03:01:39.620 | and it's going to go ahead
03:01:40.980 | and actually do a snapshot
03:01:42.120 | of the most recent visual
03:01:44.580 | or most recent information
03:01:45.940 | about the network.
03:01:46.660 | It's now going to take
03:01:48.300 | the pull request
03:01:49.140 | that it pulled from GitHub,
03:01:50.500 | the snapshot it just took
03:01:52.660 | from the knowledge graph,
03:01:53.480 | it's going to compute it
03:01:54.920 | together and then run
03:01:56.440 | all of the individual tests
03:01:57.660 | one at a time.
03:01:58.380 | So, we can see
03:01:59.260 | that it's running the test,
03:02:00.160 | one test, test one test,
03:02:01.580 | two test, three test, four.
03:02:02.480 | So, all of this is happening
03:02:03.320 | in what we call a digital twin.
03:02:05.400 | So, a digital twin, again,
03:02:06.500 | is a combination
03:02:07.720 | of the knowledge graph
03:02:08.940 | instead of tools
03:02:10.160 | that you can use
03:02:10.580 | to run a test.
03:02:11.280 | So, an example
03:02:12.280 | of a tool here
03:02:12.800 | could be Batfish
03:02:13.480 | or could be RouteNet
03:02:14.940 | or some other tools
03:02:15.880 | that you use
03:02:16.240 | for network engineering purposes.
03:02:18.120 | So, once all of these tests
03:02:20.060 | are completed,
03:02:20.760 | this tool actually is going to,
03:02:23.200 | this agent is going to now
03:02:24.420 | generate a report
03:02:25.260 | about the test results.
03:02:26.200 | So, we give you some time
03:02:28.120 | to run through this,
03:02:28.820 | it's still running the tests,
03:02:29.860 | but once it concludes
03:02:31.540 | all of the tests,
03:02:32.260 | it's going to report
03:02:33.220 | what actually the test results are.
03:02:35.640 | So, which results,
03:02:36.340 | which tests actually passed,
03:02:37.880 | which ones failed.
03:02:39.100 | for the ones that have failed,
03:02:40.140 | it's going to make
03:02:40.680 | some recommendations
03:02:41.220 | on what you can do
03:02:41.960 | to go in and fix the problem.
03:02:43.040 | I'm going to skip to the front here
03:02:45.580 | to just quickly get this
03:02:46.560 | done quickly because of time.
03:02:49.040 | So, it's attached the results
03:02:53.440 | to the ticket,
03:02:54.200 | and this is the report
03:02:55.280 | that it's spitting out
03:02:56.640 | in terms of,
03:02:57.140 | this is the report
03:02:57.760 | for the tests that were run.
03:02:58.640 | So, this execution agent
03:03:00.140 | actually created a report
03:03:01.880 | about all of the different test cases
03:03:03.620 | that were run by the system.
03:03:05.680 | So, very quick short demo here.
03:03:07.880 | There's a lot of detail
03:03:09.120 | behind the scenes,
03:03:09.740 | but I can ask some questions offline.
03:03:11.680 | The couple of things
03:03:13.580 | I want to leave us with
03:03:14.300 | is that,
03:03:14.820 | before I go to the end of this,
03:03:16.040 | is that evaluation
03:03:17.980 | is very critical here
03:03:19.280 | for us to be able to understand
03:03:21.140 | how this delivers value
03:03:22.080 | to customers.
03:03:22.580 | We're looking at
03:03:24.500 | a variety of things here.
03:03:26.240 | So, the agents themselves,
03:03:27.500 | the knowledge graph,
03:03:28.680 | slash digital twin,
03:03:29.740 | and we're looking at
03:03:31.000 | what can we actually measure
03:03:32.680 | quantifiably.
03:03:33.680 | Now, for the knowledge graph,
03:03:34.860 | we're looking at
03:03:35.420 | extrinsic metrics,
03:03:37.700 | particularly not intrinsic ones
03:03:39.560 | because we want to map this
03:03:40.520 | back to the customer's use case.
03:03:41.860 | So, this is the summary
03:03:43.400 | of what we see
03:03:44.780 | in terms of evaluation metrics.
03:03:45.960 | We are still learning.
03:03:48.140 | This is, for now,
03:03:49.760 | it's an MVP,
03:03:50.620 | but what we are learning so far
03:03:52.900 | is that those two key building blocks,
03:03:54.560 | the knowledge graph,
03:03:55.780 | and the open framework
03:03:57.760 | for building agents
03:03:58.500 | is very critical for us
03:03:59.440 | to actually build
03:04:00.000 | a scalable system
03:04:01.300 | for our customers.
03:04:02.320 | And so, I'm going to stop.
03:04:04.340 | It's eight seconds to go.
03:04:05.340 | Thank you for listening to me,
03:04:06.860 | and then if you have questions,
03:04:07.860 | I'll be out there.
03:04:08.400 | Thank you so much, Ola.
03:04:23.140 | That was fantastic.
03:04:24.220 | I love getting a deep dive,
03:04:26.180 | and always a perspective
03:04:26.980 | from a product guy.
03:04:28.060 | It's always good to hear.
03:04:29.260 | Keep us set in reality.
03:04:30.660 | Thank you.
03:04:31.600 | So, for the GraphRack track,
03:04:34.360 | closing out the day
03:04:35.380 | on this track,
03:04:36.260 | it's going to be my friend,
03:04:37.420 | Tom Smoker,
03:04:38.040 | from Why Howl,
03:04:39.140 | who's going to be talking
03:04:40.740 | about legal documents,
03:04:43.060 | and how to turn those
03:04:43.740 | into knowledge graphs, right?
03:04:44.540 | Part of it, yeah.
03:04:45.660 | Awesome.
03:04:46.500 | Quick check on your audio.
03:04:51.060 | Five, four, three, two, one.
03:04:54.920 | Still up?
03:04:59.180 | Oh, there you are.
03:05:00.000 | Oh, yeah.
03:05:00.240 | Beautiful.
03:05:00.500 | Great.
03:05:00.600 | Okay.
03:05:02.720 | Cool.
03:05:18.100 | Thanks, everyone.
03:05:18.840 | When we're ready,
03:05:22.680 | ABK, you just let me know.
03:05:23.700 | Standby.
03:05:24.680 | Change the records.
03:05:26.900 | Good.
03:05:28.420 | Cool.
03:05:29.220 | I can't see a whole lot.
03:05:30.060 | Thank you.
03:05:30.440 | I'll get started.
03:05:31.580 | Okay.
03:05:32.560 | Oh, no.
03:05:32.820 | Standby.
03:05:33.300 | One moment.
03:05:33.860 | We have to change the records.
03:05:35.040 | No worries.
03:05:35.820 | The drive, I'm sorry.
03:05:46.100 | Thank you.
03:05:57.920 | I have bad eyesight
03:05:59.460 | and an Australian accent,
03:06:00.540 | so this is not a great combination,
03:06:02.200 | so I appreciate you working with me.
03:06:03.760 | Thank you.
03:06:04.180 | Hello, everyone.
03:06:06.140 | I am here to talk about Graph Rag,
03:06:08.640 | as we're here for the track,
03:06:09.440 | but I'm talking about
03:06:10.340 | what to do in the legal industry
03:06:12.400 | and what we do in the legal industry,
03:06:13.940 | and what does it look like
03:06:14.700 | to turn documents into graphs
03:06:15.900 | and use those graphs in the age of AI.
03:06:17.560 | I tend to have to qualify
03:06:19.980 | why I'm at places.
03:06:20.920 | There's various reasons
03:06:22.280 | why I could be talking today.
03:06:23.420 | You choose the one that you want to,
03:06:24.900 | but generally,
03:06:25.520 | I've been working in graphs
03:06:26.420 | for about a decade.
03:06:27.240 | I have a good relationship
03:06:28.680 | with the Neo4j team,
03:06:30.260 | and I've been doing graphs
03:06:31.440 | for a long time,
03:06:32.140 | but primarily,
03:06:32.940 | I am the technical founder
03:06:34.140 | of a company called
03:06:34.700 | whyhow.ai,
03:06:35.540 | and we find cases first
03:06:38.680 | before lawyers do
03:06:40.360 | and then give them to lawyers.
03:06:42.400 | Now, how we find these cases
03:06:45.720 | is a process that I'll go through,
03:06:47.280 | but we use a variation of graphs,
03:06:48.660 | multi-agent systems,
03:06:49.980 | signals, etc.,
03:06:50.900 | and I'll detail through today
03:06:52.920 | how we do that
03:06:53.880 | at a high level
03:06:54.480 | and a low level,
03:06:55.080 | and I'm happy to answer questions
03:06:56.220 | at any point.
03:06:58.980 | This is broadly what we do.
03:07:00.060 | We work in law.
03:07:01.220 | This is an example.
03:07:02.340 | We find class action,
03:07:03.560 | mass tort cases
03:07:04.280 | before other people do.
03:07:05.540 | We have agents.
03:07:07.300 | We have graphs.
03:07:08.340 | We store that information.
03:07:09.200 | We scrape the web.
03:07:09.900 | We qualify that
03:07:10.660 | with a proprietary process,
03:07:11.820 | and we deal with lawyers every day
03:07:15.020 | and understand exactly how they think
03:07:16.800 | and build these cases,
03:07:17.800 | and the cases I'm referring to
03:07:19.180 | would be like many people
03:07:20.600 | used a pharmaceutical product.
03:07:21.920 | That product has caused them harm.
03:07:23.840 | Science has proved that harm,
03:07:25.100 | and we can collect those people
03:07:26.380 | and collectively sue
03:07:27.320 | the pharmaceutical company.
03:07:28.700 | So we support the law firms
03:07:29.760 | that do that.
03:07:30.340 | And as I'm talking,
03:07:31.320 | and everyone here
03:07:31.860 | for a graph rad track
03:07:32.660 | can start to imagine
03:07:33.420 | that I'm starting to develop
03:07:34.400 | a bit of a schema there.
03:07:35.320 | I'm describing individuals.
03:07:36.660 | I'm describing products.
03:07:37.600 | Those products have ingredients.
03:07:38.640 | Those ingredients have concentrations.
03:07:39.960 | Those concentrations may have
03:07:41.520 | an ID number,
03:07:42.160 | and all of a sudden,
03:07:42.880 | you can start to imagine
03:07:44.160 | there is this large,
03:07:45.860 | networked, schematized bit of data
03:07:48.120 | that has particular points in it
03:07:49.940 | that are very valuable
03:07:50.780 | and very visual
03:07:51.720 | and very useful
03:07:52.420 | to domain experts.
03:07:53.300 | So I'm going to start
03:07:56.120 | to use some definitions
03:07:57.080 | because knowledge graphs
03:07:58.300 | have been around for a long time,
03:07:59.240 | and ABK would know that
03:08:00.420 | more than I would,
03:08:01.100 | but I started my PhD
03:08:02.640 | and won my master's
03:08:03.800 | in graphs in 2016,
03:08:04.860 | and it was not nearly
03:08:05.720 | as popular as it is now,
03:08:06.700 | and it's fascinating
03:08:07.240 | to see how far it's come,
03:08:08.280 | but I do think it's important
03:08:09.540 | for me to define
03:08:10.160 | how we use them
03:08:10.780 | and how we think about them.
03:08:11.720 | Broadly, to me,
03:08:14.660 | graphs are relations.
03:08:15.500 | That's part of the visual element,
03:08:17.080 | and there's a back-end element
03:08:18.060 | as well,
03:08:18.460 | but it's the benefit
03:08:19.720 | of using graphs
03:08:20.740 | is that I can see
03:08:21.760 | what is connected
03:08:22.680 | to something else.
03:08:23.440 | I can be explicit
03:08:24.320 | about what is connected
03:08:25.160 | to something else,
03:08:26.060 | and I can do
03:08:26.640 | mass analytics
03:08:27.380 | on what is connected
03:08:28.240 | to something else.
03:08:29.000 | All the way from
03:08:29.960 | I can see it down
03:08:30.840 | to I can do
03:08:31.440 | large-scale analytics
03:08:32.740 | on it is the value
03:08:33.760 | of the relations.
03:08:34.540 | And when I use relations,
03:08:35.920 | it's not necessarily
03:08:36.880 | node-to-node.
03:08:37.840 | It can be node-to-node-to-node.
03:08:39.540 | It can be multi-hop.
03:08:40.380 | It can be as varied
03:08:41.620 | and as forked
03:08:42.220 | and as distributed
03:08:42.820 | as you want.
03:08:43.420 | This is why we use graphs
03:08:45.060 | in our process.
03:08:45.880 | Broadly,
03:08:48.560 | throughout the process
03:08:49.300 | of running this company
03:08:49.940 | and previously as an academic,
03:08:50.980 | this is what I think
03:08:51.760 | is easy about graphs.
03:08:52.660 | People look at them
03:08:53.860 | and go,
03:08:54.280 | well, that's fantastic.
03:08:54.920 | I have a great understanding
03:08:55.940 | of what this is.
03:08:56.560 | And someone else says,
03:08:57.620 | me too.
03:08:58.080 | And there isn't necessarily
03:08:59.340 | a consistency
03:09:00.000 | in what those two people
03:09:00.980 | just said.
03:09:01.460 | They may have
03:09:02.220 | a different understanding
03:09:03.020 | of what is represented.
03:09:03.940 | Broadly,
03:09:06.020 | throughout my career,
03:09:06.980 | these are the things
03:09:07.740 | that are difficult
03:09:08.300 | about graphs, right?
03:09:09.460 | And you can say
03:09:11.340 | that they're nodes
03:09:12.060 | connected to edges.
03:09:12.820 | You can say they're distributed.
03:09:13.760 | You can say they're backed up.
03:09:14.860 | There's a variety
03:09:15.800 | of ways in which people
03:09:16.900 | use the data
03:09:17.840 | that they have,
03:09:18.980 | the way they store it
03:09:19.740 | and the way they talk about it.
03:09:20.720 | And now,
03:09:21.700 | as graphs have become
03:09:22.880 | very necessary
03:09:23.700 | and consistent
03:09:24.340 | for things like
03:09:25.000 | graph rag,
03:09:25.520 | for things like
03:09:26.040 | structured data,
03:09:26.720 | et cetera,
03:09:27.200 | more and more people
03:09:28.460 | are coming to this
03:09:29.160 | relatively niche area
03:09:30.480 | previously
03:09:31.120 | that even at the time
03:09:32.620 | wasn't necessarily
03:09:33.240 | agreed upon what it was.
03:09:34.340 | So I do like to define
03:09:36.100 | what it is we're using.
03:09:37.420 | So graphs
03:09:38.080 | and multi-agent systems.
03:09:39.660 | These are the two things
03:09:40.500 | that I want to define
03:09:41.220 | as there's a variety
03:09:42.640 | of ways that people use them.
03:09:45.800 | This is how we use
03:09:46.740 | multi-agent systems, right?
03:09:48.120 | So now,
03:09:50.240 | multi-agent systems
03:09:51.420 | are all the way
03:09:52.800 | from very specifically
03:09:54.440 | define what you're dealing
03:09:55.500 | with and chain those together
03:09:57.080 | and use an LLM
03:09:58.260 | to glue it all together.
03:09:59.140 | Or it is,
03:09:59.860 | in our case,
03:10:00.460 | break down a complicated
03:10:02.300 | white-collar workflow
03:10:03.780 | down into a specific
03:10:05.520 | set of steps
03:10:06.400 | that I can I-O test, right?
03:10:08.240 | And each of those steps
03:10:09.420 | have different requirements,
03:10:10.320 | different frequencies,
03:10:11.120 | different state.
03:10:12.380 | and that state
03:10:13.020 | can be controlled
03:10:13.740 | often, in our case,
03:10:15.160 | by a graph.
03:10:16.000 | This is why we like
03:10:19.560 | to use them.
03:10:20.060 | When we're building
03:10:21.260 | an application
03:10:21.840 | for the legal industry,
03:10:23.180 | I'm not sure
03:10:23.720 | if you guys know this,
03:10:24.400 | but lawyers don't really
03:10:25.140 | like when things
03:10:25.780 | are incorrect, right?
03:10:26.640 | It is basically
03:10:27.900 | the whole industry
03:10:28.640 | is make this very
03:10:29.520 | specifically correct
03:10:30.500 | and proper
03:10:30.980 | and definitely
03:10:31.560 | in the right language.
03:10:32.460 | So when it comes
03:10:33.560 | to building applications,
03:10:34.640 | probabilistic large
03:10:36.180 | language models
03:10:36.820 | don't necessarily work
03:10:37.940 | for that just in isolation.
03:10:39.100 | I need to have
03:10:40.140 | a very specific control
03:10:41.800 | and structure
03:10:42.500 | and schema
03:10:43.160 | for the way
03:10:44.100 | that we build
03:10:44.600 | these systems
03:10:45.220 | and I need to be able
03:10:46.400 | to test and be able
03:10:47.120 | to pinpoint exactly
03:10:47.980 | what is going right
03:10:48.740 | and wrong
03:10:49.120 | at any point in time.
03:10:50.080 | Here's some of the issues
03:10:53.920 | with that, right?
03:10:54.980 | And we've heard about
03:10:55.760 | multi-agent systems a lot.
03:10:57.900 | I'm sure other people
03:10:58.500 | have as well.
03:10:58.960 | Sometimes the part
03:11:01.640 | in the workflow
03:11:02.320 | is much more important
03:11:03.480 | than the other part.
03:11:04.240 | Sometimes there's parts
03:11:05.160 | in the workflow
03:11:05.620 | I don't particularly care about.
03:11:06.960 | There are also agents
03:11:08.780 | in the world.
03:11:09.340 | Agents imply
03:11:10.220 | that these things
03:11:10.720 | are very capable,
03:11:11.580 | but I can write
03:11:12.040 | a bad prompt
03:11:12.620 | very easily
03:11:13.200 | and all of a sudden
03:11:13.800 | I have a bad agent.
03:11:14.600 | So when it comes
03:11:15.940 | to what is the agent
03:11:17.140 | that I trust,
03:11:17.820 | very few.
03:11:19.300 | We spend a lot
03:11:20.300 | of time guard railing
03:11:21.280 | as much as we possibly can.
03:11:22.520 | We spend time making
03:11:23.720 | so that the memory
03:11:24.440 | is not just immediate
03:11:25.320 | but it's episodic.
03:11:26.260 | We spend time capturing
03:11:27.680 | the information state
03:11:28.820 | over time
03:11:29.320 | and then pruning that state.
03:11:30.460 | And again,
03:11:31.360 | to bring it back,
03:11:32.200 | capturing,
03:11:33.040 | expanding,
03:11:34.280 | pruning,
03:11:35.100 | structuring,
03:11:36.200 | and then querying state
03:11:37.720 | for us happens
03:11:38.940 | in a graphical format
03:11:39.900 | because the necessity
03:11:41.380 | of having the structure,
03:11:42.540 | having the extendability,
03:11:43.940 | and then having the ability
03:11:44.960 | to remove that extension
03:11:46.220 | is really important for us.
03:11:47.840 | And then finally,
03:11:48.740 | I'm trying not to make this
03:11:50.040 | too deep depth
03:11:51.060 | and too many numbers,
03:11:51.840 | but 95% accuracy
03:11:54.780 | for a single agent
03:11:55.880 | I think is a tall order
03:11:57.980 | at this point.
03:11:58.800 | Maybe people have
03:11:59.900 | entirely accurate agents.
03:12:01.680 | I'm very happy for you.
03:12:03.000 | I don't have that
03:12:03.700 | exactly right now.
03:12:04.640 | I have systems
03:12:05.620 | that I can put in place
03:12:06.540 | like guardrails
03:12:07.260 | and humans in the loop
03:12:08.100 | that can bring these agents
03:12:09.380 | to a point
03:12:09.860 | that it is accurate enough
03:12:10.800 | that people are willing
03:12:11.460 | to use them.
03:12:12.060 | However,
03:12:12.960 | five 95% accurate agents
03:12:15.240 | chain together sequentially
03:12:16.880 | that 77% expected accuracy.
03:12:19.220 | That's not that many agents
03:12:21.160 | in a row.
03:12:21.640 | If you think about a workflow,
03:12:22.860 | that's five steps.
03:12:24.200 | And if I'm basically saying
03:12:26.000 | that if each of those five steps
03:12:27.240 | are 95% accurate,
03:12:28.300 | already quite a hard thing to ask,
03:12:29.820 | especially if there's
03:12:30.440 | an LLM involved.
03:12:31.940 | Now we're at 77% of the time
03:12:33.560 | it gets to the end
03:12:34.160 | of that workflow
03:12:34.620 | in the way that I want.
03:12:35.260 | That is part of,
03:12:37.280 | probably if I was to summarize
03:12:38.460 | my main problem,
03:12:39.120 | it would be that.
03:12:39.500 | It would be decision making
03:12:40.280 | under uncertainty
03:12:41.000 | throughout the process
03:12:42.160 | of building these systems.
03:12:43.980 | That's the background.
03:12:47.660 | That's how we understand
03:12:49.260 | these systems.
03:12:49.680 | We use multi-agent systems
03:12:50.880 | and we're naturally skeptical.
03:12:51.780 | We use graphs every day
03:12:53.360 | and we have a natural skepticism
03:12:55.240 | of exactly how these things
03:12:56.540 | are stored and structured.
03:12:57.280 | But we use them specifically
03:12:58.580 | and consistently
03:12:59.180 | in the way that we like.
03:13:00.880 | So, I am using the term agent
03:13:02.740 | because everyone's using
03:13:03.620 | the term agent.
03:13:04.180 | We build litigation agents.
03:13:05.640 | Litigation is the process of,
03:13:07.300 | well, I'm going to summarize,
03:13:08.580 | but we work with class action
03:13:10.240 | slash mass tort law.
03:13:11.220 | As I said before,
03:13:12.320 | get everyone together.
03:13:13.740 | They were harmed.
03:13:14.540 | Put that harm all in place
03:13:16.080 | and then sue a pharmaceutical company.
03:13:17.360 | Now, we don't do any
03:13:19.260 | of the litigating as a company
03:13:20.720 | or the suing,
03:13:21.580 | but we do support
03:13:22.420 | the lawyers who do that.
03:13:23.620 | We do that in a few different ways.
03:13:25.300 | Here is one of the ways
03:13:29.800 | that we look at the legal industry,
03:13:31.020 | right?
03:13:31.360 | Without exception,
03:13:32.820 | everything needs to be perfect.
03:13:34.560 | It needs to be accurate.
03:13:35.280 | It needs to be written
03:13:35.820 | in the correct way, right?
03:13:37.020 | There's also,
03:13:39.360 | once you have that correct format,
03:13:41.340 | creative arguments.
03:13:43.540 | The best lawyers
03:13:44.520 | are very, very, very detail-oriented
03:13:46.860 | and then very, very creative
03:13:49.000 | in the way
03:13:49.480 | that they can apply
03:13:50.140 | those details to a case.
03:13:51.500 | For example,
03:13:52.160 | there was an issue
03:13:53.400 | with Netflix
03:13:55.100 | and they were capturing data
03:13:56.940 | from their users
03:13:57.600 | as they do and they should
03:13:58.620 | and I'm a Netflix user
03:13:59.640 | and they capture my data
03:14:00.480 | and I appreciate it
03:14:01.160 | because they give me
03:14:01.640 | the better shows
03:14:02.180 | that I'd like to watch.
03:14:02.980 | However, there is a legal limit
03:14:04.660 | as to how much information
03:14:05.560 | they can capture from me, right?
03:14:06.820 | And you cannot surpass
03:14:08.000 | that legal limit
03:14:08.600 | or you can,
03:14:09.080 | but then you can go
03:14:09.700 | into the process of litigation.
03:14:11.000 | Now, if you surpass that,
03:14:14.200 | there needs to be a precedent
03:14:15.600 | as to why someone could say
03:14:17.820 | you cannot capture
03:14:19.120 | this much information
03:14:20.000 | and the particular precinct
03:14:21.060 | I'm referring to
03:14:21.800 | is many years ago,
03:14:23.420 | Blockbuster was sued
03:14:24.540 | by keeping too many details
03:14:26.720 | about the literal physical DVDs
03:14:28.700 | that people rented.
03:14:29.320 | That is a reasonably creative way
03:14:31.880 | to say,
03:14:32.160 | look, I remember
03:14:33.380 | that Blockbuster happened
03:14:34.380 | and what Netflix is doing
03:14:35.920 | isn't that different.
03:14:36.660 | It may be in a digital format,
03:14:37.800 | it may be at a larger scale,
03:14:38.740 | it may be into an algorithm
03:14:39.660 | instead of someone
03:14:40.360 | who's recommending it.
03:14:41.160 | However,
03:14:41.540 | that is an interesting application
03:14:43.680 | of what I'm doing.
03:14:44.300 | so these problems then
03:14:48.280 | which is necessary accuracy
03:14:50.760 | and then creativity
03:14:52.140 | on top of that accuracy
03:14:53.340 | and then all of that information
03:14:55.100 | is kept in separate places
03:14:56.520 | and a lot of that creativity
03:14:57.580 | comes from the latent knowledge
03:14:58.540 | in the expert's head
03:14:59.580 | starts to come to a bit of a fall
03:15:01.760 | when you say,
03:15:02.280 | well, I have these probabilistic agents
03:15:03.880 | that you could argue
03:15:04.840 | aren't that creative, right?
03:15:06.360 | I have these agents
03:15:07.680 | that most of the time
03:15:08.880 | do a pretty good job
03:15:09.820 | and can be creative
03:15:11.220 | in a way that frankly
03:15:12.120 | can be quite frustrating
03:15:13.140 | especially to a lawyer.
03:15:14.260 | So this butts heads
03:15:17.020 | in terms of exactly
03:15:18.220 | how lawyers want to deal
03:15:18.960 | with this information
03:15:19.600 | and again,
03:15:20.000 | I'm painting a very broad brush.
03:15:21.200 | I'm not a lawyer,
03:15:21.880 | my co-founder is.
03:15:22.760 | If anyone is a lawyer
03:15:23.520 | in the audience
03:15:23.980 | who's offended,
03:15:24.360 | I do apologize
03:15:25.000 | but this is broadly
03:15:26.380 | what I've seen
03:15:26.960 | to be accurate.
03:15:29.960 | we help with legal discovery
03:15:31.340 | as well, right?
03:15:32.400 | Like I described before
03:15:33.380 | there could be
03:15:33.760 | an unnamed pharmaceutical company
03:15:35.040 | and a pharmaceutical company
03:15:35.980 | is great
03:15:36.340 | but they happen
03:15:37.440 | to have done some harm, right?
03:15:38.880 | And it is in their best interest
03:15:40.880 | to give all of the information
03:15:42.260 | to the law firm
03:15:43.320 | and describe exactly
03:15:44.500 | well, not exactly
03:15:45.780 | describe in as many ways
03:15:47.140 | as possible
03:15:47.700 | here is 500 gigabytes
03:15:50.180 | of emails that don't matter
03:15:51.820 | go nuts, right?
03:15:53.060 | Figure out exactly
03:15:54.340 | what happened at what point
03:15:55.440 | and bring out the information.
03:15:56.700 | Now, that is a challenge
03:15:58.020 | at the moment.
03:15:58.560 | A lot of the time
03:15:59.640 | it's manually reviewed.
03:16:00.480 | There are shortcuts
03:16:01.400 | and processes by necessity
03:16:02.820 | because a lot of these lawsuits
03:16:04.020 | are on a particular timeline.
03:16:05.400 | It is physically impossible
03:16:07.220 | to read all of the information
03:16:09.020 | that is given
03:16:09.520 | in the discovery
03:16:10.100 | of the processing of a lawsuit.
03:16:11.280 | However,
03:16:12.660 | and this is just
03:16:13.740 | a generic graph I used
03:16:14.880 | because I'm not allowed
03:16:15.540 | to use the ones
03:16:16.020 | that I'm currently working on.
03:16:17.040 | However,
03:16:17.920 | if you can take
03:16:19.860 | all of that information,
03:16:20.660 | you can extract
03:16:21.480 | the information
03:16:22.100 | and structure it
03:16:22.760 | in such a way
03:16:23.360 | that it is consistent,
03:16:24.240 | all of a sudden
03:16:25.220 | that mountain of emails
03:16:26.540 | becomes a lot of information
03:16:28.360 | I can immediately dismiss
03:16:29.440 | and a bunch
03:16:30.000 | of genuinely useful information
03:16:32.820 | that I can look at.
03:16:33.600 | And not just that,
03:16:34.420 | when it comes to a graph,
03:16:35.800 | I can actually augment
03:16:37.480 | the information
03:16:38.160 | from discovery
03:16:38.840 | and then I can give
03:16:39.720 | that visual
03:16:40.400 | to the expert
03:16:41.440 | who can make
03:16:41.900 | an immediate decision.
03:16:42.840 | I'm going to loop back
03:16:44.360 | to the example
03:16:44.860 | I was describing before,
03:16:46.460 | which is the pharmaceutical example.
03:16:47.720 | So again,
03:16:48.480 | if ingredients
03:16:49.120 | are a certain concentration,
03:16:50.120 | that concentration
03:16:50.740 | is at a problem,
03:16:51.640 | that problem happened
03:16:52.440 | at a certain time,
03:16:53.260 | there is only going
03:16:54.640 | to be a few people
03:16:55.580 | in that graph
03:16:56.660 | of potentially millions
03:16:57.680 | of nodes
03:16:58.060 | that are a problem,
03:16:58.700 | right?
03:16:59.400 | In the same way
03:17:00.380 | that there are only
03:17:00.760 | a few people
03:17:01.280 | in that mountain
03:17:01.940 | of documents
03:17:02.520 | that were a problem.
03:17:03.240 | However,
03:17:03.820 | now I've changed
03:17:04.420 | the form factor
03:17:05.160 | such that I can
03:17:05.880 | specifically hone in
03:17:06.960 | on what matters
03:17:07.700 | and not just hone in
03:17:08.940 | in a data-driven way,
03:17:10.180 | I can hone in
03:17:11.020 | in a visual way
03:17:11.820 | in natural language
03:17:12.620 | such that the lawyer
03:17:13.520 | who knows exactly
03:17:14.600 | what that natural language
03:17:16.120 | means or the expert
03:17:17.340 | who knows exactly
03:17:17.940 | what that natural language
03:17:18.600 | means can make
03:17:19.540 | a decision
03:17:20.100 | that's data-driven.
03:17:23.060 | There's also a process
03:17:24.160 | if we can build
03:17:24.960 | this information exactly
03:17:25.980 | and I'm giving
03:17:26.640 | the fundamentals,
03:17:27.260 | this is a graph rag talk,
03:17:28.960 | we want to bring
03:17:29.660 | this graph in.
03:17:30.260 | The graph I just described
03:17:31.520 | is not that large,
03:17:32.320 | the graph I just described
03:17:33.240 | has a consistent schema
03:17:34.260 | and the graph
03:17:34.680 | I just described
03:17:35.280 | can be relatively
03:17:36.780 | easily retrieved.
03:17:37.960 | I'm not going to say
03:17:38.620 | that retrieval
03:17:39.180 | is completely solved,
03:17:40.080 | I am going to say
03:17:41.060 | we have agents
03:17:41.620 | in production right now
03:17:42.500 | that lawyers can
03:17:43.160 | in natural language
03:17:43.900 | query and further
03:17:45.560 | understand the lawsuit
03:17:47.320 | and the individuals
03:17:48.080 | that they're representing.
03:17:52.140 | Now we get to case research.
03:17:53.320 | So that was more discovery,
03:17:54.440 | right, mountain of documents.
03:17:55.580 | Case research would be
03:17:57.500 | a lot of people
03:17:58.420 | used said product
03:17:59.680 | and they're complaining
03:18:00.980 | about it online
03:18:01.700 | and this is a lot of the value
03:18:02.700 | of our company
03:18:03.520 | and what we do.
03:18:04.020 | People can complain
03:18:05.640 | all the time.
03:18:06.820 | They can shout into the void
03:18:07.980 | of a niche subreddit
03:18:08.980 | or they can go on Twitter
03:18:10.080 | or they can be on a forum
03:18:11.240 | that they're used to,
03:18:11.880 | they can be in IRC,
03:18:12.700 | they can be wherever they want,
03:18:13.880 | right,
03:18:14.140 | but they're using similar language
03:18:15.700 | about a specific thing
03:18:16.720 | and so when it comes
03:18:18.200 | to traditional case research,
03:18:19.440 | that information
03:18:20.540 | isn't really discovered.
03:18:21.860 | A lot of the time
03:18:22.540 | it happens
03:18:22.920 | through talking
03:18:23.380 | to another individual,
03:18:24.220 | subscribing to a newsletter,
03:18:25.620 | et cetera.
03:18:26.100 | How do people find
03:18:27.600 | the information?
03:18:28.280 | So this is a graphic
03:18:31.040 | I've taken from our website
03:18:32.040 | which I promise
03:18:32.720 | looks significantly better
03:18:33.660 | than the slides that I make
03:18:34.660 | but I tend to try
03:18:35.740 | and talk to them.
03:18:36.480 | Here is how case research
03:18:39.540 | in our case
03:18:40.640 | for our business works
03:18:42.140 | and that is we start
03:18:43.700 | and scrape the entire web.
03:18:44.860 | Now anyone can scrape
03:18:45.860 | the entire web.
03:18:46.720 | It's doable.
03:18:47.160 | It's a technical challenge
03:18:48.080 | but it's doable
03:18:48.560 | and you can scrape it
03:18:49.320 | at a frequency
03:18:49.820 | in the services, et cetera.
03:18:51.480 | what we do
03:18:51.940 | is scrape the web
03:18:52.560 | and then qualify
03:18:53.120 | the leads of that scraping.
03:18:54.920 | We filter down
03:18:56.060 | all of the information
03:18:56.860 | down to specifically
03:18:57.720 | what the individuals want.
03:18:58.820 | We have schemas
03:18:59.560 | that we work with
03:19:00.120 | particular law firms
03:19:00.920 | and lawyers
03:19:01.320 | and those schemas
03:19:02.660 | get us down
03:19:03.860 | to just the information
03:19:05.040 | that they care about
03:19:05.900 | and look,
03:19:08.140 | maybe there is
03:19:08.860 | but right now
03:19:09.320 | at least for me
03:19:09.800 | there's no such thing
03:19:10.400 | as a perfect case.
03:19:11.380 | There's no such thing
03:19:12.020 | as a perfect lawsuit.
03:19:12.780 | It depends on the lawyer
03:19:14.180 | or the partner
03:19:15.180 | or the firm
03:19:15.620 | who's willing
03:19:15.900 | to take that on.
03:19:17.160 | So it is not
03:19:18.020 | a problem of best.
03:19:19.040 | It's a problem
03:19:19.560 | of specific and personalized
03:19:20.720 | and that is where
03:19:21.620 | things like LLMs
03:19:23.280 | are particularly useful
03:19:24.280 | at the moment.
03:19:24.760 | That's where things
03:19:25.240 | like multi-agent systems
03:19:26.220 | are fantastic.
03:19:26.820 | That's where things
03:19:27.780 | like structured information
03:19:28.600 | and graphs
03:19:28.980 | all of a sudden
03:19:29.580 | a different lawyer
03:19:30.560 | can have a different
03:19:31.100 | multi-agent system
03:19:31.860 | and a different graph
03:19:32.540 | that backs up
03:19:33.060 | their specific way
03:19:33.880 | that they like to work
03:19:34.740 | as opposed to having
03:19:35.820 | a compromise previously
03:19:37.020 | on the way
03:19:37.760 | that everyone else
03:19:38.280 | liked to work
03:19:38.700 | or maybe hear something
03:19:39.460 | if they can.
03:19:40.540 | And from there
03:19:41.260 | once we've honed down
03:19:42.360 | just to the signals
03:19:43.140 | that they care about
03:19:43.800 | the qualified signals
03:19:44.820 | that are specific to them
03:19:45.900 | that signal can then
03:19:47.540 | further generate a report
03:19:49.200 | and that report
03:19:49.780 | can be entirely specific
03:19:50.760 | to the lawyer as well.
03:19:51.660 | So when it comes
03:19:52.600 | to report generation
03:19:53.400 | again multi-agent system
03:19:54.800 | that's backed up
03:19:55.340 | by a schema
03:19:55.820 | and that schema
03:19:56.320 | is consistent and pruned
03:19:57.440 | and that schema
03:19:58.380 | looks like controlled state
03:19:59.800 | with a graph
03:20:00.440 | that can build a report
03:20:01.240 | that the lawyer wants.
03:20:02.020 | And every report
03:20:03.140 | is going to be different
03:20:03.960 | but the structure
03:20:04.540 | is going to be the same
03:20:05.080 | for each lawyer
03:20:05.680 | and each lawyer
03:20:06.160 | has a different process.
03:20:06.980 | What I'm broadly
03:20:08.980 | describing is
03:20:10.120 | mass scraping the web
03:20:12.080 | down to a specific
03:20:13.380 | signal generated
03:20:14.320 | just for the lawyer.
03:20:15.060 | It's an entirely
03:20:15.680 | personalized service
03:20:16.640 | that's been automated.
03:20:17.360 | And that is the process
03:20:19.380 | of what we do
03:20:20.060 | and this is part
03:20:21.120 | of how we are able
03:20:22.160 | to manage
03:20:22.900 | and use state
03:20:23.820 | and graphs
03:20:24.320 | and multi-agent systems
03:20:25.340 | to bring the information
03:20:26.560 | together.
03:20:26.980 | Cool.
03:20:29.060 | I'm going to go through
03:20:29.780 | I think I have one case study
03:20:30.800 | that I want to describe
03:20:32.480 | just conscious of time.
03:20:33.600 | This happens.
03:20:36.480 | It's not great.
03:20:38.000 | No one really wants it to.
03:20:39.760 | there may be situations
03:20:40.600 | in which there's a bunch
03:20:41.400 | of people who bought a car
03:20:42.100 | who really wanted it
03:20:42.720 | to catch fire.
03:20:43.200 | We don't necessarily
03:20:43.720 | deal with them.
03:20:44.280 | What we do find
03:20:45.460 | is that there are people
03:20:46.100 | who are driving their car
03:20:46.940 | and it starts to smoke
03:20:47.660 | and then it catches fire
03:20:48.440 | and that is not the behavior
03:20:49.460 | that they intended to happen.
03:20:50.820 | It was not on the brochure
03:20:52.040 | when they bought it.
03:20:52.540 | It's not what they want.
03:20:53.120 | Those people immediately
03:20:55.340 | go and complain
03:20:55.960 | as they should.
03:20:56.680 | They go to government website.
03:20:58.620 | They go to carcomplaints.com.
03:20:59.920 | They're on a specific
03:21:00.640 | subreddit or forum.
03:21:01.580 | And once we can start
03:21:03.600 | to track that,
03:21:04.300 | which we can,
03:21:04.920 | and once we can start
03:21:05.600 | to scrape
03:21:06.160 | and then structure
03:21:07.060 | and then schematize
03:21:08.100 | and then analyze,
03:21:08.760 | we can start
03:21:09.420 | to basically build
03:21:10.220 | a density of complaints
03:21:11.220 | for a specific vehicle,
03:21:12.200 | for a specific year,
03:21:13.120 | for a specific problem.
03:21:14.460 | And that density
03:21:15.400 | is a combination
03:21:15.980 | of how many complaints
03:21:17.120 | multiplied by the velocity
03:21:18.500 | of complaints,
03:21:19.100 | so a certain amount
03:21:20.460 | per month over a number
03:21:21.940 | of months.
03:21:22.480 | Well, all of a sudden,
03:21:23.780 | we get to the point
03:21:24.740 | where we're finding
03:21:25.340 | these leads particularly early.
03:21:26.660 | And now,
03:21:27.100 | as we're building models,
03:21:28.080 | we're starting
03:21:28.480 | to find these leads
03:21:29.140 | early and earlier
03:21:29.780 | and that we don't necessarily
03:21:30.840 | need the velocity
03:21:31.500 | straight away.
03:21:32.160 | We can start
03:21:32.980 | to figure out
03:21:33.860 | what are the previous lawsuits
03:21:34.880 | which were all public
03:21:35.680 | and very well documented
03:21:36.780 | and exactly what happened
03:21:38.040 | in that process.
03:21:38.800 | And so,
03:21:39.900 | for a large law firm,
03:21:41.580 | maybe eight or nine months
03:21:42.740 | post people starting
03:21:43.760 | to complain,
03:21:44.240 | they can take
03:21:46.040 | that lawsuit on
03:21:46.680 | if they want to.
03:21:47.260 | For us,
03:21:48.040 | we can find it
03:21:48.700 | within about 15 minutes
03:21:50.380 | and then generally
03:21:51.080 | it takes probably a month
03:21:52.480 | for you to be confident
03:21:53.160 | that this is the signal
03:21:53.880 | that you want.
03:21:54.660 | and so we can find
03:21:55.300 | things significantly earlier.
03:21:56.340 | That process, again,
03:21:57.260 | scraping the web,
03:21:58.040 | filtering down,
03:21:58.680 | producing the specific report.
03:21:59.940 | This is an example
03:22:00.960 | that we did.
03:22:01.460 | And again,
03:22:02.340 | we deal with
03:22:03.280 | what the lawyers want.
03:22:04.300 | So this lawyer,
03:22:04.940 | again,
03:22:05.340 | he made the case
03:22:06.240 | that people's cars
03:22:07.620 | are catching fires.
03:22:08.240 | They don't really want them to.
03:22:09.320 | Those are the cases
03:22:10.260 | that he would like to take on.
03:22:11.280 | It's of a certain amount
03:22:12.380 | of money.
03:22:12.700 | It's of a certain maker model.
03:22:13.780 | It's in a certain jurisdiction,
03:22:14.860 | et cetera.
03:22:15.360 | Those specific filters,
03:22:16.840 | that schema,
03:22:17.360 | can be applied
03:22:17.920 | throughout the entire process.
03:22:19.000 | That's basically the graph.
03:22:20.600 | Each of these lawyers
03:22:21.380 | have a specific graph
03:22:22.440 | that they want.
03:22:22.980 | And not just that,
03:22:24.600 | they can filter
03:22:25.520 | and feedback that information.
03:22:26.960 | So it's not just
03:22:27.700 | a static graph.
03:22:28.340 | I mean,
03:22:28.540 | the benefit of a graph structure,
03:22:30.880 | at least,
03:22:31.540 | well,
03:22:31.760 | one of the benefits
03:22:32.440 | of a graph structure,
03:22:33.160 | I should say,
03:22:33.700 | is that it's an extensible schema
03:22:35.460 | and that I can update
03:22:36.900 | and I can query across
03:22:37.720 | and I can understand
03:22:38.340 | that information.
03:22:38.880 | So while we are dealing
03:22:41.200 | with RAG,
03:22:41.820 | I would say we have less
03:22:43.040 | of a chat RAG interface.
03:22:45.780 | Well,
03:22:46.000 | the lawyers definitely
03:22:46.560 | do appreciate that.
03:22:47.420 | A lot of what we have
03:22:48.560 | when it comes to RAG
03:22:49.400 | or retrieval augmented generation
03:22:50.680 | would be generating
03:22:51.400 | these reports.
03:22:52.440 | because as much
03:22:53.060 | as a lawyer does want
03:22:53.780 | an answer,
03:22:54.080 | what they also want
03:22:55.020 | is the form factor
03:22:55.680 | they're used to.
03:22:56.300 | And so all of these graphs
03:22:57.540 | are consistently made
03:22:58.380 | and built each day
03:22:59.160 | and then some subgraph
03:23:00.560 | from that broader
03:23:01.860 | monolithic structure
03:23:02.980 | is then brought in
03:23:04.320 | and composed into a report
03:23:05.500 | that a lawyer can action.
03:23:09.640 | kind of what's next
03:23:10.720 | and I'll talk about
03:23:11.220 | the future a little bit.
03:23:11.840 | I mean,
03:23:12.040 | what I described
03:23:12.640 | is kind of what we're doing
03:23:13.480 | but this is what we're doing.
03:23:15.460 | Final lawsuit's early.
03:23:17.060 | Compensate harm
03:23:18.040 | and then people can have
03:23:18.760 | that information
03:23:19.240 | if they want to.
03:23:19.820 | We're able to do this
03:23:21.480 | entirely technically.
03:23:22.240 | We're able to scrape
03:23:23.360 | the web structure, etc.
03:23:24.300 | We're able to iteratively
03:23:25.540 | build up a schema
03:23:26.300 | as we want to.
03:23:26.980 | This is not just
03:23:29.980 | a Gen AI problem
03:23:30.960 | and I think this is
03:23:31.640 | an important thing
03:23:32.240 | that I've seen
03:23:32.640 | around this conference
03:23:33.220 | and people may be seeing
03:23:34.120 | is that Gen AI
03:23:35.340 | is not better
03:23:36.140 | than machine learning
03:23:37.120 | and LLMs are not better
03:23:39.160 | than traditional ML systems
03:23:40.740 | but there are situations
03:23:42.320 | in which one is fantastic
03:23:43.740 | and one is not.
03:23:44.400 | If you look at
03:23:45.020 | multi-agent systems
03:23:45.980 | and again,
03:23:46.340 | I was previously
03:23:46.860 | an academic
03:23:47.340 | in multi-agent systems
03:23:48.160 | and no one ever listened to me
03:23:49.120 | so this is a bizarre situation
03:23:50.980 | but when you used to structure
03:23:53.940 | the multi-agent systems together
03:23:55.340 | somewhere along that workflow
03:23:57.060 | you would have to stop
03:23:58.840 | or say this is not doable
03:24:00.060 | because I cannot plug
03:24:01.120 | these two bits
03:24:01.600 | of information together.
03:24:02.520 | It's too probabilistic
03:24:03.500 | or it's too random
03:24:04.180 | or it's too inconsistent
03:24:04.980 | or the way to describe it
03:24:06.300 | is not a binary feature, right?
03:24:07.720 | It is I really just want
03:24:09.100 | to kind of type what I want, right?
03:24:10.480 | Now with LLMs you can
03:24:12.640 | but it's very much for us
03:24:14.620 | not an LLM filtered system
03:24:17.380 | it's an ML filtered system
03:24:18.980 | that LLMs have allowed us
03:24:20.180 | to pipe together
03:24:21.000 | such that you can actually
03:24:21.900 | provide value completely end-to-end
03:24:23.680 | which I think was previously
03:24:25.180 | not doable
03:24:25.720 | and for us, again,
03:24:26.440 | we've been using graphs
03:24:27.120 | for a long time
03:24:27.800 | for us, the ability
03:24:29.180 | to iteratively build that graph
03:24:30.940 | prune that graph
03:24:31.760 | and every single report
03:24:33.020 | gets better
03:24:33.620 | because we're able
03:24:34.160 | to manage the state
03:24:35.000 | is why people like
03:24:36.620 | working with us
03:24:37.320 | because we can consistently
03:24:38.400 | follow and track
03:24:39.120 | exactly what they want
03:24:40.320 | specifically.
03:24:40.880 | Cool.
03:24:43.540 | I think I'm just about at time
03:24:45.040 | kind of got in early
03:24:45.820 | but that's been the talk
03:24:47.260 | specifically around
03:24:48.060 | I'm happy to talk to anyone
03:24:48.940 | about the specifics
03:24:49.800 | graph rag, et cetera
03:24:51.600 | multi-agent systems
03:24:52.560 | but that's how we use the process.
03:24:53.780 | Thank you very much.
03:24:54.920 | I don't want to touch you
03:25:03.980 | if I don't need to.
03:25:18.420 | What's the best way?
03:25:19.860 | Oh, sorry.
03:25:20.320 | Let me come down.
03:25:20.980 | Can I unplug this?
03:25:22.240 | Is that okay?
03:25:22.500 | Thank you.
03:25:22.960 | That's a wrap for our day.
03:25:28.720 | Okay.
03:25:29.280 | Is that ready?
03:25:30.140 | Thank you so much.
03:25:32.800 | Thank you very much.
03:25:34.460 | I'll talk to you soon.
03:25:35.840 | Wow, your information
03:25:37.060 | is so awesome.
03:25:38.460 | Do you have a business?
03:25:40.460 | I don't know.
03:25:41.960 | I'm going to shoot my.
03:25:44.880 | I don't know, I think.
03:25:45.240 | Is it the specific lawsuit
03:26:01.540 | or the following?
03:26:02.040 | The following.
03:26:02.640 | It's not.
03:26:03.260 | Thank you.
03:26:33.240 | the sessions in this room.
03:26:35.360 | We do have to clear this room out
03:26:37.380 | to set it for tomorrow
03:26:38.500 | so we thank you
03:26:39.720 | and there's plenty of work space
03:26:42.020 | out there.
03:26:43.800 | I promise,
03:26:48.700 | they still have Wi-Fi out there.
03:26:50.060 | It's probably even better.
03:26:50.940 | They might even have coffee
03:26:53.300 | and lemon bars.
03:27:03.220 | thank you.
03:27:15.540 | Thank you.
03:27:16.760 | Thank you.
03:27:18.840 | We're going to have our questions
03:27:20.020 | outside in the track.
03:27:21.300 | Thank you.
03:27:21.880 | He's going to meet you outside.
03:27:23.980 | I promise he's not going to go anywhere.
03:27:25.460 | be glad I don't have music.
03:27:33.360 | What's that?
03:27:35.060 | We usually just turn the music up
03:27:36.580 | but they didn't clear any copyright.
03:27:37.860 | They didn't clear any music.
03:27:38.800 | No music for us.
03:27:39.740 | They didn't clear flash.
03:27:40.940 | Right.
03:27:41.800 | Yeah, yeah.
03:27:44.420 | Can you tell I'm a producer?
03:27:49.260 | Thank you.
03:28:19.240 | Awesome, thank you.
03:28:25.880 | Thank you.
03:28:29.880 | Thank you.
03:28:31.880 | Thank you.
03:28:33.880 | Thank you.
03:28:34.140 | Thank you.
03:28:34.160 | Thank you.
03:28:36.160 | Thank you.
03:29:06.140 | Thank you.
03:29:36.120 | Thank you.
03:30:06.100 | Thank you.
03:30:08.100 | Thank you.
03:30:10.100 | Thank you.
03:30:12.100 | Thank you.
03:30:14.100 | Thank you.
03:30:16.100 | Thank you.
03:30:18.100 | Thank you.
03:30:20.100 | Thank you.
03:30:22.100 | Thank you.
03:30:24.100 | Thank you.
03:30:26.100 | Thank you.
03:30:28.100 | Thank you.
03:30:30.100 | Thank you.
03:30:32.100 | Thank you.
03:30:32.100 | Thank you.
03:30:34.100 | Thank you.
03:30:36.100 | Thank you.
03:30:36.100 | Thank you.
03:30:36.100 | Thank you.
03:30:38.100 | Thank you.
03:30:40.100 | Thank you.
03:30:42.100 | Thank you.
03:30:42.100 | Thank you.
03:30:44.100 | Thank you.
03:30:44.100 | Thank you.
03:30:44.100 | Thank you.
03:30:46.100 | Thank you.
03:30:46.100 | Thank you.
03:30:46.100 | Thank you.
03:30:46.100 | Thank you.
03:30:46.100 | Thank you.
03:30:48.100 | Thank you.
03:30:48.100 | Thank you.
03:30:48.100 | Thank you.
03:30:48.100 | Thank you.
03:30:50.100 | Thank you.
03:30:50.100 | Thank you.
03:30:50.100 | Thank you.
03:30:50.100 | Thank you.
03:30:50.100 | Thank you.
03:30:52.100 | Thank you.
03:30:54.100 | Thank you.
03:30:56.100 | Thank you.
03:30:58.100 | Thank you.
03:31:00.100 | Thank you.
03:31:02.100 | Thank you.
03:31:04.100 | Thank you.
03:31:06.100 | Thank you.
03:31:08.100 | Thank you.
03:31:10.100 | Thank you.
03:31:12.100 | Thank you.
03:31:14.100 | Thank you.
03:31:16.100 | Thank you.
03:31:18.100 | Thank you.
03:31:20.100 | Thank you.
03:31:22.100 | Thank you.
03:31:24.100 | Thank you.
03:31:26.100 | Thank you.
03:31:28.100 | Thank you.
03:31:30.100 | Thank you.
03:31:32.100 | Thank you.
03:31:34.100 | Thank you.
03:31:36.100 | Thank you.
03:31:38.100 | Thank you.
03:31:40.100 | Thank you.
03:31:42.100 | Thank you.
03:31:44.100 | Thank you.
03:31:46.100 | Thank you.
03:31:48.100 | Thank you.
03:31:50.100 | Thank you.
03:31:52.100 | Thank you.
03:31:54.100 | Thank you.
03:31:56.100 | Thank you.
03:31:58.100 | Thank you.
03:32:00.100 | Thank you.
03:32:02.100 | Thank you.
03:32:04.100 | Thank you.
03:32:06.100 | Thank you.
03:32:08.100 | Thank you.
03:32:10.100 | Thank you.
03:32:12.100 | Thank you.
03:32:14.100 | Thank you.
03:32:16.100 | Thank you.
03:32:18.100 | Thank you.
03:32:20.100 | Thank you.
03:32:22.100 | Thank you.
03:32:24.100 | Thank you.
03:32:26.100 | Thank you.
03:32:28.100 | Thank you.
03:32:30.100 | Thank you.
03:32:32.100 | Thank you.
03:32:34.100 | Thank you.
03:32:36.100 | Thank you.
03:32:38.100 | Thank you.
03:32:40.100 | Thank you.
03:32:42.100 | Thank you.
03:32:44.100 | Thank you.
03:32:46.100 | Thank you.
03:32:48.100 | Thank you.
03:32:50.100 | Thank you.
03:32:52.100 | Thank you.
03:32:54.100 | Thank you.
03:32:56.100 | Thank you.
03:32:58.100 | Thank you.
03:33:00.100 | Thank you.
03:33:02.100 | Thank you.
03:33:04.100 | Thank you.
03:33:06.100 | Thank you.
03:33:08.100 | Thank you.
03:33:10.100 | Thank you.
03:33:12.100 | Thank you.
03:33:14.100 | Thank you.
03:33:16.100 | Thank you.
03:33:18.100 | Thank you.
03:33:20.100 | Thank you.
03:33:22.100 | Thank you.
03:33:24.100 | Thank you.
03:33:26.100 | Thank you.
03:33:28.100 | Thank you.
03:33:30.100 | Thank you.
03:33:32.100 | Thank you.
03:33:34.100 | Thank you.
03:33:36.100 | Thank you.
03:33:38.100 | Thank you.
03:33:40.100 | Thank you.
03:33:42.100 | Thank you.
03:33:44.100 | Thank you.
03:33:46.100 | Thank you.
03:33:48.100 | Thank you.
03:33:50.100 | Thank you.
03:33:52.100 | Thank you.
03:33:54.100 | Thank you.
03:33:56.100 | Thank you.
03:33:58.100 | Thank you.
03:34:00.100 | Thank you.
03:34:02.100 | Thank you.
03:34:04.100 | Thank you.
03:34:06.100 | Thank you.
03:34:08.100 | Thank you.
03:34:10.100 | Thank you.
03:34:12.100 | Thank you.
03:34:14.100 | Thank you.
03:34:16.100 | Thank you.
03:34:18.100 | Thank you.
03:34:20.100 | Thank you.
03:34:22.100 | Thank you.
03:34:24.100 | Thank you.
03:34:26.100 | Thank you.
03:34:28.100 | Thank you.
03:34:30.100 | Thank you.
03:34:32.100 | Thank you.
03:34:34.100 | Thank you.
03:34:36.100 | Thank you.
03:34:38.100 | Thank you.
03:34:40.100 | Thank you.
03:34:42.100 | Thank you.
03:34:44.100 | Thank you.
03:34:46.100 | Thank you.
03:34:48.100 | Thank you.
03:34:50.100 | Thank you.
03:34:52.100 | Thank you.
03:34:54.100 | Thank you.
03:34:56.100 | Thank you.
03:34:58.100 | Thank you.
03:35:00.100 | Thank you.
03:35:02.100 | Thank you.
03:35:04.100 | Thank you.
03:35:06.100 | Thank you.
03:35:08.100 | Thank you.
03:35:10.100 | Thank you.
03:35:12.100 | Thank you.
03:35:14.100 | Thank you.
03:35:16.100 | Thank you.
03:35:18.100 | Thank you.
03:35:20.100 | Thank you.
03:35:22.100 | Thank you.
03:35:24.100 | Thank you.
03:35:26.100 | Thank you.
03:35:28.100 | Thank you.
03:35:30.100 | Thank you.
03:35:32.100 | Thank you.
03:35:34.100 | Thank you.
03:35:36.100 | Thank you.
03:35:38.100 | Thank you.
03:35:40.100 | Thank you.
03:35:42.100 | Thank you.
03:35:44.100 | Thank you.
03:35:46.100 | Thank you.
03:35:48.100 | Thank you.
03:35:50.100 | Thank you.
03:35:52.100 | Thank you.
03:35:54.100 | Thank you.
03:35:56.100 | Thank you.
03:35:58.100 | Thank you.
03:36:00.100 | Thank you.
03:36:02.100 | Thank you.
03:36:04.100 | Thank you.
03:36:06.100 | Thank you.
03:36:08.100 | Thank you.
03:36:10.100 | Thank you.
03:36:12.100 | Thank you.
03:36:14.100 | Thank you.
03:36:16.100 | Thank you.
03:36:18.100 | Thank you.
03:36:20.100 | Thank you.
03:36:22.100 | Thank you.
03:36:24.100 | Thank you.
03:36:26.100 | Thank you.
03:36:28.100 | Thank you.
03:36:30.100 | Thank you.
03:36:32.100 | Thank you.
03:36:34.100 | Thank you.
03:36:36.100 | Thank you.
03:36:38.100 | Thank you.
03:36:40.100 | Thank you.
03:36:42.100 | Thank you.
03:36:44.100 | Thank you.
03:36:46.100 | Thank you.
03:36:48.100 | Thank you.
03:36:50.100 | Thank you.
03:36:52.100 | Thank you.
03:36:54.100 | Thank you.
03:36:56.100 | Thank you.
03:36:58.100 | Thank you.
03:37:00.100 | Thank you.
03:37:02.100 | Thank you.
03:37:04.100 | Thank you.
03:37:06.100 | Thank you.
03:37:08.100 | Thank you.
03:37:10.100 | Thank you.
03:37:11.100 | Thank you.
03:37:12.100 | Thank you.
03:37:14.100 | Thank you.
03:37:16.100 | Thank you.
03:37:18.100 | Thank you.
03:37:20.100 | Thank you.
03:37:22.100 | Thank you.
03:37:24.100 | Thank you.
03:37:26.100 | Thank you.
03:37:28.100 | Thank you.
03:37:30.100 | Thank you.
03:37:32.100 | Thank you.
03:37:34.100 | Thank you.
03:37:36.100 | Thank you.
03:37:38.100 | Thank you.
03:37:40.100 | Thank you.
03:37:42.100 | Thank you.
03:37:44.100 | Thank you.
03:37:46.100 | Thank you.
03:37:47.100 | Thank you.
03:37:48.100 | Thank you.
03:37:50.100 | Thank you.
03:37:51.100 | Thank you.
03:37:53.100 | Thank you.
03:37:55.100 | Thank you.
03:37:57.100 | Thank you.
03:37:58.100 | Thank you.
03:38:00.100 | Thank you.
03:38:02.100 | Thank you.
03:38:04.100 | Thank you.
03:38:06.100 | Thank you.
03:38:08.100 | Thank you.
03:38:10.100 | Thank you.
03:38:12.100 | Thank you.
03:38:14.100 | Thank you.
03:38:16.100 | Thank you.
03:38:18.100 | Thank you.
03:38:20.100 | Thank you.
03:38:21.100 | Thank you.
03:38:23.100 | Thank you.
03:38:25.100 | Thank you.
03:38:25.820 | Thank you.
03:38:26.100 | Thank you.
03:38:27.100 | Thank you.
03:38:29.100 | Thank you.
03:38:31.100 | Thank you.
03:38:33.100 | Thank you.
03:38:35.100 | Thank you.
03:38:37.100 | Thank you.
03:38:39.100 | Thank you.
03:38:41.100 | Thank you.
03:38:43.100 | Thank you.
03:38:45.100 | Thank you.
03:38:47.100 | Thank you.
03:38:49.100 | Thank you.
03:38:51.100 | Thank you.
03:38:53.100 | Thank you.
03:38:55.100 | Thank you.
03:38:56.340 | Thank you.
03:38:57.100 | Thank you.
03:38:59.100 | Thank you.
03:39:01.100 | Thank you.
03:39:03.100 | Thank you.
03:39:05.100 | Thank you.
03:39:07.100 | Thank you.
03:39:09.100 | Thank you.
03:39:11.100 | Thank you.
03:39:13.100 | Thank you.
03:39:15.100 | Thank you.
03:39:17.100 | Thank you.
03:39:19.100 | Thank you.
03:39:21.100 | Thank you.
03:39:23.100 | Thank you.
03:39:25.100 | Thank you.
03:39:27.100 | Thank you.
03:39:29.100 | Thank you.
03:39:31.100 | Thank you.
03:39:33.100 | Thank you.
03:39:35.100 | Thank you.
03:39:37.100 | Thank you.
03:39:39.100 | Thank you.
03:39:41.100 | Thank you.
03:39:43.100 | Thank you.
03:39:45.100 | Thank you.
03:39:47.100 | Thank you.
03:39:49.100 | Thank you.
03:39:51.100 | Thank you.
03:39:53.100 | Thank you.
03:39:55.100 | Thank you.
03:39:57.100 | Thank you.
03:39:58.100 | Thank you.
03:39:59.100 | Thank you.
03:40:01.100 | Thank you.
03:40:03.100 | Thank you.
03:40:05.100 | Thank you.
03:40:07.100 | Thank you.
03:40:09.100 | Thank you.
03:40:11.100 | Thank you.
03:40:13.100 | Thank you.
03:40:15.100 | Thank you.
03:40:17.100 | Thank you.
03:40:19.100 | Thank you.
03:40:21.100 | Thank you.
03:40:23.100 | Thank you.
03:40:25.100 | Thank you.
03:40:27.100 | Thank you.
03:40:29.100 | Thank you.
03:40:31.100 | Thank you.
03:40:33.100 | Thank you.
03:40:35.100 | Thank you.
03:40:35.660 | Thank you.
03:40:36.100 | Thank you.
03:40:37.100 | Thank you.
03:40:39.100 | Thank you.
03:40:41.100 | Thank you.
03:40:43.100 | Thank you.
03:40:45.100 | Thank you.
03:40:47.100 | Thank you.
03:40:49.100 | Thank you.
03:40:51.100 | Thank you.
03:40:53.100 | Thank you.
03:40:55.100 | Thank you.
03:40:57.100 | Thank you.
03:40:59.100 | Thank you.
03:41:01.100 | Thank you.
03:41:03.100 | Thank you.
03:41:05.100 | Thank you.
03:41:07.100 | Thank you.
03:41:09.100 | Thank you.
03:41:11.100 | Thank you.
03:41:13.100 | Thank you.
03:41:15.100 | Thank you.
03:41:17.100 | Thank you.
03:41:19.100 | Thank you.
03:41:21.100 | Thank you.
03:41:23.100 | Thank you.
03:41:25.100 | Thank you.
03:41:27.100 | Thank you.
03:41:29.100 | Thank you.
03:41:31.100 | Thank you.
03:41:33.100 | Thank you.
03:41:35.100 | Thank you.
03:41:37.100 | Thank you.
03:41:39.100 | Thank you.
03:41:41.100 | Thank you.
03:41:43.100 | Thank you.
03:41:45.100 | Thank you.
03:41:47.100 | Thank you.
03:41:49.100 | Thank you.
03:41:51.100 | Thank you.
03:41:53.100 | Thank you.
03:41:55.100 | Thank you.
03:41:57.100 | Thank you.
03:41:59.100 | Thank you.
03:42:01.100 | Thank you.
03:42:03.100 | Thank you.
03:42:05.100 | Thank you.
03:42:07.100 | Thank you.
03:42:09.100 | Thank you.
03:42:11.100 | Thank you.
03:42:13.100 | Thank you.
03:42:15.100 | Thank you.
03:42:17.100 | Thank you.
03:42:19.100 | Thank you.
03:42:21.100 | Thank you.
03:42:23.100 | Thank you.
03:42:25.100 | Thank you.
03:42:27.100 | Thank you.
03:42:29.100 | Thank you.
03:42:31.100 | Thank you.
03:42:33.100 | Thank you.
03:42:35.100 | Thank you.
03:42:37.100 | Thank you.
03:42:39.100 | Thank you.
03:42:41.100 | Thank you.
03:42:43.100 | Thank you.
03:42:45.100 | Thank you.
03:42:47.100 | Thank you.
03:42:49.100 | Thank you.
03:42:51.100 | Thank you.
03:42:53.100 | Thank you.
03:42:55.100 | Thank you.
03:42:57.100 | Thank you.
03:42:59.100 | Thank you.
03:43:01.100 | Thank you.
03:43:03.100 | Thank you.
03:43:05.100 | Thank you.
03:43:07.100 | Thank you.
03:43:08.100 | Thank you.
03:43:09.100 | Thank you.
03:43:11.100 | Thank you.
03:43:13.100 | Thank you.
03:43:15.100 | Thank you.
03:43:17.100 | Thank you.
03:43:19.100 | Thank you.
03:43:21.100 | Thank you.
03:43:23.100 | Thank you.
03:43:25.100 | Thank you.
03:43:27.100 | Thank you.
03:43:29.100 | Thank you.
03:43:31.100 | Thank you.
03:43:33.100 | Thank you.
03:43:35.100 | Thank you.
03:43:37.100 | Thank you.
03:43:39.100 | Thank you.
03:43:41.100 | Thank you.
03:43:43.100 | Thank you.
03:43:45.100 | Thank you.
03:43:47.100 | Thank you.
03:43:49.100 | Thank you.
03:43:51.100 | Thank you.
03:43:53.100 | Thank you.
03:43:55.100 | Thank you.
03:43:57.100 | Thank you.
03:43:59.100 | Thank you.
03:44:01.100 | Thank you.
03:44:03.100 | Thank you.
03:44:05.100 | Thank you.
03:44:07.100 | Thank you.
03:44:09.100 | Thank you.
03:44:11.100 | Thank you.
03:44:13.100 | Thank you.
03:44:15.100 | Thank you.
03:44:17.100 | Thank you.
03:44:19.100 | Thank you.
03:44:21.100 | Thank you.
03:44:23.100 | Thank you.
03:44:24.100 | Thank you.
03:44:25.100 | Thank you.
03:44:25.100 | Thank you.
03:44:27.100 | Thank you.
03:44:29.100 | Thank you.
03:44:31.100 | Thank you.
03:44:33.100 | Thank you.
03:44:35.100 | Thank you.
03:44:37.100 | Thank you.
03:44:39.100 | Thank you.
03:44:41.100 | Thank you.
03:44:43.100 | Thank you.
03:44:45.100 | Thank you.
03:44:47.100 | Thank you.
03:44:49.100 | Thank you.
03:44:51.100 | Thank you.
03:44:53.100 | Thank you.
03:44:55.100 | Thank you.
03:44:57.100 | Thank you.
03:44:59.100 | Thank you.
03:45:01.100 | Thank you.
03:45:03.100 | Thank you.
03:45:05.100 | Thank you.
03:45:07.100 | Thank you.
03:45:09.100 | Thank you.
03:45:11.100 | Thank you.
03:45:13.100 | Thank you.
03:45:15.100 | Thank you.
03:45:17.100 | Thank you.
03:45:19.100 | Thank you.
03:45:21.100 | Thank you.
03:45:23.100 | Thank you.
03:45:25.100 | Thank you.
03:45:27.100 | Thank you.
03:45:29.100 | Thank you.
03:45:31.100 | Thank you.
03:45:33.100 | Thank you.
03:45:35.100 | Thank you.
03:45:37.100 | Thank you.
03:45:39.100 | Thank you.
03:45:41.100 | Thank you.
03:45:43.100 | Thank you.
03:45:45.100 | Thank you.
03:45:47.100 | Thank you.
03:45:49.100 | Thank you.
03:45:51.100 | Thank you.
03:45:53.100 | Thank you.
03:45:55.100 | Thank you.
03:45:57.100 | Thank you.
03:45:59.100 | Thank you.
03:46:01.100 | Thank you.
03:46:03.100 | Thank you.
03:46:05.100 | Thank you.
03:46:07.100 | Thank you.
03:46:09.100 | Thank you.
03:46:11.100 | Thank you.
03:46:13.100 | Thank you.
03:46:15.100 | Thank you.
03:46:17.100 | Thank you.
03:46:19.100 | Thank you.
03:46:21.100 | Thank you.
03:46:23.100 | Thank you.
03:46:25.100 | Thank you.
03:46:27.100 | Thank you.
03:46:29.100 | Thank you.
03:46:31.100 | Thank you.
03:46:33.100 | Thank you.
03:46:35.100 | Thank you.
03:46:37.100 | Thank you.
03:46:39.100 | Thank you.
03:46:41.100 | Thank you.
03:46:43.100 | Thank you.
03:46:45.100 | Thank you.
03:46:47.100 | Thank you.
03:46:49.100 | Thank you.
03:46:51.100 | Thank you.
03:46:53.100 | Thank you.
03:46:55.100 | Thank you.
03:46:57.100 | Thank you.
03:46:59.100 | Thank you.
03:47:01.100 | Thank you.
03:47:03.100 | Thank you.
03:47:05.100 | Thank you.
03:47:07.100 | Thank you.
03:47:09.100 | Thank you.
03:47:11.100 | Thank you.
03:47:13.100 | Thank you.
03:47:15.100 | Thank you.
03:47:16.100 | Thank you.
03:47:17.100 | Thank you.
03:47:19.100 | Thank you.
03:47:21.100 | Thank you.
03:47:23.100 | Thank you.
03:47:25.100 | Thank you.
03:47:27.100 | Thank you.
03:47:29.100 | Thank you.
03:47:31.100 | Thank you.
03:47:33.100 | Thank you.
03:47:35.100 | Thank you.
03:47:37.100 | Thank you.
03:47:39.100 | Thank you.
03:47:41.100 | Thank you.
03:47:43.100 | Thank you.
03:47:45.100 | Thank you.
03:47:47.100 | Thank you.
03:47:49.100 | Thank you.
03:47:51.100 | Thank you.
03:47:53.100 | Thank you.
03:47:55.100 | Thank you.
03:47:57.100 | Thank you.
03:47:59.100 | Thank you.
03:48:01.100 | Thank you.
03:48:03.100 | Thank you.
03:48:05.100 | Thank you.
03:48:07.100 | Thank you.
03:48:09.100 | Thank you.
03:48:11.100 | Thank you.
03:48:13.100 | Thank you.
03:48:15.100 | Thank you.
03:48:17.100 | Thank you.
03:48:19.100 | Thank you.
03:48:21.100 | Thank you.
03:48:23.100 | Thank you.
03:48:25.100 | Thank you.
03:48:27.100 | Thank you.
03:48:29.100 | Thank you.
03:48:31.100 | Thank you.
03:48:33.100 | Thank you.
03:48:35.100 | Thank you.
03:48:37.100 | Thank you.
03:48:39.100 | Thank you.
03:48:41.100 | Thank you.
03:48:43.100 | Thank you.
03:48:45.100 | Thank you.
03:48:47.100 | Thank you.
03:48:49.100 | Thank you.
03:48:51.100 | Thank you.
03:48:53.100 | Thank you.
03:48:55.100 | Thank you.
03:48:57.100 | Thank you.
03:48:59.100 | Thank you.
03:49:01.100 | Thank you.
03:49:03.100 | Thank you.
03:49:05.100 | Thank you.
03:49:07.100 | Thank you.
03:49:09.100 | Thank you.
03:49:11.100 | Thank you.
03:49:13.100 | Thank you.
03:49:15.100 | Thank you.
03:49:17.100 | Thank you.
03:49:19.100 | Thank you.
03:49:21.100 | Thank you.
03:49:23.100 | Thank you.
03:49:25.100 | Thank you.
03:49:27.100 | Thank you.
03:49:29.100 | Thank you.
03:49:31.100 | Thank you.
03:49:33.100 | Thank you.
03:49:35.100 | Thank you.
03:49:37.100 | Thank you.
03:49:39.100 | Thank you.
03:49:41.100 | Thank you.
03:49:42.100 | Thank you.
03:49:43.100 | Thank you.
03:49:45.100 | Thank you.
03:49:47.100 | Thank you.
03:49:49.100 | Thank you.
03:49:51.100 | Thank you.
03:49:53.100 | Thank you.
03:49:55.100 | Thank you.
03:49:57.100 | Thank you.
03:49:59.100 | Thank you.
03:50:01.100 | Thank you.
03:50:03.100 | Thank you.
03:50:05.100 | Thank you.
03:50:07.100 | Thank you.
03:50:09.100 | Thank you.
03:50:11.100 | Thank you.
03:50:13.100 | Thank you.
03:50:15.100 | Thank you.
03:50:17.100 | Thank you.
03:50:19.100 | Thank you.
03:50:21.100 | Thank you.
03:50:23.100 | Thank you.
03:50:25.100 | Thank you.
03:50:27.100 | Thank you.
03:50:29.100 | Thank you.
03:50:31.100 | Thank you.
03:50:33.100 | Thank you.
03:50:35.100 | Thank you.
03:50:37.100 | Thank you.
03:50:39.100 | Thank you.
03:50:41.100 | Thank you.
03:50:43.100 | Thank you.
03:50:45.100 | Thank you.
03:50:47.100 | Thank you.
03:50:49.100 | Thank you.
03:50:51.100 | Thank you.
03:50:53.100 | Thank you.
03:50:55.100 | Thank you.
03:50:57.100 | Thank you.
03:50:58.100 | Thank you.
03:51:00.100 | Thank you.
03:51:02.100 | Thank you.
03:51:04.100 | Thank you.
03:51:06.100 | Thank you.
03:51:08.100 | Thank you.
03:51:10.100 | Thank you.
03:51:12.100 | Thank you.
03:51:14.100 | Thank you.
03:51:16.100 | Thank you.
03:51:18.100 | Thank you.
03:51:20.100 | Thank you.
03:51:22.100 | Thank you.
03:51:24.100 | Thank you.
03:51:26.100 | Thank you.
03:51:28.100 | Thank you.
03:51:30.100 | Thank you.
03:51:32.100 | Thank you.
03:51:34.100 | Thank you.
03:51:36.100 | Thank you.
03:51:38.100 | Thank you.
03:51:40.100 | Thank you.
03:51:42.100 | Thank you.
03:51:44.100 | Thank you.
03:51:46.100 | Thank you.
03:51:47.100 | Thank you.
03:51:48.100 | Thank you.
03:51:48.100 | Thank you.
03:51:50.100 | Thank you.
03:51:52.100 | Thank you.
03:51:54.100 | Thank you.
03:51:56.100 | Thank you.
03:51:58.100 | Thank you.
03:52:00.100 | Thank you.
03:52:02.100 | Thank you.
03:52:04.100 | Thank you.
03:52:06.100 | Thank you.
03:52:07.100 | Thank you.
03:52:09.100 | Thank you.
03:52:11.100 | Thank you.
03:52:13.100 | Thank you.
03:52:15.100 | Thank you.
03:52:17.100 | Thank you.
03:52:19.100 | Thank you.
03:52:21.100 | Thank you.
03:52:23.100 | Thank you.
03:52:25.100 | Thank you.
03:52:27.100 | Thank you.
03:52:29.100 | Thank you.
03:52:31.100 | Thank you.
03:52:33.100 | Thank you.
03:52:35.100 | Thank you.
03:52:37.100 | Thank you.
03:52:39.100 | Thank you.
03:52:41.100 | Thank you.
03:52:43.100 | Thank you.
03:52:45.100 | Thank you.
03:52:47.100 | Thank you.
03:52:49.100 | Thank you.
03:52:51.100 | Thank you.
03:52:53.100 | Thank you.
03:52:55.100 | Thank you.
03:52:57.100 | Thank you.
03:52:59.100 | Thank you.
03:53:01.100 | Thank you.
03:53:03.100 | Thank you.
03:53:05.100 | Thank you.
03:53:07.100 | Thank you.
03:53:09.100 | Thank you.
03:53:11.100 | Thank you.
03:53:13.100 | Thank you.
03:53:15.100 | Thank you.
03:53:17.100 | Thank you.
03:53:19.100 | Thank you.
03:53:21.100 | Thank you.
03:53:23.100 | Thank you.
03:53:25.100 | Thank you.
03:53:27.100 | Thank you.
03:53:29.100 | Thank you.
03:53:31.100 | Thank you.
03:53:33.100 | Thank you.
03:53:35.100 | Thank you.
03:53:37.100 | Thank you.
03:53:39.100 | Thank you.
03:53:41.100 | Thank you.
03:53:43.100 | Thank you.
03:53:45.100 | Thank you.
03:53:47.100 | Thank you.
03:53:49.100 | Thank you.
03:53:51.100 | Thank you.
03:53:53.100 | Thank you.
03:53:55.100 | Thank you.
03:53:57.100 | Thank you.
03:53:59.100 | Thank you.
03:54:01.100 | Thank you.
03:54:03.100 | Thank you.
03:54:05.100 | Thank you.
03:54:07.100 | Thank you.
03:54:09.100 | Thank you.
03:54:11.100 | Thank you.
03:54:13.100 | Thank you.
03:54:15.100 | Thank you.
03:54:17.100 | Thank you.
03:54:19.100 | Thank you.
03:54:21.100 | Thank you.
03:54:23.100 | Thank you.
03:54:25.100 | Thank you.
03:54:27.100 | Thank you.
03:54:28.100 | Thank you.
03:54:29.100 | Thank you.
03:54:31.100 | Thank you.
03:54:33.100 | Thank you.
03:54:35.100 | Thank you.
03:54:37.100 | Thank you.
03:54:39.100 | Thank you.
03:54:41.100 | Thank you.
03:54:43.100 | Thank you.
03:54:45.100 | Thank you.
03:54:47.100 | Thank you.
03:54:49.100 | Thank you.
03:54:51.100 | Thank you.
03:54:53.100 | Thank you.
03:54:55.100 | Thank you.
03:54:57.100 | Thank you.
03:54:59.100 | Thank you.
03:55:01.100 | Thank you.
03:55:03.100 | Thank you.
03:55:05.100 | Thank you.
03:55:07.100 | Thank you.
03:55:09.100 | Thank you.
03:55:11.100 | Thank you.
03:55:13.100 | Thank you.
03:55:14.100 | Thank you.
03:55:15.100 | Thank you.
03:55:17.100 | Thank you.
03:55:19.100 | Thank you.
03:55:21.100 | Thank you.
03:55:23.100 | Thank you.
03:55:25.100 | Thank you.
03:55:27.100 | Thank you.
03:55:29.100 | Thank you.
03:55:31.100 | Thank you.
03:55:33.100 | Thank you.
03:55:35.100 | Thank you.
03:55:37.100 | Thank you.
03:55:38.100 | Thank you.
03:55:40.100 | Thank you.
03:55:42.100 | Thank you.
03:55:44.100 | Thank you.
03:55:46.100 | Thank you.
03:55:48.100 | Thank you.
03:55:50.100 | Thank you.
03:55:51.100 | Thank you.
03:55:53.100 | Thank you.
03:55:55.100 | Thank you.
03:55:57.100 | Thank you.
03:55:59.100 | Thank you.
03:56:01.100 | Thank you.
03:56:03.100 | Thank you.
03:56:05.100 | Thank you.
03:56:07.100 | Thank you.
03:56:09.100 | Thank you.
03:56:11.100 | Thank you.
03:56:13.100 | Thank you.
03:56:15.100 | Thank you.
03:56:17.100 | Thank you.
03:56:19.100 | Thank you.
03:56:21.100 | Thank you.
03:56:23.100 | Thank you.
03:56:24.100 | Thank you.
03:56:26.100 | Thank you.
03:56:28.100 | Thank you.
03:56:30.100 | Thank you.
03:56:32.100 | Thank you.
03:56:34.100 | Thank you.
03:56:36.100 | Thank you.
03:56:38.100 | Thank you.
03:56:40.100 | Thank you.
03:56:42.100 | Thank you.
03:56:44.100 | Thank you.
03:56:46.100 | Thank you.
03:56:48.100 | Thank you.
03:56:50.100 | Thank you.
03:56:52.100 | Thank you.
03:56:54.100 | Thank you.
03:56:56.100 | Thank you.
03:56:58.100 | Thank you.
03:57:00.100 | Thank you.
03:57:02.100 | Thank you.
03:57:04.100 | Thank you.
03:57:06.100 | Thank you.
03:57:08.100 | Thank you.
03:57:10.100 | Thank you.
03:57:12.100 | Thank you.
03:57:14.100 | Thank you.
03:57:16.100 | Thank you.
03:57:18.100 | Thank you.
03:57:20.100 | Thank you.
03:57:22.100 | Thank you.
03:57:24.100 | Thank you.
03:57:26.100 | Thank you.
03:57:28.100 | Thank you.
03:57:30.100 | Thank you.
03:57:32.100 | Thank you.
03:57:34.100 | Thank you.
03:57:36.100 | Thank you.
03:57:38.100 | Thank you.
03:57:40.100 | Thank you.
03:57:42.100 | Thank you.
03:57:44.100 | Thank you.
03:57:46.100 | Thank you.
03:57:48.100 | Thank you.
03:57:50.100 | Thank you.
03:57:52.100 | Thank you.
03:57:54.100 | Thank you.
03:57:56.100 | Thank you.
03:57:58.100 | Thank you.
03:58:00.100 | Thank you.
03:58:02.100 | Thank you.
03:58:04.100 | Thank you.
03:58:06.100 | Thank you.
03:58:08.100 | Thank you.
03:58:10.100 | Thank you.
03:58:12.100 | Thank you.
03:58:14.100 | Thank you.
03:58:16.100 | Thank you.
03:58:18.100 | Thank you.
03:58:20.100 | Thank you.
03:58:22.100 | Thank you.
03:58:24.100 | Thank you.
03:58:26.100 | Thank you.
03:58:28.100 | Thank you.
03:58:30.100 | Thank you.
03:58:32.100 | Thank you.
03:58:34.100 | Thank you.
03:58:36.100 | Thank you.
03:58:38.100 | Thank you.
03:58:40.100 | Thank you.
03:58:42.100 | Thank you.
03:58:44.100 | Thank you.
03:58:46.100 | Thank you.
03:58:48.100 | Thank you.
03:58:50.100 | Thank you.
03:58:52.100 | Thank you.
03:58:54.100 | Thank you.
03:58:56.100 | Thank you.
03:58:58.100 | Thank you.
03:59:00.100 | Thank you.
03:59:02.100 | Thank you.
03:59:04.100 | Thank you.
03:59:06.100 | Thank you.
03:59:08.100 | Thank you.
03:59:10.100 | Thank you.
03:59:12.100 | Thank you.
03:59:14.100 | Thank you.
03:59:16.100 | Thank you.
03:59:18.100 | Thank you.
03:59:20.100 | Thank you.
03:59:22.100 | Thank you.
03:59:24.100 | Thank you.
03:59:26.100 | Thank you.
03:59:28.100 | Thank you.
03:59:30.100 | Thank you.
03:59:32.100 | Thank you.
03:59:34.100 | Thank you.
03:59:36.100 | Thank you.
03:59:38.100 | Thank you.
03:59:40.100 | Thank you.
03:59:42.100 | Thank you.
03:59:44.100 | Thank you.
03:59:46.100 | Thank you.
03:59:48.100 | Thank you.
03:59:50.100 | Thank you.
03:59:51.100 | Thank you.
03:59:52.100 | Thank you.
03:59:54.100 | Thank you.
03:59:56.100 | Thank you.
03:59:58.100 | Thank you.
04:00:00.100 | Thank you.
04:00:02.100 | Thank you.
04:00:04.100 | Thank you.
04:00:06.100 | Thank you.
04:00:08.100 | Thank you.
04:00:10.100 | Thank you.
04:00:12.100 | Thank you.
04:00:14.100 | Thank you.
04:00:16.100 | Thank you.
04:00:18.100 | Thank you.
04:00:20.100 | Thank you.
04:00:22.100 | Thank you.
04:00:24.100 | Thank you.
04:00:26.100 | Thank you.
04:00:28.100 | Thank you.
04:00:30.100 | Thank you.
04:00:32.100 | Thank you.
04:00:34.100 | Thank you.
04:00:36.100 | Thank you.
04:00:38.100 | Thank you.
04:00:40.100 | Thank you.
04:00:42.100 | Thank you.
04:00:44.100 | Thank you.
04:00:46.100 | Thank you.
04:00:48.100 | Thank you.
04:00:50.100 | Thank you.
04:00:52.100 | Thank you.
04:00:54.100 | Thank you.
04:00:56.100 | Thank you.
04:00:58.100 | Thank you.
04:01:00.100 | Thank you.
04:01:02.100 | Thank you.
04:01:04.100 | Thank you.
04:01:06.100 | Thank you.
04:01:08.100 | Thank you.
04:01:10.100 | Thank you.
04:01:12.100 | Thank you.
04:01:14.100 | Thank you.
04:01:16.100 | Thank you.
04:01:18.100 | Thank you.
04:01:20.100 | Thank you.
04:01:22.100 | Thank you.
04:01:24.100 | Thank you.
04:01:26.100 | Thank you.
04:01:28.100 | Thank you.
04:01:30.100 | Thank you.
04:01:32.100 | Thank you.
04:01:34.100 | Thank you.
04:01:36.100 | Thank you.
04:01:38.100 | Thank you.
04:01:40.100 | Thank you.
04:01:42.100 | Thank you.
04:01:44.100 | Thank you.
04:01:46.100 | Thank you.
04:01:48.100 | Thank you.
04:01:50.100 | Thank you.
04:01:52.100 | Thank you.
04:01:54.100 | Thank you.
04:01:56.100 | Thank you.
04:01:57.100 | Thank you.
04:01:58.100 | Thank you.
04:02:00.100 | Thank you.
04:02:01.100 | Thank you.
04:02:02.100 | Thank you.
04:02:04.100 | Thank you.
04:02:05.100 | Thank you.
04:02:07.100 | Thank you.
04:02:09.100 | Thank you.
04:02:11.100 | Thank you.
04:02:13.100 | Thank you.
04:02:15.100 | Thank you.
04:02:17.100 | Thank you.
04:02:19.100 | Thank you.
04:02:21.100 | Thank you.
04:02:23.100 | Thank you.
04:02:25.100 | Thank you.
04:02:27.100 | Thank you.
04:02:29.100 | Thank you.
04:02:31.100 | Thank you.
04:02:33.100 | Thank you.
04:02:35.100 | Thank you.
04:02:37.100 | Thank you.
04:02:39.100 | Thank you.
04:02:41.100 | Thank you.
04:02:43.100 | Thank you.
04:02:45.100 | Thank you.
04:02:47.100 | Thank you.
04:02:49.100 | Thank you.
04:02:51.100 | Thank you.
04:02:52.100 | Thank you.
04:02:54.100 | Thank you.
04:02:56.100 | Thank you.
04:02:58.100 | Thank you.
04:03:00.100 | Thank you.
04:03:02.100 | Thank you.
04:03:04.100 | Thank you.
04:03:06.100 | Thank you.
04:03:08.100 | Thank you.
04:03:10.100 | Thank you.
04:03:12.100 | Thank you.
04:03:14.100 | Thank you.
04:03:16.100 | Thank you.
04:03:18.100 | Thank you.
04:03:20.100 | Thank you.
04:03:22.100 | Thank you.
04:03:24.100 | Thank you.
04:03:26.100 | Thank you.
04:03:28.100 | Thank you.
04:03:30.100 | Thank you.
04:03:32.100 | Thank you.
04:03:34.100 | Thank you.
04:03:36.100 | Thank you.
04:03:38.100 | Thank you.
04:03:40.100 | Thank you.
04:03:42.100 | Thank you.
04:03:44.100 | Thank you.
04:03:46.100 | Thank you.
04:03:48.100 | Thank you.
04:03:50.100 | Thank you.
04:03:52.100 | Thank you.
04:03:54.100 | Thank you.
04:03:56.100 | Thank you.
04:03:58.100 | Thank you.
04:04:00.100 | Thank you.
04:04:02.100 | Thank you.
04:04:04.100 | Thank you.
04:04:06.100 | Thank you.
04:04:08.100 | Thank you.
04:04:10.100 | Thank you.
04:04:12.100 | Thank you.
04:04:14.100 | Thank you.
04:04:16.100 | Thank you.
04:04:18.100 | Thank you.
04:04:20.100 | Thank you.
04:04:22.100 | Thank you.
04:04:24.100 | Thank you.
04:04:26.100 | Thank you.
04:04:28.100 | Thank you.
04:04:30.100 | Thank you.
04:04:32.100 | Thank you.
04:04:34.100 | Thank you.
04:04:36.100 | Thank you.
04:04:38.100 | Thank you.
04:04:40.100 | Thank you.
04:04:42.100 | Thank you.
04:04:44.100 | Thank you.
04:04:46.100 | Thank you.
04:04:48.100 | Thank you.
04:04:50.100 | Thank you.
04:04:52.100 | Thank you.
04:04:54.100 | Thank you.
04:04:56.100 | Thank you.
04:04:58.100 | Thank you.
04:05:00.100 | Thank you.
04:05:02.100 | Thank you.
04:05:04.100 | Thank you.
04:05:06.100 | Thank you.
04:05:08.100 | Thank you.
04:05:10.100 | Thank you.
04:05:12.100 | Thank you.
04:05:14.100 | Thank you.
04:05:16.100 | Thank you.
04:05:18.100 | Thank you.
04:05:20.100 | Thank you.
04:05:22.100 | Thank you.
04:05:24.100 | Thank you.
04:05:26.100 | Thank you.
04:05:28.100 | Thank you.
04:05:30.100 | Thank you.
04:05:32.100 | Thank you.
04:05:34.100 | Thank you.
04:05:36.100 | Thank you.
04:05:38.100 | Thank you.
04:05:40.100 | Thank you.
04:05:42.100 | Thank you.
04:05:44.100 | Thank you.
04:05:46.100 | Thank you.
04:05:48.100 | Thank you.
04:05:50.100 | Thank you.
04:05:52.100 | Thank you.
04:05:54.100 | Thank you.
04:05:56.100 | Thank you.
04:05:58.100 | Thank you.
04:06:00.100 | Thank you.
04:06:02.100 | Thank you.
04:06:04.100 | Thank you.
04:06:06.100 | Thank you.
04:06:08.100 | Thank you.
04:06:10.100 | Thank you.
04:06:12.100 | Thank you.
04:06:14.100 | Thank you.
04:06:16.100 | Thank you.
04:06:18.100 | Thank you.
04:06:20.100 | Thank you.
04:06:22.100 | Thank you.
04:06:24.100 | Thank you.
04:06:26.100 | Thank you.
04:06:28.100 | Thank you.
04:06:30.100 | Thank you.
04:06:32.100 | Thank you.
04:06:34.100 | Thank you.
04:06:36.100 | Thank you.
04:06:38.100 | Thank you.
04:06:40.100 | Thank you.
04:06:42.100 | Thank you.
04:06:44.100 | Thank you.
04:06:46.100 | Thank you.
04:06:48.100 | Thank you.
04:06:50.100 | Thank you.
04:06:52.100 | Thank you.
04:06:54.100 | Thank you.
04:06:56.100 | Thank you.
04:06:58.100 | Thank you.
04:07:00.100 | Thank you.
04:07:02.100 | Thank you.
04:07:04.100 | Thank you.
04:07:06.100 | Thank you.
04:07:08.100 | Thank you.
04:07:10.100 | Thank you.
04:07:12.100 | Thank you.
04:07:14.100 | Thank you.
04:07:16.100 | Thank you.
04:07:18.100 | Thank you.
04:07:20.100 | Thank you.
04:07:22.100 | Thank you.
04:07:24.100 | Thank you.
04:07:26.100 | Thank you.
04:07:28.100 | Thank you.
04:07:30.100 | Thank you.
04:07:32.100 | Thank you.
04:07:34.100 | Thank you.
04:07:36.100 | Thank you.
04:07:38.100 | Thank you.
04:07:40.100 | Thank you.
04:07:42.100 | Thank you.
04:07:44.100 | Thank you.
04:07:46.100 | Thank you.
04:07:48.100 | Thank you.
04:07:50.100 | Thank you.
04:07:52.100 | Thank you.
04:07:54.100 | Thank you.
04:07:56.100 | Thank you.
04:07:58.100 | Thank you.
04:08:00.100 | Thank you.