back to index

LangChain Interrupt 2025 State of Agents – Andrew Ng : Harrison Chase


Whisper Transcript | Transcript Only Page

00:00:00.080 | This is a tragedy that we see more and more as a lot of building blocks are starting to get figured out.
00:00:07.220 | I'm really excited for this next section.
00:00:11.420 | So we'll be doing a fireside chat with Andrew, and Andrew probably doesn't need any introduction to most of us here.
00:00:17.980 | I'm guessing a lot of people can begin some of his classes, even with Sarah, or deep learning.
00:00:22.620 | But Andrew has been a big part of the LinkedIn story.
00:00:26.580 | So I met Andrew a little over two years ago at a conference, and we started talking about LinkedIn.
00:00:33.340 | And he graciously invited us to do a course on LinkedIn deep learning.
00:00:37.600 | I think it must have been the second or third one that they ever did.
00:00:42.000 | And I know a lot of people here would probably watch that course or have started on LinkedIn because of that course.
00:00:46.520 | So Andrew has been a huge part of the LinkedIn journey.
00:00:50.440 | And I'm super excited to welcome him on stage for a fireside chat.
00:00:54.160 | So let's welcome Andrew in.
00:00:55.500 | Thank you.
00:01:25.480 | You've obviously touched and thought about so many things in this industry, but one of your takes that I cite a lot, and probably people have talked about,
00:01:45.440 | is your take on kind of like talking about the agenticness of an application as opposed to whether something's an agent.
00:01:56.440 | And so, you know, as we're here now at an agent conference, maybe we should rename it to an agentic conference, but would you mind kind of like clarify that?
00:02:04.740 | Yeah, I think it was like almost a year and a half, two years ago that you said that, and so I'm curious if things have changed in your mind since then.
00:02:11.420 | So I remember that, and so I remember an agent.
00:02:12.420 | I mentioned that Harrison and I spoke at a conference in a year and over a year ago, and at that time, I think both of us were trying to convince other people that agents are a thing you should pay attention to.
00:02:22.420 | And that was before maybe, I think it was mid-summer last year, a bunch of lawmakers got an agentic term and start to see that sticker everywhere there was.
00:02:33.420 | So I heard this question, I think about a year and a half ago, I saw that long people are arguing, this is an agent, this is not a different, you know.
00:02:40.420 | My friends, this is a chain of conference that's an agent. And I felt that the response had an argument, but that we would succeed that as a community would just say that the decrease from something is agentic.
00:02:51.420 | So, and then we just say that if you want to go on an agentic system, move autonomy or a lot of autonomy is all fine.
00:02:58.420 | No need to spend time arguing, this is truly an agent. Let's just call all these things agentic systems with different degrees of autonomy.
00:03:06.420 | And I think that actually, hopefully, producing a lot of autonomy people, at least spend time arguing some as an agent.
00:03:13.420 | And this is common for agenting when they're young. And I think that should run down.
00:03:19.420 | Where on that spectrum of kind of like a little autonomy to a lot of autonomy do you see people building towards this?
00:03:26.420 | Yeah. So, my team, your team uses mangrove for a hardest problem during complex values and so on.
00:03:33.420 | I'm also seeing tons of digital opportunities that, frankly, are fairly linear of workflows or linear with just occasional cyber actions.
00:03:42.420 | So, one of the businesses, there are opportunities where you're running down with people looking to form a website,
00:03:48.420 | doing web search, checking something within the piece to see if it's a compliance issue or if there are, you know, some machine, some service up to.
00:03:54.420 | And it's kind of a, well, take something, copy-paste it into a web search, or something different.
00:03:59.420 | So, in business processes, there are actually a lot of fairly linear workflows or linear with very small reasonable educational branches.
00:04:07.420 | Usually, you couldn't do a failure with some of the agentless workflow.
00:04:10.420 | So, I see a lot of articulation, but one challenge I see businesses have is it's still pretty difficult to look at, you know, some stuff that's being done in US and figure out how to turn it into a agentic workflow.
00:04:22.420 | So, what is the granularity?
00:04:25.420 | You know, we should try to bring down the school into micro tasks.
00:04:28.420 | And then, you know, after you build your initial prototype, if it doesn't work well enough,
00:04:33.420 | and which of these steps do you work on to improve performance?
00:04:36.420 | So, I think that whole bag of skills on how to look at a bunch of stuff that people are doing,
00:04:41.420 | break into sequential steps, where are the small number of branches, how do you put in place evals, you know, all that.
00:04:47.420 | that skill set is still .
00:04:50.420 | And then, of course, there's a much more complex .
00:04:55.420 | I think you heard a bunch of very complex groups that's very valuable as well.
00:05:00.420 | But, I see much more in terms of number of opportunities, the amount of value.
00:05:04.420 | There's a lot of simpler efforts that I think are still being used as well.
00:05:08.420 | Let's talk about some of those skills.
00:05:10.420 | Like, so, you've been doing deep learning.
00:05:12.420 | I think a lot of courses are in pursuit of helping people kind of like building.
00:05:16.420 | And so, what are some of the skills that you think agent builders all across this sector should kind of like master and get started with?
00:05:25.420 | Well, it's a good question.
00:05:26.420 | I wish I could answer that.
00:05:28.420 | I've been thinking a lot about this, actually.
00:05:29.420 | I think one of the challenges, if you have a business process, you often have people who comply with people in the job, whatever, these steps.
00:05:40.420 | How do you put in place the, you know, through the MacGraph type integration,
00:05:46.420 | you want to see if you have CPP also some of that too, to adjust the data.
00:05:50.420 | And then, how do you process or process multiple steps,
00:05:55.420 | be able to build this into a system?
00:05:57.420 | And one thing I see a lot is putting in place the record key balance framework
00:06:02.420 | to not only to understand the performance of the overall system,
00:06:07.420 | but to trace the individual steps.
00:06:09.420 | You can pull in on what's the one step that is broken,
00:06:13.420 | what's the outcome that's broken to work on.
00:06:15.420 | I find that a lot of teams probably way longer than they should,
00:06:19.420 | just using human devalance.
00:06:21.420 | Every time you change something, you can sit there and look at a bunch of awkward scenes, right?
00:06:25.420 | I see most teams probably slower to put in place and develop,
00:06:29.420 | systematic development is ideal.
00:06:30.420 | But I find that having the United States or one to two days in the project is still really difficult.
00:06:37.420 | The skill teams, the teams are still learning skills,
00:06:42.420 | they often, you know, go down the line now, it's very recent,
00:06:45.420 | like a few months, trying to improve one for more.
00:06:48.420 | And then more students will say, you know what,
00:06:50.420 | I don't think this could ever be made to work.
00:06:52.420 | So just don't, just find the different variables as well.
00:06:56.420 | I wish I knew, I wish I knew, I knew more efficiently,
00:06:59.420 | to get this almost tactile knowledge.
00:07:02.420 | Often you're there, you know, look at the output,
00:07:05.420 | look at the trace, look at the length of the output,
00:07:08.420 | and just try to make a decision, right?
00:07:10.420 | It is a lot harder to mix, and that's still very difficult.
00:07:15.420 | And is this kind of like tactile knowledge mostly around LLMs and their limitations,
00:07:20.420 | more around like just the product framing of things,
00:07:22.420 | and that skill of taking a job and breaking it down,
00:07:26.420 | that's something that we're still getting accustomed to.
00:07:29.420 | I think it's all the way well, actually.
00:07:31.420 | So I feel like over the last couple of years,
00:07:33.420 | AI tool companies have created an amazing set of AI tools.
00:07:39.420 | this includes tools like, you know, that graph,
00:07:42.420 | but also, how do you, I guess like, how do you think about RAN?
00:07:46.420 | How do you think about the chat box?
00:07:48.420 | Many, many different ways of approaching memory.
00:07:51.420 | What else, how do you need those?
00:07:53.420 | How do you got your else?
00:07:54.420 | But I feel like there's this, you know,
00:07:56.420 | wild, sprawling array of really exciting tools.
00:08:00.420 | One picture I often have in my head is,
00:08:03.420 | if all you have are, you know, purple Lego bricks, right?
00:08:07.420 | If you can't build that much, you just say stop.
00:08:09.420 | But, and then think of these tools as being a kind of Lego bricks, right?
00:08:13.420 | And the one tools you have is as if, you know, just a purple Lego bricks,
00:08:16.420 | but the red one, the black one, and the yellow one, the green one.
00:08:19.420 | And as you get more different colored and shaped Lego bricks,
00:08:23.420 | you can very quickly assemble them into really cool things.
00:08:26.420 | And so I think a lot of these tools,
00:08:27.420 | maybe one's that's wrapped in progress,
00:08:29.420 | different types of Lego bricks.
00:08:31.420 | And when you're trying to build something, you know,
00:08:33.420 | sometimes you need that right squiggly weird shaped Lego brick,
00:08:36.420 | and some people go and they can fuck with it
00:08:38.420 | and just get the job done.
00:08:39.420 | But if you never build evals of a certain time,
00:08:43.420 | then, you know, then you can actually end up spending,
00:08:46.420 | like a few extra months to do something that someone else
00:08:49.420 | that's done that before could say,
00:08:50.420 | "Oh, wow, I should just put a few thousand strangers
00:08:52.420 | and I'll just judge."
00:08:54.420 | And just go through that process and get it that much faster.
00:08:57.420 | So, one of the unfortunate things about AI,
00:09:01.420 | it's not just one tool.
00:09:03.420 | When I'm coding, I just use a whole bunch of different stuff, right?
00:09:07.420 | I'm not a master.
00:09:08.420 | I've stuck myself up with enough tools to sell that.
00:09:11.420 | So, yeah, and I think having that practice
00:09:15.420 | with different tools,
00:09:16.420 | but also helps .
00:09:20.420 | And one of the things, they've also changed.
00:09:22.420 | So, for example,
00:09:23.420 | just all of a sudden,
00:09:24.420 | have no longer, longer context.
00:09:26.420 | And a lot of the best practices from rag,
00:09:29.420 | from, you know, a year and a half ago,
00:09:31.420 | or whatever,
00:09:32.420 | are much less relevant today.
00:09:34.420 | And I remember, Harrison was really ready
00:09:36.420 | to talk about these things,
00:09:37.420 | and play with the early managing rag frameworks
00:09:40.420 | across this organization and all that.
00:09:42.420 | that, as the content release got longer,
00:09:44.420 | now we've just done a lot of stuff into our content.
00:09:47.420 | It's not that rag has gone away,
00:09:49.420 | but the hyperpractic team has gotten way easier.
00:09:51.420 | There's a huge range of hyperpractors
00:09:53.420 | that work, you know, like, just fine.
00:09:55.420 | So, as algorithms keep harvesting,
00:09:58.420 | the instincts we hold, you know,
00:10:00.420 | two years ago,
00:10:01.420 | which may only not be relevant
00:10:02.420 | than you want today.
00:10:03.420 | You mentioned a lot of things
00:10:04.420 | that I want to talk about.
00:10:06.420 | So, okay, what are some of the Lego bricks
00:10:09.420 | that are maybe underrated right now
00:10:11.420 | that you would recommend
00:10:12.420 | that people aren't talking about?
00:10:13.420 | Like evals, I think, you know,
00:10:14.420 | we had three people talk about evals,
00:10:16.420 | and I think that's top of people's mind.
00:10:18.420 | But what are some things
00:10:19.420 | that most people maybe haven't thought of
00:10:21.420 | or haven't heard of yet
00:10:22.420 | that you would recommend that we're going into?
00:10:24.420 | Good question.
00:10:25.420 | I don't know.
00:10:26.420 | Yeah.
00:10:27.420 | I want to share this.
00:10:29.420 | So, even though people talk about evals,
00:10:31.420 | there's some ways that people don't do it.
00:10:33.420 | Why don't they do it?
00:10:35.420 | I think it's because people often have,
00:10:38.420 | I saw a post on evals rights as well.
00:10:41.420 | People think of writing evals
00:10:42.420 | as this huge thing you have to do, right?
00:10:45.420 | I think of evals as something
00:10:47.420 | I'm going to fill together really quickly,
00:10:49.420 | you know, in 20 minutes.
00:10:50.420 | And it's not that good,
00:10:51.420 | but it starts to complement
00:10:53.420 | my human eyeball evals.
00:10:55.420 | And so what often happens
00:10:57.420 | is our system and this one problem
00:10:59.420 | that I keep on getting regression.
00:11:01.420 | I thought I made it work,
00:11:02.420 | then it breaks.
00:11:03.420 | I'm here, I embrace it.
00:11:04.420 | I've done it.
00:11:05.420 | Then I call it a very simple eval,
00:11:07.420 | maybe with, you know,
00:11:09.420 | five different examples
00:11:10.420 | in some very simple eval situation
00:11:12.420 | to just check for this one regression,
00:11:14.420 | right?
00:11:15.420 | This one thing, right?
00:11:16.420 | And then I'm not swapping out human evals
00:11:18.420 | for automated evals.
00:11:19.420 | I'm still looking open myself,
00:11:21.420 | but when I change something,
00:11:22.420 | I don't want to see evals
00:11:23.420 | to just, you know,
00:11:24.420 | take this word into something
00:11:25.420 | psychopathic and think about it.
00:11:27.420 | And then what happens is,
00:11:29.420 | just like the way we write English maybe,
00:11:32.420 | once you have some slightly helpful
00:11:34.420 | but clearly very broken,
00:11:37.420 | imperfect eval,
00:11:39.420 | then you start going,
00:11:40.420 | you know what?
00:11:41.420 | I can improve my eval
00:11:42.420 | to make it better.
00:11:43.420 | I can improve to make it better.
00:11:44.420 | So just as when we build
00:11:46.420 | the public applications,
00:11:47.420 | we build some, you know,
00:11:48.420 | very quick and dirty thing
00:11:49.420 | that doesn't work
00:11:50.420 | and we could make it better.
00:11:51.420 | For a long of the way,
00:11:52.420 | I built evals.
00:11:53.420 | I built really awful evals
00:11:55.420 | that barely helps.
00:11:56.420 | and then when you look at what it does,
00:11:58.420 | you can go, you know,
00:11:59.420 | it's this email's broken.
00:12:00.420 | I can fix it.
00:12:01.420 | And you can improve it for you
00:12:02.420 | to make it better.
00:12:03.420 | So that's one thing.
00:12:05.420 | Actually, one thing that people
00:12:06.420 | have talked a lot about
00:12:07.420 | and I think is so automated
00:12:09.420 | is the voice stack.
00:12:12.420 | It's one of the things
00:12:13.420 | that I'm actually very excited
00:12:14.420 | about voice applications.
00:12:15.420 | A lot of my friends are very excited
00:12:16.420 | about voice applications.
00:12:17.420 | I see a bunch of large enterprises
00:12:19.420 | really excited about voice applications,
00:12:21.420 | very large enterprises,
00:12:22.420 | very large users.
00:12:23.420 | For some reason,
00:12:24.420 | while there are some developers
00:12:26.420 | in this community
00:12:27.420 | doing voice
00:12:28.420 | the amount of developer attention
00:12:30.420 | on voice stack applications,
00:12:32.420 | there is some, right?
00:12:33.420 | It's not really important,
00:12:34.420 | but that's one thing
00:12:35.420 | that feels much smaller
00:12:36.420 | than the large enterprise
00:12:38.420 | importance I see
00:12:40.420 | as well as applications
00:12:41.420 | going along the way.
00:12:42.420 | And not all of this
00:12:43.420 | is the low-time voice in here.
00:12:45.420 | It's not all speech-to-speech
00:12:47.420 | native volume
00:12:49.420 | or volume or volume.
00:12:50.420 | because I find those models
00:12:51.420 | are very hard to control,
00:12:53.420 | but when we use
00:12:54.420 | more of an agenting
00:12:55.420 | voice stack workflow,
00:12:56.420 | which is great,
00:12:57.420 | which we find
00:12:58.420 | much more controllable.
00:13:00.420 | ADFM work with a ton of teams
00:13:02.420 | on voice stack stuff
00:13:04.420 | that some of which
00:13:05.420 | hopefully be announcing
00:13:07.420 | the future of us
00:13:08.420 | seeing all the very exciting things.
00:13:09.420 | And then,
00:13:10.420 | other things I think
00:13:11.420 | underrated,
00:13:12.420 | one of the ones
00:13:13.420 | that maybe is not underrated
00:13:15.420 | but no others should do it,
00:13:16.420 | I think many of you have seen
00:13:17.420 | that developers
00:13:19.420 | developers
00:13:20.420 | using AI systems
00:13:21.420 | about coding
00:13:22.420 | is so much faster
00:13:23.420 | than developers
00:13:24.420 | and don't.
00:13:25.420 | I've been,
00:13:26.420 | it's been interesting
00:13:27.420 | to see how many companies
00:13:28.420 | CIOs and CTOs
00:13:30.420 | still have policies
00:13:31.420 | that don't let engineers
00:13:32.420 | use AI systems
00:13:33.420 | going.
00:13:34.420 | to improve.
00:13:35.420 | I think,
00:13:35.420 | maybe sometimes
00:13:36.420 | are good reasons,
00:13:37.420 | but I think we have
00:13:38.420 | to go across that
00:13:39.420 | because,
00:13:39.420 | frankly,
00:13:40.420 | I don't know,
00:13:40.420 | my teams
00:13:41.420 | and I just hate
00:13:42.420 | about the coding
00:13:43.420 | on AI assistants.
00:13:45.420 | I think,
00:13:46.420 | some of this is something
00:13:47.420 | that you have to do.
00:13:49.420 | I think
00:13:49.420 | underrated
00:13:50.420 | is the idea
00:13:51.420 | that I think everyone
00:13:52.420 | should learn to code.
00:13:53.420 | One fun fact
00:13:54.420 | about AI fund,
00:13:55.420 | everyone in AI fund,
00:13:57.420 | including,
00:13:58.420 | you know,
00:13:59.420 | the person that runs
00:14:00.420 | our front desk,
00:14:01.420 | receptionist,
00:14:02.420 | and my CFO,
00:14:03.420 | and my attorney,
00:14:04.420 | and the general counsel,
00:14:06.420 | everyone in AI fund
00:14:07.420 | actually knows how to code.
00:14:09.420 | it's not that one
00:14:10.420 | that you're salvaged,
00:14:11.420 | is there not,
00:14:12.420 | but in their respective
00:14:13.420 | job functions,
00:14:14.420 | maybe as a mother
00:14:15.420 | or a home code,
00:14:16.420 | that's able
00:14:17.420 | to tell a computer
00:14:18.420 | what they want
00:14:19.420 | to do.
00:14:20.420 | And so,
00:14:21.420 | it's actually driving
00:14:22.420 | more productivity
00:14:23.420 | progress
00:14:24.420 | across all of these
00:14:25.420 | job functions
00:14:26.420 | that are not salvaged.
00:14:28.420 | that's an exciting result.
00:14:29.420 | Talking about
00:14:30.420 | kind of like
00:14:31.420 | AI coding,
00:14:33.420 | what tools are you using
00:14:34.420 | for that,
00:14:35.420 | personally?
00:14:38.420 | we're,
00:14:39.420 | working on some things
00:14:40.420 | that are not getting
00:14:41.420 | announced.
00:14:42.420 | Exciting.
00:14:45.420 | maybe,
00:14:48.420 | I do use
00:14:49.420 | Cursor,
00:14:50.420 | Ritzer,
00:14:54.420 | that's some of the things.
00:14:55.420 | All right.
00:14:56.420 | Talking about voice.
00:14:58.420 | if people here
00:14:59.420 | want to get into voice
00:15:00.420 | and they're familiar
00:15:01.420 | with building kind of
00:15:02.420 | like agents with LLMs,
00:15:04.420 | how similar is it?
00:15:05.420 | Are there a lot of ideas
00:15:06.420 | people,
00:15:07.420 | all they have to learn?
00:15:08.420 | Yeah,
00:15:09.420 | in terms of,
00:15:10.420 | there are a lot of applications
00:15:11.420 | where I think voices,
00:15:12.420 | you know,
00:15:13.420 | increase certain interactions,
00:15:15.420 | that,
00:15:17.420 | that are not much more.
00:15:19.420 | it turns out that,
00:15:21.420 | it turns out from an application
00:15:22.420 | perspective,
00:15:24.420 | it turns out that this problem
00:15:25.420 | is kind of intimidating,
00:15:26.420 | right?
00:15:27.420 | For other applications,
00:15:28.420 | people can say,
00:15:29.420 | tell me what you think,
00:15:30.420 | here's a block of text problem,
00:15:31.420 | read a bunch of texts from you.
00:15:32.420 | That's actually very intimidating
00:15:33.420 | for users.
00:15:34.420 | And one of the problems
00:15:35.420 | with that is,
00:15:37.420 | people can use backspace,
00:15:38.420 | and so,
00:15:39.420 | you know,
00:15:39.420 | people aren't just as little
00:15:40.420 | learned to respond by the text.
00:15:42.420 | Whereas,
00:15:43.420 | the voice,
00:15:44.420 | you know,
00:15:45.420 | girls fall in,
00:15:46.420 | and she said,
00:15:47.420 | "If you're talking,
00:15:48.420 | you could change your mind."
00:15:49.420 | And I say,
00:15:50.420 | "Oh, I changed my mind,
00:15:51.420 | forget everything."
00:15:52.420 | And I find that,
00:15:55.420 | the lot of applications
00:15:56.420 | for the user friction
00:15:57.420 | to just getting them
00:15:58.420 | to use it
00:15:59.420 | is lower.
00:16:00.420 | You say,
00:16:00.420 | "You know,
00:16:01.420 | tell me what you think."
00:16:02.420 | And then they respond.
00:16:05.420 | in terms of voice,
00:16:06.420 | one of the biggest difference
00:16:09.420 | in terms of,
00:16:11.420 | if you can,
00:16:12.420 | if someone says something,
00:16:14.420 | you know,
00:16:15.420 | they want to respond in,
00:16:16.420 | you know,
00:16:16.420 | some point a second,
00:16:17.420 | right?
00:16:17.420 | Less than 50 milliseconds
00:16:18.420 | is great,
00:16:19.420 | but I agree,
00:16:20.420 | I'm using some point a second.
00:16:21.420 | And with a lot of,
00:16:23.420 | agents,
00:16:24.420 | they weren't closed,
00:16:25.420 | they were run for,
00:16:26.420 | many seconds.
00:16:28.420 | when 2009
00:16:29.420 | wanted to grow Avatar
00:16:30.420 | to build an Avatar,
00:16:31.420 | if you want,
00:16:34.420 | our initial version
00:16:35.420 | had kind of
00:16:36.420 | five to nine seconds
00:16:37.420 | of literacy.
00:16:38.420 | And it's,
00:16:39.420 | and it's just a bad
00:16:40.420 | user experience.
00:16:41.420 | You say something,
00:16:42.420 | you know,
00:16:43.420 | nine seconds aside,
00:16:43.420 | the Avatar response,
00:16:44.420 | but so we wanted to
00:16:45.420 | build things like,
00:16:48.420 | how to create response.
00:16:49.420 | So just as,
00:16:50.420 | you know,
00:16:51.420 | if you ask me a question,
00:16:52.420 | I might go,
00:16:54.420 | that's interesting.
00:16:56.420 | we wanted to basically do that
00:16:57.420 | to hide the latency,
00:16:59.420 | and it actually seems
00:17:00.420 | to work great.
00:17:01.420 | And there are all these other
00:17:02.420 | little tricks as well.
00:17:03.420 | Turns out,
00:17:04.420 | we're building a voice,
00:17:06.420 | customer service chatbot.
00:17:07.420 | it turns out that
00:17:08.420 | when you play background voice,
00:17:09.420 | at a customer contact center,
00:17:10.420 | it's a dense silence,
00:17:12.420 | people are much more sensitive
00:17:13.420 | on that,
00:17:14.420 | of that,
00:17:15.420 | you know,
00:17:15.420 | latency.
00:17:16.420 | So I find that there are
00:17:17.420 | a lot of these things that,
00:17:20.420 | that are different than a
00:17:21.420 | pure text based element.
00:17:22.420 | but in applications,
00:17:23.420 | for a voice based modality,
00:17:25.420 | there's some reason to be comfortable
00:17:27.420 | and just not talking.
00:17:29.420 | I think it sometimes reduces
00:17:30.420 | user friction.
00:17:32.420 | you know,
00:17:32.420 | getting some information
00:17:33.420 | out of them.
00:17:34.420 | Save,
00:17:35.420 | I think when we talk,
00:17:36.420 | we don't feel like
00:17:37.420 | we need to deliver perfection
00:17:39.420 | as much as we operate.
00:17:41.420 | so it's somehow easier
00:17:42.420 | to really just start blurting
00:17:43.420 | on your ideas
00:17:44.420 | and change your mind
00:17:45.420 | and go back and fail.
00:17:46.420 | And that lets us get
00:17:47.420 | the information from them
00:17:48.420 | that we need
00:17:49.420 | to help the user too.
00:17:50.420 | that's interesting.
00:17:53.420 | Ah, wow.
00:17:54.420 | One of the,
00:17:57.420 | one of the new things
00:17:58.420 | that's out there,
00:17:59.420 | you mentioned briefly,
00:18:00.420 | is MCP.
00:18:01.420 | How are you seeing
00:18:02.420 | that transform
00:18:03.420 | how people are building apps?
00:18:04.420 | What types of apps are building
00:18:06.420 | and what's happening
00:18:07.420 | in any business?
00:18:08.420 | Yeah,
00:18:09.420 | so I think it's really exciting.
00:18:11.420 | just this morning,
00:18:12.420 | we released with Anthropi
00:18:13.420 | and my show was on MCP.
00:18:15.420 | I actually saw a lot of,
00:18:17.420 | stuff,
00:18:18.420 | you know,
00:18:19.420 | on MCP
00:18:20.420 | that I thought
00:18:21.420 | was quite confusing.
00:18:23.420 | when we got through Anthropi
00:18:24.420 | we said,
00:18:24.420 | you know,
00:18:25.420 | let's get really good
00:18:26.420 | shortfalls on MCP
00:18:28.420 | that explains it clearly.
00:18:29.420 | I think MCP
00:18:30.420 | is fantastic.
00:18:31.420 | I think it's a very clear market
00:18:32.420 | that,
00:18:33.420 | you know,
00:18:33.420 | that Okunai
00:18:34.420 | adopted it.
00:18:35.420 | Also,
00:18:36.420 | I think speaks
00:18:36.420 | the importance of this.
00:18:39.420 | I think the MCP standard
00:18:40.420 | will continue to evolve,
00:18:42.420 | right?
00:18:43.420 | for example,
00:18:44.420 | I think one of you,
00:18:45.420 | many of you know what MCP is,
00:18:46.420 | right?
00:18:47.420 | Makes it much easier
00:18:48.420 | engines primarily,
00:18:49.420 | but frankly,
00:18:50.420 | other types of software
00:18:51.420 | to plug into different types of data.
00:18:53.420 | When I'm using Elvis myself
00:18:55.420 | or when I'm building applications,
00:18:57.420 | frankly,
00:18:58.420 | for a lot of us,
00:18:59.420 | we spend so much time
00:19:00.420 | on plumbing,
00:19:01.420 | right?
00:19:02.420 | So I think,
00:19:03.420 | for those of you from Bosch enterprises as well,
00:19:04.420 | the AI,
00:19:05.420 | especially,
00:19:06.420 | you know,
00:19:07.420 | are like pretty darn intelligent
00:19:08.420 | to do a lot of stuff
00:19:10.420 | when given the right context.
00:19:12.420 | But so,
00:19:13.420 | I found that,
00:19:14.420 | I spent,
00:19:15.420 | my team spent a lot of time
00:19:16.420 | working and plumbing
00:19:17.420 | on the data integration
00:19:18.420 | so you get the context
00:19:19.420 | to make it feel
00:19:21.420 | to do something
00:19:22.420 | that often is pretty sensible
00:19:23.420 | when it's perfect in the context.
00:19:27.420 | I think is a fantastic way
00:19:28.420 | to try to standardize
00:19:29.420 | the interface.
00:19:31.420 | tools,
00:19:32.420 | API calls,
00:19:33.420 | as well as the responses.
00:19:34.420 | It feels like,
00:19:35.420 | it feels a little bit like,
00:19:37.420 | that's,
00:19:37.420 | you know,
00:19:38.420 | a lot of MCP services
00:19:39.420 | you find the internet
00:19:40.420 | do not work,
00:19:41.420 | right?
00:19:41.420 | and the authentication systems
00:19:42.420 | are kind of,
00:19:43.420 | you know,
00:19:44.420 | even for the very large companies,
00:19:45.420 | you know,
00:19:46.420 | MCP services,
00:19:47.420 | the company is not clear
00:19:48.420 | and you know,
00:19:49.420 | authentication token,
00:19:50.420 | token works,
00:19:51.420 | it's biased,
00:19:52.420 | a lot of that going on.
00:19:53.420 | I think the MCP protocol itself
00:19:55.420 | is also early.
00:19:56.420 | Right now,
00:19:57.420 | MCP gives a long list
00:19:58.420 | of the resources
00:19:59.420 | that will,
00:20:00.420 | you know,
00:20:01.420 | eventually,
00:20:02.420 | I think,
00:20:03.420 | multiple,
00:20:04.420 | you know,
00:20:06.420 | I don't know,
00:20:08.420 | you know,
00:20:09.420 | I don't know,
00:20:10.420 | I don't know,
00:20:11.420 | the MCP,
00:20:12.420 | it is,
00:20:13.420 | the land graph,
00:20:14.420 | the land graph
00:20:15.420 | has so many API calls,
00:20:16.420 | you just,
00:20:17.420 | you can't have like,
00:20:18.420 | a long list of everything
00:20:19.420 | under the sun,
00:20:20.420 | for an agent to sort out,
00:20:21.420 | and so,
00:20:22.420 | I think MCP is a really fantastic
00:20:23.420 | first step,
00:20:24.420 | definitely encourage you to learn about it,
00:20:26.420 | that will make your life easier,
00:20:27.420 | probably,
00:20:28.420 | if you find a good MCP server
00:20:30.420 | implementation,
00:20:31.420 | so how it's on the data integrations.
00:20:33.420 | And I think,
00:20:34.420 | I think what we're important,
00:20:35.420 | is this,
00:20:36.420 | this idea of,
00:20:37.420 | when you have,
00:20:38.420 | you know,
00:20:39.420 | N models,
00:20:40.420 | or N agents,
00:20:41.420 | and N data sources,
00:20:42.420 | it should not be
00:20:43.420 | N times N method,
00:20:44.420 | to do all the integrations,
00:20:46.420 | N plus N,
00:20:47.420 | and I think MCP
00:20:48.420 | is a,
00:20:49.420 | is a fantastic first step,
00:20:50.420 | it will need to evolve,
00:20:51.420 | like a fantastic first step,
00:20:52.420 | to one that type of data integration.
00:20:55.420 | Another type of protocol,
00:20:57.420 | that seemed less fun,
00:20:58.420 | than MCP,
00:20:59.420 | is some of the,
00:21:00.420 | agent to agent stuff,
00:21:01.420 | and I remember,
00:21:02.420 | when we,
00:21:03.420 | when we were at a conference,
00:21:04.420 | a year or so ago,
00:21:05.420 | I think you were talking about,
00:21:06.420 | multi-agent systems,
00:21:07.420 | which this was kind of enabled,
00:21:09.420 | how do you see some of the,
00:21:10.420 | multi-agent,
00:21:11.420 | or agent to agent stuff,
00:21:12.420 | of all of that?
00:21:14.420 | I think,
00:21:15.420 | you know,
00:21:16.420 | the agent to AI,
00:21:17.420 | is still similar to me,
00:21:18.420 | both of us,
00:21:19.420 | right,
00:21:21.420 | we struggle,
00:21:22.420 | giving our code work,
00:21:23.420 | and so,
00:21:24.420 | making my code,
00:21:25.420 | my agent work,
00:21:26.420 | with someone else's agent,
00:21:27.420 | it feels like,
00:21:28.420 | a two miracle,
00:21:29.420 | you know,
00:21:33.420 | I see that,
00:21:34.420 | when one team is building,
00:21:35.420 | a multi-agent system,
00:21:36.420 | that often works,
00:21:37.420 | because we build,
00:21:38.420 | a bunch of agents,
00:21:39.420 | they could meet themselves,
00:21:40.420 | that works,
00:21:42.420 | right now,
00:21:43.420 | at least at this moment in time,
00:21:44.420 | maybe I'm off,
00:21:45.420 | the number of examples I've seen,
00:21:46.420 | of when,
00:21:47.420 | you know,
00:21:48.420 | one team's agent,
00:21:49.420 | and correctional agents,
00:21:50.420 | successfully engaged,
00:21:51.420 | totally different teams,
00:21:52.420 | agent and correctional agents,
00:21:53.420 | I think we're a little bit,
00:21:54.420 | early to that,
00:21:55.420 | I'm sure we'll get there,
00:21:56.420 | but I'm not,
00:21:57.420 | personally seeing,
00:21:58.420 | you know,
00:21:59.420 | real success,
00:22:00.420 | huge success stories,
00:22:01.420 | of that,
00:22:03.420 | I'm not sure,
00:22:05.420 | I agree,
00:22:06.420 | it's,
00:22:07.420 | I think it's super early,
00:22:08.420 | I think,
00:22:09.420 | if MCP is early,
00:22:10.420 | I think,
00:22:11.420 | the new page and stuff,
00:22:12.420 | is just earlier.
00:22:13.420 | Another thing,
00:22:14.420 | that's kind of like,
00:22:15.420 | top of,
00:22:16.420 | what's mine right now,
00:22:17.420 | is it's kind of,
00:22:18.420 | vibe coding,
00:22:19.420 | and all that,
00:22:20.420 | and you touched on it,
00:22:21.420 | a little bit earlier,
00:22:22.420 | with how people are using,
00:22:23.420 | these AI coding assistants,
00:22:25.420 | how do you think about,
00:22:26.420 | vibe coding,
00:22:27.420 | is that a different skill,
00:22:28.420 | than before,
00:22:29.420 | or what kind of purpose,
00:22:30.420 | does that serve,
00:22:31.420 | in the world?
00:22:32.420 | code,
00:22:33.420 | you know,
00:22:34.420 | barely looking at the code,
00:22:35.420 | right,
00:22:36.420 | I think it's a fantastic thing,
00:22:37.420 | I think it's unfortunate,
00:22:38.420 | that that called,
00:22:39.420 | fine coding,
00:22:40.420 | because it's,
00:22:41.420 | just leaving a lot of people,
00:22:42.420 | and I'm thinking,
00:22:43.420 | just go to the vibes,
00:22:44.420 | you know,
00:22:45.420 | and frankly,
00:22:46.420 | when I'm coding,
00:22:47.420 | for a day,
00:22:48.420 | you know,
00:22:49.420 | with fine coding,
00:22:50.420 | or whatever,
00:22:51.420 | with AI coding assistants,
00:22:52.420 | I'm frankly exhausted,
00:22:53.420 | by the end of the day,
00:22:54.420 | from TV,
00:22:55.420 | intellectual exercise,
00:22:56.420 | and so,
00:22:57.420 | I think the name is unfortunate,
00:22:59.420 | but the phenomenon,
00:23:00.420 | in this world,
00:23:01.420 | is great,
00:23:05.420 | over the last year,
00:23:06.420 | a few people,
00:23:07.420 | have been advising,
00:23:08.420 | others,
00:23:09.420 | to not learn to code,
00:23:10.420 | on the basis,
00:23:11.420 | that they have more,
00:23:12.420 | to be in coding,
00:23:13.420 | I think,
00:23:14.420 | I look back,
00:23:14.420 | at some of the worst,
00:23:15.420 | career advice,
00:23:16.420 | ever,
00:23:17.420 | because,
00:23:18.420 | over the last,
00:23:19.420 | many decades,
00:23:20.420 | is coding,
00:23:21.420 | easier,
00:23:21.420 | more people,
00:23:22.420 | started to code,
00:23:24.420 | it turns out,
00:23:25.420 | you know,
00:23:26.420 | when we went,
00:23:27.420 | from punch cards,
00:23:28.420 | to keyboards,
00:23:29.420 | that turned them off,
00:23:30.420 | very well,
00:23:32.420 | it turns out,
00:23:33.420 | one of the most important,
00:23:34.420 | the most important schools,
00:23:35.420 | in the future,
00:23:36.420 | for developers,
00:23:36.420 | and not developers,
00:23:37.420 | is the ability,
00:23:38.420 | to tell them about,
00:23:39.420 | exactly what you want,
00:23:41.420 | to improve,
00:23:43.420 | I think,
00:23:44.420 | understanding,
00:23:45.420 | at some level,
00:23:46.420 | how computer works,
00:23:47.420 | lets you,
00:23:48.420 | probably,
00:23:49.420 | to improve,
00:23:51.420 | I think,
00:23:52.420 | understanding,
00:23:53.420 | at some level,
00:23:54.420 | how computer works,
00:23:55.420 | lets you,
00:23:56.420 | probably,
00:23:57.420 | to improve,
00:23:59.420 | to improve,
00:24:01.420 | I think,
00:24:02.420 | understanding,
00:24:03.420 | at some level,
00:24:04.420 | how computer works,
00:24:05.420 | lets you,
00:24:06.420 | probably,
00:24:07.420 | to improve,
00:24:10.420 | to improve,
00:24:12.420 | which is why,
00:24:13.420 | I still,
00:24:14.420 | try to advise everyone,
00:24:15.420 | to learn,
00:24:16.420 | one part of the migration,
00:24:17.420 | and Python,
00:24:18.420 | or something,
00:24:19.420 | and then,
00:24:20.420 | I think,
00:24:21.420 | I personally,
00:24:22.420 | about,
00:24:23.420 | a much stronger,
00:24:24.420 | Python developer,
00:24:25.420 | than,
00:24:26.420 | JavaScript,
00:24:27.420 | right?
00:24:29.420 | with,
00:24:30.420 | AI system coding,
00:24:31.420 | I now write,
00:24:32.420 | a lot more JavaScript,
00:24:33.420 | types of code,
00:24:34.420 | than I ever used to,
00:24:36.420 | even with debugging,
00:24:37.420 | JavaScript code,
00:24:38.420 | than something else,
00:24:39.420 | building,
00:24:40.420 | that's,
00:24:41.420 | really,
00:24:42.420 | understanding,
00:24:43.420 | you know,
00:24:43.420 | what an error case is,
00:24:44.420 | what does this mean,
00:24:45.420 | that,
00:24:46.420 | that's,
00:24:47.420 | really,
00:24:50.420 | If you,
00:24:51.420 | if you don't like the,
00:24:52.420 | name,
00:24:53.420 | do you have a better name,
00:24:54.420 | than mine?
00:24:55.420 | That's a good question.
00:24:56.420 | I shouldn't think about that.
00:24:57.420 | We'll,
00:24:58.420 | we'll get back to you on that.
00:24:59.420 | Good question.
00:25:01.420 | of the things that,
00:25:02.420 | you announced recently,
00:25:03.420 | is a new fund,
00:25:05.420 | congrats on that.
00:25:06.420 | Were people,
00:25:07.420 | in the audience,
00:25:08.420 | who are going to be thinking,
00:25:09.420 | of starting a startup,
00:25:11.420 | looking into that,
00:25:12.420 | what advice,
00:25:13.420 | for them?
00:25:15.420 | I think,
00:25:16.420 | it was an interest to do,
00:25:17.420 | investment companies,
00:25:18.420 | that we co-founded.
00:25:20.420 | I think,
00:25:21.420 | in terms of,
00:25:22.420 | looking back on,
00:25:23.420 | lessons learned,
00:25:24.420 | the number one,
00:25:25.420 | I would say,
00:25:26.420 | the number one,
00:25:27.420 | predictor,
00:25:28.420 | of a startup success,
00:25:30.420 | speed.
00:25:31.420 | I know,
00:25:32.420 | it's looking at value,
00:25:33.420 | but I see a lot of people,
00:25:34.420 | that I've never seen yet,
00:25:36.420 | the speed,
00:25:37.420 | with which a skilled team,
00:25:38.420 | can execute.
00:25:40.420 | if you've never seen it before,
00:25:41.420 | I know many of you have seen it,
00:25:42.420 | it's just so much faster,
00:25:44.420 | than,
00:25:45.420 | you know,
00:25:46.420 | anything that,
00:25:47.420 | slow businesses,
00:25:48.420 | you know,
00:25:49.420 | how they do,
00:25:50.420 | and I think the number two,
00:25:51.420 | predictor,
00:25:52.420 | also very important,
00:25:54.420 | technical knowledge.
00:25:55.420 | It turns out,
00:25:56.420 | we look at the skills,
00:25:57.420 | to some things,
00:25:58.420 | like,
00:25:59.420 | how you market,
00:26:00.420 | how you sell,
00:26:01.420 | how you price,
00:26:02.420 | you know,
00:26:03.420 | all that is important,
00:26:04.420 | but that knowledge,
00:26:05.420 | has become a little bit more,
00:26:06.420 | widespread.
00:26:07.420 | The knowledge that's really rare,
00:26:09.420 | how does technology actually work,
00:26:10.420 | so quickly?
00:26:12.420 | I have deep respect,
00:26:13.420 | of good luck,
00:26:14.420 | and people,
00:26:15.420 | when pricing is hard,
00:26:16.420 | you know,
00:26:17.420 | marketing is hard,
00:26:18.420 | position is hard,
00:26:19.420 | but that knowledge,
00:26:20.420 | is more diffuse,
00:26:21.420 | and the most rare resource,
00:26:22.420 | is someone,
00:26:23.420 | that really understands,
00:26:24.420 | how the technology,
00:26:25.420 | grows.
00:26:27.420 | when I find,
00:26:28.420 | technical people,
00:26:30.420 | that have good instincts,
00:26:31.420 | or understands,
00:26:32.420 | do this,
00:26:33.420 | don't do that,
00:26:34.420 | do this,
00:26:35.420 | go twice as fast.
00:26:36.420 | And then,
00:26:37.420 | I think,
00:26:38.420 | a lot of the business stuff,
00:26:39.420 | you know,
00:26:40.420 | back to the college,
00:26:41.420 | it's very important,
00:26:42.420 | it's usually easier,
00:26:43.420 | to figure out something.
00:26:44.420 | Alright,
00:26:45.420 | that's great advice,
00:26:46.420 | for sharing something.
00:26:48.420 | we are gonna wrap this up.
00:26:49.420 | We're gonna go to a break now,
00:26:50.420 | but before we do,
00:26:51.420 | please join me,
00:26:52.420 | giving Andrew a big hand.
00:26:53.420 | a big hand.
00:26:54.420 | Thank you.
00:26:55.420 | Our next session,
00:27:03.420 | we'll begin in 15 minutes,
00:27:04.420 | of course,
00:27:05.420 | we'll have an opportunity,
00:27:06.420 | to hear our fellow attendees,
00:27:08.420 | if you're about to,
00:27:09.420 | if you're about to,
00:27:10.420 | you're about to,
00:27:11.420 | we're going to be to this,
00:27:12.420 | at NOVA.
00:27:13.420 | If you'd like to meet,
00:27:14.420 | the line chain,
00:27:15.420 | at the,
00:27:19.420 | who's also in NOVA,
00:27:20.420 | come say hello,
00:27:22.420 | we'll be passing out,
00:27:23.420 | special swag.
00:27:25.420 | Please return to the main stage,
00:27:27.420 | in 15 minutes,
00:27:28.420 | for our next session,
00:27:29.420 | for our next session,
00:27:30.420 | for our next session.
00:27:32.420 | our next session.
00:27:33.420 | is the first session.
00:27:34.420 | the first session.
00:27:35.420 | the first session.
00:27:36.420 | the first session.
00:27:37.420 | the first session.
00:27:39.420 | the first session.
00:27:40.420 | So, let's go.
00:28:10.400 | So, let's go.
00:28:40.380 | So, let's go.
00:29:10.360 | So, let's go.
00:29:40.340 | So, let's go.
00:30:10.320 | So, let's go.
00:30:40.300 | So, let's go.