back to indexMarc Benioff | All-In Summit 2024
Chapters
0:0 Sacks intros Marc Benioff
2:32 Funding Shinya Yamanaka's research
11:38 Marc on building philanthropy into Salesforce
17:15 AI's impact on enterprise software and the cloud
27:56 Salesforce's AI approach
00:00:00.000 |
It is the center of the technology world right now. 00:00:15.040 |
One of the things that really matters to me is having a positive global impact. 00:00:21.360 |
Technology is not good or bad, it's what you do with it that matters. 00:00:25.040 |
In your quest to change the world, don't forget to do something for other people. 00:00:31.600 |
And that was a moment in time when I said, wow, when I start a company, I'm going to 00:00:36.920 |
make sure that philanthropy and giving and generosity and these values are in the culture 00:01:08.640 |
I warned you that I'm not the interviewer in the group, but you chose me, so I'm honored. 00:01:21.880 |
And you're the one that all the women really like. 00:01:28.140 |
Like I'll talk to my friends at dinner, they're like, you know, Sax, what's he like? 00:01:33.740 |
Well, let's just say we're honored to have Mark Benioff here, and truly who's a visionary 00:01:40.400 |
And I would say, you know, there's probably a lot of- 00:01:43.400 |
And I thank my mother for writing that video, by the way, as well. 00:01:50.200 |
You know, in the world of business software in particular, we don't have that many people 00:01:54.720 |
who you can describe as visionaries, but you consistently have been one. 00:02:29.080 |
Can I just, before we start, you know, listen. 00:02:33.080 |
So I want to just do something I would not normally do, and this is going to be a little 00:02:36.880 |
bit of a thing, but I just have to do a little riff on this. 00:02:40.540 |
We just heard an extraordinary presentation on an extraordinary man, and there's somebody 00:02:46.120 |
who's amazing that most people don't get to hear of, and we just heard his name quite 00:02:53.160 |
He is based in Kyoto, Japan, but he works half-time at UCSF, and it's amazing what his 00:03:01.960 |
vision for the world is that he thinks, basically, that we're salamanders and we're going to 00:03:06.320 |
be able to regenerate ourselves, and that's amazing. 00:03:10.820 |
And so I've been friends with him maybe for a decade, but I fund his research, and so 00:03:15.880 |
a lot of these things to watch him have these breakthroughs, you heard about the Yamanaka 00:03:19.720 |
factors, the Yamanaka factors, which are basically this idea that Yamanaka had this breakthrough 00:03:26.760 |
in Kyoto, you know, basically he's hanging out there in his lab eating the sushi, the 00:03:32.000 |
whole thing, and then boom, and he goes, "If I take these four things, I can take an ordinary 00:03:39.760 |
skin cell, just any little skin cell, and turn it into a stem cell," which is like the 00:03:46.720 |
heart of human existence, and he did it, and he was able to repeat it and repeat it and 00:03:51.280 |
repeat it, and he won the Nobel Prize for it, pretty cool. 00:03:56.760 |
And then he, and I'm going to get the pronunciation of this wrong, but he then was able to take 00:04:01.680 |
that stem cell, put it into your eye, if you have damacular degeneration, and boom, healed 00:04:13.440 |
Then he worked with a buddy of his in the lab next door, and he took the same thing, 00:04:18.440 |
took the stem cells, and he turned it on a cookie sheet, and it looked like it was like 00:04:25.320 |
a plastic thing on the cookie sheet, it was really cool, and then he took out somebody's 00:04:30.680 |
cornea that was all screwed up, cut the material out of the cookie sheet, popped it in the 00:04:36.520 |
eye, and the guy could see, and was like, "Amazing," then he's like, "Listen, this is 00:04:43.120 |
amazing, I bet I can grow a brain," so he took the stem cells, and he started growing 00:04:51.360 |
brains called organoids, and he's like, "Got a cookie sheet of brains," and I'm like, "Really?" 00:04:57.280 |
He's like, "This is amazing, look at all the brains," and then I went and saw him and had 00:05:03.600 |
lunch with him, and I'm like, "What's happening with the brains?" 00:05:12.480 |
I'm like, "Oh, scary," then I said to him, "Now what are you doing?" 00:05:19.080 |
"Oh, I'm growing intestines," I'm like, "Whoa, intestines, is that good?" 00:05:25.000 |
He's like, "Huge idea, I can now grow intestines on the cookie sheet, and taking the stem cells, 00:05:31.600 |
I've got a whole intestine here, and then he can turn it into a lab for all the horrible 00:05:36.880 |
things that people get in their gut, and all these diseases that have never been cured, 00:05:41.400 |
but now you have a real simulated environment," so he's an incredible person. 00:05:48.560 |
I'm going, I'm going with this, you got to stay with me, I'm trying to help bring the 00:05:53.440 |
energy up in here, follow, just hold on, hold on, hold on, hold on, wait, wait, wait, this 00:06:01.160 |
So then I'm like, you heard the story, like at the end they said, "Listen, how do I get 00:06:06.800 |
these regenerative factors going inside myself?" 00:06:09.740 |
So UCSF just published research based on funding grant that I and others have given them, and 00:06:15.800 |
they had a breakthrough that the regenerative factor inside your own blood is called PF4, 00:06:22.040 |
and the way you get PF4, and I'm not going to get this exactly right, because you know, 00:06:25.200 |
I'm in software, I'm not a doctor, so just follow with me. 00:06:28.120 |
I thought that's what we're going to talk about today. 00:06:29.400 |
You know that, I know, but I got to tell you this, because I'm scot-so-jack watching that. 00:06:33.800 |
One is, it was either that or those crazy shots you have backstage, I don't know. 00:06:39.800 |
Number one is, PF4, you get more regenerative factors in your body, like calorie restriction, 00:06:47.920 |
and if you know David and I, that does not sound very good. 00:06:51.840 |
Two, working out with weights, also not exactly our top thing. 00:07:04.120 |
Parabiosis kind of came out of research published a decade ago in the New York Times and others, 00:07:09.160 |
which came from Saul Valeda, another person I work with at UCSF, where they took the blood 00:07:14.600 |
of a young mouse and put it into an old mouse, and then the old mouse got young again. 00:07:21.640 |
And that was moving the PF4 into that old mouse. 00:07:27.080 |
And the fourth thing is clothotherapy, which is a genetic therapy that I don't really understand. 00:07:31.120 |
These four things can start to generate more of these things inside your body. 00:07:34.920 |
So then I'm like getting excited, I'm like, God, I have these problems, maybe I can regenerate 00:07:42.240 |
And so I'm talking to my doctor at UCSF, because I'm going through my own serious problem, 00:07:47.520 |
where I'm like, my left leg is like a half an inch shorter than my right leg, and I'm 00:07:51.480 |
running on the treadmill, and I'm always ripping my Achilles, ripping, ripping. 00:07:55.820 |
And all of a sudden, my Achilles looks like it has a donut. 00:07:59.080 |
And in fact, I went to UCSF, and there was like an MRI, you know, well, how many of you 00:08:04.280 |
Raise your hand so you know what is horrible it is. 00:08:06.000 |
Anyway, you get in this big machine, they're looking at my Achilles, they come out, they're 00:08:14.320 |
And I'm like, so I kind of took this thought, and I'm like talking to my doctor, I'm like, 00:08:18.960 |
why can't we like, use some of this, figure out what we can do. 00:08:23.560 |
So he's like, all right, come back on Wednesday. 00:08:27.040 |
So I come back on Wednesday, at five o'clock, you know, I'm in Mission Bay at UCSF. 00:08:33.480 |
And I'm like, hey, Anthony, where is everybody? 00:08:36.520 |
I think we're going to talk about it, come into my lab. 00:08:39.540 |
So I come into the lab, they've got like a centrifuge there, all this stuff going on. 00:08:53.560 |
It's going to be very straightforward because we have two things we can do with you, Mark. 00:08:57.520 |
Number one, we can just take your Achilles, and we can bring you into surgery right now, 00:09:01.160 |
we'll just shave off half your Achilles, and then put you in a boot and see where you are 00:09:10.360 |
What we're going to do is we're going to take a scalpel, right here, we're going to cut 00:09:18.580 |
into your Achilles like 20 times and into your ankle. 00:09:21.800 |
I'm going to take your butt, I'm going to spin it, I'm going to try to find the PF4 00:09:25.000 |
in your plasma, I'm going to inject it into your Achilles and into your plasma, slice 00:09:39.120 |
Because it destabilizes the PRP and the plasma and all the PF4 and all that. 00:09:47.800 |
So he did the whole thing, and then boom, I'm like a salamander. 00:09:54.920 |
So that thing that you just heard, that shit is real, and it's pretty awesome. 00:10:09.120 |
So if he can sell $3 billion into his startup, I should probably start, I'm ready to go. 00:10:14.480 |
Do you ever consider that you missed your calling as a scientific researcher? 00:10:23.560 |
So you are one of the first to actually try using the Yamanaka factors on yourself? 00:10:29.400 |
I wouldn't think I'm one of the first, but I think that it's very real and it's going 00:10:33.360 |
to have a huge impact on our lives, and I think that we should be supporting these medical 00:10:39.720 |
I think it's one of the reasons that I've put almost $1 billion into UCSF and philanthropy, 00:10:44.760 |
because I believe in these people who have dedicated their lives to basic science and 00:10:55.440 |
They're so inspiring to me, and I just had lunch with Yamanaka and Sal Vallada and Anthony 00:11:01.360 |
Luke and another incredible researcher, Mark Moiser, at my house, and we're talking about 00:11:06.520 |
the intersection between oncology and regenerative medicine, which is like two completely different 00:11:13.520 |
And it's what inspires me, that we can work with others to give them the entrepreneurial 00:11:20.200 |
push to go do something incredible, and these people are just awesome. 00:11:26.480 |
So let's shift gears and talk about something else. 00:11:31.440 |
I know you're very philanthropic and do it a lot with UCSF, so kudos to you for encouraging 00:11:38.600 |
Let's shift to another thing that's having a huge impact on our lives, which is the cloud 00:11:50.040 |
And how long have you been a public company for at this point? 00:12:01.880 |
Well, actually, speaking of earnings, here, let's see if we have this slide. 00:12:10.200 |
This is your stock chart over 25, I think 25 years, 20 years. 00:12:16.160 |
I guess there's no linear success, exactly, right? 00:12:32.480 |
Actually, this is one of the things I appreciate about the way you do earnings calls, is you 00:12:35.280 |
just put out this really simple tweet, and it shows a progression. 00:12:40.480 |
And if you like looking at numbers the way I do and seeing patterns in them, one of the 00:12:45.720 |
things I noticed a while ago was that if you start at the bottom and work your way to the 00:12:49.440 |
top, that Salesforce is growing by about 20% a year. 00:12:55.320 |
And if you look at it over three years, that's roughly a double. 00:12:59.740 |
So every three years, Salesforce was doubling. 00:13:03.040 |
And that means that over a decade, it's growing 10x. 00:13:06.560 |
And so every decade is basically exponential. 00:13:11.520 |
That was one of the patterns I noticed with Salesforce. 00:13:13.320 |
Look, I think that the growth, obviously, is incredible, the $38 billion. 00:13:21.560 |
It's more than Coca-Cola did, I think, last quarter. 00:13:26.200 |
But let me just say, probably the best decision we made, and it's not on the slide, which 00:13:30.920 |
is the day we started the company, we put 1% of our equity, 1% of our profit, 1% of 00:13:39.000 |
our product, 1% of all of our employees' time into a 50713 foundation. 00:13:45.280 |
Now at the time, it was very easy, because we had no employees. 00:13:53.100 |
But that idea, though, really kind of created the foundation of the company, because we 00:13:59.140 |
were able to do now, and I think you know the numbers, right, we're almost 10 million 00:14:05.520 |
We've been able to give away almost a billion in grants. 00:14:08.000 |
We run almost 100,000 nonprofits and NGOs for free on our service. 00:14:12.400 |
And I think it really set the stage that business could be the greatest platform for change 00:14:20.120 |
So is there $2 billion in equity sitting in that 501(c)(3) at this point? 00:14:27.340 |
I think there's about a half a billion in the foundation, and a lot has been already 00:14:31.240 |
And then we give out more every year, and every month, every day, whatever. 00:14:35.600 |
But like on Monday, we'll give another $25 million approximately to the San Francisco 00:14:43.100 |
And that is, you know, we've given them about $150 million. 00:14:46.740 |
I mean, it's obviously, I went to public schools, it was very important to me, but my mother 00:14:53.180 |
was a teacher in the San Francisco public schools. 00:14:56.740 |
But also our employees, you know, we have 75,000 employees, their kids are in the public 00:15:03.100 |
And so it's a key part of our mantra and our culture that we're trying to support public 00:15:12.060 |
I really think that each one of us needs to focus more on the public education system 00:15:19.100 |
It's something I encourage in, not all my employees, but whenever I do a presentation, 00:15:23.140 |
I'm like, you know, my public school is like a block from my house, Presidio Middle School, 00:15:28.300 |
and I just went down there and knocked on the door, and they're like, "Who are you?" 00:15:39.040 |
They need a new playground, they need this, they need that. 00:15:42.660 |
And maybe they just need some support, moral support. 00:15:47.620 |
But it's been a great thing to really anchor the company in those values, and I think it's 00:15:55.340 |
So what did you think when you saw that OpenAI started with a nonprofit, not as 1%, but as 00:16:06.920 |
I mean, 18,000 companies have now followed our 1-1-1 model. 00:16:12.300 |
You can find out about it at pledge1percent.org. 00:16:21.600 |
The cloud model, which you also have been part of that, the subscription model, you've 00:16:26.860 |
also been part of that, and the philanthropic model, and you've been part of that. 00:16:30.600 |
And those ideas that we're doing three models, that continues to be the fuel for the company 00:16:38.520 |
And I think that for a lot of these companies that have followed us, that have gone on to 00:16:42.020 |
scale and have had huge IPOs, and whether it was Slack or whether it was Atlassian or 00:16:47.460 |
whether it was Etwilio or whatever, they've had these huge foundations and have had huge 00:16:54.540 |
And business can be the greatest platform for change, and you can do a lot with your 00:17:06.620 |
But we can also do a little more with our business, and we can use it in a positive 00:17:10.540 |
way and try to move the world maybe a little bit more in the right direction. 00:17:15.980 |
So let's talk about the cloud part of that innovation. 00:17:26.580 |
We're at the precipice of the greatest moment in the history of enterprise software and 00:17:35.980 |
I had a moment, I would say, more than a decade ago, which I call my kind of AI freakout moment, 00:17:41.700 |
where I really felt ... I mean, maybe it's ... Obviously, we've all spent ... How many 00:18:01.500 |
But we've all seen the movies, and like Peter Schwartz, who wrote or was a key part of writing 00:18:06.420 |
Minority Report and also War Games as our chief futurist at Salesforce. 00:18:13.520 |
And a decade, more than a decade ago, I had this moment where I was like, "Okay. 00:18:19.220 |
And bought a bunch of companies and put together Einstein, and Einstein has done amazing. 00:18:23.860 |
It's doing trillion transactions, trillion and a half transactions a week, predictive, 00:18:32.900 |
But now I'm really convinced that we are now really at the moment, right now, where enterprise 00:18:40.420 |
software is going to be completely transformed with artificial intelligence. 00:18:45.180 |
And we're going to see it, and obviously, I'm getting tuned up for Dreamforce, which 00:19:03.540 |
Since you're not going to be there, let me tell you what's going to happen. 00:19:09.260 |
Anyway, number one is we're going to ... We really see a moment right now where we are 00:19:17.620 |
100% focused on one thing and one idea, and I can tell you why that is if you're interested. 00:19:25.100 |
And AgentForce is the most exciting thing I have ever worked on in my career. 00:19:31.540 |
It's the culmination, really, of everything that we've done at Salesforce. 00:19:34.580 |
Because to make AgentForce really deliver, we had to have all of our customer touchpoints 00:19:41.280 |
We have to have an amalgamated data cloud, because we need the data especially to achieve 00:19:51.300 |
It's these three layers that are really going to deliver this next generation capability. 00:19:56.460 |
And I was just with Disney last night, and Disney has AgentForce. 00:20:00.640 |
They have the newest version, which we call Atlas, which is our most accurate, not just 00:20:04.740 |
model, but we have an extremely unusual technique that we'll talk about. 00:20:08.820 |
And Atlas delivers for Disney, for their cast members, which are their employees, through 00:20:14.240 |
extremely complex problems that it's solving for them. 00:20:20.080 |
More than 90% accuracy and almost no hallucinations, and in some cases, 95% accuracy and almost 00:20:27.460 |
And that idea that we can kind of come in to a very difficult and complex and sophisticated 00:20:34.700 |
Now, with Disney, if you go to DisneyStore.com, that's Salesforce. 00:20:39.660 |
If you go to the Disney parks, do you still go to Disneyland? 00:20:46.820 |
It's great, because you get to cut around the lines and all that. 00:20:47.820 |
How many of you have done the Disney guides thing? 00:20:49.820 |
We've got a lot of poor people here, actually. 00:20:54.940 |
Anyway, you should get these Disney guides, because they get you around the lines, and 00:20:59.980 |
you've got to do 30 rides a day, and it's much better than having to wait. 00:21:05.620 |
But anyway, Disney guides run on Salesforce, they have Slack, too. 00:21:13.140 |
We have Disney+, because the service now fell over, and we had to replace that inside the 00:21:21.740 |
We do the Disney cruises, and the Disney real estate, and we have every Disney customer 00:21:28.540 |
So the amalgamated dataset that we have around Disney is awesome. 00:21:32.100 |
So when we can take that Disney dataset, and then we apply Atlas and AgentForce- 00:21:38.460 |
How do you deliver a level of accuracy that has been incredible? 00:21:42.660 |
And I've got a couple more examples, I can tell you, that are just blowing my mind. 00:21:46.380 |
And I never thought it was really possible, but now it really is. 00:21:57.820 |
Because we're starting to hear this term a lot, but I think a lot of people here may 00:22:02.380 |
not know what that means in the context of AI. 00:22:07.940 |
Is that Agent Smith, or what are we talking about? 00:22:09.540 |
Well, we're at some level, I mean, I think like, I'll give you an example that we're 00:22:14.540 |
working with a large medical company not too far away from here, Kaiser. 00:22:19.180 |
They've got 20 million patients, they have a super complex dataset, they have all of 00:22:22.940 |
the data from Epic, they have the largest Epic customer in the world. 00:22:28.000 |
And more than 90% of all patient inquiries and scheduling requests and schedule my doctor, 00:22:35.100 |
my CT scan, my MRI, my this, my that, are being resolved by AgentForce and Atlas. 00:22:41.540 |
That idea that we can resolve through a autonomous agent, a deep and complex customer interaction 00:22:51.540 |
Obviously we have to do a few things to make it really work for our customers. 00:22:54.440 |
Number one is, it's got to be trusted, because our customers, we're running the largest banks, 00:23:00.700 |
sales companies, media companies, CPG companies, blah, blah, blah, blah, blah in the world. 00:23:07.620 |
It can't be some separate team that they're going to spin up. 00:23:10.460 |
It's their existing Salesforce team, it's happening within the Salesforce platform. 00:23:14.780 |
It's got to be open, it has to be able to work with and interoperate with other systems. 00:23:20.260 |
It's going to have to be multimodal, so it's going to have to speak to them and have voice 00:23:24.220 |
and video and do all of those kind of incredible capabilities. 00:23:28.040 |
And one other key thing, because evidently the humans have not gone away. 00:23:33.700 |
The doctors have not gone away from Kaiser, and the cast members have not gone away from 00:23:38.420 |
Disney and on and on, so we're going to have to handshake seamlessly with our apps. 00:23:44.220 |
So even though we have all these apps and we've wired up all these customer touchpoints, 00:23:48.340 |
the agents are autonomously interacting with and building the data and metadata and extending 00:23:55.420 |
So by the end of this month, we'll have more than a thousand customers on our AgentForce 00:24:00.880 |
The efficiency and productivity that we've been had with AgentForce is like nothing I 00:24:06.360 |
have ever seen with any of our customers or technology in the history of software. 00:24:11.160 |
But there's a second point, it isn't just about this kind of ease of use. 00:24:15.300 |
It's that they have the ability to do things that are truly astonishing, and that is also 00:24:23.760 |
So they can go out and like on a day like today, like it's a hundred and something degrees 00:24:27.680 |
outside, I don't know if you've been out there, it's pretty hot. 00:24:30.920 |
And Disneyland may not be as full today as it's going to be, and they knew that was going 00:24:35.200 |
to be true two days ago that a heat wave was coming. 00:24:38.240 |
Disney can proactively go out to their consumers and their customers and say, "Hey, come enjoy 00:24:43.440 |
the heat with us all, you know, at Disneyland, and we're going to give you a special promotion 00:24:48.360 |
or price or contest or whatever it is to come to Disneyland." 00:24:52.120 |
So we want to be able to proactively go out and generate revenue, and we also want to 00:24:56.400 |
be able to kind of bring that customer service in. 00:24:59.560 |
I think last night I had dinner at Beverly Hills at the Grill. 00:25:23.760 |
Anyway, you can use OpenTable to make restaurant reservations, and there's 160 million consumers 00:25:36.400 |
They're not in this room, but they're somewhere, and they've got also 60,000 restaurants, and 00:25:42.520 |
they've got a lot of complex issues, you know, in regard, you know, I didn't get my table 00:25:47.280 |
or my food wasn't right, my potato didn't get cooked, whatever it is. 00:25:51.320 |
These things are going to get worked out, but also, all of a sudden, the restaurant's 00:25:54.400 |
like, "Oh, look, we're not as full tonight as we want to be, and we're willing to do, 00:25:58.000 |
let's go out to our customer base and bring them in, but let's do it through a complex 00:26:02.300 |
conversation, you know, an empathic conversation as an agent with our customers." 00:26:09.400 |
Okay, so how long will it be until when you call a customer support center, you're talking 00:26:16.880 |
to an AI that sounds like a human and you can't tell the difference? 00:26:24.320 |
We already have that live, and we will have that scaled for thousands of customers before 00:26:31.000 |
the end of, live for, with thousands of customers live before the end of this year. 00:26:38.320 |
And we just, I just demoed it, I was just at a conference and spoke a couple miles away 00:26:42.800 |
from here at KPMG, and we showed them that exact situation where, you know, through, 00:26:49.400 |
you know, we used to call, you know, this kind of voice response system, whatever. 00:26:53.880 |
But you would kind of hit a wall pretty quickly with your bot, you know. 00:27:01.800 |
These are like, we're really getting to, like, another level capability, and I think that 00:27:08.640 |
And I think in the example of Disney, you know, Google has some great products. 00:27:11.800 |
I know Sergey was here yesterday, and they've done a great job with AI, as you know. 00:27:16.280 |
But in a head-to-head benchmark of Salesforce's agent force against Google's AI, we 2X them 00:27:24.560 |
And the reason why, as we'll explain it next week, you know, it's a couple of things. 00:27:30.220 |
Not only is there a next-gen models, but it's also new techniques involving next-generation 00:27:35.400 |
retrieval augmented generation, RAG techniques that no one has seen before, and it's really 00:27:44.440 |
Well, they're a good partner, also, a customer, and I love them, but yeah, it's competitive. 00:27:51.080 |
We're trying to all make AI a little more accurate and a little few less hallucinations 00:27:57.180 |
Let me give the audience a little update about something we just heard at OpenAI. 00:28:01.040 |
They just did a day where they brought in a relatively small number of investors and 00:28:06.440 |
kind of gave us all an update on their product roadmap. 00:28:09.240 |
And it sounds kind of similar, because everyone's moving in the same direction. 00:28:14.480 |
Number one was that they said that LLMs would soon be at PhD-level reasoning. 00:28:20.720 |
Right now, it's more like a smart high school or college student in terms of the answers. 00:28:27.280 |
Shortly behind that is agents, like you're talking about, and then third and closely 00:28:31.600 |
related is that agents will have the ability to use tools, and a tool can be a website. 00:28:38.360 |
So if you think about it now, you've got this LLM. 00:28:44.880 |
You can give it an objective, it will break that objective into a list of tasks, and those 00:28:50.360 |
tasks can include using other pieces of software. 00:28:54.880 |
And thanks to things like OpenAI just launched the audio API, which developers can use. 00:29:04.520 |
The LLM can now basically pretend to be a human, and, you know, it won't be hard to 00:29:09.880 |
find a piece of software to enable a phone call. 00:29:12.600 |
So you can imagine telling a personal assistant agent that, and it could be, you know, it 00:29:19.200 |
could be OpenTable, that, "Hey, book me a dinner reservation at the grill." 00:29:24.900 |
And it could place a phone call on your behalf and actually talk to the grill. 00:29:27.760 |
It could also go on OpenTable and just use OpenTable and book it, but if for some reason 00:29:33.040 |
that didn't work, it could literally place a phone call on your behalf, and the person 00:29:36.360 |
picking up at OpenTable wouldn't even know that your agent actually isn't a human, it's 00:29:42.960 |
But here's where I think it gets really crazy, is when the phone gets picked up on the other 00:29:48.000 |
end, that could be an AI too pretending to be a human. 00:29:51.500 |
So you could have two AIs pretend to be humans talking to each other and resolving tasks 00:29:57.120 |
And I literally, I think that's where it's headed. 00:29:59.880 |
We're definitely moving in this direction, but there's a cautionary tale here. 00:30:03.640 |
And I think that I'll just tell you the real-world experience with my customers and the problems 00:30:10.160 |
I think in the last few years, what we've kind of heard, and, you know, some of it has 00:30:14.360 |
come from OpenAI, but especially from Microsoft, that we're in this co-pilot world, and these 00:30:21.600 |
The level of accuracy, the spillage of information, the lack of trusted environment, co-pilot 00:30:28.840 |
And that idea that this kind of amount of technology got released and sold into these 00:30:35.360 |
very large customers, telling them that the promise of AI is here, but didn't do it in 00:30:40.800 |
a trusted way, didn't do it with the level of accuracy, didn't do it with the level of 00:30:45.640 |
And one of the things that was interesting, because I was with one of the customers, trying 00:30:49.380 |
to do this exact technique that you're talking about, which is a large telecommunications 00:30:55.760 |
And what this company did is take a model and try to-- 00:31:02.000 |
We're going to just tell you, training a model, retraining a model, building their own model. 00:31:09.460 |
We're going to DIY our AI, and it's going to be awesome. 00:31:12.760 |
Then we're going to write our own agents, and we're going to do this, we're going to 00:31:16.440 |
And I'm sitting there, and I'm going through it, and whatever, and then I finally am like, 00:31:19.360 |
"Now, show me your benchmarks, and show me all these different pieces." 00:31:22.840 |
And you know, for them, it's a bit of a science project. 00:31:24.880 |
And I've seen this now with a number of our customers, that they're kind of DIY-ing their 00:31:30.040 |
And you know, DIY, I think it's fine if you're like Neil Young, and it's homegrown, and it's 00:31:37.960 |
But this is not what you should be doing with your artificial intelligence. 00:31:41.840 |
But what are you guys using as your foundation model? 00:31:45.240 |
Like, what do you guys use for your foundation? 00:31:46.240 |
We have a lot of our own models, our own techniques, our own... 00:31:49.680 |
And then we let you bring in the model that you want, but we are all about achieving your 00:31:55.600 |
Because what I've seen with these kind of approaches, especially the one that you just 00:31:58.920 |
outlined, is that, yeah, you can get maybe 30 or 40% accuracy. 00:32:04.080 |
You know, in this case, this customer is 25%. 00:32:06.880 |
You had somebody on the stage yesterday, I won't tell who it is, he's a common friend 00:32:10.000 |
of both of ours, who tried to take this approach for a large telecommunications company that 00:32:14.960 |
he owns, and he said he was getting about a 25% accuracy with this homegrown model. 00:32:21.440 |
Instead, in our platform, the platform is building the model for you. 00:32:25.440 |
You're not having to train and retrain your own models. 00:32:28.120 |
You're building your own models in our platform, and we're gonna deliver much higher levels 00:32:32.580 |
of accuracy for you, and we're gonna deliver AI. 00:32:38.960 |
This is this next generation of AI, and I think that we'll have to prove that with benchmarks 00:32:46.980 |
Because the promise is amazing, but at a very deep level, customers are gonna need, you 00:32:52.220 |
know, what you and I have done for the last, you know, 20 years of our life, which is build 00:32:55.700 |
professional enterprise software, and deliver it to them in a capability. 00:32:59.300 |
And in regards to an agent running enterprise software, I mean, you just saw, like, that 00:33:03.580 |
was the fundamental business model of Adept, which was David Luan's company, you know, 00:33:09.460 |
and that's, he built GPT-3, then he left OpenAI to start Adept, and this idea to build agents 00:33:16.300 |
I'm sure that all of those things are gonna happen, but again, you have to get to a level 00:33:21.060 |
of accuracy, because everyone in this room, and you and I, we've all had this experience 00:33:26.180 |
where we're on these models, and it's like, this is not really more than hallucinations, 00:33:32.100 |
and that's no good, or as we say here in Los Angeles, no es bueno, when it comes to, okay, 00:33:41.620 |
You know, when you're dealing with healthcare, and you've got a patient, and you're reading 00:33:44.700 |
their medical records, you better be delivering more than 90 or 95% accuracy, 'cause the 50% 00:33:52.860 |
- Well, I can see you're ready for Dreamforce. 00:34:05.780 |
- Are you guys excited for the rise of agents? 00:34:11.280 |
I think that everything we've seen so far with LLMs has been, again, about reasoning 00:34:16.620 |
and generating, but with agents, the AI's gonna be able to take actions, and they're 00:34:22.300 |
gonna know how to use tools, which, until now, it's been something only humans can do. 00:34:26.060 |
- I gotta tell you a really good story, because you're, like, inspiring me around, you know, 00:34:30.740 |
Steve Jobs had a huge impact on my life, and I worked at Apple in 1984 when I was in high 00:34:37.900 |
school, and coming into college, and I was in an assembly language program, and I wrote 00:34:42.580 |
the first native assembly language on this Macintosh, on the 68000 assembler, and sitting 00:34:49.960 |
there in the cubes, and Steve was running it, whatever it was, and thank God, you know, 00:34:55.300 |
I had this relationship, and it influenced me so much in my life, and then called me 00:34:59.660 |
on a series of times, and after I started Salesforce, gave me really key advice. 00:35:03.580 |
Anyway, it was 2010, and he calls me, and I was like, come down here, I need to talk 00:35:09.100 |
I'm like, shit, what the hell, what did I do this time? 00:35:12.180 |
So I go down there to his office, and I always bring a few Salesforce employees with me, 00:35:16.380 |
and I've got some great folks with me, and we're sitting there, and he's like, I'm gonna 00:35:20.220 |
show you this, and I'm like, alright, let's go, and he brings out the iPad, and he's got 00:35:25.740 |
two of them, he's got the big one and the small one, and he's like, yeah, Mark, here 00:35:30.100 |
it is, but I don't like the small one, I'm only gonna have one size, you know that, and 00:35:34.540 |
I'm like, yes, sir, and he's like, listen, you know, I've been working on this concept 00:35:39.700 |
for a long time, and you know, in 2007, I introduced iPhone, and I said, thank you for 00:35:46.900 |
sending me one, I love it, it's great, he's like, but do you know why now we're doing 00:35:52.100 |
I'm like, no, because I know you had that, too, in 2007, oh, yeah, but you know what, 00:35:56.060 |
the real situation here is that Apple, I'm like, what is it, Steve, he's like, we only 00:36:00.460 |
have one A-team here, one A-team, so we're only focused on one thing at a time, and then 00:36:05.980 |
he lays out, like, five or six products on his coffee table, and he goes, and we will 00:36:10.580 |
never have more products than can fit on my coffee table, and I'm like, well, that's really 00:36:15.420 |
awesome, and he's like, I've been focused on 2007 and the iPhone, and now I'm gonna 00:36:19.700 |
zero in, and I'm only gonna do iPad, one focus at a time, remember that, Mark, that's the 00:36:26.900 |
way you need to run Salesforce, and I'm like, okay, is that why you brought me down here? 00:36:37.460 |
And that's how I feel right now about AgentForce, this is all I am doing, just try to take our 00:36:42.260 |
company, you know, we have a great company, 38 billion in revenue, 75,000 employees, hundreds 00:36:47.940 |
of thousands of customers, and one focus, AgentForce, this is because of what you're 00:36:53.660 |
saying, this is the moment, this is the greatest opportunity in the history of enterprise software, 00:37:00.020 |
and it must be executed with absolute acuity and excellence, and that is what I think we 00:37:07.580 |
You know, so I agree with you, I mean, I think the agents are gonna be huge, and Elon said 00:37:15.300 |
something kind of similar to the other day, he said, we got him talking about Optimus, 00:37:21.300 |
I just heard about the farm animals, I didn't know about the, what was, was there another 00:37:25.620 |
He said, well, he was talking about, he was talking about Optimus, and, oh, you, the thing 00:37:33.140 |
These jokes are all, each one is kind of dying very fast, it's sad. 00:37:35.380 |
It took me a second to realize that you were talking about his cock, but now I got it, 00:37:45.340 |
It's great how you bring this humor into the all-in, yeah. 00:37:53.260 |
So what Elon mentioned that really stuck with me is he said that humanoid robots, the creation 00:37:59.260 |
of these humanoid robots are the biggest economic opportunity in the history of the world. 00:38:07.700 |
He is, but, well, it's kind of like you saying that agents are the biggest opportunity in 00:38:17.700 |
It strikes me that there's something similar here, which is... 00:38:21.460 |
Well, I'm saying there's an analogy here between... 00:38:28.580 |
Here, the point is this, is that where we're going with AI is it's gonna be able to take 00:38:35.220 |
real actions, and in the case of Optimus, it's in the physical world, and it's gonna 00:38:41.260 |
In the enterprise, it's basically the brain for these agents. 00:38:49.420 |
I wouldn't say they're competing, and so my point... 00:38:56.380 |
I think you're right about this opportunity, and what I'm saying is I think it's analogous 00:39:02.980 |
I think there's no question, and I think that for our customers, they're gonna augment their 00:39:12.180 |
We're gonna take some customers and just turn them into margin machines, and I think that 00:39:16.900 |
the opportunity in the enterprise is unbelievable. 00:39:19.540 |
He's also directly addressing the consumer market, which I think is very exciting. 00:39:23.860 |
Obviously, he's an expert in that area, and yeah, we're about to move into this new world 00:39:28.460 |
of AI, of droids, of all these things, and it's a bunch of waves of... 00:39:34.900 |
Look, technology is getting lower cost and easier to use. 00:39:38.860 |
It's a continuum, and we're all riding that continuum. 00:39:42.640 |
This is extremely important, but also what's very important is, especially as we move into 00:39:46.980 |
this, we all have to think about what are the values that are gonna guide this technology? 00:39:55.660 |
That was the one place where I got the hands to go up, right? 00:39:58.940 |
So we know how it can go really wrong, right? 00:40:13.420 |
What are the values as we guide into the next level of the future? 00:40:17.500 |
Because those core values that we need to manifest and really focus on, that is, I think, 00:40:25.040 |
It's gotta be figured out, and that is why we're very lucky that you are one of the great 00:40:29.780 |
visionaries of our industry, because you're not just a great entrepreneur and CEO, but