back to index

Marc Benioff | All-In Summit 2024


Chapters

0:0 Sacks intros Marc Benioff
2:32 Funding Shinya Yamanaka's research
11:38 Marc on building philanthropy into Salesforce
17:15 AI's impact on enterprise software and the cloud
27:56 Salesforce's AI approach

Whisper Transcript | Transcript Only Page

00:00:00.000 | It is the center of the technology world right now.
00:00:04.340 | It's not what Mark did, it's when he did it.
00:00:09.160 | And the king of the cloud is Salesforce.
00:00:12.760 | Please welcome Mark Benioff.
00:00:15.040 | One of the things that really matters to me is having a positive global impact.
00:00:21.360 | Technology is not good or bad, it's what you do with it that matters.
00:00:25.040 | In your quest to change the world, don't forget to do something for other people.
00:00:31.600 | And that was a moment in time when I said, wow, when I start a company, I'm going to
00:00:36.920 | make sure that philanthropy and giving and generosity and these values are in the culture
00:00:42.140 | of the company from day one.
00:00:44.360 | You want to sit here?
00:00:48.800 | Or you want the couch?
00:00:52.120 | You want to sit on the couch?
00:00:54.200 | Where do you want to sit?
00:00:55.200 | I'll give you the couch, I'll sit here.
00:00:59.160 | You deserve the couch.
00:01:00.440 | You deserve it.
00:01:01.440 | The big couch.
00:01:03.440 | A little too close.
00:01:06.440 | It's nice to see you.
00:01:08.640 | I warned you that I'm not the interviewer in the group, but you chose me, so I'm honored.
00:01:16.880 | But you're the nice one.
00:01:17.880 | Oh, OK.
00:01:18.880 | Thank you.
00:01:19.880 | Am I right?
00:01:20.880 | Is he the nice one?
00:01:21.880 | And you're the one that all the women really like.
00:01:28.140 | Like I'll talk to my friends at dinner, they're like, you know, Sax, what's he like?
00:01:32.740 | He's amazing.
00:01:33.740 | Well, let's just say we're honored to have Mark Benioff here, and truly who's a visionary
00:01:38.280 | in the world of software.
00:01:40.400 | And I would say, you know, there's probably a lot of-
00:01:43.400 | And I thank my mother for writing that video, by the way, as well.
00:01:46.480 | Mom, thank you for writing that for me.
00:01:50.200 | You know, in the world of business software in particular, we don't have that many people
00:01:54.720 | who you can describe as visionaries, but you consistently have been one.
00:01:58.160 | You really-
00:01:59.160 | It's true.
00:02:01.280 | You got the, I think, the whole-
00:02:02.640 | We're sitting now on the edge of the couch.
00:02:05.280 | Here we go.
00:02:06.280 | Maybe we'll end up on the floor.
00:02:07.280 | I don't know.
00:02:08.280 | What's going to happen?
00:02:09.280 | I'm trying to keep it engaging.
00:02:10.280 | Moving around a lot.
00:02:11.280 | Oh, OK.
00:02:12.280 | All right.
00:02:13.280 | Is this how it's going to be the whole time?
00:02:20.080 | Way worse.
00:02:21.080 | Way worse?
00:02:23.080 | All right.
00:02:24.080 | Let me finish this little intro here.
00:02:25.640 | I forgot where I was.
00:02:29.080 | Can I just, before we start, you know, listen.
00:02:33.080 | So I want to just do something I would not normally do, and this is going to be a little
00:02:36.880 | bit of a thing, but I just have to do a little riff on this.
00:02:40.540 | We just heard an extraordinary presentation on an extraordinary man, and there's somebody
00:02:46.120 | who's amazing that most people don't get to hear of, and we just heard his name quite
00:02:49.420 | a few times.
00:02:50.420 | His name is Shin Yamanaka, Yamanaka-san.
00:02:53.160 | He is based in Kyoto, Japan, but he works half-time at UCSF, and it's amazing what his
00:03:01.960 | vision for the world is that he thinks, basically, that we're salamanders and we're going to
00:03:06.320 | be able to regenerate ourselves, and that's amazing.
00:03:10.820 | And so I've been friends with him maybe for a decade, but I fund his research, and so
00:03:15.880 | a lot of these things to watch him have these breakthroughs, you heard about the Yamanaka
00:03:19.720 | factors, the Yamanaka factors, which are basically this idea that Yamanaka had this breakthrough
00:03:26.760 | in Kyoto, you know, basically he's hanging out there in his lab eating the sushi, the
00:03:32.000 | whole thing, and then boom, and he goes, "If I take these four things, I can take an ordinary
00:03:39.760 | skin cell, just any little skin cell, and turn it into a stem cell," which is like the
00:03:46.720 | heart of human existence, and he did it, and he was able to repeat it and repeat it and
00:03:51.280 | repeat it, and he won the Nobel Prize for it, pretty cool.
00:03:56.760 | And then he, and I'm going to get the pronunciation of this wrong, but he then was able to take
00:04:01.680 | that stem cell, put it into your eye, if you have damacular degeneration, and boom, healed
00:04:10.840 | the eye, because the eye regenerated.
00:04:13.440 | Then he worked with a buddy of his in the lab next door, and he took the same thing,
00:04:18.440 | took the stem cells, and he turned it on a cookie sheet, and it looked like it was like
00:04:25.320 | a plastic thing on the cookie sheet, it was really cool, and then he took out somebody's
00:04:30.680 | cornea that was all screwed up, cut the material out of the cookie sheet, popped it in the
00:04:36.520 | eye, and the guy could see, and was like, "Amazing," then he's like, "Listen, this is
00:04:43.120 | amazing, I bet I can grow a brain," so he took the stem cells, and he started growing
00:04:51.360 | brains called organoids, and he's like, "Got a cookie sheet of brains," and I'm like, "Really?"
00:04:57.280 | He's like, "This is amazing, look at all the brains," and then I went and saw him and had
00:05:03.600 | lunch with him, and I'm like, "What's happening with the brains?"
00:05:06.040 | He's like, "I stopped the brains."
00:05:08.600 | I'm like, "Why did you stop the brains?"
00:05:10.480 | "I think they can feel the pain."
00:05:12.480 | I'm like, "Oh, scary," then I said to him, "Now what are you doing?"
00:05:19.080 | "Oh, I'm growing intestines," I'm like, "Whoa, intestines, is that good?"
00:05:25.000 | He's like, "Huge idea, I can now grow intestines on the cookie sheet, and taking the stem cells,
00:05:31.600 | I've got a whole intestine here, and then he can turn it into a lab for all the horrible
00:05:36.880 | things that people get in their gut, and all these diseases that have never been cured,
00:05:41.400 | but now you have a real simulated environment," so he's an incredible person.
00:05:45.800 | Anyway.
00:05:46.800 | Where do you want to go with that?
00:05:48.560 | I'm going, I'm going with this, you got to stay with me, I'm trying to help bring the
00:05:53.440 | energy up in here, follow, just hold on, hold on, hold on, hold on, wait, wait, wait, this
00:06:00.160 | is going to get good.
00:06:01.160 | So then I'm like, you heard the story, like at the end they said, "Listen, how do I get
00:06:06.800 | these regenerative factors going inside myself?"
00:06:09.740 | So UCSF just published research based on funding grant that I and others have given them, and
00:06:15.800 | they had a breakthrough that the regenerative factor inside your own blood is called PF4,
00:06:22.040 | and the way you get PF4, and I'm not going to get this exactly right, because you know,
00:06:25.200 | I'm in software, I'm not a doctor, so just follow with me.
00:06:28.120 | I thought that's what we're going to talk about today.
00:06:29.400 | You know that, I know, but I got to tell you this, because I'm scot-so-jack watching that.
00:06:33.800 | One is, it was either that or those crazy shots you have backstage, I don't know.
00:06:38.800 | Oh, okay.
00:06:39.800 | Number one is, PF4, you get more regenerative factors in your body, like calorie restriction,
00:06:47.920 | and if you know David and I, that does not sound very good.
00:06:51.840 | Two, working out with weights, also not exactly our top thing.
00:06:58.440 | Parabiosis, do you know what that is?
00:07:04.120 | Parabiosis kind of came out of research published a decade ago in the New York Times and others,
00:07:09.160 | which came from Saul Valeda, another person I work with at UCSF, where they took the blood
00:07:14.600 | of a young mouse and put it into an old mouse, and then the old mouse got young again.
00:07:21.640 | And that was moving the PF4 into that old mouse.
00:07:26.080 | So that's it.
00:07:27.080 | And the fourth thing is clothotherapy, which is a genetic therapy that I don't really understand.
00:07:31.120 | These four things can start to generate more of these things inside your body.
00:07:34.920 | So then I'm like getting excited, I'm like, God, I have these problems, maybe I can regenerate
00:07:39.240 | different parts of myself, whatever.
00:07:42.240 | And so I'm talking to my doctor at UCSF, because I'm going through my own serious problem,
00:07:47.520 | where I'm like, my left leg is like a half an inch shorter than my right leg, and I'm
00:07:51.480 | running on the treadmill, and I'm always ripping my Achilles, ripping, ripping.
00:07:55.820 | And all of a sudden, my Achilles looks like it has a donut.
00:07:59.080 | And in fact, I went to UCSF, and there was like an MRI, you know, well, how many of you
00:08:03.280 | have had an MRI?
00:08:04.280 | Raise your hand so you know what is horrible it is.
00:08:06.000 | Anyway, you get in this big machine, they're looking at my Achilles, they come out, they're
00:08:09.880 | all like this.
00:08:10.880 | Oh, sorry about this.
00:08:13.320 | Really horrible.
00:08:14.320 | And I'm like, so I kind of took this thought, and I'm like talking to my doctor, I'm like,
00:08:18.960 | why can't we like, use some of this, figure out what we can do.
00:08:23.560 | So he's like, all right, come back on Wednesday.
00:08:27.040 | So I come back on Wednesday, at five o'clock, you know, I'm in Mission Bay at UCSF.
00:08:33.480 | And I'm like, hey, Anthony, where is everybody?
00:08:36.520 | I think we're going to talk about it, come into my lab.
00:08:39.540 | So I come into the lab, they've got like a centrifuge there, all this stuff going on.
00:08:44.960 | I'm like, well, this is interesting.
00:08:46.240 | He's like, did you work out today?
00:08:47.640 | Yes, I work out.
00:08:48.640 | Do you following the PFR?
00:08:49.800 | I'm doing it.
00:08:51.480 | Okay.
00:08:52.560 | This is what we're going to do.
00:08:53.560 | It's going to be very straightforward because we have two things we can do with you, Mark.
00:08:57.520 | Number one, we can just take your Achilles, and we can bring you into surgery right now,
00:09:01.160 | we'll just shave off half your Achilles, and then put you in a boot and see where you are
00:09:05.120 | in six months.
00:09:06.120 | I go, doesn't sound great.
00:09:08.840 | Second idea.
00:09:10.360 | What we're going to do is we're going to take a scalpel, right here, we're going to cut
00:09:18.580 | into your Achilles like 20 times and into your ankle.
00:09:21.800 | I'm going to take your butt, I'm going to spin it, I'm going to try to find the PF4
00:09:25.000 | in your plasma, I'm going to inject it into your Achilles and into your plasma, slice
00:09:30.240 | into it with my scalpel.
00:09:32.040 | I go, sounds great.
00:09:33.880 | There's one problem.
00:09:34.880 | I go, what's that?
00:09:35.880 | We don't use anesthesia to do that.
00:09:39.120 | Because it destabilizes the PRP and the plasma and all the PF4 and all that.
00:09:43.640 | I'm like, let's rock.
00:09:46.800 | Let's rock.
00:09:47.800 | So he did the whole thing, and then boom, I'm like a salamander.
00:09:52.840 | They grew me a new Achilles right in place.
00:09:54.920 | So that thing that you just heard, that shit is real, and it's pretty awesome.
00:10:03.600 | I have a question for you.
00:10:09.120 | So if he can sell $3 billion into his startup, I should probably start, I'm ready to go.
00:10:13.480 | Got the pitch.
00:10:14.480 | Do you ever consider that you missed your calling as a scientific researcher?
00:10:19.560 | Definitely not.
00:10:20.560 | Definitely not.
00:10:21.560 | You're happy with the choices you made.
00:10:22.560 | Well, that's an incredible story.
00:10:23.560 | So you are one of the first to actually try using the Yamanaka factors on yourself?
00:10:29.400 | I wouldn't think I'm one of the first, but I think that it's very real and it's going
00:10:33.360 | to have a huge impact on our lives, and I think that we should be supporting these medical
00:10:38.720 | researchers.
00:10:39.720 | I think it's one of the reasons that I've put almost $1 billion into UCSF and philanthropy,
00:10:44.760 | because I believe in these people who have dedicated their lives to basic science and
00:10:54.440 | doing and meeting them.
00:10:55.440 | They're so inspiring to me, and I just had lunch with Yamanaka and Sal Vallada and Anthony
00:11:01.360 | Luke and another incredible researcher, Mark Moiser, at my house, and we're talking about
00:11:06.520 | the intersection between oncology and regenerative medicine, which is like two completely different
00:11:11.240 | worlds that don't talk to each other.
00:11:13.520 | And it's what inspires me, that we can work with others to give them the entrepreneurial
00:11:20.200 | push to go do something incredible, and these people are just awesome.
00:11:23.400 | Each one is amazing.
00:11:24.400 | That is incredible.
00:11:26.480 | So let's shift gears and talk about something else.
00:11:28.440 | Nice coincidence with the oncology.
00:11:29.440 | Yeah, I know.
00:11:30.440 | It's a great story.
00:11:31.440 | I know you're very philanthropic and do it a lot with UCSF, so kudos to you for encouraging
00:11:36.680 | that type of research.
00:11:38.600 | Let's shift to another thing that's having a huge impact on our lives, which is the cloud
00:11:41.880 | and software, where you were a pioneer.
00:11:44.240 | You started Salesforce back in 1999.
00:11:46.360 | 25 years ago.
00:11:47.920 | 25 years ago.
00:11:50.040 | And how long have you been a public company for at this point?
00:11:53.320 | 2004.
00:11:54.320 | So 20 years.
00:11:56.200 | 20 years.
00:11:57.720 | And one of the things I noticed-
00:11:58.840 | I'd rather count 80 earnings calls.
00:12:00.880 | Yeah.
00:12:01.880 | Well, actually, speaking of earnings, here, let's see if we have this slide.
00:12:05.120 | Do we have earnings?
00:12:06.120 | Well, first-
00:12:07.120 | Oh, yeah.
00:12:08.120 | Oh, boy.
00:12:09.120 | What slide is that?
00:12:10.200 | This is your stock chart over 25, I think 25 years, 20 years.
00:12:14.840 | You're almost at your all-time high.
00:12:16.160 | I guess there's no linear success, exactly, right?
00:12:18.400 | Yeah.
00:12:19.400 | That's a really good point.
00:12:20.400 | Yeah, we had a-
00:12:21.400 | You just keep going.
00:12:22.400 | We basically had a bubble.
00:12:23.400 | We had a bubble in late 2021.
00:12:24.400 | You keep going.
00:12:25.400 | We had a huge correction in '22.
00:12:26.400 | There was?
00:12:27.400 | I need to make a note of that.
00:12:28.400 | But you're basically back to where you were.
00:12:30.480 | This is one of your tweets.
00:12:31.480 | It is.
00:12:32.480 | Actually, this is one of the things I appreciate about the way you do earnings calls, is you
00:12:35.280 | just put out this really simple tweet, and it shows a progression.
00:12:40.480 | And if you like looking at numbers the way I do and seeing patterns in them, one of the
00:12:45.720 | things I noticed a while ago was that if you start at the bottom and work your way to the
00:12:49.440 | top, that Salesforce is growing by about 20% a year.
00:12:55.320 | And if you look at it over three years, that's roughly a double.
00:12:59.740 | So every three years, Salesforce was doubling.
00:13:03.040 | And that means that over a decade, it's growing 10x.
00:13:06.560 | And so every decade is basically exponential.
00:13:09.040 | If you can stick with it long enough.
00:13:11.520 | That was one of the patterns I noticed with Salesforce.
00:13:13.320 | Look, I think that the growth, obviously, is incredible, the $38 billion.
00:13:18.920 | And obviously, the cash flow is incredible.
00:13:21.560 | It's more than Coca-Cola did, I think, last quarter.
00:13:24.180 | But the margin is incredible.
00:13:26.200 | But let me just say, probably the best decision we made, and it's not on the slide, which
00:13:30.920 | is the day we started the company, we put 1% of our equity, 1% of our profit, 1% of
00:13:39.000 | our product, 1% of all of our employees' time into a 50713 foundation.
00:13:45.280 | Now at the time, it was very easy, because we had no employees.
00:13:47.800 | We had no equity.
00:13:48.920 | We had no profit.
00:13:50.960 | So it wasn't very complicated.
00:13:53.100 | But that idea, though, really kind of created the foundation of the company, because we
00:13:59.140 | were able to do now, and I think you know the numbers, right, we're almost 10 million
00:14:03.920 | hours of volunteerism.
00:14:05.520 | We've been able to give away almost a billion in grants.
00:14:08.000 | We run almost 100,000 nonprofits and NGOs for free on our service.
00:14:12.400 | And I think it really set the stage that business could be the greatest platform for change
00:14:16.480 | when it came for Salesforce.
00:14:18.200 | It gave it that philanthropic platform.
00:14:20.120 | So is there $2 billion in equity sitting in that 501(c)(3) at this point?
00:14:25.340 | A lot.
00:14:26.340 | Well, there's more.
00:14:27.340 | I think there's about a half a billion in the foundation, and a lot has been already
00:14:30.200 | given out.
00:14:31.240 | And then we give out more every year, and every month, every day, whatever.
00:14:35.600 | But like on Monday, we'll give another $25 million approximately to the San Francisco
00:14:40.920 | and Oakland public schools.
00:14:43.100 | And that is, you know, we've given them about $150 million.
00:14:46.740 | I mean, it's obviously, I went to public schools, it was very important to me, but my mother
00:14:53.180 | was a teacher in the San Francisco public schools.
00:14:56.740 | But also our employees, you know, we have 75,000 employees, their kids are in the public
00:15:02.100 | schools.
00:15:03.100 | And so it's a key part of our mantra and our culture that we're trying to support public
00:15:09.540 | education, adopt a public school.
00:15:12.060 | I really think that each one of us needs to focus more on the public education system
00:15:17.660 | in the United States.
00:15:19.100 | It's something I encourage in, not all my employees, but whenever I do a presentation,
00:15:23.140 | I'm like, you know, my public school is like a block from my house, Presidio Middle School,
00:15:28.300 | and I just went down there and knocked on the door, and they're like, "Who are you?"
00:15:33.860 | And I'm like, "How can I help you?
00:15:37.760 | And what can I do to support you?"
00:15:39.040 | They need a new playground, they need this, they need that.
00:15:42.660 | And maybe they just need some support, moral support.
00:15:47.620 | But it's been a great thing to really anchor the company in those values, and I think it's
00:15:52.980 | an important thing for every company.
00:15:55.340 | So what did you think when you saw that OpenAI started with a nonprofit, not as 1%, but as
00:16:00.740 | 100%, but then it became a for-profit?
00:16:03.500 | What did you think of that innovation?
00:16:05.460 | Confusing.
00:16:06.920 | I mean, 18,000 companies have now followed our 1-1-1 model.
00:16:12.300 | You can find out about it at pledge1percent.org.
00:16:15.200 | That other model, I don't really understand.
00:16:17.520 | I think we've proven our model.
00:16:18.920 | This is important.
00:16:19.920 | You know, we came out with three models.
00:16:21.600 | The cloud model, which you also have been part of that, the subscription model, you've
00:16:26.860 | also been part of that, and the philanthropic model, and you've been part of that.
00:16:30.600 | And those ideas that we're doing three models, that continues to be the fuel for the company
00:16:36.980 | and extremely important.
00:16:38.520 | And I think that for a lot of these companies that have followed us, that have gone on to
00:16:42.020 | scale and have had huge IPOs, and whether it was Slack or whether it was Atlassian or
00:16:47.460 | whether it was Etwilio or whatever, they've had these huge foundations and have had huge
00:16:53.540 | impact.
00:16:54.540 | And business can be the greatest platform for change, and you can do a lot with your
00:16:58.100 | business.
00:16:59.100 | So we're all building great products.
00:17:01.340 | Okay.
00:17:02.340 | That's great.
00:17:03.420 | And we're selling them.
00:17:05.420 | That's great too.
00:17:06.620 | But we can also do a little more with our business, and we can use it in a positive
00:17:10.540 | way and try to move the world maybe a little bit more in the right direction.
00:17:14.980 | Okay.
00:17:15.980 | So let's talk about the cloud part of that innovation.
00:17:19.380 | Where do you think we're at right now?
00:17:20.620 | I mean, is it all AI all the time?
00:17:24.180 | How are you thinking about it?
00:17:26.580 | We're at the precipice of the greatest moment in the history of enterprise software and
00:17:31.620 | of cloud computing.
00:17:33.180 | There's no question.
00:17:35.980 | I had a moment, I would say, more than a decade ago, which I call my kind of AI freakout moment,
00:17:41.700 | where I really felt ... I mean, maybe it's ... Obviously, we've all spent ... How many
00:17:45.900 | of you watched Minority Report?
00:17:47.620 | All right.
00:17:48.620 | We saw that movie.
00:17:50.500 | And what about War Games?
00:17:51.500 | War Games?
00:17:52.500 | Anybody remember that?
00:17:53.500 | Okay.
00:17:55.500 | Yeah.
00:17:56.500 | Yeah.
00:17:57.500 | We all saw these movies.
00:17:58.500 | Terminator?
00:17:59.500 | Okay.
00:18:00.500 | That one's a little scary.
00:18:01.500 | But we've all seen the movies, and like Peter Schwartz, who wrote or was a key part of writing
00:18:06.420 | Minority Report and also War Games as our chief futurist at Salesforce.
00:18:13.520 | And a decade, more than a decade ago, I had this moment where I was like, "Okay.
00:18:16.780 | This is really happening.
00:18:17.780 | Here we go."
00:18:19.220 | And bought a bunch of companies and put together Einstein, and Einstein has done amazing.
00:18:23.860 | It's doing trillion transactions, trillion and a half transactions a week, predictive,
00:18:28.500 | generative.
00:18:29.500 | I really thought, "Okay.
00:18:30.500 | This is what's going to be the moment."
00:18:32.900 | But now I'm really convinced that we are now really at the moment, right now, where enterprise
00:18:40.420 | software is going to be completely transformed with artificial intelligence.
00:18:45.180 | And we're going to see it, and obviously, I'm getting tuned up for Dreamforce, which
00:18:49.540 | is going to be Tuesday of next week.
00:18:51.220 | How many of you are coming to Dreamforce?
00:18:53.540 | Not enough.
00:18:54.540 | Anyway.
00:18:56.540 | These aren't my people.
00:18:57.540 | I'm leaving now.
00:18:58.540 | Well, it's good.
00:18:59.540 | Well, look.
00:19:00.540 | But no.
00:19:01.540 | Let me just tell you.
00:19:02.540 | This is an opportunity.
00:19:03.540 | Since you're not going to be there, let me tell you what's going to happen.
00:19:07.340 | Thanks for being part of my team.
00:19:09.260 | Anyway, number one is we're going to ... We really see a moment right now where we are
00:19:17.620 | 100% focused on one thing and one idea, and I can tell you why that is if you're interested.
00:19:23.420 | But it's AgentForce.
00:19:25.100 | And AgentForce is the most exciting thing I have ever worked on in my career.
00:19:31.540 | It's the culmination, really, of everything that we've done at Salesforce.
00:19:34.580 | Because to make AgentForce really deliver, we had to have all of our customer touchpoints
00:19:39.420 | wired up, which we do.
00:19:41.280 | We have to have an amalgamated data cloud, because we need the data especially to achieve
00:19:45.660 | the AI accuracy, and the metadata as well.
00:19:49.820 | And we have to have the agents.
00:19:51.300 | It's these three layers that are really going to deliver this next generation capability.
00:19:56.460 | And I was just with Disney last night, and Disney has AgentForce.
00:20:00.640 | They have the newest version, which we call Atlas, which is our most accurate, not just
00:20:04.740 | model, but we have an extremely unusual technique that we'll talk about.
00:20:08.820 | And Atlas delivers for Disney, for their cast members, which are their employees, through
00:20:14.240 | extremely complex problems that it's solving for them.
00:20:20.080 | More than 90% accuracy and almost no hallucinations, and in some cases, 95% accuracy and almost
00:20:25.260 | no hallucinations.
00:20:27.460 | And that idea that we can kind of come in to a very difficult and complex and sophisticated
00:20:33.700 | dataset.
00:20:34.700 | Now, with Disney, if you go to DisneyStore.com, that's Salesforce.
00:20:39.660 | If you go to the Disney parks, do you still go to Disneyland?
00:20:41.820 | Sometimes, yeah.
00:20:42.820 | Okay.
00:20:43.820 | You ever get a Disney guide?
00:20:44.820 | Sometimes.
00:20:45.820 | Yeah.
00:20:46.820 | It's great, because you get to cut around the lines and all that.
00:20:47.820 | How many of you have done the Disney guides thing?
00:20:49.820 | We've got a lot of poor people here, actually.
00:20:54.940 | Anyway, you should get these Disney guides, because they get you around the lines, and
00:20:59.980 | you've got to do 30 rides a day, and it's much better than having to wait.
00:21:04.620 | Okay.
00:21:05.620 | But anyway, Disney guides run on Salesforce, they have Slack, too.
00:21:11.420 | We do DisneyStore.com.
00:21:13.140 | We have Disney+, because the service now fell over, and we had to replace that inside the
00:21:20.020 | Disney+ call center.
00:21:21.740 | We do the Disney cruises, and the Disney real estate, and we have every Disney customer
00:21:27.180 | test point all wired up.
00:21:28.540 | So the amalgamated dataset that we have around Disney is awesome.
00:21:32.100 | So when we can take that Disney dataset, and then we apply Atlas and AgentForce-
00:21:36.460 | Okay.
00:21:37.460 | So how do you define-
00:21:38.460 | How do you deliver a level of accuracy that has been incredible?
00:21:42.660 | And I've got a couple more examples, I can tell you, that are just blowing my mind.
00:21:46.380 | And I never thought it was really possible, but now it really is.
00:21:50.820 | Go ahead.
00:21:52.820 | Well, I just wanted to find-
00:21:53.820 | Do you want me to ask you a question, also?
00:21:54.820 | No, no.
00:21:56.820 | What do you mean by agent?
00:21:57.820 | Because we're starting to hear this term a lot, but I think a lot of people here may
00:22:02.380 | not know what that means in the context of AI.
00:22:04.940 | Yeah.
00:22:05.940 | Did you see the movie The Matrix?
00:22:06.940 | Yes, I did.
00:22:07.940 | Is that Agent Smith, or what are we talking about?
00:22:09.540 | Well, we're at some level, I mean, I think like, I'll give you an example that we're
00:22:14.540 | working with a large medical company not too far away from here, Kaiser.
00:22:19.180 | They've got 20 million patients, they have a super complex dataset, they have all of
00:22:22.940 | the data from Epic, they have the largest Epic customer in the world.
00:22:28.000 | And more than 90% of all patient inquiries and scheduling requests and schedule my doctor,
00:22:35.100 | my CT scan, my MRI, my this, my that, are being resolved by AgentForce and Atlas.
00:22:41.540 | That idea that we can resolve through a autonomous agent, a deep and complex customer interaction
00:22:49.740 | is a breakthrough thought.
00:22:51.540 | Obviously we have to do a few things to make it really work for our customers.
00:22:54.440 | Number one is, it's got to be trusted, because our customers, we're running the largest banks,
00:23:00.700 | sales companies, media companies, CPG companies, blah, blah, blah, blah, blah in the world.
00:23:05.740 | Number two is, it's got to be easy for them.
00:23:07.620 | It can't be some separate team that they're going to spin up.
00:23:10.460 | It's their existing Salesforce team, it's happening within the Salesforce platform.
00:23:14.780 | It's got to be open, it has to be able to work with and interoperate with other systems.
00:23:20.260 | It's going to have to be multimodal, so it's going to have to speak to them and have voice
00:23:24.220 | and video and do all of those kind of incredible capabilities.
00:23:28.040 | And one other key thing, because evidently the humans have not gone away.
00:23:33.700 | The doctors have not gone away from Kaiser, and the cast members have not gone away from
00:23:38.420 | Disney and on and on, so we're going to have to handshake seamlessly with our apps.
00:23:44.220 | So even though we have all these apps and we've wired up all these customer touchpoints,
00:23:48.340 | the agents are autonomously interacting with and building the data and metadata and extending
00:23:55.420 | So by the end of this month, we'll have more than a thousand customers on our AgentForce
00:23:59.380 | platform.
00:24:00.880 | The efficiency and productivity that we've been had with AgentForce is like nothing I
00:24:06.360 | have ever seen with any of our customers or technology in the history of software.
00:24:11.160 | But there's a second point, it isn't just about this kind of ease of use.
00:24:15.300 | It's that they have the ability to do things that are truly astonishing, and that is also
00:24:22.080 | generate revenue.
00:24:23.760 | So they can go out and like on a day like today, like it's a hundred and something degrees
00:24:27.680 | outside, I don't know if you've been out there, it's pretty hot.
00:24:30.920 | And Disneyland may not be as full today as it's going to be, and they knew that was going
00:24:35.200 | to be true two days ago that a heat wave was coming.
00:24:38.240 | Disney can proactively go out to their consumers and their customers and say, "Hey, come enjoy
00:24:43.440 | the heat with us all, you know, at Disneyland, and we're going to give you a special promotion
00:24:48.360 | or price or contest or whatever it is to come to Disneyland."
00:24:52.120 | So we want to be able to proactively go out and generate revenue, and we also want to
00:24:56.400 | be able to kind of bring that customer service in.
00:24:59.560 | I think last night I had dinner at Beverly Hills at the Grill.
00:25:02.720 | Have you been there?
00:25:03.720 | Great, right?
00:25:04.720 | Cream spinach?
00:25:05.720 | I don't know.
00:25:06.720 | I did something different.
00:25:07.720 | What do you want?
00:25:08.720 | What do you prefer?
00:25:09.720 | Well, I like potatoes.
00:25:10.720 | Potatoes, okay.
00:25:11.720 | You know, any kind of potato.
00:25:14.760 | Any kind of potato.
00:25:15.760 | Baked potato.
00:25:16.760 | Steak fries.
00:25:17.760 | So I'm on OpenTable.
00:25:18.760 | Right?
00:25:19.760 | Anybody here use OpenTable?
00:25:20.760 | Nice.
00:25:21.760 | It's a nice group.
00:25:22.760 | It's a very weird group.
00:25:23.760 | Anyway, you can use OpenTable to make restaurant reservations, and there's 160 million consumers
00:25:35.400 | on OpenTable.
00:25:36.400 | They're not in this room, but they're somewhere, and they've got also 60,000 restaurants, and
00:25:42.520 | they've got a lot of complex issues, you know, in regard, you know, I didn't get my table
00:25:47.280 | or my food wasn't right, my potato didn't get cooked, whatever it is.
00:25:51.320 | These things are going to get worked out, but also, all of a sudden, the restaurant's
00:25:54.400 | like, "Oh, look, we're not as full tonight as we want to be, and we're willing to do,
00:25:58.000 | let's go out to our customer base and bring them in, but let's do it through a complex
00:26:02.300 | conversation, you know, an empathic conversation as an agent with our customers."
00:26:07.440 | I think it's going to be a rocket ship.
00:26:09.400 | Okay, so how long will it be until when you call a customer support center, you're talking
00:26:16.880 | to an AI that sounds like a human and you can't tell the difference?
00:26:20.920 | Are we there yet?
00:26:21.920 | We're there yet.
00:26:22.920 | We are already at that point.
00:26:24.320 | We already have that live, and we will have that scaled for thousands of customers before
00:26:31.000 | the end of, live for, with thousands of customers live before the end of this year.
00:26:38.320 | And we just, I just demoed it, I was just at a conference and spoke a couple miles away
00:26:42.800 | from here at KPMG, and we showed them that exact situation where, you know, through,
00:26:49.400 | you know, we used to call, you know, this kind of voice response system, whatever.
00:26:53.880 | But you would kind of hit a wall pretty quickly with your bot, you know.
00:26:58.380 | But these aren't bots.
00:26:59.920 | These are not the bots you're looking for.
00:27:01.800 | These are like, we're really getting to, like, another level capability, and I think that
00:27:07.400 | it's pretty impressive.
00:27:08.640 | And I think in the example of Disney, you know, Google has some great products.
00:27:11.800 | I know Sergey was here yesterday, and they've done a great job with AI, as you know.
00:27:16.280 | But in a head-to-head benchmark of Salesforce's agent force against Google's AI, we 2X them
00:27:23.080 | on accuracy.
00:27:24.560 | And the reason why, as we'll explain it next week, you know, it's a couple of things.
00:27:30.220 | Not only is there a next-gen models, but it's also new techniques involving next-generation
00:27:35.400 | retrieval augmented generation, RAG techniques that no one has seen before, and it's really
00:27:40.200 | incredible what's possible.
00:27:41.560 | So you're kicking Google's ass.
00:27:43.440 | I'm cool with that.
00:27:44.440 | Well, they're a good partner, also, a customer, and I love them, but yeah, it's competitive.
00:27:48.880 | It's competitive.
00:27:49.880 | Let me keep building on this.
00:27:51.080 | We're trying to all make AI a little more accurate and a little few less hallucinations
00:27:56.180 | along the way.
00:27:57.180 | Let me give the audience a little update about something we just heard at OpenAI.
00:28:01.040 | They just did a day where they brought in a relatively small number of investors and
00:28:06.440 | kind of gave us all an update on their product roadmap.
00:28:09.240 | And it sounds kind of similar, because everyone's moving in the same direction.
00:28:12.920 | So there are three big takeaways.
00:28:14.480 | Number one was that they said that LLMs would soon be at PhD-level reasoning.
00:28:20.720 | Right now, it's more like a smart high school or college student in terms of the answers.
00:28:25.240 | We're going to be at the next level.
00:28:27.280 | Shortly behind that is agents, like you're talking about, and then third and closely
00:28:31.600 | related is that agents will have the ability to use tools, and a tool can be a website.
00:28:38.360 | So if you think about it now, you've got this LLM.
00:28:40.900 | It's really smart.
00:28:41.900 | It's got, you know, it's like a PhD.
00:28:44.880 | You can give it an objective, it will break that objective into a list of tasks, and those
00:28:50.360 | tasks can include using other pieces of software.
00:28:54.880 | And thanks to things like OpenAI just launched the audio API, which developers can use.
00:29:01.080 | It's in private beta.
00:29:02.080 | We have some companies using it.
00:29:04.520 | The LLM can now basically pretend to be a human, and, you know, it won't be hard to
00:29:09.880 | find a piece of software to enable a phone call.
00:29:12.600 | So you can imagine telling a personal assistant agent that, and it could be, you know, it
00:29:19.200 | could be OpenTable, that, "Hey, book me a dinner reservation at the grill."
00:29:24.900 | And it could place a phone call on your behalf and actually talk to the grill.
00:29:27.760 | It could also go on OpenTable and just use OpenTable and book it, but if for some reason
00:29:33.040 | that didn't work, it could literally place a phone call on your behalf, and the person
00:29:36.360 | picking up at OpenTable wouldn't even know that your agent actually isn't a human, it's
00:29:41.960 | an AI.
00:29:42.960 | But here's where I think it gets really crazy, is when the phone gets picked up on the other
00:29:48.000 | end, that could be an AI too pretending to be a human.
00:29:51.500 | So you could have two AIs pretend to be humans talking to each other and resolving tasks
00:29:56.120 | on your behalf.
00:29:57.120 | And I literally, I think that's where it's headed.
00:29:59.880 | We're definitely moving in this direction, but there's a cautionary tale here.
00:30:03.640 | And I think that I'll just tell you the real-world experience with my customers and the problems
00:30:08.440 | that I'm trying to solve for them.
00:30:10.160 | I think in the last few years, what we've kind of heard, and, you know, some of it has
00:30:14.360 | come from OpenAI, but especially from Microsoft, that we're in this co-pilot world, and these
00:30:19.160 | co-pilots have universally failed.
00:30:21.600 | The level of accuracy, the spillage of information, the lack of trusted environment, co-pilot
00:30:26.640 | has been a complete disaster.
00:30:28.840 | And that idea that this kind of amount of technology got released and sold into these
00:30:35.360 | very large customers, telling them that the promise of AI is here, but didn't do it in
00:30:40.800 | a trusted way, didn't do it with the level of accuracy, didn't do it with the level of
00:30:44.040 | security needed.
00:30:45.640 | And one of the things that was interesting, because I was with one of the customers, trying
00:30:49.380 | to do this exact technique that you're talking about, which is a large telecommunications
00:30:53.400 | company in Seattle.
00:30:55.760 | And what this company did is take a model and try to--
00:30:59.000 | One that we've heard of?
00:31:00.000 | Nope.
00:31:01.000 | Okay.
00:31:02.000 | We're going to just tell you, training a model, retraining a model, building their own model.
00:31:05.920 | "Mark, we have to have our own models.
00:31:07.800 | We're going to DIY it.
00:31:09.460 | We're going to DIY our AI, and it's going to be awesome.
00:31:12.760 | Then we're going to write our own agents, and we're going to do this, we're going to
00:31:14.960 | do that, the other thing."
00:31:16.440 | And I'm sitting there, and I'm going through it, and whatever, and then I finally am like,
00:31:19.360 | "Now, show me your benchmarks, and show me all these different pieces."
00:31:22.840 | And you know, for them, it's a bit of a science project.
00:31:24.880 | And I've seen this now with a number of our customers, that they're kind of DIY-ing their
00:31:30.040 | And you know, DIY, I think it's fine if you're like Neil Young, and it's homegrown, and it's
00:31:35.960 | Canada, and it's Ontario.
00:31:37.960 | But this is not what you should be doing with your artificial intelligence.
00:31:41.840 | But what are you guys using as your foundation model?
00:31:44.240 | Is it Lama 2?
00:31:45.240 | Like, what do you guys use for your foundation?
00:31:46.240 | We have a lot of our own models, our own techniques, our own...
00:31:49.680 | And then we let you bring in the model that you want, but we are all about achieving your
00:31:54.120 | accuracy.
00:31:55.600 | Because what I've seen with these kind of approaches, especially the one that you just
00:31:58.920 | outlined, is that, yeah, you can get maybe 30 or 40% accuracy.
00:32:04.080 | You know, in this case, this customer is 25%.
00:32:06.880 | You had somebody on the stage yesterday, I won't tell who it is, he's a common friend
00:32:10.000 | of both of ours, who tried to take this approach for a large telecommunications company that
00:32:14.960 | he owns, and he said he was getting about a 25% accuracy with this homegrown model.
00:32:19.380 | And I'm like, "Why are you doing that?"
00:32:21.440 | Instead, in our platform, the platform is building the model for you.
00:32:25.440 | You're not having to train and retrain your own models.
00:32:28.120 | You're building your own models in our platform, and we're gonna deliver much higher levels
00:32:32.580 | of accuracy for you, and we're gonna deliver AI.
00:32:36.320 | This is the AI that you want.
00:32:38.960 | This is this next generation of AI, and I think that we'll have to prove that with benchmarks
00:32:44.360 | and with bake-offs, and to show customers.
00:32:46.980 | Because the promise is amazing, but at a very deep level, customers are gonna need, you
00:32:52.220 | know, what you and I have done for the last, you know, 20 years of our life, which is build
00:32:55.700 | professional enterprise software, and deliver it to them in a capability.
00:32:59.300 | And in regards to an agent running enterprise software, I mean, you just saw, like, that
00:33:03.580 | was the fundamental business model of Adept, which was David Luan's company, you know,
00:33:09.460 | and that's, he built GPT-3, then he left OpenAI to start Adept, and this idea to build agents
00:33:14.380 | that are gonna drive apps.
00:33:16.300 | I'm sure that all of those things are gonna happen, but again, you have to get to a level
00:33:21.060 | of accuracy, because everyone in this room, and you and I, we've all had this experience
00:33:26.180 | where we're on these models, and it's like, this is not really more than hallucinations,
00:33:32.100 | and that's no good, or as we say here in Los Angeles, no es bueno, when it comes to, okay,
00:33:38.660 | Kaiser, and you're dealing with healthcare.
00:33:41.620 | You know, when you're dealing with healthcare, and you've got a patient, and you're reading
00:33:44.700 | their medical records, you better be delivering more than 90 or 95% accuracy, 'cause the 50%
00:33:50.660 | accuracy thing is no good.
00:33:52.860 | - Well, I can see you're ready for Dreamforce.
00:33:55.300 | - I'm trying to get, find it.
00:33:56.900 | I'm testing material out here a little bit.
00:33:59.060 | - What do you guys think?
00:34:00.060 | - So, I'm like...
00:34:05.780 | - Are you guys excited for the rise of agents?
00:34:08.540 | Yeah?
00:34:10.060 | It's gonna be a really big deal.
00:34:11.280 | I think that everything we've seen so far with LLMs has been, again, about reasoning
00:34:16.620 | and generating, but with agents, the AI's gonna be able to take actions, and they're
00:34:22.300 | gonna know how to use tools, which, until now, it's been something only humans can do.
00:34:26.060 | - I gotta tell you a really good story, because you're, like, inspiring me around, you know,
00:34:30.740 | Steve Jobs had a huge impact on my life, and I worked at Apple in 1984 when I was in high
00:34:37.900 | school, and coming into college, and I was in an assembly language program, and I wrote
00:34:42.580 | the first native assembly language on this Macintosh, on the 68000 assembler, and sitting
00:34:49.960 | there in the cubes, and Steve was running it, whatever it was, and thank God, you know,
00:34:55.300 | I had this relationship, and it influenced me so much in my life, and then called me
00:34:59.660 | on a series of times, and after I started Salesforce, gave me really key advice.
00:35:03.580 | Anyway, it was 2010, and he calls me, and I was like, come down here, I need to talk
00:35:08.100 | to you.
00:35:09.100 | I'm like, shit, what the hell, what did I do this time?
00:35:12.180 | So I go down there to his office, and I always bring a few Salesforce employees with me,
00:35:16.380 | and I've got some great folks with me, and we're sitting there, and he's like, I'm gonna
00:35:20.220 | show you this, and I'm like, alright, let's go, and he brings out the iPad, and he's got
00:35:25.740 | two of them, he's got the big one and the small one, and he's like, yeah, Mark, here
00:35:30.100 | it is, but I don't like the small one, I'm only gonna have one size, you know that, and
00:35:34.540 | I'm like, yes, sir, and he's like, listen, you know, I've been working on this concept
00:35:39.700 | for a long time, and you know, in 2007, I introduced iPhone, and I said, thank you for
00:35:46.900 | sending me one, I love it, it's great, he's like, but do you know why now we're doing
00:35:51.100 | the iPad?
00:35:52.100 | I'm like, no, because I know you had that, too, in 2007, oh, yeah, but you know what,
00:35:56.060 | the real situation here is that Apple, I'm like, what is it, Steve, he's like, we only
00:36:00.460 | have one A-team here, one A-team, so we're only focused on one thing at a time, and then
00:36:05.980 | he lays out, like, five or six products on his coffee table, and he goes, and we will
00:36:10.580 | never have more products than can fit on my coffee table, and I'm like, well, that's really
00:36:15.420 | awesome, and he's like, I've been focused on 2007 and the iPhone, and now I'm gonna
00:36:19.700 | zero in, and I'm only gonna do iPad, one focus at a time, remember that, Mark, that's the
00:36:26.900 | way you need to run Salesforce, and I'm like, okay, is that why you brought me down here?
00:36:32.420 | Yes, you may go.
00:36:37.460 | And that's how I feel right now about AgentForce, this is all I am doing, just try to take our
00:36:42.260 | company, you know, we have a great company, 38 billion in revenue, 75,000 employees, hundreds
00:36:47.940 | of thousands of customers, and one focus, AgentForce, this is because of what you're
00:36:53.660 | saying, this is the moment, this is the greatest opportunity in the history of enterprise software,
00:37:00.020 | and it must be executed with absolute acuity and excellence, and that is what I think we
00:37:05.460 | all need to do.
00:37:07.580 | You know, so I agree with you, I mean, I think the agents are gonna be huge, and Elon said
00:37:15.300 | something kind of similar to the other day, he said, we got him talking about Optimus,
00:37:20.300 | you know, his robot.
00:37:21.300 | I just heard about the farm animals, I didn't know about the, what was, was there another
00:37:24.620 | part of the presentation?
00:37:25.620 | He said, well, he was talking about, he was talking about Optimus, and, oh, you, the thing
00:37:32.140 | about...
00:37:33.140 | These jokes are all, each one is kind of dying very fast, it's sad.
00:37:35.380 | It took me a second to realize that you were talking about his cock, but now I got it,
00:37:42.860 | okay.
00:37:43.860 | So, he was referring to...
00:37:45.340 | It's great how you bring this humor into the all-in, yeah.
00:37:49.680 | It's very subtle sometimes, yeah.
00:37:53.260 | So what Elon mentioned that really stuck with me is he said that humanoid robots, the creation
00:37:59.260 | of these humanoid robots are the biggest economic opportunity in the history of the world.
00:38:05.180 | The average person's...
00:38:06.180 | Is he making some of them by any chance?
00:38:07.700 | He is, but, well, it's kind of like you saying that agents are the biggest opportunity in
00:38:12.820 | the history of enterprise software.
00:38:13.820 | Thank you.
00:38:14.820 | Let me write that down.
00:38:15.820 | Thank you for letting me know that.
00:38:17.700 | It strikes me that there's something similar here, which is...
00:38:19.460 | Elon is a good salesman.
00:38:20.460 | Is that your point?
00:38:21.460 | Well, I'm saying there's an analogy here between...
00:38:24.580 | He didn't give you the regenerative pitch.
00:38:26.580 | Is that my...
00:38:27.580 | Well, no.
00:38:28.580 | Here, the point is this, is that where we're going with AI is it's gonna be able to take
00:38:35.220 | real actions, and in the case of Optimus, it's in the physical world, and it's gonna
00:38:38.740 | be the brain for these humanoid robots.
00:38:41.260 | In the enterprise, it's basically the brain for these agents.
00:38:45.060 | I think these things are actually pretty...
00:38:47.900 | They're on parallel tracks.
00:38:49.420 | I wouldn't say they're competing, and so my point...
00:38:52.020 | These are the droids you're looking for.
00:38:54.380 | So I think...
00:38:55.380 | Anyway, I think that...
00:38:56.380 | I think you're right about this opportunity, and what I'm saying is I think it's analogous
00:39:00.100 | to what Elon is seeing with robotics.
00:39:02.980 | I think there's no question, and I think that for our customers, they're gonna augment their
00:39:07.060 | employees.
00:39:08.060 | They're gonna make things lower cost.
00:39:09.060 | They're gonna increase their revenues.
00:39:10.340 | They're gonna increase their margins.
00:39:12.180 | We're gonna take some customers and just turn them into margin machines, and I think that
00:39:16.900 | the opportunity in the enterprise is unbelievable.
00:39:19.540 | He's also directly addressing the consumer market, which I think is very exciting.
00:39:23.860 | Obviously, he's an expert in that area, and yeah, we're about to move into this new world
00:39:28.460 | of AI, of droids, of all these things, and it's a bunch of waves of...
00:39:34.900 | Look, technology is getting lower cost and easier to use.
00:39:38.860 | It's a continuum, and we're all riding that continuum.
00:39:42.640 | This is extremely important, but also what's very important is, especially as we move into
00:39:46.980 | this, we all have to think about what are the values that are gonna guide this technology?
00:39:52.460 | Because each of us have seen the movies.
00:39:54.660 | We all watch the movies.
00:39:55.660 | That was the one place where I got the hands to go up, right?
00:39:58.940 | So we know how it can go really wrong, right?
00:40:01.920 | Everybody saw that part of the movie.
00:40:03.820 | So what are the values?
00:40:06.140 | What's gonna be really important to us?
00:40:07.780 | Will it be trust?
00:40:08.780 | Will it be customer success?
00:40:09.780 | Is it innovation?
00:40:10.780 | Is it quality?
00:40:11.780 | Is it sustainability?
00:40:13.420 | What are the values as we guide into the next level of the future?
00:40:17.500 | Because those core values that we need to manifest and really focus on, that is, I think,
00:40:22.860 | still out there as a major discussion item.
00:40:25.040 | It's gotta be figured out, and that is why we're very lucky that you are one of the great
00:40:29.780 | visionaries of our industry, because you're not just a great entrepreneur and CEO, but
00:40:34.500 | you're a great human being.
00:40:35.500 | So thank you, Mark.
00:40:36.500 | Thank you.
00:40:36.500 | (audience cheering and applauding)
00:40:38.260 | - Thank you.
00:40:39.260 | (audience applauds)