back to index

Nikesh Arora | All-In Summit 2024


Chapters

0:0 The Besties welcome Nikesh Arora
2:56 How to build great businesses, lessons from his time at Google
6:30 Lessons from working with Masa at SoftBank, risk tolerance
11:51 Scaling Palo Alto Networks, handicapping cybersecurity, M&A strategy
20:2 State of cyber threats
24:23 Age of AI: How agents will change consumer apps, how AI is impacting cybersecurity

Whisper Transcript | Transcript Only Page

00:00:00.000 | Our next speaker is actually fortunate enough to have had seen his brand name turn into a verb.
00:00:06.480 | One of the probably most prolific executives in this current generation is Nikesh Arora.
00:00:12.320 | The big daddy of the cyber security space. These guys are really at the forefront of the industry.
00:00:16.240 | There are very few people who consistently, time and time again,
00:00:19.760 | find the way to just persevere, be relevant. Nikesh is one of those people.
00:00:24.720 | This is a man who has tremendous insight into technology. He helped turn Google into the
00:00:28.960 | dominant player in search. This is the most innovative industry in the world. We're constantly
00:00:33.120 | paranoid from an innovation perspective. I've got to be on my toes because once we figure out how
00:00:37.200 | they did it last time, they're trying a new way to do it next time. This is the country where your
00:00:42.000 | dreams come true and if you go around the world and you ask young people where do they want to go,
00:00:46.000 | they'll still want to come to America. I think this is one of the most successful democracies
00:00:50.320 | in the world. This is where capitalism thrives. All right. Ladies and gentlemen, Nikesh Arora.
00:00:55.840 | My guy. Appreciate you. Thanks for coming. David? David? How are you? What's up, Ruskin?
00:01:06.720 | Let me just do this intro properly. Look at all the phones go up. Wow.
00:01:17.360 | You joined Google in 2004, although there was a nice prolific buildup to that career,
00:01:22.240 | but you joined in 2004. You left in 2014. You started in ad sales and you left as the SVP
00:01:30.240 | and chief business officer of Google. Revenue went from three billion. I checked this actually just
00:01:35.280 | to make sure because it's staggering, to 66 billion when you left. Then you, because we're
00:01:40.880 | going to talk about that, and then you got seduced to go work with Masayoshi-san at SoftBank where
00:01:45.920 | you were vice chairman and president. Yeah. That must've been interesting. Jason has a look at
00:01:50.480 | Jason. He's looking at the top. So many good questions. But then you left. Yes. And look,
00:02:00.160 | I've known you for a long time. We've been very good friends for a long time. I was
00:02:03.120 | surprised because I got, you know, you called and you're like, Hey, I'm going to be seat chairman
00:02:07.680 | and CEO of Palo Alto Networks. And I had known what it was, but I didn't really understand.
00:02:14.160 | And then meanwhile, in the last, what's it been? Seven years? Six and a half. Six and a half years.
00:02:18.480 | Market cap is up by 5X. You took a $20 billion company. It's 110 billion as it stands. I think
00:02:24.000 | you've tripled revenue. So this is clearly no longer luck. So now you're in the skill camp.
00:02:32.800 | Oh, good. I'm in founder mode? No, founder mode is cocaine. I was wondering when that was going to
00:02:37.920 | start. No, he has to fly to Europe. You cannot bring founder mode. There are no waffles on the
00:02:46.480 | plane. You can get founder mode in Europe though. Let's just start and just, um, actually let's just
00:02:54.080 | start there. Okay. Okay. Um, you've seen a lot of different executives. You've played a lot of
00:02:58.560 | different roles. You've seen founders. You've advised a lot of founders. Tell us what, what,
00:03:03.120 | what takes, what, what does it take to be successful? And you can use these labels or
00:03:08.400 | not founder mode, manager mode, whatever it is, but what does it take to figure things out
00:03:12.480 | consistently? Look, um, I think you already put that out there, didn't you? Did you say that,
00:03:18.480 | uh, if you think about building great businesses and the center of great businesses, great products,
00:03:24.800 | if you don't have a great product, you're not going to build a great business for the long
00:03:27.840 | term. And this is something, and I know Sergei's here. I learned that with Larry and Sergei at
00:03:31.520 | Google, that they were obsessed about product on a constant basis. So when I came to my job,
00:03:36.080 | I said, the first thing I'm going to focus on is build a great product. But I think it's slightly
00:03:40.320 | different in consumer enterprise. In consumer, you build a great product, you find the flywheel,
00:03:44.640 | you can try and figure out how the flywheel continues to work. In enterprise, eventually
00:03:48.240 | you take a great product and you've got to figure out how to get it out to all the amazing customers
00:03:51.760 | out there. So I think it requires a tremendous amount of focus, tremendous amount of, um,
00:03:58.160 | sort of detail inspection. But I think at the same time, you've got to find a way of taking
00:04:03.920 | lots of amazing people, getting them on the same train and getting them to execute at scale.
00:04:09.680 | It's impossible for one human being to do that at scale. So you have to have a lot of people doing
00:04:13.760 | it amazingly well. That's the trick. Tell us about that first, that first story or that version of
00:04:19.120 | that story inside of Google, because you were there for a long time and a lot of good things
00:04:22.080 | happened. What was that like? What did you learn? Look, Google has one of the best flywheels there
00:04:27.600 | is in the consumer space, right? So we were blessed that we were working with a product that
00:04:32.640 | nobody had ever seen. Everybody wanted to use. It's funny, like every one of us worries about
00:04:36.880 | customer support. You didn't need it. It was an amazing, simple product, easy to use, free. And
00:04:41.600 | our job was to go monetize advertising. So part of that was how do you scale that around the world
00:04:47.360 | in every country, where there's a single product with a single use case, where you have to see how
00:04:52.480 | you can attract lots of advertisers. And that requires building a system. How you get thousands
00:04:56.400 | of people around the world to build a system and execute. So you build a system, you build a
00:05:01.680 | programmatic system, you look at stuff, you inspect, and you hire really amazing people who
00:05:05.520 | go out and do their best that they can. And when you're doing that, and the thing is growing so
00:05:09.360 | fast, what is the, what has to happen for you to go from running Europe, I think is how you started,
00:05:14.400 | to being the head of business there? What does that take? Well, you know, it was such an amazing
00:05:20.640 | juggernaut that you had to figure out how to differentiate. And what is interesting is when
00:05:24.160 | I joined, Google was 24% of global revenue. When I moved to the US, it was 49%.
00:05:31.440 | You mean Europe?
00:05:33.520 | Yes. And it was one of the few tech companies in the world whose European revenue was higher than
00:05:38.080 | the US revenue for a brief period of time. So I think somebody noticed.
00:05:42.080 | And what happens, you get the call and you're like, we need you to move to America?
00:05:46.000 | Yeah, I was on a trip to Russia trying to open an office there, which had to be shut down at
00:05:52.640 | some point in time for a bit. And I got a call from Eric Schmidt and he said, your boss is retiring,
00:05:58.720 | we'd like you to come here and do what he does. And so what then motivates you to leave a job
00:06:07.600 | like that? Because you're kind of then at the top of the pinnacle, you see everything,
00:06:10.800 | you're meeting everybody.
00:06:11.840 | I guess I wanted more, I wanted to do more, get involved sort of the overall business,
00:06:19.680 | wanted to do some product work as well. I didn't have product jobs at Google,
00:06:22.880 | I was seen as a sales guy. At Palo Alto, all I do is product for the first six years of my life.
00:06:27.840 | So to be able to go out and do that differently, but I had to take a brief sojourn to
00:06:32.160 | my Japanese trip, lots of good sushi and lots of-
00:06:35.120 | It was a great vacation, let's talk about it.
00:06:36.880 | Oh, come on, it wasn't a vacation.
00:06:38.400 | Well, a great search. But I mean, this was the largest venture fund ever created,
00:06:44.480 | a hundred billion dollars. And you have this mercurial, brilliant individual, Masayoshi-san,
00:06:51.840 | and he starts placing bets in a way that we've never seen.
00:06:55.280 | What was the genius in that? And what was the Achilles heel?
00:07:00.320 | Look, Masa is one of those people whose risk appetite grew as he grew older.
00:07:07.920 | And you mentioned that because that is a very unique thing, it usually goes the other way.
00:07:14.240 | I have two young kids, and every time, and I'm constantly trying to de-risk them,
00:07:17.600 | saying, "Hey, be careful when you cross the road, be careful when you do this."
00:07:20.960 | You get married, people tell you, "Be careful, buy a house, go settle down."
00:07:24.160 | So we're constantly de-risking our lives as we get older, on a constant basis.
00:07:28.400 | All of us do it, we don't realize we do it, right?
00:07:31.200 | Masa's the opposite. The older he gets, like, "Come on, let's go all in."
00:07:36.080 | He's like you guys, right? He wants to go all in.
00:07:39.200 | So he's like, "Nikesh, I have a great idea. We'll put a billion, we'll borrow 19 billion."
00:07:44.320 | I'm like, "How does that work again? All you got is a billion."
00:07:46.720 | "Yeah, I have one great idea, a billion in, 19 billion."
00:07:49.520 | That's what he did. That's how he built SoftBank Japan.
00:07:52.000 | I think he was the richest man in the world for 88 days in the last internet boom.
00:07:55.600 | Then he was left with a billion dollars.
00:07:57.760 | - Unbelievable.
00:07:59.040 | - And he built from scratch again, so yes.
00:08:01.040 | - So there were a series of incredible bets.
00:08:04.160 | - Yes.
00:08:04.480 | - Maybe walk us through some of those bets, because there was Nvidia, there was Arm.
00:08:08.880 | - All those happened after I left, but that's if I can...
00:08:10.880 | - And then you have...
00:08:11.520 | - Did Arm happen after you left?
00:08:12.560 | - That was where we kind of, you know...
00:08:15.120 | - Unpack it.
00:08:16.160 | - Sorry?
00:08:16.800 | - Unpack that. Unpack this.
00:08:20.080 | - Well, Masa likes a trillion.
00:08:26.720 | - He likes the number one trillion.
00:08:28.960 | - Yes.
00:08:29.200 | - Okay. It's a good number.
00:08:31.520 | - It's better than a billion. Beats a billion.
00:08:34.080 | - It is greater than a billion.
00:08:35.360 | - Yeah, it's greater than a billion, last I checked, yes.
00:08:38.560 | And when I met him the first time after we'd done a deal at Google, we met when he was...
00:08:44.240 | He came to see Larry, Sergey, and Eric and said, "I'd like to do a search deal with you.
00:08:51.600 | I have Yahoo Japan."
00:08:52.560 | I tried to explain to him there's something called...
00:08:55.520 | You can't have two search engines both powered by Google in Japan.
00:08:58.800 | And to his credit, he said, "You can, as long as the advertising systems are different."
00:09:01.520 | So if you look, in Japan today, Yahoo Japan is powered by Google,
00:09:05.120 | and Google's powered by Google, but the advertising systems are different,
00:09:07.760 | hence it's non-competitive.
00:09:09.280 | So he got that done.
00:09:10.560 | And then he says to me, "Show me this plan."
00:09:13.360 | He was going to buy a lot of companies in telecom and get to $100 billion in EBITDA,
00:09:17.040 | which at 10 times EBITDA would be a trillion dollars.
00:09:19.760 | - Okay.
00:09:21.700 | - So then that kind of fizzled out. He lost interest after a while.
00:09:24.720 | And then his next idea was to raise, I think it was 1, 2, 3, 4, 100, 200, 300, and 400.
00:09:31.120 | - Billion dollars.
00:09:32.560 | - Yes, for the Vision Fund, so that'd make a trillion.
00:09:34.720 | (audience laughing)
00:09:36.560 | - Hold on a second, we're doing that, yeah, that's a trillion dollars.
00:09:39.680 | - Yeah, 1, 2, 3, 4, yeah.
00:09:40.720 | - Yeah.
00:09:41.220 | - You know, it's interesting.
00:09:41.920 | - And then, like, he had this whole portfolio of companies,
00:09:43.760 | and there's a bunch of Japanese analysts who'd sit in the office,
00:09:47.120 | MBAs from University of Tokyo.
00:09:49.040 | And eventually, they took all the business that we had
00:09:51.280 | and forecast their fee cash flow at the end, where the DCF was a trillion.
00:09:55.520 | - So he was good at setting goals.
00:09:58.880 | (audience laughing)
00:10:02.880 | So he thought Arm was going to be a trillion-dollar company.
00:10:04.880 | - Got it.
00:10:06.320 | - We were--
00:10:06.720 | - Sorry, let me ask a question.
00:10:08.080 | Do you think it was the mistake, not the mistake,
00:10:11.120 | is it about being financially oriented as opposed to product or impact oriented?
00:10:17.200 | Is there an orientation thing there where if money is the goal,
00:10:20.160 | it becomes a lot harder to achieve versus--
00:10:21.920 | - Well, look, money is a way to keep track.
00:10:24.080 | - Yeah.
00:10:24.320 | - It's not the goal.
00:10:25.520 | That was the way he kept track.
00:10:26.880 | I get it.
00:10:27.680 | But, you know, he was not financially oriented as much as he went by his gut.
00:10:31.200 | It's like many founders, when he believed in it, he was all in.
00:10:36.000 | He totally believed in it.
00:10:37.760 | And sometimes to a fault.
00:10:39.600 | And you saw that.
00:10:40.560 | One thing I did learn from, which is very fascinating, is, you know,
00:10:43.760 | like a person who's like used to wanting to get things right,
00:10:47.520 | I'd make an investment with him.
00:10:49.120 | And then it'd be one investment, I'd say, "Oh, shit, that's not going right.
00:10:51.120 | "Let me go and talk to the company, help them, help them fix it.
00:10:53.600 | "We can get them up and running."
00:10:54.560 | He calls me aside one day.
00:10:55.440 | He says, "Nikesh, you're spending too much time with the mistake."
00:10:59.360 | He said, "If you go spend that time
00:11:02.880 | "with a company that's growing it three times,
00:11:04.640 | "they can grow it six times.
00:11:06.000 | "We'll make our money up six times with that company
00:11:07.840 | "instead of you trying to fix that from half back to one."
00:11:09.920 | So it's kind of interesting, you know?
00:11:12.080 | That's an incredible lesson, actually.
00:11:13.440 | That's the hardest thing to do.
00:11:14.800 | Cutting edge, sunken cost.
00:11:16.720 | I mean, there's a million ways to say it.
00:11:18.480 | But you have to let your winners ride.
00:11:20.640 | You got to focus on the winners.
00:11:21.680 | But go all in on the winners.
00:11:22.960 | Go all in on the winners.
00:11:23.680 | It's hard to do.
00:11:24.480 | It's hard to say, "Oh my God, I made a mistake."
00:11:26.560 | You go and say, "I can fix it.
00:11:28.000 | "I'm good.
00:11:28.640 | "I'm going to salvage it."
00:11:29.840 | Yeah.
00:11:30.320 | That's an ego problem.
00:11:31.280 | That's a hubris problem.
00:11:32.320 | I don't want to have a mistake on my record.
00:11:33.760 | Or a good person wants to help the founder,
00:11:36.880 | you know, realize their vision.
00:11:39.600 | And the cutthroat nature of this with the power law
00:11:43.840 | is such that 6X-ing something that was at 3X
00:11:47.040 | is much, much more likely than getting a zero to a one.
00:11:51.360 | Do you apply that principle as an operator?
00:11:54.080 | And if so, like how at Palo Alto Network?
00:11:56.960 | Look, in the last six and a half years,
00:11:59.920 | I've got 19 companies.
00:12:01.840 | Right.
00:12:02.340 | You can show the slides.
00:12:04.640 | So we've got some great slides.
00:12:05.360 | We've got some slides that just...
00:12:07.380 | Here's your stock.
00:12:08.400 | Good job.
00:12:08.900 | Thank you.
00:12:10.020 | It's not bad.
00:12:12.080 | All right, go back.
00:12:12.800 | Let's see.
00:12:13.300 | If you go back, it's really...
00:12:15.120 | Go back a second.
00:12:15.840 | It's really bad.
00:12:16.340 | What's the market cap now?
00:12:18.960 | $110 billion.
00:12:19.760 | $110 billion.
00:12:20.560 | Yeah, that's where they...
00:12:21.120 | And when you started, it was at...
00:12:23.040 | $20 billion.
00:12:23.840 | $20 billion.
00:12:24.560 | What you're seeing here is the economic principle
00:12:28.240 | of founder mode.
00:12:30.640 | Okay.
00:12:31.140 | And the revenue and the operating income.
00:12:35.040 | Yum, yum.
00:12:36.640 | Which...
00:12:37.140 | How do you look at something like this
00:12:39.040 | when you were first approached for the job?
00:12:41.120 | How do you underwrite the job?
00:12:42.240 | Like, what are you looking at?
00:12:44.320 | And you said you've spent the last six and a half years
00:12:46.720 | in product.
00:12:47.220 | Did you see a product that just...
00:12:49.600 | Was it missing something?
00:12:50.480 | That's the thing I want to hear about it.
00:12:51.440 | Like, did you go all in on one or two big things?
00:12:53.360 | Yeah, was it broken?
00:12:53.860 | That was always the Steve Jobs model,
00:12:55.360 | was pick the winner and go all in on it,
00:12:56.800 | get rid of all the other stuff.
00:12:58.240 | Does that principle apply here or...
00:12:59.840 | No, this is like a different principle.
00:13:01.600 | Look, it's a $180 billion industry on an annual basis.
00:13:05.360 | The largest market share is one and a half percent,
00:13:07.280 | which is us.
00:13:08.420 | And it's a subsector of technology
00:13:10.480 | with the most amount of fragmentation.
00:13:11.680 | And look around, you know,
00:13:12.960 | Benioff, I think is going to be here,
00:13:14.160 | builds a platform for Salesforce.
00:13:16.560 | You have ServiceNow, you have Workday.
00:13:18.080 | There's no cybersecurity platform.
00:13:19.520 | You sit there and say,
00:13:20.480 | "This is a phenomenal opportunity."
00:13:22.660 | Two, it's a company that's fully public,
00:13:24.160 | so I don't have to deal with voting controls
00:13:26.240 | and founders who I have to deal with,
00:13:27.440 | which have different motivations.
00:13:29.440 | It's a evergreen sector.
00:13:33.600 | More we get connected,
00:13:35.520 | the more people want to hack.
00:13:36.640 | The more you're going to connect it,
00:13:37.680 | the more data is there for people to take away.
00:13:39.200 | So you're not going to have a demand problem.
00:13:40.880 | - Yeah, sadly.
00:13:42.320 | - So if you can go into a sector
00:13:43.920 | where there's no demand problem,
00:13:45.440 | you can look at it and say,
00:13:46.240 | "What did everybody get wrong?"
00:13:47.200 | Say, "Well, everybody sort of lived in their swim lane."
00:13:49.040 | So we were in our swim lane.
00:13:50.800 | We did one thing.
00:13:52.080 | There are five swim lanes in cybersecurity.
00:13:53.760 | In six years, we looked forward and said,
00:13:56.560 | "Where is the world going to?
00:13:57.680 | "It's going to the cloud.
00:13:58.720 | "There's a bunch of AI."
00:14:00.000 | That was our sort of plastics moment, cloud and AI.
00:14:02.560 | So we said, "Let's not go reinvent the past."
00:14:04.880 | So one of the things I also learned
00:14:07.040 | during my time at Google and NASA is like,
00:14:10.160 | a lot of people get hung up
00:14:11.360 | in trying to make the stuff work,
00:14:13.280 | assuming everything around you is going to stay the same.
00:14:15.680 | So saying, "No, we're just going to focus
00:14:17.600 | "and assume that 50% of the world is in the public cloud.
00:14:20.080 | "What's security going to look like then?
00:14:21.840 | "Assume latency is low.
00:14:23.440 | "You can process in the cloud.
00:14:24.880 | "Data storage is cheap.
00:14:25.840 | "What's going to change?"
00:14:26.640 | So we built for that.
00:14:29.040 | We were 19 companies.
00:14:30.320 | We went and looked at how everybody does M&A and who failed.
00:14:34.160 | - Oh, what did you learn from that?
00:14:36.080 | - Well, we learned that things traded a price for a reason.
00:14:40.320 | So very often people say, "You know what?
00:14:42.400 | "Ah, number one is a billion dollars.
00:14:44.320 | "Number four is $200 million.
00:14:45.840 | "I can take $200 million and clean it up and fix it."
00:14:48.160 | It's at $200 million for a reason.
00:14:50.720 | The billion is a billion for a reason.
00:14:51.920 | They'll still be around.
00:14:52.960 | - Yes.
00:14:54.160 | - So why don't we buy the guy who's worth a billion dollars?
00:14:57.600 | We'll be number one.
00:14:58.400 | We'll be leading the market.
00:15:00.320 | We have brute force to go to market.
00:15:02.400 | We'll go use that.
00:15:03.760 | And we're probably going to slow them down a little bit
00:15:05.360 | because they're a larger company.
00:15:06.320 | So we'll compensate for slowdown with go-to-market
00:15:10.320 | that we bring to them and we'll let them lose.
00:15:12.320 | So we're the only company where,
00:15:13.680 | when we acquired companies,
00:15:14.800 | there's a funny story.
00:15:16.240 | I got a guy who says, "Oh, great.
00:15:17.680 | "We're buying a company in cloud security.
00:15:19.120 | "I'm the senior vice president of blockchain, cloud, and AI."
00:15:22.960 | I'm like, "Great.
00:15:23.600 | "Welcome to your new boss."
00:15:26.320 | He's like, "What do you mean?"
00:15:27.840 | I said, "That's the guy you're going to work for."
00:15:29.760 | He's like, "We just bought his company."
00:15:30.960 | I said, "Yeah.
00:15:31.680 | "He kicked your ass with low resources
00:15:34.080 | "out there in the market.
00:15:35.040 | "You're going to learn something from him."
00:15:36.240 | - There you go.
00:15:36.720 | - "Welcome to your new boss."
00:15:37.680 | - Well, you know,
00:15:39.440 | there is an analogy.
00:15:43.120 | You share a passion for basketball as well.
00:15:45.680 | I see you all the time at the Warriors game.
00:15:47.440 | And I don't think this is a jump to say
00:15:50.560 | watching that team play and how they manage talent
00:15:53.120 | and play as a team definitely informed
00:15:55.920 | how you play the game, yeah?
00:15:57.200 | - That's true.
00:15:58.720 | Not lately, but yes, in the past.
00:16:00.240 | - Yeah.
00:16:00.640 | - But look, we've done that 19 times.
00:16:03.920 | We had seven out of 10.
00:16:05.280 | We've gotten right.
00:16:06.000 | And we still possibly have the most number of founders
00:16:10.000 | who still work for Palo Alto.
00:16:11.120 | - Yeah, actually, can you explain that?
00:16:12.320 | So when you buy a company, isn't the typical motivation,
00:16:14.720 | wait till I cliff, it's a year.
00:16:16.480 | And then most people just vamoose, they're gone.
00:16:18.400 | - No, no.
00:16:18.800 | So here's how it works.
00:16:19.600 | If you come to Palo Alto, we'll take your equity away first.
00:16:22.720 | We'll say, I'm gonna give you back one and a half times
00:16:25.360 | your equity if you stay with me for three years.
00:16:26.880 | - Oh, wow.
00:16:28.120 | - To the founder.
00:16:28.800 | - Yes.
00:16:29.280 | - Okay, wait a second.
00:16:30.640 | So the founder owns, let's say it's $100 million,
00:16:33.440 | a billion dollar company.
00:16:34.320 | They own 20%, they got 200 million.
00:16:35.760 | You say, hey, stay again.
00:16:36.800 | I'll give you 300 million, right?
00:16:39.040 | - It's gonna work for me for three years.
00:16:41.040 | - Pretty good deal.
00:16:42.080 | - Because when you buy a company,
00:16:43.520 | you're buying a half a product and a full vision.
00:16:47.520 | - Ah, right.
00:16:48.960 | - I lose the vision part of it, I get half a product.
00:16:52.080 | - Right.
00:16:52.480 | How much of the success is predicated on the engine
00:16:57.120 | at Palo Alto Networks to drive sales,
00:16:58.880 | to sell into the enterprise?
00:17:00.080 | How much do you come in and then the founder feels
00:17:02.480 | like there's an interference model now that's like,
00:17:04.480 | how are you getting in my way?
00:17:05.520 | And how do you manage that balance?
00:17:07.680 | - So what happens is like, look, the customers
00:17:09.920 | and security want the best product.
00:17:11.280 | That's why everybody lives in their swim lanes.
00:17:12.880 | We said, we gotta be in multiple swim lanes.
00:17:14.560 | To be in multiple swim lanes is multiple people saying,
00:17:16.880 | I've got great products.
00:17:18.080 | In enterprise, there's this bizarre thing
00:17:19.680 | called magic quadrants, which Gartner has.
00:17:21.680 | And your badge of honor is you're in the top right,
00:17:25.120 | which is a leader's quadrant in Gartner.
00:17:27.520 | When I joined Palo Alto, we were in two.
00:17:30.080 | We're in 24 right now, in the top right.
00:17:32.240 | So now when you go to customers saying,
00:17:34.000 | hey, I got some great products.
00:17:35.040 | Ah, well, you're gonna sell me some good and some bad.
00:17:37.200 | Like, you know, you pick.
00:17:38.480 | There's all 24 are in the top right,
00:17:40.320 | and they all work together better.
00:17:41.840 | So for that, we need the founders building the product
00:17:43.840 | and staying there.
00:17:44.880 | Yeah, they do feel a little, sometimes they feel
00:17:46.560 | like they're being directed.
00:17:47.840 | But there's also another rule.
00:17:49.360 | I had a wonderful conversation with a founder,
00:17:51.520 | which was my first acquisition.
00:17:52.960 | And I think we didn't set the bid right.
00:17:55.760 | So we had to fix it in future deals.
00:17:57.600 | Paid the founder a reasonable amount of money,
00:18:00.000 | probably $850 million to the company.
00:18:02.240 | He had about $150 million.
00:18:03.760 | He was gonna get 200.
00:18:05.200 | Then he comes into my office and saying,
00:18:06.400 | hey, Kesh, I had a problem.
00:18:07.600 | I think we should do it this way.
00:18:10.880 | You're telling us to do it this way.
00:18:12.000 | I said, because this way is the way
00:18:14.400 | it's gonna work for us.
00:18:15.520 | He's like, yeah, but when I came here with my company,
00:18:18.960 | I said, whoa, wait a minute.
00:18:20.000 | I said, have you ever sold a house?
00:18:24.000 | He's like, yeah.
00:18:25.360 | I said, who decides what, and you get to stay in it.
00:18:28.880 | Who decides what color the walls are gonna be painted?
00:18:30.960 | (audience laughing)
00:18:31.840 | - The new owner.
00:18:32.560 | - It's the new owner, of course.
00:18:33.600 | (audience laughing)
00:18:34.560 | I said, thank you.
00:18:35.280 | (audience laughing)
00:18:37.360 | So he stayed there.
00:18:38.240 | He really liked us, and he stayed for three years.
00:18:40.240 | He made 2 1/2 times that money he got.
00:18:42.160 | But from then on, we changed the game.
00:18:45.360 | When we acquire a company,
00:18:46.320 | we sit the founder down and say, okay,
00:18:47.600 | the lawyers will do their thing.
00:18:49.840 | You're gonna sit on my head of product
00:18:51.520 | and design a product strategy we both agree on.
00:18:53.520 | - Yeah.
00:18:55.120 | - So we don't buy a company
00:18:56.240 | until we have a joint agreed product strategy
00:18:58.000 | with the founder.
00:18:58.720 | - Now, in Cisco, Oracle, Salesforce,
00:19:01.840 | they've all kind of had this M&A playbook
00:19:04.080 | that they claim is part of their engine of success.
00:19:07.360 | How differentiated is it for you?
00:19:09.360 | What's kind of the biggest contrast
00:19:11.680 | for your playbook versus those engines?
00:19:14.800 | - The biggest contrast for us is we like to buy a product
00:19:18.800 | which we can integrate and sell to customers.
00:19:20.720 | We have a go-to-market engine.
00:19:22.000 | We like to keep it the way it is.
00:19:23.200 | I think if I'm gonna buy a company
00:19:27.040 | at eight to 10 times revenue,
00:19:28.480 | I'm just overpaying for customers and sales.
00:19:32.080 | I have all the customers already.
00:19:33.280 | Why would I pay eight to 10 times revenue
00:19:36.480 | to buy a customer I already have it
00:19:37.920 | on a different product?
00:19:38.720 | So I'd rather buy the product,
00:19:41.600 | use my go-to-market capabilities,
00:19:43.200 | go sell them to the customer base,
00:19:44.960 | unless I can take the two companies,
00:19:47.200 | merge them, and I can make it worth 16 times.
00:19:50.000 | If you as an investor can buy both our companies
00:19:53.120 | and enjoy yourself,
00:19:54.000 | why would I have to pay a premium
00:19:55.200 | to buy it at eight times revenue?
00:19:56.400 | So we're very clear.
00:19:57.600 | We don't wanna buy customer bases.
00:19:59.120 | We wanna buy products which we can integrate
00:20:01.120 | and sell them to our customer base.
00:20:02.320 | - Let's talk a little bit about the threats
00:20:04.000 | that are out there in the modern world,
00:20:05.600 | how they're evolving with artificial intelligence.
00:20:08.080 | Obviously, it can be used on both sides
00:20:09.840 | of this competition to see who can protect
00:20:12.880 | information and then who can steal it.
00:20:14.320 | Who are the actors?
00:20:16.160 | What's the motivation today?
00:20:19.040 | And how are they coordinating?
00:20:20.160 | 'Cause it feels like there is now this new
00:20:22.720 | allegiance between American hackers,
00:20:27.440 | very young, anonymous, working with,
00:20:30.400 | to do the social engineering,
00:20:31.840 | working with some brute force tools
00:20:34.160 | out of China, Russia, other places.
00:20:36.000 | Who's orchestrating these very large
00:20:42.720 | hacks that occurred at the casinos recently?
00:20:45.200 | And then we can get into,
00:20:48.240 | how should our government, if at all,
00:20:50.480 | be thinking about stopping these
00:20:53.440 | and partnering with corporate America
00:20:55.680 | to neutralize these threats?
00:20:57.840 | Because some of them are involving
00:20:59.360 | the governments of these countries.
00:21:00.720 | - Yeah, so I think, look,
00:21:02.000 | if you trace the history of cyber hacking,
00:21:04.480 | you had these big hacks,
00:21:05.840 | which used to take 30 or 50 days to figure out.
00:21:08.080 | People were doing them as a hobby.
00:21:10.480 | You'd think of your notion of a hacker
00:21:12.000 | was some kid sitting in their parents' basement
00:21:14.400 | who didn't get out of there
00:21:15.680 | on his little PC trying to hack this
00:21:17.840 | and trying to get all the data out of there.
00:21:19.440 | And then suddenly people discovered,
00:21:20.720 | "Wait, I can get better than this," right?
00:21:22.400 | Because, and it was usually the proof.
00:21:24.560 | It was a badge of honor.
00:21:25.280 | "Oh, I hacked into this database
00:21:26.640 | "or I hacked in there.
00:21:27.360 | "I got in there.
00:21:28.080 | "You guys aren't strong enough."
00:21:29.040 | And as the world got more connected,
00:21:30.880 | what happened was people said,
00:21:31.920 | "Wait, why am I wasting my time
00:21:33.840 | "hacking one user, one company at a time?
00:21:36.000 | "Let me go after a piece of supply chain.
00:21:38.080 | "If I hack the Exchange server,
00:21:39.440 | "everybody who uses the Exchange server is fair game.
00:21:42.000 | "If I hack an agent or not, an antivirus,
00:21:44.640 | "I can get into everybody's computer.
00:21:46.320 | "If I hack a large email provider,
00:21:48.560 | "I can have access to every dissident's email,"
00:21:50.960 | which is when nation-states got involved.
00:21:52.960 | Nation-states said, "Wait a minute.
00:21:54.240 | "If I got one data, why bother hacking one person?
00:21:57.200 | "Let me go hack the back end and get in."
00:21:59.200 | - Let me hack Gmail.
00:22:00.320 | - Yes, more effective, right.
00:22:01.840 | So when that began to happen,
00:22:05.280 | nation-states started getting involved.
00:22:06.560 | They're like, "That's interesting.
00:22:07.200 | "If I can do that,
00:22:08.720 | "I can destabilize nations.
00:22:10.160 | "I can get data about other people that I want."
00:22:13.040 | So that became a bit of a nation-state activity.
00:22:15.680 | Now, cybersecurity, offense is way easier than defense.
00:22:18.560 | For defense, you've got to write 100% of the time.
00:22:21.280 | Office, you're gonna find one door.
00:22:22.560 | So put that aside.
00:22:26.080 | Then what happened on top of that is that
00:22:27.920 | nation-states started cultivating these entrepreneurs
00:22:31.120 | in the hacking world, saying,
00:22:31.920 | "Listen, that's how you keep your skills up to date.
00:22:34.080 | "If you go after stuff, we'll look the other way
00:22:36.400 | "while you're going and doing it
00:22:37.440 | "because if we need you, we'd have found a way."
00:22:39.920 | Then we discovered this notion,
00:22:42.960 | "Wait, there's tremendous economic value now in hacking."
00:22:45.360 | So we became ransomware.
00:22:47.120 | People wanna say, and then there's like a magic number.
00:22:50.240 | They ask for $30 million or less
00:22:51.520 | because that's director's liability, insurance.
00:22:53.600 | - Wait, sorry, sorry, sorry.
00:22:55.520 | When you get hacked for ransomware,
00:22:57.440 | 30 million is what?
00:22:58.400 | - It's what companies can give you
00:23:00.320 | where it's covered by insurance.
00:23:01.600 | - Oh, it's covered by insurance.
00:23:02.960 | - Beyond that--
00:23:03.440 | - Which is purely working backwards from that policy.
00:23:05.360 | (audience laughing)
00:23:07.440 | - So there's about $2 billion that's been paid
00:23:10.000 | in the last 12 months in ransomware.
00:23:12.000 | - Wow. - Wow.
00:23:12.640 | - If you think, here's the anatomy of a hack, right?
00:23:15.840 | Somebody says, "I found a SolarWinds server, it's hackable."
00:23:18.480 | So some set of guys go quickly and plant themselves
00:23:21.200 | at 1,800 SolarWinds servers, which are exposed to the internet.
00:23:24.240 | Then there's a separate industry sub-segment.
00:23:26.800 | They sell it to say, "Listen, I'm only in the seeding business.
00:23:30.000 | "You can go run ransomware as a service,
00:23:33.120 | "negotiations with customers."
00:23:35.360 | - Wow. - That's their go-to market.
00:23:36.800 | - Yes, it's their go-to market,
00:23:37.920 | amplification to system denigration, yes.
00:23:39.840 | Then there's a third set of people who are payment clearing.
00:23:42.880 | People say, "I'll collect the money.
00:23:44.160 | "I know how to process $300 a bit more."
00:23:45.840 | - Oh my gosh. - Oh, it's so sophisticated.
00:23:47.280 | - That's just so sophisticated.
00:23:48.480 | - And where are they geographically?
00:23:49.760 | Like, is it all over?
00:23:50.880 | - Everywhere. - Everywhere.
00:23:51.760 | - Everywhere where extradition treaties are light.
00:23:53.840 | And-- - Are there specific
00:23:57.520 | foreign nationals that move to jurisdictions to do this?
00:23:59.920 | Is this like-- - There's a lot of people
00:24:03.760 | in the world out there who do this,
00:24:05.280 | and they're hard to find. - Amazing.
00:24:06.560 | - And remember, think about the enforcement.
00:24:08.640 | Like, where are you gonna go?
00:24:10.400 | Go to your local police station?
00:24:11.680 | - You're freaking me out.
00:24:12.480 | (audience laughs)
00:24:14.080 | - Somebody takes a million dollars away
00:24:15.440 | from your bank account.
00:24:16.080 | Where are you gonna go, local police?
00:24:16.960 | The guy says, "Actually, sir,
00:24:18.000 | "this looks like somebody in Greece."
00:24:19.360 | - Yeah. - Do you know our--
00:24:21.280 | - Panic attack. - Yeah.
00:24:23.360 | - It's gonna be okay. - Let me shift the conversation.
00:24:25.680 | I wanna talk about AI for a second
00:24:27.280 | 'cause you mentioned it as well.
00:24:28.240 | But I wanna first ask it to you more
00:24:31.200 | as just a smart observer of the market.
00:24:33.440 | You're in the market.
00:24:34.160 | You've invested in a bunch of companies as well.
00:24:35.840 | What's the state of AI?
00:24:38.320 | Take it however you want.
00:24:40.880 | - Yes, I mean, look, what's interesting is
00:24:42.800 | I think a lot of people are chasing LLMs.
00:24:45.760 | I think there's a very well-established expectation out there.
00:24:49.840 | These LLMs will get smarter and smarter.
00:24:51.520 | Inference will come.
00:24:52.880 | Latency will go down.
00:24:54.560 | Cost to deploy, cost to train will all come down.
00:24:58.480 | So the good news is we seem to have established
00:25:00.880 | a nicely competitive space out there
00:25:03.280 | between all these people that you can expect
00:25:06.000 | some sort of economic rationality to prevail.
00:25:08.880 | And a lot of people are sort of investing
00:25:10.640 | a lot of dollars to get it there.
00:25:11.920 | And thanks to Mark Zuckerberg,
00:25:13.520 | throwing out open source models,
00:25:14.640 | he keeps them honest and fair everywhere around there.
00:25:17.120 | So we'll all get access to these models.
00:25:19.120 | But for the most part,
00:25:20.720 | as you get into the application of these models,
00:25:22.480 | I think the world changes in consumer enterprise.
00:25:24.880 | In consumer, you gotta figure out how these models
00:25:28.240 | are gonna translate into consumer services
00:25:30.400 | and make them better.
00:25:31.280 | And you can see that the question is,
00:25:32.800 | do we get a whole new Google that's formed
00:25:35.840 | or a whole new Facebook that's created?
00:25:38.160 | Or do the existing players move fast enough
00:25:40.800 | to embody sort of to embed AI in there
00:25:43.920 | and our hooks, those services having to us are so strong
00:25:46.880 | that we don't shift our usage.
00:25:49.520 | So let's park that for a second.
00:25:50.720 | We'll go back there in a minute.
00:25:52.240 | On the enterprise side,
00:25:53.200 | it's not useful unless you can train on my data.
00:25:56.800 | And you'll discover 90% of companies have bad data.
00:26:00.720 | 90% of companies, like how do you solve this problem?
00:26:03.840 | I don't know.
00:26:04.240 | I don't know how I fixed the last firewall that broke down.
00:26:07.520 | If I don't have that data
00:26:08.480 | and I don't have 10 good instances,
00:26:10.560 | how do I make it work in the 11th instance?
00:26:12.560 | So we're all busy refactoring our data,
00:26:15.040 | figuring out how to collect good data
00:26:16.160 | on the enterprise side, which is gonna happen.
00:26:18.640 | It's all the easy stuff.
00:26:19.840 | And Sebastian will tell him,
00:26:20.800 | "Claude and I've got it figured out.
00:26:21.920 | I'm gonna answer questions."
00:26:22.720 | Those are easy questions.
00:26:23.680 | How much balance do I owe you?
00:26:24.880 | When do I owe you?
00:26:25.440 | Can I pay you tomorrow?
00:26:26.480 | No, you have to pay me today.
00:26:27.840 | Even an AI bot can answer that question.
00:26:29.520 | Right.
00:26:30.000 | But it's very hard to say, "My firewall broke down.
00:26:31.840 | I don't know what happened.
00:26:32.800 | How do I fix it?"
00:26:34.000 | Because I'd need a lot of data.
00:26:34.880 | So I think on the enterprise side,
00:26:36.240 | a lot of companies would have to do a lot of work
00:26:38.000 | to get their data sorted.
00:26:39.520 | And that's in process.
00:26:40.480 | What we've done is it's stimulated all of us
00:26:42.240 | to go out and get that figured out.
00:26:44.160 | On the consumer side, it's gonna be very interesting.
00:26:46.160 | I think we can all imagine a future which says,
00:26:48.000 | "Hey, my favorite phone or favorite hardware device
00:26:51.200 | or favorite interface,
00:26:52.560 | go book me a ticket to Geneva,
00:26:55.440 | which is where I'm going after this,
00:26:56.880 | and book me a restaurant and a hotel room."
00:27:00.080 | Now you just, in your brain, say, "Wait, wait, wait, wait.
00:27:03.200 | I just did booking.com, I did OpenTable,
00:27:06.160 | and I did hotels.com."
00:27:07.280 | Now we're gonna see this happen.
00:27:09.200 | Who's gonna control the user interface
00:27:12.800 | and whose agent is gonna talk to who?
00:27:14.320 | Right.
00:27:16.000 | Try telling any of the existing app guys that,
00:27:18.640 | "Listen, suppressor UI,
00:27:20.400 | I'm just gonna send you an API call.
00:27:21.840 | Send it back to me.
00:27:22.560 | I'll control the data about the consumer."
00:27:24.400 | Yeah.
00:27:24.720 | Let's see how far that lasts.
00:27:26.720 | I mean, that's, yeah,
00:27:28.240 | you're just handing over your business to them, yeah.
00:27:30.560 | Well, then what's gonna happen?
00:27:31.600 | One or two things happen.
00:27:32.560 | That always happens, right?
00:27:33.600 | These people become the legacy players,
00:27:36.720 | and you'll have new companies that are formed,
00:27:38.480 | which are agent-based only.
00:27:39.760 | It's like, you know, half your fortune is better than none.
00:27:44.000 | Yeah.
00:27:44.320 | So if I started a company tomorrow and said,
00:27:45.840 | "Listen, I only have an agent that does airline bookings.
00:27:48.320 | Just pay me 20%, I'm good.
00:27:50.080 | I don't need a brand."
00:27:50.880 | What happens then?
00:27:53.920 | So I think there's gonna be much more upheaval
00:27:56.640 | in the consumer space than any one of us realizes.
00:27:59.440 | I think 5 million apps will be redesigned in the next 10 years.
00:28:02.160 | They'll all become agents.
00:28:03.840 | Right.
00:28:05.200 | The ones that wanna survive will do that first?
00:28:06.880 | But it's very hard.
00:28:08.800 | Yeah.
00:28:09.440 | Your margin's my opportunity, I guess,
00:28:11.360 | is the way we say it in the industry.
00:28:12.800 | Yes, yes.
00:28:13.520 | But it's very hard.
00:28:14.800 | Can you try going to any of these large branded apps
00:28:16.880 | that sit on everybody's phone and say, "Listen."
00:28:19.600 | Shut it off.
00:28:20.400 | Yeah.
00:28:20.720 | Become a service provider of data.
00:28:23.520 | Just talk to Siri, just talk to whatever.
00:28:25.440 | Whatever it is.
00:28:26.240 | Yeah.
00:28:26.800 | And we'll get it done for you.
00:28:27.920 | AI and attacks and sophistication,
00:28:31.920 | of the attacks that occur,
00:28:33.200 | how often does human factors,
00:28:36.320 | phishing, tricking people come into play these days?
00:28:40.480 | And how much of that is going to be exacerbated
00:28:44.080 | by AI deepfakes, et cetera?
00:28:46.160 | Oh, the attacks are most simple.
00:28:47.920 | Most simple.
00:28:49.840 | Explain.
00:28:51.280 | You know, we had a whole pen test company
00:28:54.000 | and they do this thing.
00:28:58.800 | They said, "Listen, we're gonna penetrate your defenses.
00:29:01.280 | "It's not possible, it's just impossible.
00:29:03.440 | "We're gonna figure it out."
00:29:04.960 | The guy goes in the morning,
00:29:06.880 | eight o'clock at the parking lot,
00:29:08.240 | drops a bunch of USB sticks with little tape on the sticks,
00:29:11.760 | my home videos.
00:29:12.800 | Oh my God.
00:29:14.420 | He drops about 25 of them,
00:29:17.120 | six of them log in with a USB stick
00:29:18.640 | on the computer in the office.
00:29:19.760 | They're in.
00:29:20.960 | Oh my God.
00:29:21.920 | Fuck.
00:29:22.640 | Oh my God.
00:29:23.520 | So great.
00:29:25.120 | I don't know if you need to go like get a,
00:29:27.760 | you know, battering ram and break your door down.
00:29:29.920 | This is...
00:29:30.420 | It's human behavior.
00:29:32.080 | Human behavior.
00:29:32.720 | And they possibly said something more colorful
00:29:35.280 | than my home videos on that,
00:29:36.400 | but I'm just gonna keep it PG here, right?
00:29:39.600 | So you can decide at what point in time
00:29:41.200 | your curiosity gets the better.
00:29:41.600 | My nudes with a Z.
00:29:42.800 | That got a hundred percent.
00:29:46.400 | Yeah, so it's like, these are not hard attacks.
00:29:48.960 | Like, you know, there's, we had a,
00:29:51.200 | we sent an email out to everyone saying,
00:29:53.200 | "National pet day.
00:29:54.640 | "Please take a picture of your fluffy pet at home
00:29:57.120 | "and upload it with this website.
00:29:58.640 | "And the person who does it
00:29:59.840 | "will give $10,000 to the SPCA."
00:30:02.640 | Oh my God, you've seen the beautiful fuzzy pictures
00:30:05.280 | uploaded to this hacking site
00:30:07.120 | where we had all your details.
00:30:08.560 | All the IP addresses, everything.
00:30:10.880 | Yes, everything.
00:30:11.680 | Oh no, no, you have to actually add to your username.
00:30:13.440 | Oh God, password.
00:30:14.640 | And there's the things like,
00:30:15.840 | is your pet so wonderful
00:30:17.200 | that you use their name as a password?
00:30:18.640 | Yeah.
00:30:19.140 | Let's-
00:30:20.400 | What's your pet's name?
00:30:21.440 | Love it.
00:30:22.580 | Let's-
00:30:23.540 | Let's flip it, it's so great.
00:30:25.520 | This is not hard.
00:30:26.400 | You don't need like, you know,
00:30:27.520 | huge cybersecurity sensors to block this stuff.
00:30:29.920 | Okay, wait, flip it around for a second.
00:30:32.000 | There's something going on.
00:30:32.880 | I don't know if you read, you probably did.
00:30:34.080 | There's an explosion of deepfake porn
00:30:39.680 | in South Korea going on right now.
00:30:41.920 | That's not my area of specialization.
00:30:43.360 | No, no, no.
00:30:43.860 | You guys might know more about that.
00:30:48.160 | Heard from a friend.
00:30:48.960 | I don't have time for that kind of stuff.
00:30:50.400 | No, no, no, no, no, I meant more.
00:30:51.760 | He read on Twitter.
00:30:52.720 | And Jamaat searched for himself and there wasn't any.
00:30:54.960 | No, but so there's all this fake content
00:30:56.400 | that's going to emerge.
00:30:57.280 | There's going to be all this-
00:30:58.000 | Is it fake?
00:30:58.560 | Somebody's fake is somebody's reality.
00:31:00.240 | Yeah, no.
00:31:00.960 | But my point is more different.
00:31:02.720 | How do you, if you're asked by a customer,
00:31:04.560 | tell me if that's real or not?
00:31:06.080 | How do you figure out tomorrow if something is real?
00:31:08.720 | Well, look, there's a huge conversation going on
00:31:10.880 | that there needs to be some form of regulation
00:31:13.760 | that insists that watermarking needs to happen.
00:31:15.920 | If you generate a video using any AI tool,
00:31:18.800 | it has to say created by AI.
00:31:20.240 | If it doesn't, it's very hard to tell the difference.
00:31:23.600 | Yeah.
00:31:24.320 | And as I was saying to somebody else the other day,
00:31:25.840 | it's like most likely if it seems too perfect,
00:31:28.160 | it was probably created by AI.
00:31:29.440 | How is that different?
00:31:30.320 | Let me just ask philosophical.
00:31:31.680 | How is that different than airbrushing in Photoshop
00:31:34.400 | and making the person look completely different?
00:31:36.240 | We don't have any of those disclosures today.
00:31:39.120 | I always feel like there's a spectrum.
00:31:40.240 | Well, it's scale.
00:31:40.880 | I guess scale would be the issue, right?
00:31:42.720 | And fidelity?
00:31:43.360 | I don't know.
00:31:43.920 | Sorry?
00:31:44.560 | Scale and fidelity.
00:31:45.600 | Like the number of people who can do what you're saying
00:31:49.280 | is 0.1% of the population or 1%.
00:31:51.600 | Now it's 100.
00:31:52.320 | I think if it's with the intent to deceive.
00:31:54.720 | I see.
00:31:55.220 | Anyway.
00:31:57.060 | And then what about on the other side?
00:31:58.560 | In terms of defense, have you started to make AI shields that--
00:32:04.080 | Yes, yes.
00:32:04.800 | So look, the two biggest risks today in AI
00:32:08.640 | is that I think about 20% to 30% of most companies
00:32:11.760 | have employees on the younger side who are using AI apps
00:32:14.480 | to try and get their job done faster and easier.
00:32:17.600 | Write me a marketing blog.
00:32:18.720 | Try and figure something out.
00:32:19.760 | Write me a script for this or take this data.
00:32:22.400 | Analyze it for me.
00:32:23.600 | The risk there is that you're sending proprietary data up
00:32:26.160 | into a model for a company.
00:32:28.320 | Here's a napkin drawing of a chip design
00:32:31.120 | I just made for inferencing.
00:32:33.040 | Turn this into a real CAD drawing for me.
00:32:34.800 | They could do it, but except that LLM was brought down
00:32:38.560 | from hugging face, and it's going back to North Korea.
00:32:41.940 | So there's that risk, that your employees
00:32:43.920 | are being targeted with AI apps in your company
00:32:45.840 | who are uploading proprietary data in a happy way,
00:32:48.480 | very nicely for you.
00:32:50.320 | Same guy who picked up the USB stick.
00:32:52.400 | So there, you have a product that watches all these apps
00:32:55.920 | and makes sure that what you're using, what you're uploading
00:32:57.680 | is not being sent to dangerous apps.
00:32:59.520 | Or if you don't want your employees to send it,
00:33:01.600 | it will block you.
00:33:02.640 | The other one, which is kind of interesting,
00:33:04.160 | is that I think almost every company is experimenting
00:33:06.880 | with deploying LLMs internally because they all want
00:33:09.520 | their favorite proprietary chat interface.
00:33:12.800 | And there, you need to be careful because you can,
00:33:17.040 | what you used to do with SQL injection,
00:33:18.560 | you can do it with prompt injection,
00:33:19.760 | you can bombard models, you can bias them,
00:33:22.320 | you can do a whole bunch of stuff.
00:33:23.680 | So you have what we call an AI firewall that'll protect you.
00:33:26.640 | I want to be sensitive at the time because I know
00:33:27.840 | you have to fly to Geneva.
00:33:28.800 | Thank you very much for coming.
00:33:29.920 | Ladies and gentlemen, Nikesh Arora.
00:33:31.440 | Thank you for having me.
00:33:32.320 | (applause)
00:33:34.560 | (applause)