back to index

GPT-4o launches, Glue demo, Ohalo breakthrough, Druck's Argentina bet, did Google kill Perplexity?


Chapters

0:0 Bestie Intros: Recapping Phil Hellmuth's birthday weekend
7:38 OpenAI launches GPT-4o: better, faster, cheaper
29:40 Sacks demos Glue: How AI unlocked his Slack killer
40:12 Friedberg walks through his major breakthrough at Ohalo
61:35 Stanley Druckenmiller bets on Argentina and Javier Milei: strategy, roadmap for the US
73:54 Jason's bet on Athena, how AI will change company building
82:21 Google launches AI summaries in search

Whisper Transcript | Transcript Only Page

00:00:00.000 | All right, everybody, welcome to your favorite podcast in the
00:00:03.700 | world's number one podcast, the all in podcast. It's Episode
00:00:06.840 | 1790. Oh, wait, that's just how it feels. Welcome to Episode
00:00:11.400 | 179. With me today, of course, is your Sultan of science. I
00:00:16.840 | don't know if that's a movie background, or it's just his
00:00:19.800 | favorite vegetables. What's going on there? What's the crop?
00:00:22.040 | That's AI generated. AI generated crop. Okay, I'm trying
00:00:25.720 | AI backgrounds. I'm going to try it out for a while with different
00:00:27.840 | crops. Your fans are going to be crushed that you're not doing
00:00:30.240 | deep movie polls. With us, of course, man about town DC. New
00:00:36.320 | products being launched. David Sachs, the rain man. Yeah. How
00:00:40.160 | you doing, buddy? Good. Good. Yeah. Good week. What's going
00:00:43.240 | on? Yeah, definitely. Tremont poly hoppity. German dictator. He
00:00:50.120 | puts the chairman dictator, I would like to take this
00:00:53.800 | opportunity to wish my child a happy birthday. I absolutely
00:01:02.600 | love you.
00:01:03.240 | Well, now the rest of us look like Yeah, great. I've never
00:01:07.440 | done that before.
00:01:08.200 | Sacks in your desk. In your desk is a piece of paper with your
00:01:12.400 | children's names and their birthdays. You want to pull it
00:01:14.520 | out and see birthdays a year and I've never done
00:01:22.200 | rain, man.
00:01:22.840 | No, no, no. But I'm saying it rarely lands on the same day.
00:01:37.760 | Today is the day. Today's the day. Today's the day. Okay. Is
00:01:41.800 | congratulations child? Oh, congratulations. Yeah. How old
00:01:46.200 | you are? No gender name or any other specifications, folks. We
00:01:49.120 | can't we can't tip anybody off. No pronouns. No
00:01:51.800 | pronouns. Yes. So how are they experiencing their birthday?
00:01:57.360 | This child has experienced a wonderful life and this child
00:02:00.680 | is an incredible person for whom I have tremendous admiration and
00:02:05.760 | love and compassion and hope for the future.
00:02:08.000 | All right. And did you order them some chicken fingers?
00:02:10.200 | I cannot comment on who this person is.
00:02:13.080 | Are you talking of course about Phil Helmuth?
00:02:19.960 | Can we please talk about last weekend's festivities? What a
00:02:23.360 | disaster he is. Oh my god. You guys just see you guys know. So
00:02:28.360 | we missed you last weekend. We missed you. So much fun. Come
00:02:31.080 | on. We missed you on Saturday night. Saturday night was really
00:02:33.000 | fun. Hmm. I had such a lovely time coming home to be totally
00:02:37.240 | honest with you. We had a cabana set up on Saturday played
00:02:40.680 | blackjack. I missed you guys too. I had a FOMO. I saw the
00:02:43.240 | videos. It was so fun. Well, you don't have to have too much FOMO
00:02:46.160 | because Phil sent the entire group chat to poker news.com.
00:02:51.040 | They didn't run it twice. The flop.org poker dash update. Oh
00:02:58.000 | my god. Yeah. It was like five stories and he leaked every
00:03:01.040 | single person who's there and the jets and the jet numbers.
00:03:04.560 | He's like, look, here's me and Elon. Elon came by for my dinner.
00:03:07.720 | No, no, no. It was worse than that. No, it's worth that. He
00:03:09.840 | said, I got to hang out with our guy Elon for 10 minutes and 14
00:03:14.160 | seconds. He intercepted him at the valet. Wait, what? 10
00:03:25.920 | minutes and 14 seconds. He had the exact time down to the
00:03:28.640 | second. Oh my god. Well, listen, I want to wish Phil Helmuth a
00:03:31.040 | happy birthday because I didn't miss his 60th party. Yeah, it's
00:03:34.880 | coming up. Actually, his birthday is not so good. It
00:03:36.720 | wasn't actually his birthday. It was Bill Gurley. So we just
00:03:39.120 | hijacked Bill Gurley's birthday. I also got to enjoy for my first
00:03:42.880 | time ever the experience of Baccarat, which I've decided is
00:03:46.320 | the most DGN game on earth. It's literally the most, you just
00:03:50.960 | flip a coin. It's flipping coins. You make betting
00:03:55.280 | decisions. All you do in Baccarat is you say bank or
00:03:58.720 | player, and then you freak yourself out about how you flip
00:04:01.200 | the cards. And the smartest people I know on earth are all
00:04:03.600 | sitting around this table at two or three in the morning saying,
00:04:06.560 | turn this corner this way. No, no, no, no, no. Turn it this way.
00:04:09.760 | Turn it this way. There's two dots and they're debating the
00:04:12.400 | right way to flip a card over. No, the Baccarat Sweat is the
00:04:17.200 | most incredible performative act in the casino. It's the
00:04:20.640 | weirdest thing. Yeah, you're right. Everyone's got their own
00:04:23.120 | little technique about how they bend the card. It's all
00:04:26.160 | destroyed by the end of the deck. They get thrown out. I go
00:04:29.360 | lengthwise. I go like this and I try to see. Oh, like you're
00:04:32.400 | curling your mustache like an evil villain? It's the evil
00:04:35.200 | villain. And then you call out, oh my god, no spotter. If you
00:04:38.560 | see a spot or two across. And then you get to decide whether
00:04:43.680 | the bank turns over their cards and when they turn you lose it,
00:04:46.800 | then you lose a small house and then you're like, yeah, you're
00:04:49.520 | convincing yourself that you have all this control and ways
00:04:52.080 | to change the outcome. You're literally flipping a card.
00:04:54.640 | It's even worse than that. You're basically sitting down
00:05:00.000 | at the casino's table and then they tell you whether you've
00:05:02.240 | won or lost. And in order to convince yourself that that's
00:05:05.760 | not what's going on, you have to play with the card. But
00:05:09.120 | really, they just tell you, you either win or lose.
00:05:11.920 | And I'm watching the smartest guys we know staring at the
00:05:15.520 | window at the little machine that tells you whether bank or
00:05:18.080 | player one and they're studying it, doing an analysis at one
00:05:24.000 | point. It's gotta go black. Helmuth's like, I'm calling it
00:05:27.040 | now. Bank, bank, player, player, player. And all the guys are
00:05:29.600 | like, let's do it. And then everyone's got heads, heads,
00:05:31.200 | tails. So Helmuth asked us to play in the high stakes
00:05:35.600 | poker game on Poker Go. So it was me, Helmuth, Stanley, Sammy,
00:05:40.400 | House, and then Jen Tilly, and Nick Airball and Robo. So most
00:05:45.440 | of the guys from the House game plus Jen Tilly and Nick Airball.
00:05:48.880 | Jennifer Tilly is amazing. What a great human.
00:05:51.200 | Listen to this. Well, listen to this hand. Literally, the
00:05:55.200 | second hand of the actual poker game, Jen Tilly is in the
00:06:05.200 | big blind. No, sorry. She's under the gun. She raises
00:06:08.400 | house and bull three bets. It comes all the way around to me
00:06:11.920 | on the button. I look and I have pocket kings. Oh, I ship the
00:06:17.440 | whole cheeseburger comes back to Tilly. She ships house ships.
00:06:21.920 | Listen to these hands. Jen Tilly has aces. Jeff house and
00:06:26.240 | bold has kings. I have kings. Oh, my god. I've never seen a
00:06:30.240 | cooler hand like this in my life doubts. And the second in the
00:06:34.720 | second hand of the game. Anyways. Wow. Don't worry, guys.
00:06:38.160 | It's back and I want to tell you they tripled up. She triples
00:06:43.360 | up and then into lockdown. The first time I ever played with
00:06:46.240 | her. And I stacked her right. Anyways, I don't want to reveal
00:06:49.840 | the game. But it was it was wonderful. This one. I show up
00:06:52.800 | at a mutual friend of ours game. And there's like beautiful
00:06:56.160 | Porsche or something in the driveway is a really notable
00:06:58.320 | car. And the I noticed on the license plate says DJ. But it's
00:07:02.800 | spelled with a J. And I'm like, Oh, degenerate. What a great
00:07:05.520 | license plate. I wonder who's that is. I go, it's Jennifer
00:07:07.520 | Tilly. She is so cool. She's very charming. Great, very
00:07:12.880 | charming. Great actress. Great. She was in. That's what it was.
00:07:17.440 | Yeah. You don't have to ask me twice. Yeah, exactly. Exactly.
00:07:21.200 | What a great gangster film. Yeah. With Gina Gertrude. I mean,
00:07:25.840 | Gina Gertrude. And that's the one. Oh, my God. That film. That
00:07:29.040 | film. Oh, my God. Well, let's not get canceled here. Okay.
00:07:31.440 | Yeah. And it is quite a film. All right. Speaking of action.
00:07:37.040 | Big week, the AI industrial complex is dominating our
00:07:42.480 | docket here. Apologies to Biden, Ukraine and Nikki Haley. But we
00:07:45.600 | got to go AI right now open AI, launch chat GPT for Oh, 4.0.
00:07:51.360 | Monday, three days after Sam wise came on all in as a
00:07:56.480 | programming note, and we'll go to Freiburg about this. We
00:07:59.600 | probably made a bit of a strategical or tactical error in
00:08:03.360 | not postponing his apparent appearance. In fairness,
00:08:06.640 | Freiburg Sam wise did tell us. Originally, he was coming on to
00:08:10.000 | talk about those things. But then it got pushed back. Anything
00:08:12.480 | you want to add to that as a programming note? Because people
00:08:15.520 | are wondering what happened. I've been talking with Sam for a
00:08:20.000 | while a year about coming on the show. And every time I see him,
00:08:23.360 | we're like, Hey, you should come on the show. He's like, I want
00:08:24.960 | to come on the show. Okay, let's find a date. We never got a date
00:08:27.680 | that worked. I saw him in March. And he said, Hey, I want to come
00:08:30.720 | on the show. I said, Okay, well, come on, let me know when works.
00:08:33.760 | And a couple of weeks later, he's like, what about this date
00:08:35.920 | in May? And I'm like, yeah, that's, that's fine. We can make
00:08:38.960 | that work. He's like, well, I've got a big announcement we're
00:08:41.840 | going to be doing. And I was like, perfect. Come on the show
00:08:44.160 | that that sounds great. And then the night before, he asked me,
00:08:49.440 | he told me he texted me like, hey, we're actually not going to
00:08:52.080 | have this announcement happen tomorrow. It's going to be
00:08:54.800 | delayed. He didn't tell me how long and I'm like, well, is it
00:08:56.640 | chat? Is it GPT five? He's like, no, it's not GPT five. And I was
00:09:00.240 | like, okay, well, you know, come on the show anyway, because he
00:09:02.800 | didn't tell me when he's doing the announcement or when it's
00:09:04.480 | being pushed to so it didn't seem like that big a deal. And I
00:09:07.040 | thought we were just going to be able to have a good chat anyway.
00:09:09.040 | So it's really unfortunate. I think the fact that the
00:09:11.200 | announcement happened two days after and he had to stay quiet
00:09:13.520 | about it during our interview. But that's the story. I think in
00:09:17.040 | the future, if someone says they've got a big announcement
00:09:18.960 | to do, we should probably push them. If they if they don't be
00:09:23.280 | but I don't think we're gonna be doing a lot of these
00:09:25.200 | interviews. Anyway, I think people clearly don't love them.
00:09:27.760 | And it's better for us to just kind of hang out and talk.
00:09:29.760 | I think I think if we had just gotten Sam on the day after the
00:09:35.680 | launch of GPT for Omni, as opposed to what is it three days
00:09:39.840 | before? Yeah, you could have talked much more freely about
00:09:42.560 | it. Yeah, it was supposed to happen same day. So it's
00:09:45.840 | unfortunate. That's all it worked out this way. A little
00:09:47.680 | trick is to say you can tell us under embargo. But my
00:09:51.040 | understanding is they were still doing the videos over the
00:09:54.560 | weekend. So I think those videos and stuff, they were still
00:09:57.440 | figuring them out. And so yeah, lesson learned. In terms of the
00:10:00.640 | interviews on the show. Just to recap for people, we've done a
00:10:04.400 | dozen, half of them have been presidential candidates.
00:10:07.280 | Sometimes they break out, sometimes they don't. But we
00:10:10.080 | follow our interest and our passion here on the pod. It's
00:10:13.920 | got to be interesting for us too. So we think this person is
00:10:16.480 | going to be interesting. We do it. And yeah, we understand you
00:10:18.480 | miss a news subject. But yeah, it is what it is.
00:10:21.520 | And to your point, a lot of the people that come on, and
00:10:25.680 | increasingly, a lot of people asked to come on because they
00:10:28.160 | know we're not journalists. And so for all of those folks that
00:10:31.520 | expect us to be journalists, that's not what we are. We're
00:10:36.640 | for entrepreneurs, we're for business people, we're for
00:10:39.200 | friends before technologists before curious people were for
00:10:42.080 | poker players. But we're not for journalists. And so we're
00:10:46.320 | going to ask whatever we feel like asking. Sometimes those
00:10:49.840 | things will touch a chord, because it's what you wanted to
00:10:52.000 | have asked. And sometimes we won't go to a place, whether we
00:10:56.800 | didn't have time to or whether we forgot or whether we chose
00:10:59.360 | not to. And I think it's important to have that
00:11:01.520 | disclaimer, like we have day jobs. And this is what we do to
00:11:06.880 | coalesce a bunch of information in the way that we're thinking
00:11:09.200 | about the world. So we are not journalists. So I think what
00:11:13.920 | that means is that if the guest doesn't want to talk about
00:11:17.120 | something, we're not going to start peppering him with gotcha
00:11:20.480 | questions and things like that. I appeared at a conference a
00:11:23.840 | couple of days ago, to promote glue, which we'll get to. And
00:11:27.760 | the first half of the conversation was like a normal
00:11:30.240 | conversation about what we were launching. And then the second
00:11:32.400 | half was basically the reporter peppering me with fastball
00:11:35.440 | questions, which is fine. I knew what I was signing up for. It's
00:11:38.400 | a totally different style. It's a totally different style than
00:11:40.960 | coming on the pod and having a normal conversation. But it's
00:11:43.440 | not really our job to make somebody open up if they don't
00:11:47.120 | want to talk.
00:11:47.600 | What was the spiciest question, Sax? What was the fastball?
00:11:50.480 | Anything come close to your head?
00:11:51.680 | No, I mean, it's not worth really getting into. You can
00:11:55.280 | watch it if I was just curious, like, look, I kind of like
00:11:58.560 | sometimes when reporters pitch me fastballs, because, yeah, you
00:12:01.760 | can strike out or you can hit it out of the park. Yeah. And
00:12:03.680 | they do that.
00:12:04.240 | That's an important part here. I think, you know, as a former
00:12:07.840 | editor in chief journalist myself, I sometimes like to ask,
00:12:11.680 | I would say a challenging question in a respectful way. I
00:12:14.720 | did that, for example, vague, you know, just clarifying his
00:12:18.080 | thoughts on trans and gay rights, wasn't disrespectful,
00:12:22.560 | was thoughtful. Would you consider it spicy or hardcore? I
00:12:26.080 | don't think it was hardcore. He likes to talk about us.
00:12:28.400 | No, but that's because you asked it from a position of
00:12:30.480 | curiosity. You weren't trying to catch the guy.
00:12:32.800 | No, see the difference. I'm actually interested in his
00:12:34.720 | opinion. This is my point. That's why it comes out
00:12:37.120 | differently. And that's why I think people enjoy these
00:12:39.120 | conversations. And sometimes we don't get to the other kind of
00:12:42.560 | answer, because I'm not interested in trying to gotcha
00:12:45.360 | somebody that's working hard.
00:12:46.880 | I always have the same conditions when I do interviews,
00:12:49.120 | which is I don't clear questions. And I don't let
00:12:50.960 | people edit it. But you know, everybody's got a different view
00:12:53.520 | on how to do interviews and feel a difference. If you like it,
00:12:56.960 | you like it. If you like Lex Friedman's version, or Tim
00:13:00.160 | Ferris's version, or you prefer, you know, Fox or CNN, go watch
00:13:04.240 | those interviews there, you can have a whole range of different
00:13:07.040 | interviews and interview styles available to you in the media
00:13:09.600 | landscape. We are but one. Sam Weiss mentioned on the pod last
00:13:12.480 | week that the next big model might not be called GPT-5. So on
00:13:17.360 | Monday, they launched GPT-4-O. The O stands for Omni. It's
00:13:22.560 | everything you love about tech. It's faster, it's cheaper, it's
00:13:25.520 | better. But from my perspective, the real show was the massive
00:13:29.840 | amount of progress they made on the UI/UX. The O stands for
00:13:34.160 | Omni, as in omnivore. It takes in audio, text, images, even your
00:13:38.000 | desktop, and video from your camera to inform what it's
00:13:41.520 | doing. You can consider it like 360-degree AI. Producer Nick
00:13:46.000 | will show a couple of videos while I describe them here
00:13:48.400 | before we go to the besties for the reaction to the
00:13:50.320 | announcement. First, they made great progress in solving the
00:13:54.320 | CB problem we mentioned last week. That's where like when you
00:13:57.040 | use Siri or any of these tools, you say, you know, "Hey, JETCPT,
00:14:00.320 | what's two plus two over and you have to wait and then if you
00:14:03.040 | talk over each other, it breaks." They now have that
00:14:06.720 | working much smoother. They did an example of counting where
00:14:09.760 | they said speed up, slow down. They did a translator that
00:14:12.320 | worked really well. I would like you to function as a
00:14:15.120 | translator. I have a friend here who only speaks Italian and I
00:14:18.240 | only speak English. And every time you hear English, I want
00:14:21.120 | you to translate it to Italian. And if you hear Italian, I want
00:14:23.840 | you to translate it back to English. Is that good?
00:14:25.600 | Perfetto. Mike, io mi chiedo se le ballene potessero parlare, cosa ci direbbero?
00:14:36.960 | Mike, she wonders if whales could talk, what would they tell us?
00:14:42.160 | They might ask, "How do we solve linear equations?"
00:14:46.000 | Potrebbero chiederci come risolviamo le equazioni lineari?
00:14:52.800 | Sicuramente sì.
00:14:56.480 | Certainly yes.
00:14:57.200 | Great. Looks like it works.
00:14:59.200 | I think Duolingo stock took a hit during that. Most impressive,
00:15:03.200 | to me at least, and also I had Sandeep Madhra on my other
00:15:07.600 | podcast and we talked about it, was their desktop and iOS app.
00:15:11.520 | What this app does is fascinating. It watches your
00:15:14.560 | desktop on your Macintosh or on your iPad or iPhone and it does
00:15:21.600 | things like adaptive learning. Here's a clip of Sal Khan from
00:15:24.400 | Khan Academy and his son basically using a drawing app to
00:15:29.280 | do some arithmetic or geometry and it does adaptive learning
00:15:34.000 | and basically makes a personal coach because the app is
00:15:36.960 | watching his son try to solve a problem.
00:15:39.040 | I'm here with my son and I'd love you to tutor him on this
00:15:44.160 | math problem, but don't give him the answer. You can ask
00:15:47.040 | questions and nudge him in the right direction, but I really
00:15:49.200 | want to make sure he understands it himself. He's here in the
00:15:52.720 | room, so you can talk to him directly.
00:15:54.720 | Can you first identify which sides of the triangle are the
00:15:59.760 | opposite, adjacent, and hypotenuse relative to angle
00:16:04.480 | alpha?
00:16:04.980 | All right, so I'm pretty sure this is the angle alpha right
00:16:09.040 | here, right?
00:16:09.540 | Correct. Now, looking at the triangle, which side do you
00:16:15.360 | think is the hypotenuse?
00:16:17.360 | Um, I'm not totally sure. I think it might be this one, but
00:16:25.840 | I really am not sure. This side AC?
00:16:28.000 | You're close. Actually, side AC is called the adjacent side to
00:16:34.080 | the angle alpha. The hypotenuse is the longest side of a right
00:16:38.720 | triangle and is directly opposite the right angle. Can
00:16:42.640 | you find which one is the hypotenuse?
00:16:44.880 | Oh, OK, I see. So I think the hypotenuse is this really long
00:16:50.320 | side from A to B. Would that be correct?
00:16:52.560 | Exactly. Well done.
00:16:55.120 | It can also participate in Zoom calls, explain charts, all that
00:16:58.800 | great stuff. And so it's going to be your guide on the side.
00:17:01.760 | It's going to be a present, you know, personality while you're
00:17:05.360 | using your apps. It's really impressive, I have to say. So I
00:17:08.640 | guess let's start, Freeberg, with your takeaways on all of
00:17:12.720 | these innovations that we saw.
00:17:14.480 | I think it's become quite apparent that there's an
00:17:18.800 | evolution underway in model architecture. We've, and I think
00:17:24.160 | you may remember, we talked about this briefly with Sam last
00:17:26.720 | week, but we're moving away from these very big, bulky models
00:17:31.120 | that are released every couple of months or quarters and cost a
00:17:35.760 | lot of money to rebuild every time they get re-released
00:17:39.440 | towards a system of models. So this multimodal system basically
00:17:45.040 | leverages several models at once that work together or that are
00:17:49.520 | linked together to respond to the inputs and to provide some
00:17:54.240 | generative output, and that those individual models
00:17:57.120 | themselves can be continuously tuned and/or continuously
00:18:00.960 | updated. So rather than have, you know, hey, there's this big
00:18:04.240 | new release that just happened, this new model just got trained,
00:18:06.880 | cost $10 million to train it, it's been pushed, these models
00:18:09.840 | can be upgraded with tuning, with upgrade features, and then
00:18:14.000 | linked together with other new smaller models that are perhaps
00:18:16.320 | specialized for specific tasks like doing mathematics or
00:18:19.680 | rendering an image or rendering a movie. And so I think what
00:18:22.720 | we're going to see is soon more of an obfuscation of the
00:18:27.840 | individual models, and more of this general service type
00:18:32.400 | approach, where the updates are happening in a more continuous
00:18:35.760 | fashion. I think this is the first step of OpenAI taking
00:18:38.880 | that architectural approach with GPT-4.0. And what's behind the
00:18:44.080 | curtains, we don't know. We don't know how many models are
00:18:46.000 | there. We don't know how frequently they're being
00:18:47.680 | changed, whether they're being changed through actually
00:18:49.520 | upgrading the parameters, or whether they're being fine
00:18:52.160 | tuned. And so this seems to be pretty obvious. If you look at
00:18:55.200 | this link, one of the criticisms that initially came
00:18:59.040 | out when they released GPT-4.0 was that there was some
00:19:04.320 | performance degradation. And Stanford actually runs this
00:19:08.320 | massive multitask language understanding assessment. And
00:19:13.040 | they publish it, I think daily, or pretty frequently on how all
00:19:16.240 | the models perform. And you can see the scorecard here, that
00:19:19.120 | GPT-4.0 actually outperforms GPT-4. And so this goes
00:19:23.360 | counter to some of the narrative that in order to get
00:19:25.200 | some of the performance improvements and speed
00:19:26.800 | improvements they got in 4.0, that they actually made the
00:19:29.840 | model worse. And it seems actually the opposite is true,
00:19:32.000 | that the model's gotten slightly better, it's still
00:19:33.520 | underperforms cloud three. Opus, which you can see here
00:19:37.280 | ranks top of these charts, but there's lots of different
00:19:38.880 | charts, all the companies published on charts, they all
00:19:41.040 | claim that they're better than everyone else. But I like
00:19:43.120 | Stanford because it's independent.
00:19:44.320 | Chamath, any thoughts after seeing it and in combination
00:19:47.440 | with our interview? Do you think chat GPT is running away
00:19:51.040 | with the consumer experience? Or do you think this is like
00:19:55.120 | neck and neck with some of the other players?
00:19:57.120 | Not to tell tales out of school, but somebody that we
00:20:00.000 | all know in our group chat, posted something about the fact
00:20:03.040 | that the consumer growth had stalled. I don't know how they
00:20:07.040 | knew that, that they maybe they got some data or maybe
00:20:10.400 | they're an investor. You guys know what I'm talking about.
00:20:13.120 | And, and they said that they're trying to reinvigorate
00:20:15.600 | growth into the consumer app into an open AI. I mean,
00:20:19.440 | Any insights as to why it might be plateauing in your
00:20:21.920 | perspective?
00:20:22.420 | I wrote this in my annual letter. But there are these
00:20:28.080 | phases of growth. And when you look at like social networks
00:20:32.400 | as a perfect example, Friendster was magical when it
00:20:35.120 | was first created. Right. And then you had my space that
00:20:38.480 | just ran circles around them, because Friendster didn't
00:20:41.040 | really invest the money and the quality that it took to,
00:20:45.440 | to create a moat. And then my space really wasn't able to
00:20:49.680 | compete. So we were, you know, Facebook, we were the eighth
00:20:52.160 | or ninth when we showed up on the scene, and we ran circles
00:20:54.480 | around everybody. I think what it means is that there are
00:20:58.720 | these phases of product development, which exist in many
00:21:02.960 | markets, this market, I think, is going through the same
00:21:05.360 | thing. And right now, we're in the first what I would call
00:21:07.760 | primordial ooze phase, which is everybody's kind of like
00:21:11.280 | running around like a chicken with their heads cut off.
00:21:13.520 | There's all these core basic capabilities that are still so
00:21:17.200 | magical when you see them. But we all know that five and 10
00:21:20.320 | years from now, these things will be table stakes, right.
00:21:22.960 | And what free bird just showed is a table of many companies
00:21:28.000 | and many trillions of market cap, all effectively running to
00:21:31.600 | the same destination. So I think where we are is probably
00:21:35.200 | within two years of where the basic building blocks are
00:21:39.040 | standardized. And then I think the real businesses get built.
00:21:42.720 | So I will maintain my perspective here, which is the
00:21:46.560 | quote unquote, Facebook of AI has yet to be created.
00:21:49.440 | Okay. And here it is chat, GBT web visits, as you can see,
00:21:54.160 | have plateaued, this data is similar web, I would agree with
00:21:58.480 | you, Jamal, it seems like the use cases, and the lucky lose
00:22:02.560 | who were just trying the software, because they heard
00:22:05.440 | about it. They've gone away. And then we have to find actual
00:22:08.880 | use cases. Saxe, I'm wondering, but our friend, Jason, just to
00:22:12.480 | kind of complete that said something about the premium
00:22:14.400 | conversion, right? That's what he said. I don't know how he
00:22:16.160 | knows. Paid, paid. So to be clear, paid versus free. And
00:22:20.240 | then what Sam said on the podcast last week was, it seems
00:22:24.080 | like whenever they come out with something new, the old stuff
00:22:26.800 | becomes free. In my talk with Sonny this week, he mentioned
00:22:30.240 | that these new models are so much more efficient, that you
00:22:34.080 | actually can throw the old model in the garbage garbage because
00:22:37.680 | it's so inefficient. And these are now becoming about 90%
00:22:42.240 | cheaper every year, which means every two years, these things
00:22:44.320 | are gonna be 99% cheaper and better. Yep. And it might be
00:22:49.280 | amazing. I sacks on a strategic level is going to make all this
00:22:54.720 | free, or close to free and maybe just charge for multiplayer
00:22:57.920 | version. That seems to be where it's heading. You don't have to
00:23:01.120 | log in to use 3.5. You don't have to log in to use Google
00:23:05.520 | serve. No, you do have to log in still on Google services. But I
00:23:08.160 | think these are going to just be free. So on a product basis,
00:23:10.640 | what are your thoughts? And then maybe could talk about free to
00:23:13.200 | pay? Do you think everybody in the world is going to pay 20 30
00:23:16.160 | 40 bucks 500 a year 200 a year to have one of these? Or are
00:23:19.520 | they just going to all be free? Well, I think you're assuming
00:23:22.400 | there that the long term business model of open AI is and
00:23:25.280 | you to see subscriptions, and I think that's probably the least
00:23:29.680 | attractive business model they have available to them. It's
00:23:32.560 | sort of the first one and the most obvious one because they
00:23:35.280 | put out chat GPT, and then it's pretty easy just to roll out
00:23:38.080 | premium version. But in my experience, B2C subscriptions,
00:23:41.600 | it's just not a very attractive business model, because
00:23:44.480 | consumers just aren't willing to pay a lot, and they have high
00:23:46.880 | churn rates, and there's no possibility of expansion, really.
00:23:50.240 | So I suspect they're going to move in more of a b2b direction
00:23:53.760 | over time, because that's where the real money is. And probably
00:23:57.200 | the way they do that is by monetizing all the apps that are
00:24:01.200 | built on top of it. And I think that in that sense, GPT 40 is a
00:24:06.880 | really important innovation. By the way, the the O stands for
00:24:11.840 | Omni, which I think stands for Omni channel. I think you may
00:24:15.040 | have said Omnivore, which is kind of funny.
00:24:17.120 | Yes, it's Omni. Yeah, which means all the different media
00:24:19.920 | types are currently currently coming in, right? Like, that's
00:24:23.520 | the difference. It's not like you just give it an image or
00:24:25.360 | give it a video. It's absorbing all those at the same time in
00:24:28.480 | parallel, I believe.
00:24:29.600 | That's right. So there's three big innovations with this model,
00:24:31.920 | right? So one is Omni channel, which means text, audio, video
00:24:35.760 | and images. Second, it's more conversational, like it
00:24:40.240 | understands the tone of people talking and understand sort of
00:24:44.480 | sentiment in a way it didn't before. And then the third
00:24:47.360 | thing, which is really important is that it's just much faster
00:24:50.320 | and more performant than the previous version, GPT-4 Turbo.
00:24:53.840 | In the speed test, they say it's twice as fast, we've played with
00:24:56.400 | it at glue, we can talk about that in a minute. And it feels
00:24:59.360 | 10 times as fast, it is much faster. But it's the combination
00:25:02.720 | of all three of these things that really makes some magical
00:25:06.080 | experiences possible. Because when you increase the speed of
00:25:10.000 | processing, you can now actually have conversations within a much
00:25:13.360 | more natural way before it was the the models were just too
00:25:17.120 | slow. So there'd be a long delay after every prompt. Yeah. So now
00:25:22.560 | like you showed, it can do things like you point the camera
00:25:25.440 | at a blackboard or something with math equations on it. And
00:25:28.880 | it can walk you through how to solve that problem. Or two
00:25:32.800 | people can be talking and it does real time translation. You
00:25:36.480 | know, there's that old saying that every Star Trek technology
00:25:38.880 | eventually becomes true. They've just basically invented
00:25:41.360 | the whole natural language real time. Yes. Universal
00:25:44.480 | translator. Yeah. So anyway, so those are some interesting use
00:25:47.920 | cases. But I just think they're going to be able to unleash a
00:25:51.040 | whole lot of new applications. And if they're metering the
00:25:55.520 | usage of the models and providing the best dev tools, I
00:25:58.880 | think there is a business model there.
00:26:00.320 | This thing is moving so fast. They're in like Leonardo
00:26:02.960 | DiCaprio mode, every two years, they throw the old model away.
00:26:07.040 | Okay, let's keep thank you, sex. But I mean,
00:26:10.480 | did you write that ahead of time?
00:26:13.680 | One point on that is, there are a whole bunch of startups out
00:26:22.560 | there that we're creating virtual customer support agents.
00:26:26.800 | And they've been spending the last couple of years working on
00:26:30.240 | trying to make those agents more conversational, quicker, more
00:26:34.880 | responsive. I think their product roadmaps just became
00:26:38.080 | obsolete. Now, that's not to say there isn't more work for them
00:26:41.360 | to do in workflow in terms of integrating the AI with customer
00:26:46.240 | support tools and doing that last mile of customizing the
00:26:50.960 | model for the vertical specific problems of customer support.
00:26:54.640 | But my guess is that hundreds of millions of dollars of R&D just
00:27:00.240 | went out the window. And probably this is the best time
00:27:03.520 | to be creating a customer support agent company. If you're
00:27:05.600 | doing it two years ago, five years ago, your work has just
00:27:08.720 | like been, well, I mean, that is the thing of this pace, like,
00:27:11.840 | you know, you used to have to throw away client server stuff
00:27:14.400 | or, you know, whatever, you had a web based thing, you get an
00:27:17.360 | app out, you throw away some of the old code. But this is like
00:27:19.920 | every 18 months, your work has been replaced.
00:27:22.720 | If you're an app developer, the key thing to understand is where
00:27:26.320 | does model innovation and and your innovation begin? Because
00:27:30.640 | if you get that wrong, you'll end up doing a bunch of stuff
00:27:33.600 | that the model will just obsolete in a few months.
00:27:36.000 | I think you're totally right. I think that's such a really
00:27:38.320 | important observation. That's why I think the incentive for
00:27:40.960 | these folks is going to be to push this stuff into the open
00:27:43.360 | source. Because if you if you solve a problem, that's
00:27:47.200 | operationally necessary for your business, but it isn't the
00:27:50.400 | core part of your business. What incentive do you have to really
00:27:54.560 | keep investing in this for the next five and 10 years to
00:27:56.880 | improve it, you're much better off like clarinet, for example,
00:27:59.440 | right, we talked about the the amazing improvement and savings
00:28:03.360 | that clarinet had by improving customer support, release it in
00:28:07.120 | the open source, guys, let the rest of the community take it
00:28:09.840 | over so that it's available to everybody else. Otherwise,
00:28:12.880 | you're going to be stuck supporting it. And then if and
00:28:15.520 | when you ever wanted to switch out a model, you know, GPT 404
00:28:20.000 | to 402, Claude to llama, it's going to be near impossible,
00:28:24.160 | and it's going to be costly. So I also think sacks the
00:28:27.360 | incentive to just push towards open source in this market, if
00:28:32.160 | you will, is so much more meaningful than any other
00:28:34.480 | market. Yeah, I mean, listen, you were there when I think you
00:28:38.000 | were there at Facebook when they did the open compute project,
00:28:41.120 | and they just were like, sorry, guys, talk about talk about
00:28:44.080 | torching an entire market. Explain what it is. So there was
00:28:47.680 | this moment where when you were trying to build data centers,
00:28:51.760 | you'd have these like, one you rack mounted kind of like
00:28:54.880 | machines that you use. And what Facebook observed was there was
00:28:58.560 | only a handful of companies that provided it. And so it was
00:29:01.200 | unnecessarily expensive. And so Facebook just designed their
00:29:04.720 | own and then release the specs online just kind of said, here
00:29:07.920 | it is. And they went to these Taiwanese manufacturers and
00:29:10.800 | other folks and said, please make these for your cost plus a
00:29:14.000 | few bucks. And it was revolutionary in that market,
00:29:17.760 | because it allowed this open platform to sort of embrace this
00:29:22.000 | very critical element that everybody needs. And I think
00:29:25.120 | there's going to be a lot of these examples inside of AI,
00:29:29.600 | because the costs are so extreme, so much more than just
00:29:33.040 | building a data center for a traditional web app, that the
00:29:35.840 | incentives to do it are just so so meaningful. Yeah, and I just
00:29:39.600 | showed it on the screen. Saks, you've actually been dancing
00:29:42.240 | along this line. Last night, I was using your new slack killer
00:29:46.160 | or coexist or I'm not sure it feels like a slack killer to me
00:29:48.560 | because I'm moving my company to it on over the weekend, we're
00:29:51.280 | moving to glue. And when I were doing some very, I think I may
00:29:55.920 | need to wet my beak on this one. We want you to wet your beak.
00:29:59.600 | I feels like 100 bagger to me. I'm in killer killer. Yes,
00:30:06.240 | because you got can you do that again in Christopher Walken
00:30:09.040 | voice, please. I get to wet my beak. It's like 100 x slide in
00:30:14.480 | the 500. Wow. Tell me about product decisions. Where does
00:30:21.520 | the AI end? And your product begin? Yeah, well, it's a good
00:30:26.960 | point. I mean, I think where the AI ends, we want to use the
00:30:30.480 | most powerful AI models possible. And we wanted to
00:30:33.760 | focus on enterprise chat. So you could think of us as for sure
00:30:37.760 | a slack killer slack competitor. It says that slack wasn't built
00:30:41.200 | for the AI era glue is AI native. What does that mean? No
00:30:45.440 | channels. You know, I showed this to Tomas, the first thing
00:30:47.600 | he said is you had me at no channels, right? People are so
00:30:50.480 | sick of channels, you have to keep up with all these hundreds
00:30:52.480 | and hundreds of channels. And the real problem with channels
00:30:54.480 | is there's one thread in a channel that you want to see. In
00:30:58.160 | order to see it, you have to join the whole channel. And now
00:31:00.320 | you're getting all this noise. People just want the threads. So
00:31:03.600 | if you look at what's the chat model inside of chat GPT, it's
00:31:06.720 | just threads, right? You create a topic based thread in chat
00:31:10.880 | GPT, the AI comes up with a name for it, puts it in the sidebar.
00:31:15.680 | And then if you want to talk about something else, you create
00:31:17.520 | a new chat. That's exactly the way that glue works. It's just
00:31:20.240 | multiplayer. You just put the groups and individuals you want
00:31:23.520 | on the thread. Let me just show you real quick. Here's my glue
00:31:27.040 | here. And you can see that in the sidebar, I've got all the
00:31:29.120 | threads that I've been involved in. And like I said, you can
00:31:31.920 | address them to multiple people or groups. And then you've got
00:31:34.720 | the chat here. Now, we've also fully integrated AI. And so Nick
00:31:39.920 | who's our producer, just in this thread seg at glue AI, what
00:31:43.600 | countries to sacks talk about most in episodes, episodes is a
00:31:47.280 | group we created to be the repository of all of the
00:31:49.760 | transcripts of our episodes. And so Lou did a search and it
00:31:55.280 | said David Sachs frequently discusses Ukraine, what the
00:31:57.920 | most. Yeah, really, then the Nick said, be more specific
00:32:02.080 | about sack stance on Ukraine, Russia war. Oh, boy. And
00:32:05.840 | gonna basically overload the server. Well, here it said here,
00:32:09.360 | David Sachs has articulated a nuanced and critical perspective
00:32:11.600 | on the Ukraine, Russia war across various episodes, the
00:32:13.600 | all in pod. Here's some key points encapsulating his stance.
00:32:16.480 | And it like nailed it. It's talked about prevention through
00:32:20.240 | diplomacy, opposition to NATO expansion, humanitarian
00:32:22.880 | concerns, skepticism, military intervention, peace deal
00:32:26.160 | proposal. You know, I'll copy and paste this onto Twitter x
00:32:30.480 | later today. But the point is, it like nailed it across all
00:32:34.080 | these different episodes. And then this is a feature of glue.
00:32:36.720 | It provided sources. So it cites where it got all the
00:32:40.640 | information from. So imagine, you know, we're doing this for
00:32:44.320 | the all in pod. But you could imagine that instead of it being
00:32:47.200 | transcripts of a podcast, it could be your work documents,
00:32:50.800 | you now have in your main chat, the ability just to ask, hey,
00:32:54.800 | at blue AI, remind me where we left that project or tell me
00:32:59.840 | who the expert is on this subject matter, or who's
00:33:02.560 | contributed the most of this project. I've actually figured
00:33:05.120 | out using blue AI who's contributed the most deal flow
00:33:08.000 | at craft is pretty amazing. Now let me show you like,
00:33:10.640 | I'm not gonna say here. I think it's Brian. Okay, Brian's my
00:33:16.560 | guy. Those two guys. Daniel and Brian. I get sacks on. Okay, you
00:33:24.960 | know, Daniel, I'll hire Brian constitute craft. We'll call it
00:33:30.400 | on the raft.
00:33:30.960 | Sacks, I just looked at the chat GPT for Omni server. And when
00:33:40.880 | you did that query, it actually rolled its eyes. Keep going.
00:33:43.920 | Okay. So so we talked about how chat GPT for Oh, understand
00:33:50.000 | sentiment in conversations. And I think this is a really good
00:33:52.480 | example. So Evan, who's my co founder glue was just testing
00:33:55.840 | this for us. And he asked, What is each besties personality
00:33:59.680 | type based on conversations and episodes. And first of all, it
00:34:03.360 | figured out who the besties are, like we didn't program that it
00:34:06.000 | just figured it out on its own, which is pretty impressive. And
00:34:09.760 | here's what I said about each besties personality. So it says
00:34:12.560 | Jason calcaneus role, the moderator and entertainer,
00:34:15.840 | personality traits, charismatic, humorous, lively and the glue of
00:34:19.440 | the group. Same phrase, often praise for his ability to make
00:34:22.480 | the podcast entertaining, and keep the conversation flowing.
00:34:25.840 | Jason's energy and ability to bring humor and discussions are
00:34:27.920 | key elements that define his role in the podcast comments.
00:34:30.000 | He's acknowledged for his role in bringing the group together.
00:34:32.080 | Chamath Palihapitiya the visionary and hustler, personality
00:34:35.760 | traits, insightful, driven, persistent, Chamath's relentless
00:34:38.720 | drive and ability to hustle are admired by his peers, making him
00:34:41.120 | a core part of the podcast intellectual discussions. I'll
00:34:43.200 | skip me but does Freeburg the pragmatic scientist and realist,
00:34:46.880 | pragmatic, methodical and a bit reserved? Oh, free birds, often
00:34:50.320 | the voice of reason, bringing a scientific and realistic
00:34:52.400 | perspective, the discussion, he focuses on analysis, logical
00:34:56.080 | reasoning. And then it cites where it got this from. And it
00:34:59.520 | says here, overall, the dynamic between the four besties creates
00:35:01.760 | a well rounded and engaging podcast with each member bringing
00:35:04.800 | their unique strengths and personality traits to the table.
00:35:07.360 | I think that's pretty incredible.
00:35:08.720 | How woke is this? Have you? Have you put any rails on or it's
00:35:12.640 | just just pure chat GPT for Oh, combined with the data? Yeah,
00:35:18.080 | yeah. So what we're doing here is we're wrapping chat GPT for
00:35:21.200 | Oh, with blue features that we've implemented to get the
00:35:26.160 | most out of the conversation. There's things we have to do to
00:35:28.560 | scope the the prompt. And then we're using a retrieval
00:35:33.200 | augmented generation service called raggy, which does rag as
00:35:37.600 | a service that basically slurps in our transcripts and makes
00:35:41.280 | them accessible to the AI. So that's basically the stack that
00:35:45.440 | we're using. But as the models get better and better, glue just
00:35:47.600 | gets better and better. Again, can I can I just make a comment
00:35:50.320 | on this? It's just so clean. J cal was the key for me and
00:35:54.960 | abandoning slack. He told me two or three years ago, he called
00:36:00.560 | me and he said, I have, you can tell me the exact channels, I
00:36:04.400 | eliminated some channels that were random. There was like two
00:36:08.320 | or three channels that you have a random channel, your slack
00:36:10.720 | instance wasn't allowed to have. And I was like, this is genius.
00:36:14.240 | And I went in and I was like, all of our companies should just
00:36:16.480 | eliminate these channels. And we could only get like 20% or 30%
00:36:21.040 | compliance. But it really started to turn me off slack
00:36:23.520 | because I would get caught in these threads that were just so
00:36:26.640 | totally useless. And I thought, why aren't people working? And
00:36:31.040 | this is really great, because you cannot blather on about
00:36:33.840 | nonsense in glue, which I find really useful. Well, this is
00:36:36.720 | what happens when slack in we use it at 8090 just so you know,
00:36:40.000 | so we were the when we got into the early get into slack too
00:36:43.280 | much, people start to think slack is the job. And replying
00:36:46.720 | to slacks and having conversations is the job when
00:36:48.720 | there's actually a job to be done. There's a job to be done.
00:36:51.200 | Yeah. And so it's important. And what I liked about this
00:36:53.840 | implementation facts was it's like the ability to make a feed
00:36:58.160 | or a data source inside of your communication platform. So the
00:37:03.040 | fact that you imported all of the episodes and the transcripts
00:37:06.560 | is great. But what I want is like our HubSpot or our cell
00:37:10.320 | CRM. I want our Zen desk, I want our LinkedIn jobs and our
00:37:15.840 | LinkedIn job applications. I want our notion I want a coda to
00:37:19.040 | each have the ability and when I was using it last night, what
00:37:22.320 | you do is you use the at symbol to evoke and to summon in a way
00:37:27.680 | it's like summoning Beetlejuice. So you summon your AI, but then
00:37:31.280 | you tell it what data set you want to go after. So you say,
00:37:35.120 | you know, at AI, let's talk about, I don't know, how do you
00:37:40.560 | manage your deal flow at craft? Do you use software like CRM
00:37:43.920 | software to manage deals, Brian, we just do it on you. But we do
00:37:47.600 | it all in glue. So it's ready right there. But you're right.
00:37:49.840 | So So the first thing that glue AI has access to is all of your
00:37:53.520 | chat history, which is amazing, because you get like, you know,
00:37:57.040 | that we can look at all your attachments. And we've got, I
00:38:00.000 | think, six integrations at launch, there'll be more. So
00:38:02.400 | yeah, like all of your enterprise data will be there.
00:38:04.480 | In the short term, you're right, you have to summon the
00:38:06.320 | repository by app mentioning because the AI needs a little
00:38:09.040 | bit of help of where to look. But in the future, it's going to
00:38:12.160 | figure it out on its own. So it's gonna become more and more
00:38:14.800 | seamless, but it'll insert itself. So we have a discussion
00:38:17.600 | about sales. And then you might have a sales bot that says, Hey,
00:38:20.560 | by the way, nobody's called this client in three months.
00:38:23.680 | Well, that's where I want to go with it is I call that
00:38:25.680 | promptless, which is I want the AI just to chime in when it
00:38:29.520 | determines that it has relevant information and can help the
00:38:32.560 | team, even if it hasn't been summoned yet. But we need some
00:38:36.080 | model improvement for that, frankly, I mean, we'll be able
00:38:38.080 | to get there by GPT five. But that's totally where this is
00:38:40.960 | headed. I'll show you just one more fun example. If I could,
00:38:44.320 | let me just show you this. So I asked it to write a letter to
00:38:48.800 | Lena Khan, to be a guest at the all in summit. And I told it
00:38:53.440 | mentioned positive things we've said about Lena Khan in episodes
00:38:58.560 | of the all in pod. And so it wrote this letter, Dear Chair
00:39:02.800 | Khan, we hope this message finds you well on behalf of the host
00:39:05.680 | the all in pod, we're excited to send an invitation for you to
00:39:07.760 | speak at the upcoming all in summit. And then it says, in our
00:39:11.760 | conversations, we have frequently highlighted your
00:39:13.520 | impressive credentials, and the impactful work you've
00:39:15.920 | undertaken. For example, in episode 36, we acknowledge your
00:39:19.600 | trailblazing role. And so the letter was able to quote
00:39:23.680 | episodes of the all in pod, just without anyone having to go do
00:39:27.520 | that research and figure out like what would be the best
00:39:29.520 | because I told it only say positive things don't say
00:39:32.000 | anything negative. Right? And then it said warm regards. And
00:39:35.120 | it said who the four besties were, again, we never told it
00:39:37.520 | who the besties are. We just said, write us a letter. So
00:39:41.200 | it's pretty incredible. Now, this is an example of the all in
00:39:44.480 | pod or think about any work context, where the AI has access
00:39:48.400 | to your previous work documents. It's pretty amazing what it can
00:39:52.480 | do. Well, I mean, it is kind of in the name, like this is glue,
00:39:56.320 | put you together. And slack is where you slack off makes total
00:39:59.120 | sense. The brands give you a little bit of a tip. We should
00:40:02.000 | have seen it coming with slack. Totally.
00:40:04.320 | We have a breaking news story. It's a great story. I got
00:40:13.040 | breaking news coming in. Friedberg, your life's work.
00:40:17.120 | Saks did his product review. Now it's your turn, Friedberg.
00:40:21.760 | We got breaking news coming in. I did promise you that when
00:40:24.960 | Ohalo decides to come out of stealth, and explains what we've
00:40:29.520 | done and what we're doing. I would do it here on the all in
00:40:32.960 | pod first, before the all in exclusive. So basically, by the
00:40:39.280 | time this pod airs, we're going to be announcing what Ohalo has
00:40:45.360 | been developing for the past five years and has had an
00:40:48.080 | incredible breakthrough in, which is basically a new
00:40:50.640 | technology in agriculture. And we call it boosted breeding. I'm
00:40:55.120 | going to take a couple minutes just to talk through what we
00:40:57.760 | discovered, or invented at Ohalo and why it's important. And the
00:41:03.920 | kind of significant implications for it. But basically, five
00:41:07.360 | years ago, we had this theory that we could change how plants
00:41:12.960 | reproduce. And in doing so, we would be able to allow plants to
00:41:18.720 | pass 100% of their genes to their offspring rather than just
00:41:22.800 | half their genes to their offspring. And if we could do
00:41:25.760 | that, then all the genes from the mother and all the genes
00:41:28.320 | from the father would combine in the offspring, rather than just
00:41:31.760 | half the genes from the mother and half the genes from the
00:41:33.840 | father. And this would radically transform crop yield, and
00:41:38.080 | improve the health and the size of the plants, which could have
00:41:41.760 | a huge impact on agriculture, because yield the size of the
00:41:45.440 | plants ultimately drives productivity per acre, revenue
00:41:48.960 | for farmers, cost of food, calorie production,
00:41:51.360 | sustainability, etc. So this image just shows generally how
00:41:54.560 | reproduction works. You've got two parents, you get a random
00:41:59.040 | selection of half of the DNA from the mother, and a random
00:42:02.960 | selection of half the DNA from the father. So you never know
00:42:05.120 | which half you're going to get from the mother, or which half
00:42:07.200 | you're going to get from the father. That's why when people
00:42:09.680 | have kids, every kid looks different. And then those two
00:42:12.400 | halves come together and they form the offspring. So every
00:42:15.120 | time a new child is born, every time a plant has offspring, you
00:42:19.200 | end up with different genetics. And this is the problem with
00:42:22.960 | plant breeding. Let's say that you have a bunch of genes in one
00:42:26.400 | plant that are disease resistant, a bunch of genes and
00:42:28.560 | the other plant that are drought resistant, and you want to try
00:42:31.040 | and get them together. Today, the way we do that in
00:42:33.840 | agriculture is we spend decades trying to do plant breeding
00:42:37.520 | where we try and run all these different crosses, find the ones
00:42:40.400 | that have the good genes, find the other ones that have the
00:42:42.000 | good genes and try and keep combining them. And it can take
00:42:44.400 | forever and it may never happen that you can get all the good
00:42:47.200 | genes together in one plant to make it both disease resistant
00:42:50.960 | and drought resistant. So what we did is we came up with this
00:42:55.280 | theory that we could actually change the genetics of the
00:42:57.920 | parent plants, we would apply some proteins to the plants, and
00:43:02.000 | those proteins would switch off the reproductive circuits that
00:43:06.640 | cause the plants to split its genes. And as a result, the
00:43:10.560 | parent plants give 100% of their DNA to their offspring. So the
00:43:15.040 | offspring have double the DNA of either parent, you get all the
00:43:18.480 | genes from the mother all the genes from the father. And
00:43:21.440 | finally, after years of toiling away and trying to get this
00:43:24.800 | thing to work and all these experiments and all these
00:43:26.800 | approaches, we finally got it to work. And we started
00:43:30.400 | collecting data on it. And the data is ridiculous. Like the
00:43:33.920 | yield on some of these plants goes up by 50 to 100% or more.
00:43:37.920 | Just to give you a sense, like in the corn seed industry,
00:43:41.280 | breeders that are breeding corner spending $3 billion a
00:43:44.960 | year on breeding, and they're getting maybe one and a half
00:43:47.520 | percent yield gain per year. With our system, we are seeing
00:43:50.880 | 50 to 100% jump in the size of these plants. It's pretty
00:43:53.600 | incredible. Here's an example. This is a little weed that we
00:43:56.320 | that you do experiments with in agriculture, called Arabidopsis.
00:43:59.920 | So it's really easy to work with. And you can see that what
00:44:01.760 | we have on the top are those two parents A and B. And then we
00:44:05.600 | applied our boosted technology to them, and combine them. And
00:44:08.880 | we ended up with that offspring called boosted IDs, you can see
00:44:11.120 | that that plant on the right is much bigger, it's got bigger
00:44:13.040 | leaves, it's healthier looking, etc.
00:44:14.640 | For your question, does that mean that the boosted one has
00:44:18.000 | twice the number of chromosomes as a and b?
00:44:20.560 | Exactly right.
00:44:22.320 | So is that like a new species, then?
00:44:23.920 | Yeah, so it's hard to survive with twice the number of
00:44:28.080 | chromosomes.
00:44:28.800 | Yeah, it's, it's called polyploidy. So we actually see
00:44:31.280 | this happen from time to time in nature. For example, humans
00:44:34.960 | have two sets of chromosomes, right? So does corn, so do many
00:44:38.560 | other species. Somewhere along the evolutionary history, wheat
00:44:43.520 | doubled, and then doubled again, and you end up actually in wheat
00:44:47.440 | having six sets of chromosomes. Wheat is what's called a
00:44:51.280 | hexaploid. Potatoes are a tetraploid, they have four sets
00:44:54.160 | of chromosomes. And strawberries are an octaploid, they have
00:44:57.280 | eight. And some plants have as many as 24 sets of chromosomes.
00:45:00.640 | So certain plant species have this really weird thing that
00:45:03.520 | might happen from time to time in evolution where they double
00:45:05.920 | their, their DNA naturally. And so what we've effectively done
00:45:09.440 | is just kind of applied a protein to make it happen and
00:45:13.120 | bring the correct two plants together when we make it happen.
00:45:16.240 | And so this could only happen for a plant, right? This can
00:45:18.720 | never happen with an animal.
00:45:20.080 | It wouldn't, it wouldn't work in animals. It works in plants.
00:45:22.560 | Okay. And one way you can think about plant genetics is all the
00:45:26.240 | genes are sort of like tools in a toolbox. The more tools you
00:45:29.920 | give the plant, the more it is, it has available to it to
00:45:33.280 | survive in any given second to deal with drought or hot weather
00:45:37.280 | or cold weather, etc. And so every given second, the more
00:45:40.960 | tools or the more genes the plant has that are beneficial,
00:45:43.680 | the more likely it is to keep growing and keep growing. And
00:45:45.680 | that plays out over the lifetime of the plant with bigger,
00:45:48.560 | bigger leaves and bigger, you know, grows taller. But more
00:45:51.440 | importantly, if you look at the bottom, the seeds get bigger.
00:45:53.760 | And in most crops, what we're harvesting is the seed. That's
00:45:56.560 | true. And you know, corn and many other crops. And so seeing
00:45:59.920 | over a 40% increase in seed in this little weed was a really
00:46:03.040 | big deal. But then we did it in potato. And potato is a crazy
00:46:06.560 | result. Potatoes, the third largest source of calories on
00:46:08.960 | earth. And so we took two potatoes that you see here in
00:46:12.320 | the middle A, B and C, D, we applied our boosted technology
00:46:15.840 | to it, to each of them and put them together and you end up
00:46:18.560 | with this potato ABCD. That's the boosted potato. And as you
00:46:21.920 | can see, these were all planted on the same date. And the
00:46:24.560 | boosted potatoes much bigger than all the other potatoes
00:46:28.000 | here, including a market variety that we show on the far right.
00:46:30.720 | That's what's typically grown in the field. Now here's what's
00:46:33.280 | most important when you look under the ground and you
00:46:34.880 | harvest the potatoes. You can see that that a B potato only
00:46:38.480 | had 33 grams, CD had nine grams. So each parent had 33 and nine
00:46:43.840 | grams potato. But the boosted offspring had 682 grams of
00:46:48.240 | potato, the yield gain was insane. And so you can see this
00:46:51.920 | being obviously hugely beneficial for humanity. You
00:46:56.720 | know, potatoes being the third largest source of calories,
00:46:58.880 | Indian potato farmers are growing one acre of potato in
00:47:03.040 | India, they eat potato two meals a day. In Africa, potato is a
00:47:07.360 | food staple. So around the world, we've had a really tough
00:47:10.080 | time breeding potatoes and improving the yield. With our
00:47:12.720 | system, we've seen incredible yield gains in potato almost
00:47:15.280 | overnight. And the other potatoes, those are normal size
00:47:19.200 | potatoes that you see there. Those are like, you know, table
00:47:22.160 | potatoes. Basically, that looks like a russet potato right
00:47:24.160 | there. That's like a normal size.
00:47:25.520 | It started as like a little creamer potato, basically, and
00:47:30.160 | you blew it up into a russet potato. Yeah, so the genetics on
00:47:34.240 | a B, you can see they're like little purple, tiny little
00:47:37.600 | purple potatoes, the genetics on CD are like these little white,
00:47:40.400 | you know, tiny little ball potatoes. But when you put those
00:47:43.360 | two together with boosted, and you combine all the DNA from a
00:47:46.240 | B and all the DNA from CD, you get this crazy, high yielding
00:47:49.280 | potato, ABCD, which by the way, is higher yielding than the
00:47:52.640 | market variety that's usually grown in the field on the far
00:47:54.880 | right. So why not just grow russet potatoes, then we are.
00:47:58.720 | And so we're working on doing this with russet. We're working
00:48:00.800 | on doing this with every major potato line. Sorry, the
00:48:04.480 | improvement you'll see is actually yield. So it's not the
00:48:06.640 | size of the potato, it's the number of potatoes that are
00:48:08.320 | being made. And, and so you'll see acre or something like that,
00:48:12.080 | like the exactly, you know, projects in the 60s and 70s,
00:48:15.120 | you can tell freebergs onto something here. You got David
00:48:19.120 | sacks to pay attention during it. Yeah, there's gonna be a
00:48:23.120 | decker court and sacks is awake. So actually, like, how do I wet
00:48:25.920 | my because I was interrogating the potato lines. I've never
00:48:29.120 | what's going on. I think genetics is interesting. But so
00:48:32.160 | have you tried these potatoes? They taste different? Oh, no,
00:48:34.880 | they're awesome. Yeah, they're they're potatoes. And we do a
00:48:37.760 | lot of analysis. Any horns yet or anything like that? No. I
00:48:42.800 | mean, again, one of the other advantages of the system that
00:48:45.760 | we've developed, let me go back here. And I just want to take
00:48:48.560 | two seconds on this. One of the other things this unlocks is
00:48:52.240 | creating actual seed that you can put in the ground in crops
00:48:56.880 | that you can't do that in today. So potatoes, the third
00:48:59.120 | largest source of calories. But the way we grow potatoes, you
00:49:02.000 | guys remember the movie, The Martian, you chop up potatoes,
00:49:03.920 | and you put them back in the ground. Because the seed that
00:49:06.960 | comes out of a potato, which grows on the top and the flower,
00:49:10.240 | every one of those seed is genetically different. Because
00:49:12.480 | of what I just showed on this chart, right, you get half the
00:49:14.960 | DNA from the mother half the DNA from either. So every seed has
00:49:17.520 | different genetics. So there's no potato seed industry today.
00:49:20.800 | And potato is like $100 billion market. With our system, not
00:49:24.960 | only can we make potatoes higher yielding and make them disease
00:49:27.920 | resistant. What we also make is perfect seed. So farmers can now
00:49:32.560 | plant seed in the ground, which saves them about 20% of revenue
00:49:35.760 | takes out all the disease risk, and makes things much more
00:49:38.480 | affordable and easier to manage for farmers. So it creates
00:49:41.120 | entirely new seed industries. So we're going to be applying
00:49:43.680 | this boosted technology that we've discovered across nearly
00:49:46.880 | every major crop worldwide. It'll both increase yield, but
00:49:51.040 | it will also have a massive impact on the ability to
00:49:54.960 | actually deliver seed and help farmers and make food prices
00:49:58.480 | lower. Is it more sustainability? No, it's
00:50:02.400 | actually cheaper. So higher yield, lower cost, do you need
00:50:05.120 | more water, less water, less land, less energy, do you need
00:50:09.680 | more fertilizer? fertilizer usually scales with biomass, but
00:50:14.800 | these sorts of systems should be more efficient. So fertilizer
00:50:17.840 | use per pound produced should go down significantly. As we get
00:50:22.320 | to commercial trials with all this stuff. And we're doing this
00:50:25.200 | across many crops. So there's a lot of work to do in terms of
00:50:27.760 | like, how do you scale and tell us about production in the
00:50:29.840 | field? Tell us about the the patents, and how important
00:50:35.360 | patents play a role in this because isn't it like, like one
00:50:38.240 | of Monsanto's big things like they just go and sue everybody
00:50:40.880 | into the ground or whatever, like, I'm gonna answer you one
00:50:43.040 | second, I'm just gonna switch my headset just died. Wow, we
00:50:45.680 | went from sacks as bots to freebergs crops. I'm glad we're
00:50:51.120 | doing him second, because all of a sudden, like group chat
00:50:53.840 | doesn't seem very important. Yeah. Wow. He just, he just
00:50:57.520 | saw the whole Ukraine crisis here. I wouldn't be able to grow
00:51:01.040 | wheat in the desert. And in the race, he solved the world food
00:51:04.800 | problem. Yeah, sex. What if you what did you do for the last
00:51:07.040 | six months? Yeah, we made our price chat a little better. But
00:51:09.360 | we added AI to enterprise chat. We cleaned up your slack. So
00:51:13.840 | yeah, when you invest, we've invested a ton of money. This
00:51:16.400 | was stealth for five years, we put a ton of money into this
00:51:19.200 | business. So when you invest like I mean, north of 50. Yeah,
00:51:25.920 | 50 million, five years, and you don't have a product in market
00:51:28.960 | yet. Wow, that's some we actually have some product.
00:51:30.880 | Yeah. So I haven't talked about the way we've been making money
00:51:32.960 | in some of the business we've been doing. Okay, let me just
00:51:35.120 | make sure this is like clear. So that last photo you showed
00:51:39.040 | with the different types of potatoes, you had created the
00:51:43.280 | super huge ones. But you're saying that the the yield
00:51:46.880 | benefit here is just you create a much bigger, hardier plant
00:51:49.280 | that's capable of producing many more potatoes. The size of
00:51:52.720 | potatoes doesn't change. You can control for that when you
00:51:55.440 | breed. So the selection of what plants you put together in the
00:51:58.080 | boosted system allows you to decide you want small, medium,
00:52:00.880 | large, that's all part of the design of which plants do you
00:52:04.080 | want to combine? Okay, because your goal is not to turn like a
00:52:06.400 | russet potato into like a watermelon or something like
00:52:08.720 | that. No, the goal is to make more russet potato per acre, so
00:52:12.080 | that we use less water, we use less land, farmers can make more
00:52:15.120 | money, people pay less for food. That's the goal. And so it's
00:52:18.640 | all about yield. It's not about changing the characteristics.
00:52:21.520 | There are some crops where you want to change the
00:52:23.200 | characteristics, like you might want to make bigger corn kernels
00:52:26.480 | and bigger cobs on the corn, which is another thing that
00:52:28.640 | we've done. And that's actually been published in our patent.
00:52:32.160 | And the reason, by the way, I'm talking about all this is some
00:52:34.480 | of our patents started to get published last week. And so when
00:52:37.600 | that came out, the word started to get out. And that's why we
00:52:39.600 | decided to get public with what we've done, because it's now
00:52:42.240 | coming out in the open. You mentioned something briefly
00:52:44.720 | there about where different crops can be planted. You know,
00:52:50.800 | we had these big talks about wheat and corn, they're only
00:52:53.840 | available in very specific parts, you know, north of the
00:52:57.440 | equator, the jungles can't be in obviously polar or desert
00:53:01.120 | extremes. So if you're successful, what would this do
00:53:04.400 | for on a global basis, where these crops are made? Because
00:53:10.480 | that's our whole discussion about you. Totally wheat belly
00:53:14.000 | of Europe, the cradle of wheat. It's a great question. I'm so
00:53:18.160 | glad you asked it because that's one of the key drivers for the
00:53:20.560 | business is that we can now make crops adapted to all sorts of
00:53:24.640 | new environments that you otherwise can't grow food. Today,
00:53:27.680 | there's close to somewhere between 800 million and a
00:53:29.760 | billion people that are malnourished, that means they are
00:53:31.600 | living on less than 1200 calories a day for more than a
00:53:34.800 | year. But on average, we're producing 3500 calories per
00:53:39.200 | person worldwide in our ag systems. The problem is we just
00:53:42.320 | can't grow crops where we need them. And so by being able to do
00:53:46.240 | this sort of system where we can take crops that are very drought
00:53:49.120 | resistant, or can grow in sandy soil or very hot weather, and
00:53:53.040 | adapt cooler climate crops to those regions, but through the
00:53:55.840 | system, we can actually move significantly where things are
00:53:59.200 | grown. And, and improve food access in regions of how
00:54:03.440 | Friedberg when you look at a potato, how do you figure out
00:54:05.760 | what part of their DNA is the drought resistant part? Yeah.
00:54:10.400 | And then how do you make sure that that's turned on? So even
00:54:13.200 | if you inherit that chromosome, is there some potential
00:54:16.240 | interaction with the generally if we can, so these are what are
00:54:18.880 | called markers, genetic markers. And so there are known
00:54:22.080 | markers associated with known phenotypes, a phenotype is a
00:54:25.520 | physical trait of a plant. And so we know lots of markers for
00:54:29.280 | every crop that we grow markers for disease resistance, drought
00:54:32.560 | resistance, markers for big plants, short plants, etc. And so
00:54:37.520 | what we do is we look at the genetics of different plants
00:54:40.400 | that we might want to combine into the boosted system. And we
00:54:42.720 | say these ones have these markers, these ones have these
00:54:44.480 | markers, let's put them together. And then that that'll
00:54:47.520 | drive the results. One of the other interesting things we're
00:54:50.160 | seeing, which I didn't get too much into in the slides. It's
00:54:54.800 | not just about combining traits. But it turns out, when you add
00:54:59.360 | more genes together, biology figures out a way to create gene
00:55:03.920 | networks. These are all these genes that interact with each
00:55:06.320 | other in ways that are not super well understood. But it makes
00:55:10.560 | the organism healthier and bigger and live longer. This is
00:55:14.160 | like when you bring like why mutts are healthier and live
00:55:16.400 | longer than purebred dogs, because they have more genetic
00:55:19.360 | diversity. So there's a lot of work now in what's called
00:55:23.280 | quantitative genomics, where you actually look at the
00:55:25.520 | statistics across all the genes, you use a model, and the model
00:55:29.360 | predicts which two crosses you want to make out of hundreds of
00:55:33.360 | 1000s or millions of potential crosses that the AI predicts,
00:55:36.880 | here's the two best ones to cross, because you'll get this
00:55:39.680 | growth or this healthiness. So how do you want to how do you
00:55:42.240 | want to make money freeberg? Are you going to sell the seeds?
00:55:44.960 | Are you going to become the direct farmer? Are you going to
00:55:47.680 | become food as a service? Like, how do you make the most money
00:55:51.840 | from this, we're not going to farm, farmers are our customers.
00:55:55.440 | And so there are different ways to partner with people in the
00:55:58.640 | industry who already have seed businesses or already have
00:56:01.920 | genetics and help them improve the quality of their business.
00:56:04.800 | And then there's other industries like in potato, where
00:56:08.000 | we're building our own business of making potato seed, for
00:56:10.400 | example. So every crop and every region is actually quite
00:56:13.840 | different. So it becomes a pretty complicated business to
00:56:16.320 | scale. We're in the earlier days, we're already revenue
00:56:19.840 | generating, I would like a sweeter blueberry. No comment.
00:56:24.160 | No comment. Yeah, I get tilted by the quality of the Driscoll
00:56:27.840 | blueberries. Let me tell you something about the Driscoll
00:56:30.160 | blueberries. Also the Driscoll I've I've had only one batch of
00:56:33.760 | a Driscoll strawberry that was just off the charts. And every
00:56:37.280 | 19,847 other batches I bought have been Yeah, now you want the
00:56:42.080 | European small ones or the Japanese ones from Hokkaido
00:56:45.920 | because they're rich and sweet. And they're not these like
00:56:48.720 | monstrosity of giant flavorful strawberries. What's that about?
00:56:52.400 | Could you do a seedless mango? Yes, no cut it.
00:56:56.800 | Oh my god, how great would that be?
00:56:59.680 | Work for a mango is like the worst ratio.
00:57:04.240 | Yeah. Well, somehow we made it about us.
00:57:07.520 | Yeah, no, no, look, I think that's it is all about you guys.
00:57:10.080 | Tell us about the blueberries. Sorry. Well, no, every year.
00:57:12.640 | Driscoll's puts out a special labeled package called sweetest
00:57:17.520 | batch. And they just had the sweetest batch of strawberry and
00:57:21.280 | blueberries. I don't know if they're still in the stores, but
00:57:22.880 | they only last for like a week or two. And that's the best
00:57:25.360 | genetics only grown on a small number of acres. Really
00:57:28.880 | incredible going as soon as this is done. See if they have
00:57:32.080 | it. So I got it a few weeks ago. It's quite delicious. Anyway,
00:57:35.360 | we know, let's just say we know the berry market very well. My
00:57:38.240 | co founder, CTO, Judd Ward, who's, who's brilliant idea
00:57:42.240 | boosted breeding was many years ago, who I met because they had
00:57:45.680 | a New Yorker article on Judd, I cold called him and said, Hey,
00:57:48.560 | will you come in and give us a tech talk, we started talking
00:57:51.360 | and Judd came up with this idea for boosted breeding. And so we
00:57:53.920 | started the business with Judd and Judd ran molecular breeding
00:57:57.040 | at Driscoll. So we have a lot of Driscoll's people that work at
00:57:59.520 | O'Halo, we know the market really well.
00:58:01.040 | Can you go back to the patent stuff? Like, are you a seed
00:58:04.320 | person? So we spent we spent 50 million bucks on you know, plus
00:58:08.640 | on this business today. So we have filed for IP protections
00:58:12.960 | that people can't just rip us off. But I would say I think
00:58:16.160 | that the real advantage for the business arises from what we
00:58:20.000 | call trade secrets, which is not just about taking patents and
00:58:23.440 | going out and suing people. That's not a great business. The
00:58:26.080 | business is how do you build a moat? And then how do you extend
00:58:28.720 | that moat? The great thing about plant breeding and genetics is
00:58:32.080 | that once you make an amazing variety, the next year, the
00:58:35.040 | variety gets better. And the next year, the variety gets
00:58:37.120 | better. And so it's hard for anyone to catch up. That's why
00:58:39.760 | seed companies generally get monopolies in the markets,
00:58:43.120 | because farmers will keep buying that seed every year, provided
00:58:46.560 | it delivers the best genetics. And so our business model is
00:58:49.840 | really predicated on how do we build advantages and moats and
00:58:52.240 | then keep extending them rather than try to leverage IP. So I'm
00:58:55.840 | a big fan of like building business model advantages.
00:58:58.160 | This is going to be a credible sax. If you think about, you
00:59:00.880 | know, geopolitically, what's going on in Somalia, Sudan,
00:59:03.840 | Yemen, Afghanistan, those places have 10s of millions of people,
00:59:08.720 | I think hundreds of millions collectively, who are at risk
00:59:11.280 | for starvation, if you could actually make crops that could
00:59:13.200 | be farmed there, Friedberg, you would change humanity. And then
00:59:17.120 | all these people buying up farmland in America, that could
00:59:20.800 | devalue that farmland, if that wasn't as limited of a resource
00:59:24.720 | you have Friedberg, like,
00:59:25.760 | no, I think. So first of all, like farmland in America is
00:59:28.560 | mostly family owned, that's 60% rented, actually. So a lot of
00:59:32.960 | families own it, and then they rent it out because they stopped
00:59:35.200 | farming it. But the great thing that we've seen in agriculture
00:59:40.000 | historically is that the more calories we produce, the more
00:59:43.920 | food we produce, the more there seems to be a market. It's like
00:59:46.320 | any other economic,
00:59:47.520 | what about wheat and rice?
00:59:49.120 | Yeah. So those are calorie sources one and two. And there's
00:59:54.080 | certainly opportunity for us to apply our boosted systems there.
00:59:57.920 | The big breakthrough with potato is we can make potato seed using
01:00:00.720 | our boosted system in addition to making better potatoes.
01:00:03.120 | McDonald's is the largest buyer of potatoes. Yeah.
01:00:05.840 | So in the US 60% of the potatoes go to French fries and potato
01:00:09.520 | chips. McDonald's buys most of the fries. PepsiCo under Frito
01:00:13.360 | Lay buys most of the potato chip potatoes. 40% are table potatoes.
01:00:17.200 | In India, 95% of the potatoes are table potatoes, they're eaten
01:00:22.240 | at home. And the Indian potato markets three to four times as
01:00:25.360 | big as the US potato market. In Brazil, it's 90% table potato. So
01:00:30.080 | all around the world potatoes different. The US is, you know,
01:00:33.200 | unusually large consumers of French fries and potato chips.
01:00:36.560 | I speak on behalf of Jay Cowan, I said, we will gladly invest a
01:00:41.760 | million at a 10 cap in both of your businesses.
01:00:44.400 | Absolutely. Yes, we will break our way into this.
01:00:47.440 | Jay Cowan and I will do the deal. We'll wire the money. We'll wire
01:00:50.480 | the money a little million to each of you guys at a 10 cap.
01:00:52.880 | Thank you.
01:00:53.760 | Absolutely. You're in. It may not be a 10 cap, though. But yes.
01:00:56.880 | Breaking news, Chamath and Jay Cowan have secured the bag.
01:01:00.560 | It's breaking news. Chamath and Jay Cowan have secured the bag
01:01:04.000 | from the besties actually doing work.
01:01:06.080 | Yeah. Well, I appreciate you guys letting me talk about it today.
01:01:09.200 | I'm excited to share it.
01:01:10.400 | The both of you. I love it.
01:01:11.360 | It's been, yeah, building stuff is hard. There's always risk.
01:01:15.680 | It's a lot of work and a lot of setbacks. But man, when you get
01:01:19.840 | stuff working, it's great.
01:01:21.360 | We're each doing the things we do best.
01:01:23.600 | Freeburg is solving the world's hunger problem. And I'm making,
01:01:27.200 | I'm cleaning up your slack, making your enterprise chat a
01:01:32.080 | little better.
01:01:32.880 | All progress counts. All right. Stanley Druckenmiller has got a
01:01:38.800 | new boyfriend. Druckenmiller's got a boyfriend and his name is
01:01:43.040 | Javier. And they've eloped to Argentina. Druckenmiller
01:01:47.200 | professed his love. Tom Cruise on Oprah's couch in a CNBC
01:01:51.680 | interview this week, the only free market quote leader in the
01:01:55.440 | world right now, bizarrely is in Argentina of all places. He cut
01:01:59.200 | social security at 35%. If he came to office, they've gone
01:02:02.400 | from a primary deficit of like four or 5% to a 3% surplus.
01:02:06.080 | They've taken a massive hit in GDP, basically a depression for
01:02:09.520 | a quarter. And his approval rating has not gone down.
01:02:12.640 | Druckenmiller has explained how he invested in Argentina after
01:02:17.520 | seeing Millet's speech at Davos, which we covered. Here's a 30
01:02:22.080 | second clip, play the clip, Nick.
01:02:23.680 | By the way, do you want to hear how I invest in Argentina? It's
01:02:26.320 | a funny story. I wasn't at Davos, but I saw the speech in
01:02:30.400 | Davos and it was about one o'clock in the afternoon in my
01:02:33.440 | office. I dialed up perplexity and I said, give me the five
01:02:37.280 | most liquid ADRs in Argentina. It gave me enough of a
01:02:42.000 | description that I followed the old Soros rule, invest and then
01:02:46.160 | investigate. I bought all of them. We did some work on them.
01:02:49.520 | I increased my positions. So far, it's been great, but we'll
01:02:53.840 | Yeah, that's quite interesting. Quick note, you hear
01:02:57.280 | Druckenmiller mention ADRs. For those of you who don't know, and
01:03:00.800 | I was one of them, they stand for American Depository Receipts,
01:03:04.240 | basically a global stock offered on a US exchange to simplify
01:03:07.360 | things for investors. Yeah, I mean, he didn't sign a prenup
01:03:13.360 | here. He just went all in and he bought the stock, Chamath, and
01:03:16.160 | then he's going to figure it out later. Tell us your thoughts on
01:03:19.120 | this love affair, this bromance.
01:03:21.760 | There's a great clip of Millet. He goes on this talk show in
01:03:24.720 | Argentina and the talk show host, she's just so excited and
01:03:29.520 | greets him and then they start making out. Have you guys seen
01:03:33.440 | this?
01:03:33.680 | What?
01:03:34.160 | Full on French kissing each other. It's hilarious.
01:03:40.640 | Yeah, I mean, Soros has been very famous for this invest and
01:03:43.760 | investigate thing. It's like a smart strategy for very, very
01:03:49.200 | liquid public market investors that have the curiosity that he
01:03:53.200 | does. I mean, I don't have much of a reaction to that. I think
01:03:55.360 | that the thing with Argentina that's worth taking away is when
01:03:59.840 | you've spent decades casting about and misallocating capital
01:04:05.200 | and running your economy into the ground, the formula for
01:04:08.800 | fixing it is exactly the same. You cut entitlements, and you
01:04:15.040 | reinvigorate the economy. And so the thing we need to take away
01:04:19.360 | is if we don't get our together, that's probably what we're
01:04:21.200 | gonna have to do.
01:04:22.080 | Saks, the influence of Millet on American politics. Will there
01:04:27.600 | be any? It seems like he has paralleled what Elon did at
01:04:32.720 | Twitter, Facebook, and Zuck did a Facebook. Do you think that
01:04:37.200 | this experiment he's doing down there of just cutting staff,
01:04:40.880 | cutting departments will ever make its way into American
01:04:45.520 | politics?
01:04:46.080 | Probably not. I mean, not until we're forced to. But what Millet
01:04:50.800 | did, he comes in, and they've got a huge budget deficit, and
01:04:53.680 | they've got runaway inflation, and they're debasing their
01:04:55.920 | currency. And just practically overnight, he just slashes
01:04:59.280 | government spending to the point where he has a government
01:05:01.200 | surplus. And then as soon as he gets credibility with the
01:05:04.560 | markets, that allows him to reduce interest rates, inflation
01:05:07.200 | goes away, and people start investing in the country.
01:05:10.000 | Magic.
01:05:10.500 | It's magic.
01:05:11.600 | So there is a path.
01:05:12.720 | It's obvious. Listen, I mean, you can't run deficits forever.
01:05:18.640 | You can't accumulate debt forever. It's just like a
01:05:20.720 | household. If your spending exceeds your income, eventually
01:05:26.320 | you got to pay it back or you go broke. And the only reason we
01:05:29.440 | haven't gone broke or experienced hyperinflation is
01:05:31.600 | because we're the world's reserve currency. So there's
01:05:33.840 | just a lot of room for debasement. And there's not a
01:05:37.600 | ready alternative yet. I mean, everyone's trying to figure out
01:05:39.600 | what the alternative will be. So we've been able to accumulate
01:05:42.720 | more and more debt, but it's reaching a point where it's
01:05:45.200 | unsustainable. And what we've already seen is that the Fed's
01:05:48.080 | had to jack up interest rates from very low, practically
01:05:51.520 | nothing, to 5.5%. And that has a real cost on people's
01:05:55.760 | well-being. Because now, your cost of getting a mortgage goes
01:06:00.320 | way up. I mean, mortgage rates are over, what, 7.5% now?
01:06:03.600 | Yeah, 6%, 7%, depending on how much net worth and your credit
01:06:07.760 | rating.
01:06:08.480 | Right. And so it's much harder to get a mortgage now. It's
01:06:11.840 | harder to make a car payment if you need to borrow to buy a car.
01:06:14.640 | And if you have personal debt, the interest rate is going to
01:06:17.040 | be higher. The inflation rate actually doesn't take into
01:06:19.520 | account any of those things. Remember, Larry Summers did that
01:06:22.480 | study where he said the real inflation rate would be 18% or
01:06:26.080 | would have peaked at 18% if you include a cost of borrowing.
01:06:29.920 | That's why people don't feel as well off as the unemployment
01:06:33.520 | rate would normally suggest. So people are hit really hard when
01:06:38.320 | interest rates go up in terms of big purchases they need to
01:06:42.400 | make with debt. And then, of course, it's really bad for the
01:06:45.360 | investment environment because when interest rates are really
01:06:49.200 | high, that creates a higher hurdle rate and people don't
01:06:52.960 | want to invest in risk assets. And so eventually, the pace of
01:06:57.200 | innovation will go down. And Druckenmiller made this point in
01:07:00.240 | his next set of comments. He said that treasury is still
01:07:03.520 | acting like we're in a depression. It's interesting
01:07:06.320 | because I've studied the depression. You had a private
01:07:08.000 | sector crippled with debt, basically with no new ideas. So
01:07:12.000 | interventionist policies were called for and were effective.
01:07:14.880 | He said the private sector could not be more different today
01:07:17.360 | than it was in the Great Depression. The balance sheets
01:07:19.760 | are fine. They're healthy. And have you ever seen more
01:07:22.320 | innovation ideas that the private sector could take
01:07:24.640 | advantage of, like blockchain, like AI? He says all the
01:07:28.000 | government needs to do is get out of the way and let them
01:07:29.760 | innovate. Instead, they spend and spend and spend. And my new
01:07:32.880 | fear now is that spending and the resulting interest rates on
01:07:36.720 | the debt that's been created are going to crowd out some of
01:07:40.560 | the innovation that otherwise would have taken place. I
01:07:42.640 | completely endorse Druckenmiller's view of
01:07:44.160 | binomics. And actually, I mean, this is what I said way back in
01:07:47.760 | 2021.
01:07:49.600 | Victory lap. Here we go. David Sacks victory lap. We need a
01:07:53.920 | little graphic for that.
01:07:55.040 | Druckenmiller used the word binomics. Instead, I give these
01:07:57.520 | guys an F because they're still printing money and spending
01:08:00.800 | money like we're in a depression, even though we're in
01:08:02.880 | a rip-roaring economy. And when they started doing this back in
01:08:06.000 | 2021, I tweeted, "Binomics equals pumping trillions of
01:08:09.600 | dollars of stimulus into a rip-roaring economy." I'm not
01:08:11.440 | going to pretend like I know what's going to happen next, but
01:08:13.440 | never tried this before. What happened next was a lot of
01:08:15.760 | inflation and that jacked up interest rates. According to
01:08:18.720 | even Keynesian economics, the reason why you have deficit
01:08:22.080 | spending is because you're in a recession or depression. And so
01:08:24.640 | you use the government to stimulate and balance things
01:08:27.600 | out. You don't do deficit spending when the economy is
01:08:31.040 | already doing well. So this spending, there's no reason for
01:08:34.720 | It's like showing up to like a party that's going crazy and
01:08:37.440 | being like, putting gasoline on the fire.
01:08:39.760 | Yeah, I mean, more importantly, it should limit the approval or
01:08:44.560 | action of certain programs that you might otherwise want to do
01:08:49.040 | in a normal environment. But in an inflationary environment, you
01:08:53.200 | don't have the flexibility to do that. Student loan forgiveness
01:08:56.880 | is a really good example. Is now the time?
01:08:59.280 | Of course not.
01:09:00.240 | To do student loan forgiveness, or do we wait for inflation to
01:09:03.440 | temper a bit? Is now the time? You know, so there's just a lot
01:09:06.800 | of these examples that actually the opposite should be true.
01:09:09.040 | Yeah, but none of all of those things get you votes.
01:09:12.720 | Before we move on from this, look, what we have coming out of
01:09:14.800 | Washington here is a contradictory and therefore
01:09:17.120 | self-defeating policy. You've got the Fed jacking up rates to
01:09:20.480 | control inflation, you move across town, and you've got
01:09:23.440 | Capitol Hill on the White House, spending like there's no
01:09:26.000 | tomorrow, which is inflationary, right? Why would you do both
01:09:28.640 | those things? Choose what your policy is going to be like
01:09:30.800 | driving with your foot on the brake and the gas at the same
01:09:33.280 | time. It's not a great idea for the car.
01:09:35.200 | Let me just make one comment, Jekyll, before we move on about
01:09:37.440 | the Druckenmiller investment statement, of course, I just
01:09:40.320 | wanted to say, like, I think what it highlights about
01:09:42.720 | Druckenmiller, and call it a rift in investing philosophy or
01:09:47.040 | skill, is the difference between precision and accuracy.
01:09:50.240 | What I mean by that is precision really references that you do a
01:09:54.560 | lot of detailed analysis to try and make sure you understand
01:09:58.320 | every specific thing that is going right or could go wrong.
01:10:01.840 | But the problem, and so that means you, for example, might do
01:10:04.880 | a ton of diligence on a company and make sure you understand
01:10:07.360 | every dollar, every point of margin, all the specifics of the
01:10:10.960 | maturation of that business and where they are in their cycle.
01:10:14.160 | But you could be very precise, but be very inaccurate. For
01:10:18.400 | example, if you miss an entire trend, someone could invest in
01:10:21.840 | Macy's back when Amazon was taking off and have done a lot
01:10:25.280 | of precise analysis on Macy's margin structure and performance
01:10:29.120 | and said, this is a great business. But they missed the
01:10:31.680 | bigger trend, which is that e commerce was going to sweep away
01:10:34.800 | Macy's and consumers were simply that's not possible in the
01:10:38.560 | analysis, but they were doing to be honest, free bird, nobody
01:10:41.520 | can make that stupid of a trade to say Macy's versus Amazon over
01:10:45.280 | the next 10 years.
01:10:46.480 | Oh, yeah. And so like, and Jake, I want to show that.
01:10:49.840 | Do not poke the tiger. Let's not get into it. Other podcasters,
01:10:55.120 | the worst spread trade in history.
01:10:56.960 | Yeah, let me just finish the statement. But the other one is
01:10:59.920 | being accurate and accurate means you get the right bet the
01:11:03.520 | right sentiments, the right friend, the problem with being
01:11:06.880 | accurate, you could have said, in the year 2000, hey, the
01:11:10.240 | internet is going to take off. And you could have put a bunch
01:11:13.280 | of money in but the problem was you were right. You just had to
01:11:16.960 | have the necessary patience. And so accuracy generally yields
01:11:22.240 | better returns, but it requires more patience because you can't
01:11:25.840 | necessarily time how long it will take for you to be right.
01:11:29.520 | So a guy like Druckenmiller is making an accurate bet he bets
01:11:33.600 | correctly on the trend on where things are headed. He doesn't
01:11:36.800 | necessarily need to be precise. But he has the capital and his
01:11:40.320 | capital structure that allows him to be patient to make sure
01:11:42.640 | that he eventually gets the return.
01:11:44.080 | And to build on your thoughts, having watched this movie a
01:11:46.320 | couple of times, and you know, I overthought the Twitter
01:11:49.200 | investment as but one example, I had the opportunity to invest
01:11:52.240 | in Twitter when it was like a single digit millions company.
01:11:55.280 | And I just thought, you know what, this thing is only like
01:11:59.040 | the headline. And I told them, like, it's the headline. It's
01:12:02.160 | not like the entire blog post me a cacophony of idiots, this
01:12:04.960 | thing is going to be chaos. And I was right, but I was wrong,
01:12:07.440 | right? Great bet, but my wrong analysis, right. And so you can
01:12:12.080 | add precision to other aspects, like when you sell your shares,
01:12:15.840 | or when you double down, but you have to get the trend right,
01:12:18.640 | which is Evan Williams, great entrepreneur, jack great
01:12:20.720 | entrepreneur, Twitter taking off like a weed, just make the bet.
01:12:24.000 | Right. And the problem is you knew too much about journalism,
01:12:27.920 | you knew too much about the space they were trying to
01:12:29.840 | disrupt. And that can be a mistake. Correct. We did.
01:12:33.040 | PayPal, none of us knew anything about payment. So that's one of
01:12:35.120 | the reasons we were successful. All the payments experts told us
01:12:37.600 | it couldn't be done. Right? So that happens a lot.
01:12:40.480 | I had never even I didn't even know what a Facebook was when I
01:12:44.000 | joined Facebook. It's an American college phenomenon. No,
01:12:46.480 | serious. You don't have that in Canada.
01:12:48.480 | But you knew Zach, and you saw some growth charts, and you saw
01:12:51.680 | some precision in his ability to build product. And that's the
01:12:55.200 | way to go.
01:12:55.520 | The great thing about network effect businesses is there's a
01:12:59.200 | trend line that sustains because it builds if it's an appropriate
01:13:02.240 | network effect. So you can be accurate about buying into the
01:13:05.760 | right network effect business. You don't need to use all of
01:13:08.960 | this diligence to be perfectly sound around the maturation of
01:13:14.480 | the revenue and the margin structure and all that stuff as
01:13:16.800 | long as the trend line is right. And you're willing to be
01:13:18.720 | patient to hold your investment. I think Druckenmiller his point
01:13:21.520 | is incredible. He took a look, he very quickly made a macro
01:13:24.400 | assessment. From a macro perspective, what Millet is
01:13:27.120 | doing is significantly different than what we're seeing in any
01:13:30.160 | other emerging market, let alone mature market with respect to
01:13:33.280 | fiscal austerity and appropriateness in this sort of
01:13:36.320 | inflationary global inflationary environment. And he said, you
01:13:39.040 | know what, I don't see any other leader doing this. This is a no
01:13:41.840 | brainer bet. Let me make the bet. And as long as he's willing
01:13:45.120 | to hold this thing for long enough, eventually, the markets
01:13:47.440 | will get there and call it a spread trade against anything.
01:13:51.120 | He'll be proven right.
01:13:52.240 | Well, speed and so not to but speaking of bets, Jake, how you
01:13:56.480 | told me this week that you just made your largest investment
01:13:58.960 | ever. Tell us about that.
01:14:00.240 | Yeah, so I've gotten very lucky now, because a lot of my
01:14:03.360 | founders from the first couple of cohorts of investing I did
01:14:08.000 | when I was a Sequoia scout have come back and created second
01:14:10.480 | and third companies. And so, you know, that happened with TK
01:14:13.920 | Uber, and the cloud kitchens that happened with Raul from
01:14:17.040 | report of then superhuman. And then it happened recently, just
01:14:21.280 | in the past year, my friend Jonathan, who's the co founder
01:14:23.920 | of thumbtack, asked me to come to dinner, he said, Hey, you
01:14:26.720 | know, you were the first investor in thumbtack, will you
01:14:29.040 | be the first investor in our next company, Athena? And I said,
01:14:31.280 | Sure, what do you do? And he explained it to me. And we put a
01:14:33.760 | seven figure bet in which is rare for us as a seed fund,
01:14:36.800 | right? Normally, our bet sizes are 100k to 50. You know, it's a
01:14:40.400 | $50 million fund.
01:14:41.360 | Why did you do it?
01:14:42.080 | Yeah, it's very simple. It's the fastest growing company I've
01:14:45.120 | ever seen. And I'm including Uber in that it has been growing
01:14:49.680 | at, you know, a rate that I'll just say is faster than Uber and
01:14:54.800 | Robin Hood went when we were investing in intensive millions
01:14:57.280 | of dollars. It's a very simple concept. When thumbtack was
01:15:02.080 | building their marketplace, they used researchers in places like
01:15:07.040 | Manila, etc, in the Philippines, knowledge workers, and what they
01:15:10.000 | realized was, the point 1% of those knowledge workers were as
01:15:14.640 | good or better than say, Americans at doing certain jobs.
01:15:17.920 | And so they've created this virtual EA service, you can go
01:15:21.680 | see it at Athena, wow.com. And we now have two of them inside
01:15:26.640 | of our company. It turns out Americans don't want to do the
01:15:29.200 | operations role. So it's kind of like AWS, you just give them
01:15:32.720 | $36,000 a year, they give you essentially an operations or an
01:15:36.320 | EA. And they have ones that are kind of cheap of staff ish. And
01:15:41.120 | this company is growing like a weed. So I am working with them
01:15:45.360 | on the product design as well. So imagine having, you know, two
01:15:49.360 | or three of these incredibly hardworking people who are
01:15:53.040 | trained with MBA class level curriculum, they spend months
01:15:59.760 | training these people up, they pay them two or three times what
01:16:02.880 | they would make at any other company. And then they pair them
01:16:05.280 | with executives here. And it's kind of been an underground
01:16:08.000 | secret in Silicon Valley, because it's only by invitation
01:16:11.920 | right now, because they can only train so many people. But if
01:16:14.720 | you've tried to hire an executive assistant, I don't
01:16:17.040 | know if anybody's tried to do that recently, you hooked me up.
01:16:19.760 | So I will be guinea picking this service. Yes. Soon, and I
01:16:23.280 | have two of them. And so it is just the greatest that you can
01:16:27.120 | have an operations person powered by AI tools as well.
01:16:30.560 | Yeah, so that's the kind of secret sauce here is they're
01:16:34.000 | training them, and they watch you work, and then they will
01:16:37.760 | learn how you do your job. And then how quickly you can
01:16:40.480 | delegate and get stuff off your plate is the name of the game.
01:16:43.280 | So we have an investment team with researchers and analysts
01:16:45.760 | in it, we have a due diligence team. And then you have like
01:16:48.400 | executive functions in our fund. They have now started
01:16:52.320 | shadowing, you know, you know, highly paid Americans in an
01:16:56.960 | investment firm ours, and then train them up. And now our due
01:17:00.720 | diligence, our first level screening, you know, and our
01:17:03.840 | tracking of companies is being done by these assistants, for
01:17:07.440 | what I'll say is a third to a fourth of the price I was paying
01:17:10.400 | previously. So what that does in an organization is, we're just
01:17:13.840 | delegating away and then moving our investment team to doing
01:17:17.200 | in person meetings, and doing higher level stuff. And so
01:17:20.800 | you're 8090. So at 8090, we have this funny thing where we've
01:17:25.040 | made it a verb, whenever you see somebody doing high quality work
01:17:29.120 | at a quarter to a 10th of the cost, we say, Oh, you just
01:17:32.080 | 8090 did. Correct. So you're you're 8090 in the investment
01:17:35.680 | team, I'm 8090 in the investment team. And you know what, it was
01:17:38.000 | scary as hell for them, because they're like, Am I gonna lose my
01:17:39.760 | job? It's like, No, you now get to instead of doing a check and
01:17:42.960 | call once a month, you can do a check and call every other week
01:17:46.000 | or every week, or instead of doing 15, first round interviews
01:17:49.840 | a week, you can do 25. Because you have this assistant with the
01:17:53.040 | way, doing all the repetitive work,
01:17:55.280 | the way that companies will work in five and 10 years, I don't
01:17:58.400 | think guys, any of us are going to recognize what it's going to
01:18:00.800 | look like. So this is where I go. I mean, like watching saxes
01:18:04.560 | demo earlier, how much progress and how seamless that product
01:18:09.120 | works with the features it has enabled by the underlying
01:18:12.400 | models. You just get to thinking how all of these vertical
01:18:16.480 | software applications become completely personalized and
01:18:21.920 | quickly rebuilt around AI. You know, it's, it's so obvious,
01:18:26.560 | imagine how long it would have taken john to write a letter to
01:18:29.520 | Lena Khan to like if we said john invite Lena Khan, but be
01:18:32.560 | sure to reference all the nice things we said about her on
01:18:35.120 | episodes of the pod. I'll be 10 hours work. You gotta go find
01:18:38.320 | the episodes. Yeah, listen to him to figure out what the best
01:18:41.040 | quotes are. And you got it done in five seconds. It's
01:18:43.840 | incredible. Totally. And this is building that same sort of
01:18:46.880 | capability into a very specific vertical application that's
01:18:50.080 | specific to some business function. And you can probably
01:18:53.920 | spend a couple minutes or an hour building that function. And
01:18:56.800 | then it saves you hours a day in perpetuity. Yeah, you know, I
01:19:00.640 | think I think that's why these tools companies or the tools
01:19:03.280 | products that Google, Microsoft, Amazon and a few others are
01:19:09.600 | building are actually incredible businesses, because
01:19:12.480 | so many enterprises and so many vertical application builders
01:19:16.080 | are going to be able to leverage them. I got right their entire
01:19:18.640 | business functions. I got myself and my co founders at 8090. We
01:19:22.240 | get this stream of emails of companies that are like, or
01:19:26.000 | people that are like, we have this product idea, or we have
01:19:29.120 | this small product. One of the emails I got, this is crazy, was
01:19:33.680 | from a guy that's like, Oh, we've 8090 Photoshop. So like,
01:19:36.720 | we have like a much, much cheaper version of Photoshop.
01:19:38.800 | And the guy was doing like a few million bucks of, of ARR and
01:19:42.080 | growing really nicely. But then it turned out that somebody saw
01:19:45.520 | that and then 8090 did it. So then there's that thing. And so
01:19:51.200 | to your point, Friedberg, none of these big companies stand a
01:19:54.800 | chance. Yeah, it's everything. Not because they're not because
01:20:00.080 | the products aren't good. But like, Jake, I was going to go
01:20:02.960 | off and experiment with this, Zach's going to go off and build
01:20:05.200 | a product, you know, as every time that you're at a boundary
01:20:07.920 | condition, we're all going to explore, well, maybe we could do
01:20:11.040 | this with AI, maybe we shouldn't hire a person, not because we're
01:20:14.960 | trying to be mean about it. But it's because the normal, natural
01:20:18.320 | thing to do. And the opex of companies is just going to go
01:20:21.920 | down, which means the size of companies are going to shrink,
01:20:24.320 | which means the amount of money you need is going to go down.
01:20:26.400 | And that's just going to create the ability for these companies
01:20:29.760 | to sell those products cheaper. So it's a national, it's a
01:20:33.600 | massive deflationary tail. We had the same thing happen with
01:20:36.640 | compute. And now it's happening inside of organizations. I wrote
01:20:40.400 | a blog post about this on my sub stack called ADD. This is the
01:20:43.120 | framework I came up with. I told my entire team, look at what you
01:20:46.800 | got done every week. And I want you to ask three questions. How
01:20:49.280 | can I automate this? How can I deprecate this? How can I
01:20:53.280 | delegate it? And you know, the automate part is AI and what
01:20:56.480 | you're doing, David, the delegate part is Athena wow.com.
01:21:00.080 | And then the deprecate is, hey, just be thoughtful, what are you
01:21:03.280 | doing that you don't need to do? And that's 8090 in something
01:21:06.400 | like, there are things inside these products that you don't
01:21:08.880 | actually need. What's the core functionality of the product,
01:21:12.320 | you know, make it as affordable as possible. And then what's
01:21:15.520 | going to happen for people who think this is bad for society,
01:21:18.000 | you've got it completely wrong. We're going to have more people
01:21:21.200 | be able to create more products and solve more problems. The
01:21:25.280 | unemployment rate is going to stay very low. We're just going
01:21:27.920 | to have more companies. So the idea like, there was somebody
01:21:31.600 | who was working on very small, like software, I want to get
01:21:34.800 | pitched on very niche ideas, I want to create something where
01:21:37.440 | people can find people to play pickleball with, right, like a
01:21:40.080 | pickleball marketplace. Now, that didn't, wouldn't typically
01:21:43.520 | work, because you would need $5 million a year to build that
01:21:46.480 | product. But if you can build it for $500,000 a year, well, now
01:21:49.840 | you've only got to clear that number to be profitable. So a
01:21:52.240 | lot more smaller businesses, a lot more independence, all
01:21:55.280 | these little niche ideas will be able to be built. And a VC who
01:21:58.720 | says, I'm not giving you $5 million to build that app will
01:22:00.960 | be like, but I will give you 500k. And that's what I'm seeing
01:22:04.560 | on the ground in startups, the same startups that had a request
01:22:08.000 | of $3 million in funding five years ago are now requesting
01:22:11.280 | 500 to a million. It's deflationary all the way down to
01:22:16.240 | you guys. Incredible. Did you see the Google thing? Did you
01:22:18.720 | guys see the Google? Oh, yeah, Gemini stuff, chat GPT omni
01:22:22.400 | launch, at the same time, or perhaps strategically right
01:22:26.320 | before Google dropped its latest AI announcements at IO. The
01:22:31.840 | biggest announcement is that they are going to change search.
01:22:35.600 | This is the piece of the puzzle on the kingdom that they have
01:22:38.880 | been very concerned with, and they're going for it. The new
01:22:41.680 | product and they have like 20 different products, you can see
01:22:45.200 | them at labs.google, where they put all their different
01:22:47.280 | products. But this is the most important one, they call it AI
01:22:50.320 | overviews. Basically, it's perplexity for most users by
01:22:53.600 | the end of the year, they're going to have this. Here's how
01:22:55.680 | it works. And you can see it on your screen. If you're watching
01:22:57.680 | us go to YouTube. Here, they gave an example, how do you
01:23:00.800 | clean a fabric sofa, this normally would have given you
01:23:03.040 | 10 blue links here, it gives you a step by step guide with
01:23:06.160 | citations and links. So they're preempting, you know, the issue
01:23:10.320 | of people getting upset. And as I predicted, they're going to
01:23:14.240 | have targeted ads, here's the things you need in order to
01:23:18.000 | clean your couch. You can only use this if you're using your
01:23:21.120 | Gmail account. If you use like a domain name on Google Docs, it
01:23:24.640 | won't work there. So go to labs.google. But they're doing
01:23:28.240 | citations. And I think that we're going to see a major
01:23:32.320 | lawsuit here, those people who are in those boxes are going to
01:23:34.960 | look at the answer here and realize maybe they don't get the
01:23:36.960 | click through. And that this answer was built on that. And
01:23:39.680 | now we're gonna have to have a new framework, there's going to
01:23:41.440 | need to be sacks, a new company that clears this content so
01:23:45.600 | that Google can do answers like this.
01:23:48.560 | The workflow stuff in Gmail also kicked ass the demo that they
01:23:52.560 | showed was, you get a bunch of receipts. And the person giving
01:23:56.960 | the demo, she said something the effect of Walt, wouldn't it be
01:23:59.600 | great if like, you know, the AI assistant, we're able to find
01:24:02.400 | all the receipts, and then aggregated them, and put them in
01:24:05.680 | a folder, and then also actually generated an expense report, or
01:24:09.600 | like a spreadsheet, why not fly it? It's crazy. Yeah, I got to
01:24:14.400 | say, I think that it's free to change your mind. And so it's
01:24:18.160 | good to do that. Oh, and I think that Chamath, and a rare moment
01:24:22.720 | of reflection, might do a, are we gonna have a re underwriting?
01:24:26.400 | Is this a re underwriting? I change my mind all the time. I
01:24:29.040 | just, I mean, you know, because I'm
01:24:30.960 | Ladies and gentlemen, breaking news, Chamath is re-underwriting his
01:24:34.160 | Google train.
01:24:34.720 | Sorry, I know to blow your ears out. I think the Google thing is
01:24:40.000 | pretty special between last week's announcement of isomorphic
01:24:44.080 | labs, which, let's be honest, that's a, that's just a multi
01:24:48.640 | hundred billion dollar company. So you're saying there might be
01:24:51.600 | many, think about it this way, right? Multi billion dollar
01:24:54.400 | opportunity sitting there dormant inside of Google that AI
01:24:56.960 | unlocks.
01:24:57.600 | Look at a company like Royalty Pharma. So if Royalty Pharma,
01:25:00.800 | with a pretty, it's a phenomenal business run by a phenomenal
01:25:04.160 | entrepreneur, Pablo LaGarreta. But what is that business? That's
01:25:06.880 | buying two and 3% royalties of drugs at work. And you can see
01:25:11.920 | how much value that those guys have created, which is
01:25:14.160 | essentially 90% EBITDA margin business. It's outrageous,
01:25:18.640 | because they're in the business of analyzing, and then buying
01:25:21.280 | small slivers. I think something like isomorphic ends up being
01:25:25.200 | of that magnitude of margin scale, but at an order of
01:25:28.480 | magnitude, or two orders of magnitude higher revenue. So if
01:25:31.760 | you if you fold that back into a Google, if you think about what
01:25:36.160 | they're doing now on the search side, these guys may be really
01:25:39.280 | kicking some ass here. So I think that the, the reports of
01:25:44.400 | their death were premature and exaggerated.
01:25:46.800 | Absolutely. And the report of their death freeberg was based
01:25:49.360 | upon people don't need to click on the ads. But as I said on
01:25:52.880 | this very bogus, my belief is that this is going to result in
01:25:57.120 | more searches and more knowledge engagement. Because once you
01:26:00.400 | get how to cook your steak and get the right temperature, right
01:26:04.720 | for medium rare, it's going to anticipate your next three
01:26:07.760 | questions better. So now to say, Hey, what wine pairing would you
01:26:10.720 | want with that steak? Hey, do you need steak knives, and it's
01:26:13.200 | just going to read your mind that you need steak knives and
01:26:15.440 | trim off likes to buy steak knives, but maybe you like to
01:26:17.680 | buy mock meats, whatever it is, it's going to drive more research
01:26:22.000 | and more clicks. So while the monetization per search may go
01:26:25.040 | down, we might see many, many more searches. What do you think
01:26:28.480 | freeberg you work there. And when we look at the the future
01:26:33.040 | of the company and the stock price, Nick, we'll pull it up.
01:26:35.760 | Man, if you had held your stock? Yeah, I don't know. Did you
01:26:40.160 | hold? I bought some
01:26:43.920 | original stock.
01:26:44.800 | Oh, no, I sold all my stock back when I started climate
01:26:49.200 | because I was a startup entrepreneur and needed to live.
01:26:52.240 | So which, you know, I did the math on it, it was pretty it'd
01:26:58.880 | be worth. It'd be worth a lot. It would be worth billions or
01:27:03.280 | tens of billions? No, no. Would it would have been a billion?
01:27:06.160 | No, no. Okay. You know, I was not like a super I was not a
01:27:09.600 | senior exec or anything. I think what you said is is probably
01:27:13.600 | true. So that's a creative. I think the other thing that's
01:27:17.600 | probably true is a big measure at Google on the search page in
01:27:22.560 | terms of search engine performance was the bounce back
01:27:25.360 | rate, meaning someone does a search, they go off to another
01:27:28.240 | site, and then they come back because they didn't get the
01:27:29.920 | answer they wanted. And then the one box launched, which shows a
01:27:33.840 | short answer on the top, which basically keeps people from
01:27:37.200 | having a bad search experience because they get the result
01:27:39.600 | right away. So a key metric is they're going to start to
01:27:42.320 | discover which vertical searches, meaning like a cooking
01:27:48.000 | recipes, that kind of stuff like referencing for travel, there's
01:27:51.040 | lots and lots of these different types of searches that will
01:27:53.920 | trigger a snippet or a one box that's powered by Gemini, that
01:27:57.680 | will provide the user a better experience than them jumping
01:28:00.560 | off to a third party page to get that same content. And then
01:28:03.680 | they'll be able to monetize that content that they otherwise
01:28:06.800 | were not participating in the monetization of. So I think the
01:28:10.400 | real victim in all this is that long tail of content on the
01:28:13.680 | internet, that probably gets cannibalized by the snippet one
01:28:16.880 | box experience within the search function. And then I do think
01:28:20.080 | that the revenue per search query in some of those
01:28:23.120 | categories actually has the potential to go up, not explain,
01:28:26.640 | explain, give me an example, you keep people on the page, you
01:28:29.040 | get more, more search volume. There, you get more searches
01:28:33.600 | because of the examples you gave. And then when people do
01:28:36.400 | stay, you now have the ability to better monetize that
01:28:39.520 | particular search query, because you otherwise would have lost
01:28:42.400 | it to the third party content page. So for example, selling
01:28:45.520 | the steak knives is another is, you know, it's kind of a good
01:28:47.360 | example, or booking the travel directly, and so on. So by
01:28:52.000 | keeping more of the experience integrated, they can monetize
01:28:54.880 | the search per query higher. And they're going to have more
01:28:59.360 | queries. And then they're going to have the quality of the
01:29:01.840 | queries go up. So I think it's all in, there's a case to be
01:29:05.440 | made. I haven't done a spreadsheet analysis on this,
01:29:07.520 | but I guarantee you, going back to our earlier point about
01:29:10.720 | precision versus accuracy, my guess is there's a lot of hedge
01:29:13.760 | fund type folks doing a lot of this precision type analysis,
01:29:17.280 | trying to break apart search queries by vertical and try to
01:29:20.960 | figure out what the net effect will be of having better AI
01:29:23.520 | driven one box and snippets. And my guess is that's why
01:29:26.160 | there's a lot of buying activity happening in the stock right
01:29:28.240 | now. And I think they're probably all missing to most
01:29:30.800 | point, a lot of these call options like isomorphic labs, I
01:29:36.000 | can tell you meta and Amazon, what meta and Amazon do not
01:29:39.360 | have an isomorphic lab and Waymo sitting inside their
01:29:41.760 | business, that suddenly pops to a couple hundred billion of
01:29:44.240 | market cap. And Google does have a few of those. So so other
01:29:47.520 | bets actually pay off. These are maybe I look, I mean,
01:29:50.720 | there's Calico, no one talks about Calico. I don't know
01:29:52.560 | what's going on extension. Yeah, let me get sacks involved
01:29:55.040 | in discussion sacks. When we show that example, it's obvious.
01:29:58.720 | Google is telling you where they got these citations from
01:30:02.320 | and how they built their how to clean your couch, how to make
01:30:04.560 | your steak. Those they were in a very delicate balance with
01:30:07.840 | content creators over the past two decades, which is, hey,
01:30:11.040 | we're going to use a little bit of your content, but we're
01:30:13.760 | going to send you traffic. This is going to take away the need
01:30:17.600 | to send traffic to these places. They're going to benefit
01:30:20.160 | from it. To me, this is the mother of all class action
01:30:23.280 | lawsuits, because they're putting it right up there. Hey,
01:30:25.920 | we're using your content to make this answer. Here's the
01:30:29.040 | citations. We didn't get your permission to do this, but we're
01:30:31.280 | doing it anyway. What do you think is the resolution here?
01:30:34.720 | Does all these content go away because there's no model? Does
01:30:38.160 | Google try to make peace with the content creators and cut
01:30:40.640 | them in or license their data? What's going to happen to
01:30:43.600 | content creation when somebody like Google is just going to
01:30:46.880 | take wire cutter or these other sources that are not behind a
01:30:50.000 | paywall and just give you the goddamn answer?
01:30:52.160 | Well, look, this is the same conversation we've had two or
01:30:55.280 | three times where we're going to need the courts to figure out
01:30:57.840 | what fair use is. And depending on what they come up with, it
01:31:00.960 | may be the case that Google has to cut them in by licensing by
01:31:05.120 | doing licensing deals. We don't know the answer to that yet. By
01:31:08.000 | the way, I do know a founder who is already skating to where
01:31:10.960 | the puck is going and creating a rights marketplace so that
01:31:15.200 | content owners can license their AI rights to whoever wants to
01:31:18.640 | use them. I think that could be very interesting. I had a call
01:31:21.440 | with him yesterday and you and I will be on that cap table
01:31:24.080 | together once again. So I don't want to say who it is because
01:31:27.280 | we can't let him announce his own round, but I'm only
01:31:29.840 | participating in the seed round. Look, stepping back here.
01:31:32.320 | It's interesting. If you go back to the very beginning of
01:31:35.280 | Google, the OG Google search bar had two buttons on it, right?
01:31:38.880 | Search and I feel lucky. I feel lucky was just tell me the
01:31:42.560 | answer. Just take me to the best result. And no one ever did
01:31:45.840 | that, because it kind of sucked. Then they started inching
01:31:49.120 | towards with one box, but it wasn't you didn't get the one
01:31:51.440 | box very often. It's very clear now that Gemini powered one box
01:31:55.680 | is the future of Google search. People just want the answer. I
01:31:59.360 | think that this feature is going to eat the rest of Google
01:32:02.320 | search. Now, it's a little bit unclear what the financial
01:32:07.440 | impact of that will be. I think like you guys are saying,
01:32:09.520 | there'll be probably more searches because search gets
01:32:11.840 | more useful. There's fewer blue links to click on, but maybe
01:32:15.360 | they'll get, you know, compensated through those like
01:32:17.920 | relevant ads. Hard to say you're probably right, that Google
01:32:21.360 | ultimately benefits here. But let's not pretend this was a
01:32:25.440 | deliberate strategy on their point. They got dragged kicking
01:32:28.160 | and screaming into this by innovation of perplexing other
01:32:31.360 | companies. Yep, they had no idea they got caught completely
01:32:34.800 | flat footed. And they've now I guess caught up by copying
01:32:38.400 | perplexity and sexual perplexity. I think they're kind
01:32:41.920 | of screwed now unless they get over an acquisition deal. But
01:32:45.680 | perplexity came up with the idea of having citations in
01:32:51.200 | having a comprehensive search results. Yeah, which was
01:32:54.320 | something search result with citations and related questions.
01:32:57.600 | And right, they did extremely well. And quite frankly, all
01:33:00.560 | Google had to do was copy them. Now they've done that. And I
01:33:03.680 | think it does look like a killer.
01:33:04.800 | And by the way, this was all something that I saw 15 years
01:33:08.080 | ago, when I did Mahalo, which was my human powered search
01:33:10.640 | engine, and which I had copied or been inspired by neighbor
01:33:14.160 | and down in Korea, they were the first ones to do this, you
01:33:17.120 | know, it came up because there were only three or four markets
01:33:20.080 | where Google couldn't displace the number one, Korea, Russia,
01:33:24.400 | Japan, Russia had was a Russian search engine. God, I can't
01:33:28.480 | remember now. Japan had Yahoo Japan, which Masayoshi san had
01:33:33.120 | carved out and was never part of it. And they were loyal to
01:33:35.600 | that and very nationalistic. Koreans are very innovative
01:33:38.640 | folks at down. And neighbor just made search that was so
01:33:43.920 | amazing. You do a search and be like, here's music, here's
01:33:47.120 | images. Here's answers. Here's q&a. It was awesome. But you
01:33:51.600 | know, it just shows you like, you need to have a lot of
01:33:53.760 | wherewithal and timing is everything as an entrepreneur.
01:33:55.760 | My timing was 10 years too early in the wrong technology. I
01:33:58.320 | used humans, not AI, because AI didn't work 15 years ago.
01:34:00.960 | One thing I would say about big companies like a Google or
01:34:05.360 | Microsoft is that the power of your monopoly determines how
01:34:09.040 | many mistakes you get to make. So think about Microsoft
01:34:13.520 | completely missed iPhone, remember, and they like they
01:34:16.400 | screwed up the whole smartphone, mobile phone era,
01:34:18.720 | and it didn't matter. Didn't matter. Sasha comes in blows
01:34:21.920 | this thing up to a $3 trillion public company, same thing
01:34:24.640 | here with Google, they completely screwed up AI, they
01:34:27.360 | invented the transformer, completely missed LLM, then
01:34:30.640 | they had that fiasco, where, you know, they have George
01:34:33.760 | Washington, George Washington, doesn't matter. They can make
01:34:36.960 | 10 mistakes, but their monopoly is so strong, that they can
01:34:39.920 | finally get it right by copying the innovator. And they're
01:34:42.880 | probably gonna become a $5 billion company now, sorry, $5
01:34:45.360 | trillion company reminds me, you know, the greatest product
01:34:48.720 | creation company in history. I think we all know who that was.
01:34:53.200 | And take a look down memory lane. Here are the 20 biggest
01:34:56.800 | failed Apple products of all time. The Apple Lisa, Macintosh
01:35:01.120 | portable, we all remember the Newton, which was their PDA, the
01:35:05.440 | 20th anniversary, Macintosh, super sexy, people don't
01:35:08.800 | remember they had their own video game.
01:35:10.640 | I was at a conference a couple years ago, that Jeff Bezos
01:35:15.040 | spoke at, I think he's given this talk in a couple other
01:35:18.480 | places, you could probably find it on the internet. But he
01:35:20.800 | talks about Amazon's legacy of failure, and how they had the
01:35:24.800 | fire phone and the fire this and the fire that and he's like,
01:35:28.400 | our job is to fail big swings, we have to make these blunders.
01:35:31.760 | But what makes us successful is that we learn from the failures
01:35:34.640 | and, you know, we make the right next decision.
01:35:37.040 | Yeah, but if you're a startup, and you make big failures,
01:35:40.800 | usually it's got a business one and done. Yeah.
01:35:43.280 | But this is how you stay competitive. If you're a big
01:35:47.600 | founder led tech company, the only way you're going to have a
01:35:50.800 | shot at staying relevant is to take big shots that you're
01:35:54.800 | going to fail at. I just don't want you to do things that
01:35:59.680 | you're gonna fail at. Right? Remember this boombox is one of
01:36:01.760 | the huge difference between startups and big companies is
01:36:04.400 | that big companies can afford to have a portfolio of products,
01:36:06.880 | they have a portfolio of bets, some of them will work and that
01:36:09.440 | keeps the company going startup really has to go all in on their
01:36:12.160 | best idea. Totally. I always tell founders just go all in on
01:36:15.360 | your best idea. They're always asking me for permission to
01:36:17.840 | pivot. And I always tell them do go for the best idea. Don't
01:36:22.160 | don't hedge. Don't try to do five things at once. Just go all
01:36:25.280 | in on your best idea. Yeah. And if it doesn't work, you reboot
01:36:28.960 | and start with a new table. You're gonna go all in. So to
01:36:32.640 | speak another amazing episode is in the can the boys are in a
01:36:36.800 | good mood. You got your great episode. No guests this week.
01:36:39.760 | Just all bestie all the time. And very important. The march to
01:36:45.760 | a million continues halfway there. You got us there fans. We
01:36:49.440 | hit 500,000 subbies on YouTube, which means y'all earned a live
01:36:54.800 | q&a with your besties coming at you. In the next couple of
01:36:58.000 | weeks. We're going to do it live on YouTube. So if you're not one
01:37:01.360 | of the first 500 get in there now so you get the alert. We're
01:37:03.680 | going to take your questions live. It's going to be dangerous.
01:37:06.960 | Any questions? No questions. Who knows what could happen on a
01:37:11.280 | live show. And by the way, I just want to let you know that
01:37:13.520 | Phil Hellmuth breaking news Phil Hellmuth and Draymond Green
01:37:17.040 | just resigned from open AI. We didn't get into that. But the
01:37:19.440 | open AI resignations continue. Phil Hellmuth has tweeted he's
01:37:22.880 | no longer with open AI.
01:37:25.120 | You guys like my baby cashmere pink sweat.
01:37:29.120 | That's pretty great. We're gonna get summer Chamath soon.
01:37:32.240 | Are the buttons coming down? Are you gonna go linen? The linen
01:37:35.760 | Chamath show up. The unbuttoning is about to happen in the next
01:37:38.560 | great unbuttoning. This is how you know it's kind of like
01:37:41.760 | Groundhog Day. You know that summer's here. When you lose
01:37:45.360 | your buttons almost Memorial Day when after Memorial Day, the
01:37:49.600 | button can come down. Yeah, we're gonna go three buttons
01:37:52.720 | down. I'll still be wearing my black tee. Saks will still be
01:37:56.560 | blue blazer, blue shirt, red tie. And Freeberg in fields of
01:38:02.160 | gold. Look at Freeberg and fields ago taking us out staying
01:38:05.440 | fields ago coming at you two for Tuesday. See all the next
01:38:08.800 | all the pod for the Sultan of Science. The Rain Man David
01:38:13.360 | Saks and Chairman Dictator. I am your Z 100 Morning Zoo DJ.
01:38:17.280 | We'll see you next time. Love you boys.
01:38:19.120 | Rain Man David Saks.
01:38:29.520 | And it said we open sources to the fans and they've just gone
01:38:33.540 | crazy with it. Love you. I'm the queen of Kinhua.
01:38:38.740 | Besties are gone.
01:38:46.420 | My dog taking a notice in your driveway.
01:38:50.180 | We should all just get a room and just have one big huge
01:38:56.240 | orgy because they're all just useless. It's like this like
01:38:58.480 | sexual tension that they just need to release somehow.
01:39:02.820 | We need to get merch.
01:39:10.420 | And now the plugs the all in summit is taking place in Los
01:39:23.380 | Angeles on September 8 through the 10th. You can apply for a
01:39:26.660 | ticket at summit.all in podcast.co scholarships will be
01:39:31.900 | coming soon. You can actually see the video of this podcast on
01:39:35.380 | YouTube, youtube.com slash at all in or just search all in
01:39:39.460 | podcast and hit the alert bell and you'll get updates when we
01:39:43.420 | post and we're going to do a party in Vegas my understanding
01:39:47.540 | when we hit a million subscribers so look for that as
01:39:49.960 | well. You can follow us on x x.com slash the all in pod.
01:39:54.340 | Tick Tock is all underscore in underscore talk, Instagram, the
01:39:58.800 | all in pod. And on LinkedIn, just search for the all in
01:40:02.060 | podcast. You can follow Chamath at x.com slash Chamath. And you
01:40:06.100 | can sign up for a sub stack at Chamath dot sub stack.com I do
01:40:09.620 | free bird can be followed at x.com slash free bird and all
01:40:12.500 | hollow is hiring. Click on the careers page at Oh, hollow
01:40:15.220 | genetics.com. And you can follow sacks at x.com slash David sacks
01:40:20.180 | sacks recently spoke at the American moment conference and
01:40:23.100 | people are going crazy for it. It's into his tweet on his ex
01:40:26.180 | profile. I'm Jason Calacanis. I am x.com slash Jason and if you
01:40:30.940 | want to see pictures of my Bulldogs and the food I'm
01:40:33.140 | eating, go to instagram.com slash Jason in the first name
01:40:36.780 | club. You can listen to my other podcasts this week in startups
01:40:40.260 | to search for it on YouTube or your favorite podcast player we
01:40:42.780 | are hiring a researcher apply to be a researcher doing primary
01:40:47.100 | research and working with me and producer Nick working in data
01:40:50.060 | and science and being able to do great research finance etc. All
01:40:54.300 | in podcast.co slash research. It's a full time job working
01:40:57.420 | with us the besties and really excited about my investment in
01:41:01.460 | Athena go to Athena Wow, you know wow.com and get yourself a
01:41:06.860 | bit of a discount from your boy j cow, you know, wow.com. We'll
01:41:11.140 | see you all next time on the all in podcast.