back to index

Big Fed rate cuts, AI killing call centers, $50B govt boondoggle, VC's rough years, Trump/Kamala


Chapters

0:0 Bestie intros + All-In Summit recap
6:50 Fed cuts 50 bps: Economic tailwind, scary signal, or both?
17:35 AI is coming for call centers; how agent training works
33:41 US government wasting $50B for rural internet and EV charging stations
47:10 Reflecting on some rough years in VC: is the model broken?
67:18 Reacting to the first Trump/Kamala debate, what factors will make each candidate can win or lose the race

Whisper Transcript | Transcript Only Page

00:00:00.000 | All right, everybody, welcome back to the all in podcast. The
00:00:05.100 | channel has been active. We're in the afterglow. We're in the
00:00:08.020 | all in summit afterglow. It's so glowing that Friedberg couldn't
00:00:13.340 | make it. He has been riding a high Nick told me that in the
00:00:17.100 | last week, happy just, we, we've only put out a half the clips
00:00:20.980 | and they've already gotten 20 million views. Oh, my lord, you
00:00:24.960 | know, I will be will be around 50 million. I think when all the
00:00:28.700 | clips are released, then you let it bake for a couple of months.
00:00:31.900 | That is an astoundingly large amount of reach. Yeah. And
00:00:36.260 | that's just YouTube. We're not doing it on the podcast feed
00:00:38.580 | right now. YouTube and x Well, hopefully we get it on the
00:00:40.780 | podcast feed, we get another 50 million. But freebergs in his
00:00:45.320 | afterglow couldn't make it but he's very busy right now. Look
00:00:48.980 | how happy is the summit went? Well, is that marijuana? Think
00:00:53.940 | he's making potatoes? I think that's his farm. But I mean, the
00:00:57.340 | smile is incredible. It's marijuana. It's version of
00:01:02.500 | founder mode. He's in fact, he's in the bog, his founder mode
00:01:07.660 | gives. He is he's in the afterglow. And he won't be with
00:01:30.700 | us this week. But he he organized such a great
00:01:34.380 | conference. Don't you think Jacob? He did great. I mean, he
00:01:38.020 | really took charge of that. And that's just an amazing job. I
00:01:42.380 | like to give him his flowers. Absolutely. It is like at least
00:01:46.020 | a trillion times better than the first and at least 50% better
00:01:49.500 | than the second. I mean, that's how it should go. You know, when
00:01:52.380 | you create something in the world, Chamath, what you want to
00:01:54.860 | do is you want to you want to hand it off to professional
00:01:57.620 | management to then scale it, right? Not everybody can do the
00:02:00.980 | creative act of actually forming something, you need to have
00:02:04.540 | these operators to go and then execute your vision. And I just
00:02:08.900 | want to give Friedberg his flowers for executing
00:02:10.820 | incredibly well. We all play a role. Chamath Sachs launched a
00:02:15.020 | tequila company. I want to say thanks to Friedberg. He did all
00:02:19.540 | of these great speakers. Big thank you to our CEO, john who
00:02:23.620 | put together all the operations Nick did incredible. Nick did
00:02:27.980 | incredible, incredible job with those opening graphics. They
00:02:31.580 | went viral. Zach helped with the graphics. You had young
00:02:34.980 | Spielberg chipping in you had Laura did an amazing job with
00:02:37.820 | stage management. And of course, you know, I focused on the
00:02:40.740 | moderation. I got a lot of great things. So everybody plays a
00:02:43.260 | role. You got sacks with the tequila, Friedberg, Laura, Zach,
00:02:49.060 | Spielberg, Nick, john, everybody brought something to the table.
00:02:53.100 | Are you? So congratulations to everybody. You scale through
00:02:57.900 | people. That's it. scale through people. That's it. Did anybody
00:03:02.180 | cut the joke? Wait, come on. Everybody contributed. You
00:03:07.460 | understand? sacks new tequila company, john operations, your
00:03:12.420 | free bird content me with them being the world's greatest
00:03:14.940 | moderator up there. What's your contribution? Oh, yeah, two
00:03:18.540 | months showed up. Shemoth looked great. I showed up. That's just
00:03:24.060 | showed up and looked great. I brought my two votes. And I
00:03:27.740 | brought my vision. Absolutely. I would also say fan favorite. You
00:03:32.060 | what you really did that was amazing was you took a lot of
00:03:34.620 | selfies. I was very proud of both of you. With the fan
00:03:38.380 | service. Fans were very pleased that you guys took so many
00:03:41.900 | selfies. You know, we got a lot of feedback to coming in. So it
00:03:47.740 | was a pretty, pretty great feedback. Do you think that you
00:03:52.420 | did better as moderator because you finally let go of just the
00:03:56.860 | conference organization? What do you think? Yeah, I think that
00:04:01.940 | you're able to focus on your unique value ad instead of
00:04:05.100 | immersing yourself in a bunch of details that could be handled by
00:04:09.340 | the team. I agree. It was absolutely process to get you to
00:04:14.140 | let go. Well, you know, you have. So it's a fair point. I
00:04:18.700 | did people did say my moderation was dialed in. And I appreciate
00:04:21.540 | that positive feedback from everybody. And yeah, there is
00:04:24.380 | something to having people you trust with the content, you
00:04:28.660 | know, moderation was excellent. This time it was better than
00:04:31.020 | before. Because I think that you're actually exceptional as a
00:04:34.820 | moderator. And I think you're mostly average as a conference
00:04:38.100 | producer. But I do think as a moderator, you're excellent. I
00:04:44.420 | mean, like some of the most memorable moments were you
00:04:48.780 | basically drawing out contrasting opinions, and the
00:04:54.220 | way that the people engaged with them was so healthy and good.
00:04:57.420 | That was the I think the recurring theme. So I give you
00:05:01.140 | an enormous amount of credit. I think you did an exceptional job.
00:05:03.620 | But I also think it's because you were able to focus on what
00:05:05.820 | you're yes, I do agree with that. I was talking to Jade
00:05:09.260 | about and she said, and Nick also pointed out you were really
00:05:11.740 | dialed in Jake out. What's up? And I said, I'm not worrying
00:05:14.500 | about the party and the vendors and the front desk and the
00:05:17.100 | sponsors. And it is actually you're able to focus. Did you
00:05:21.260 | have some favorite moments yourself there? sacks any
00:05:23.340 | favorite moments for your panels or things maybe that exceeded
00:05:26.140 | expectation for you?
00:05:27.220 | Well, I thought the mirror Shimer Jeffrey Sachs panel was
00:05:31.820 | great. I thought it would be which is why I helped organize
00:05:35.300 | it. But I was just glad that the audience so many people in the
00:05:38.020 | audience reacted and said that was the surprise hit of the
00:05:40.120 | conference. I would say that was my favorite of the event. One
00:05:42.620 | of the best panels I've ever been part of. It's the most
00:05:45.220 | viewed. It's like slightly above Elon's one. Really? Oh, just
00:05:48.660 | behind you. Elon slightly ahead. But yeah, it's still like
00:05:51.620 | growing. It's like finding an audience. Well, I think that I
00:05:55.120 | think that if you if you look at the one from last year, Graham
00:05:58.020 | Allison, where he got a standing ovation, the thing is there are
00:06:00.380 | these village elders where they are at a point in their life
00:06:05.540 | where they're willing to just be a truth teller. But oftentimes
00:06:10.220 | they're D platform. And we have the ability to actually bring
00:06:14.500 | some of the smartest of them on and give them a voice. And it's
00:06:18.860 | incredible how much they resonate because what they say
00:06:21.220 | is so logical and sensible. So that's a really important thing
00:06:26.820 | that we have now at our disposal. And I think that
00:06:29.580 | people really appreciate it, you know, so we're like, I think
00:06:32.540 | we're doing a really important job in doing that. And now the
00:06:36.060 | question is, what village elders do we get next year to keep, you
00:06:39.540 | know, being truth tellers?
00:06:40.700 | Well, give us your thoughts. You know, there's an all in Twitter
00:06:44.700 | handle and he's Chamath David Sachs, and I'm at Jason and
00:06:48.140 | freebergs at freeberg. Just tell us who you think would be great.
00:06:50.540 | But Sachs, I know you're super excited and want to give Biden
00:06:53.260 | his flowers. The Fed just cut rates 50 bips and the stock
00:06:57.420 | market is tearing it up right now. On Wednesday, Fed cut
00:07:02.580 | interest rates by half a percentage point, taking them
00:07:06.340 | down off of a 23 year high. We've been talking about this
00:07:10.380 | God for two years here on this podcast first rate cut since
00:07:13.220 | March of 2020, which is about when we started this podcast,
00:07:17.860 | Jay Powell basically said the Fed thinks inflation is coming
00:07:20.940 | down to around 2% nicely. And they don't want the job market
00:07:26.180 | to soften any further than it already has. He also mentioned
00:07:29.820 | immigration has helped soften the market, the labor market as
00:07:34.700 | well, obviously, with all those new people looking for jobs. So
00:07:37.540 | in the last two months, July and August CPI has been at a two
00:07:40.380 | handle we talked about that 2.9% in July 2.5% in August, here's
00:07:45.260 | the CPI over the last decade. Obviously, massive boom in
00:07:50.900 | interest that you see there from 2021 to 2023. Many obviously
00:07:57.540 | think we're gonna have more rate cuts, probably 25 every meeting
00:08:00.820 | for a little bit. And Dow's already at an all time high
00:08:06.900 | surge 300 points on the news. Here's a here's some interesting
00:08:13.300 | data about the 50 basis point kickoff cuts. So this is where
00:08:17.420 | it gets interesting Chamath Fed only started publicizing their
00:08:21.020 | interest rate changes in 1994. Since 94 Fed has initiated a
00:08:24.620 | cutting cycle six times. Here's the chart. Take a good look at
00:08:28.380 | that. 95 98 2019. They started with 25 bps. Oh, one and Oh,
00:08:34.700 | seven. After the great financial crisis, they started with a 50
00:08:38.940 | BIP cut. So obviously, there was an emergency 50 BIP cut in March
00:08:44.060 | of 2020. When COVID hit. Oh, 107 2020, very severe situations.
00:08:49.980 | And the what happened in the markets is what I want to
00:08:53.700 | discuss with both of you today in 20 2001, market fell 31% in
00:08:58.940 | the two years after that recut in 2007 market fell 26% after two
00:09:04.380 | years. So and 2020, despite all the fears market ripped 44% over
00:09:10.140 | two years, what's the more likely scenario? Chamath? Is
00:09:13.460 | this similar to the.com great financial crisis or similar to
00:09:17.740 | 2020?
00:09:18.580 | Well, I think 2020, you have to put in a big asterisk because
00:09:23.020 | the question is, what would have happened had there not been
00:09:25.500 | COVID? And had there not been an entire global shutdown. So if
00:09:30.340 | you go back to that chart, you could probably just extrapolate
00:09:33.820 | and cut out that part that's flat. Because the part that's
00:09:39.500 | flat from 2020 to 2022, was largely artificially created.
00:09:43.740 | Because on top of that, we injected so much money into the
00:09:47.020 | economy, the reality is, we probably would have raised at
00:09:50.300 | some rate of change that you could have predicted from 2016.
00:09:54.820 | So what do you what do you take away from that, I think that you
00:09:57.820 | have to like, realize we're at a point in the economy, where you
00:10:04.060 | cut rates, because there's tension. And there's tension
00:10:08.780 | between employment and unemployment, there's tension
00:10:12.340 | between earnings, growth and contraction. And so it's a
00:10:17.780 | stimulatory move. So if you look through that stimulatory move,
00:10:22.260 | why is the Fed doing this? And why will they cut probably all
00:10:26.740 | the way down to two or 3% by the end of 26? It's because we now
00:10:30.780 | need to stimulate the economy. So the reason why markets tend
00:10:36.100 | to fall once the rate cut cycle starts, is because the next
00:10:41.140 | couple of quarters sort of demonstrate what I think the Fed
00:10:44.980 | is expecting, which is that there's pressure in the economy.
00:10:48.900 | We have not seen that flow through in earnings or in how
00:10:54.180 | companies describe markets on the field, by and large, except
00:10:58.220 | for a few. So I think this part of the cycle now will be about
00:11:01.980 | all of these companies telling us whether there's nothing to see
00:11:05.180 | here, or whether there is actual real pressure. And if there is
00:11:08.220 | real pressure, it'll probably look like the several times
00:11:11.180 | before where you're just going to have to contract the value of
00:11:15.180 | financial assets, because they're just not worth as much
00:11:17.620 | when they're earning less.
00:11:18.660 | Okay, sacks, any thoughts here? Just balls and shrugs?
00:11:23.300 | I think a lot of people are commenting on the fact that the
00:11:27.140 | only other two times we've had a 50 basis point rate cut in
00:11:31.260 | modern history, it has been just before a recession. So I think
00:11:35.660 | this happened in 2001 2007, right before the recession, and
00:11:41.660 | the Fed had to do a dramatic rate cut because they could see
00:11:44.100 | in the data that things were weakening. So a lot of people
00:11:46.220 | are asking the question, well, is that what's going on here?
00:11:48.820 | Now, pals comments, though, are indicating that the economy is
00:11:54.060 | in good shape. He said the economy is in very good shape
00:11:56.140 | that basically indicating that they had tamed inflation, and
00:12:00.340 | that they would look to cut another 50 basis points this
00:12:03.300 | year. So pals, rhetoric is, in a way at odds with the magnitude
00:12:12.060 | of this cut, the, you know, so why didn't they just cut 25
00:12:14.580 | basis points, I think people are trying to figure that out,
00:12:17.380 | reading the tea leaves into during the tea 50, because they
00:12:21.180 | could just do 25 a month. Sure. Yeah, if the economy is hot,
00:12:27.060 | why wouldn't you tiptoe into rate cuts? And just do 25? Now,
00:12:31.300 | that's the key thing. If you look at the the dot plot, and if
00:12:34.500 | you look at where the smart financial actors are betting
00:12:37.100 | where rates end, so it's hard to sort of like look at any point
00:12:41.420 | in time 50. Now 25 later, what does it all mean? It's very hard
00:12:45.660 | to know. But what is much clearer is, where do we think
00:12:48.820 | terminal rates will be in even in the next 18 months, and it is
00:12:52.820 | dramatically lower for where they are now. And I think that
00:12:55.700 | support sacks your that argument that you just made, which is, if
00:12:59.620 | you're going to basically cut this aggressively over the next
00:13:03.580 | year to year and a half, by the estimates of very smart
00:13:07.100 | financial actors whose job it is to spend every day observing the
00:13:10.460 | Fed, then they must see something because otherwise, as
00:13:14.100 | you said, you could take a much more gradual approach. And so I
00:13:17.940 | think that the smart financial actors are guessing recession
00:13:23.140 | or guessing contraction. I think what they're also guessing is
00:13:27.620 | similar to nonfarm payrolls, we're going to go through a
00:13:31.060 | couple of difficult GDP revisions, probably downward. And
00:13:35.540 | I think that will have an impact to people's sense of how the
00:13:39.860 | economy is doing even more than what their senses today, which
00:13:42.940 | is already teetering on it's at best, okay. And I think all of
00:13:47.700 | that has to play itself out. So it's going to be a very
00:13:49.940 | complicated and dynamic fall in that respect.
00:13:52.700 | Yeah. And I think so much of this has to do with
00:13:56.380 | unemployment. We had that period where so many jobs were
00:14:00.900 | available. Remember, we talked about it here 11 12 million jobs
00:14:03.700 | available at the peak. We can debate the numbers, of course,
00:14:06.580 | but we all saw it where you just couldn't hire talent in America,
00:14:09.860 | there was so few people available to take positions. And
00:14:13.940 | man has that changed. And you get to see it on the ground in
00:14:16.820 | early stage startups, where this whole narrative, I don't know if
00:14:21.540 | you start in your board meetings, but hey, we can't find
00:14:24.180 | a person, hey, we're looking, hey, that search is still going,
00:14:26.820 | we're still looking for a director of sales, we're still
00:14:28.580 | looking for salespeople, we're still looking for developers,
00:14:30.580 | we're still looking for operations people. Now, it's the
00:14:33.100 | opposite. It's like, I just I'm hiring producers here in Austin,
00:14:38.140 | because I'm building it my in person studio. We had like, I
00:14:41.820 | don't know, a dozen viable candidates for this position.
00:14:44.780 | And I had a hard time picking between, you know, the top
00:14:48.660 | three. Now, that's distinctly different than my experience for
00:14:52.060 | the last five to 10 years, where you were like, how do we how do
00:14:56.140 | we fill this role? So I think that employment has been broken.
00:14:59.300 | And that's the thing that has me concerned, because with all
00:15:01.860 | these people who came in through the southern border, and then
00:15:04.740 | you have people outsourcing to other countries, I wonder if
00:15:08.500 | Americans are going to lose so many of these mid paying jobs.
00:15:12.260 | And this will dovetail into our next story about Amazon making
00:15:14.900 | cuts. I'm very worried about the the hollowing out of the upper
00:15:18.500 | middle class that elite group of hundred $50,000 jobs that then
00:15:23.220 | employ nannies and spend money in the economy. I wonder, I
00:15:26.940 | don't know if you're seeing that in your company, sacks.
00:15:28.980 | I'm not worried about the hollowing out of that class.
00:15:32.540 | Staying for that. But I mean, just in terms of the labor
00:15:38.180 | market, what do you see, you know, in companies right now,
00:15:40.980 | you know, hiring the talent pool, etc?
00:15:45.020 | Well, I mean, in tech, things are pretty good. I mean, they
00:15:48.260 | they're not as absurdly frothy as they were during the bubble
00:15:51.620 | of 2020 and 2021. But things are good. You have this huge AI
00:15:56.340 | tailwind now. And there's just a ton of investment going into AI.
00:15:59.660 | There's a little bit of a tale of two cities going on. If
00:16:02.460 | you're in AI, things are really bubbly. And if you're outside AI
00:16:05.740 | there, they've returned to much more normal levels in terms of
00:16:10.580 | valuation and company operations, all that kind of
00:16:14.420 | stuff. Just to go back to the state of the economy for a
00:16:17.940 | second. The reason why a lot of people were predicting a
00:16:21.300 | recession, including me for a while is that the yield curve
00:16:25.620 | inverting has been an almost perfect gauge of whether a
00:16:29.900 | recession is coming. It's when basically the Fed raises short
00:16:34.380 | term interest rates above long term interest rates. Normally,
00:16:38.060 | long rates are the ones that should be higher because
00:16:40.780 | investors demand a higher rate of return to tie up their money
00:16:43.380 | for longer. So something's really off and kind of broken
00:16:47.180 | when short rates go above long rates, the yield curve inverts.
00:16:50.020 | And it's always been the prelude to a recession. But the
00:16:53.140 | recession doesn't come when the yield curve inverts, it usually
00:16:56.900 | comes when the yield curve D inverts. And the reason for that
00:17:00.140 | is because the Fed now sees weakness and dramatically cuts
00:17:04.700 | the short rates. So in other words, it jacks up the short
00:17:07.220 | rates to control inflation. That works, it trickles through
00:17:10.460 | the economy, the economy cools down. And then the Fed says, Oh,
00:17:14.180 | shit, maybe we've over corrected, they slam on the
00:17:16.580 | brakes, and then they cut rates to basically make up for the
00:17:18.700 | effect in the economy. So the yield curve has finally D
00:17:22.100 | inverted. And the question is just do we now get that
00:17:25.340 | recession? Or did the Fed manage this to a soft landing? I don't
00:17:28.700 | think we know I'm not. I'm not like calling a recession. But
00:17:31.940 | this is the thing that people are concerned about. Yeah.
00:17:34.740 | Well, sex, we were talking about AI in the group chat, right?
00:17:37.580 | Yeah, I think it's now becoming really clear that call centers
00:17:40.820 | are going to be the first really big disruption caused by AI.
00:17:44.820 | Yeah, I mean, all the level one customer support is going to get
00:17:48.140 | replaced by AI. I mean, llms plus voice, because, you know,
00:17:53.740 | open AI just released their audio API. You saw that at the
00:17:58.500 | All in Summit, we released a Mearsheimer AI. Yeah, where we
00:18:03.700 | trained it on all of his work. And you can go to Mearsheimer
00:18:06.180 | dot AI and ask it questions. And it will tell you the answers in
00:18:09.220 | his voice because we cloned his voice using resemble AI. Anyway,
00:18:13.660 | so AI can do voice now. And it can be trained extremely well on
00:18:19.820 | large data sets to give you answers to questions, which is
00:18:22.940 | pretty much what customer support is. So I think it's now
00:18:26.460 | becoming clear that I think within the next two to three
00:18:29.460 | years, you're going to see a massive disruption in that I
00:18:31.620 | agree with that massively. And I think there's another under
00:18:34.820 | reported story, which is people don't like to call and talk to a
00:18:40.020 | customer service agent like an actual human, if they can avoid
00:18:43.580 | it, they would much rather go on YouTube and say, How do I fix
00:18:46.380 | this? Or, you know, ask chat GPT, how do I fix this? It's
00:18:49.740 | like, I don't want to waste another person's time, just give
00:18:52.300 | me the answer as quick as possible. And AI will give you
00:18:55.180 | the answer quicker. YouTube will give you the answer quicker.
00:18:57.900 | I've had so many times where I have people who work for me,
00:19:01.100 | we're like, I don't know how to do that. And I literally would
00:19:03.740 | walk up to the computer and load YouTube and type in, how do I
00:19:08.180 | blank and there's a video there, watch it on two speed, you can
00:19:11.300 | do it. That's what's, you know, gonna also kill this, like, I
00:19:14.860 | don't want to talk to a human, just change my flight, just, you
00:19:19.020 | know, but you talk about, yeah, I mean, you talk about
00:19:21.580 | disruption, call centers are a very big part of the economy in
00:19:24.980 | certain geographies, Denver, Salt Lake, I mean, parts of
00:19:28.820 | Florida. Yeah, exactly. It's a really big deal. If like half
00:19:33.020 | the cost gets ripped out of those call centers,
00:19:35.300 | where would you move those people? If you if you had your
00:19:38.040 | choice, could they move to sales?
00:19:39.820 | Well, I think sales will be the one that's disrupted after
00:19:42.500 | customer support. But I don't know, I think it's gonna be very
00:19:46.320 | disruptive. One of the reasons I think this is, you know, in the
00:19:48.900 | early days of LLM, people were saying that legal services would
00:19:53.360 | be disrupted. And you saw some very highly valued startups,
00:19:58.300 | rocketing up based on that. I think the problem with that is
00:20:02.300 | the error rate. So when you think about AI applications, you
00:20:08.020 | have to think about what is the tolerable error rate that the
00:20:11.940 | industry will allow? Because we know that AI get things wrong,
00:20:15.380 | they can hallucinate. And you're never gonna be able to make it
00:20:17.920 | perfect. I mean, you can improve the quality, but it's still
00:20:20.180 | gonna have some errors. And when you're dealing with like legal
00:20:23.220 | services, for example, you just can't have mistakes. This is not
00:20:26.820 | tolerated. However, customer support is different customer
00:20:29.780 | support is already organized into levels level one, level two,
00:20:33.540 | level three, based on difficulty. And there's already
00:20:36.800 | in a sense, a mechanism for failover. It's like the level one
00:20:39.740 | customer support person can't answer the question, they kick
00:20:43.460 | it up to level two. So there's a place for LLM to start in
00:20:49.260 | customer support, which is replacing all the level one and
00:20:52.300 | then working their way up the chain to level two, as they get
00:20:55.820 | better and better. And so what I'm saying is that the level of
00:21:00.500 | accuracy now, especially with the new PhD level reasoning
00:21:03.620 | models is good enough. Yeah, we don't need to wait for like some
00:21:07.220 | perfect LLM model. And I think this is why this is going to be
00:21:10.820 | a big, big disruption. We're gonna have their, their jobs
00:21:15.420 | disrupted, or at least transformed.
00:21:17.100 | Well, it could be the end of the entire career as well. Jamal, if
00:21:19.860 | you were to look at this four by four, sort of quadrant chart
00:21:23.160 | that sacks is describing, which is the cost of an era, you know,
00:21:28.140 | and the actual complexity of the job, perhaps, or the cost of the
00:21:34.300 | job. How do you look at this? I know, you're working on
00:21:38.620 | software that kind of does this with your startup as well.
00:21:42.420 | I mean, I'll preview one use case from 8090, which is pretty
00:21:52.140 | stunning. You know, we work with an a very large regulated,
00:21:58.220 | highly regulated company, public company. And they have a very
00:22:07.140 | complicated set of people and processes, because of the field
00:22:12.580 | in which they're in. And David, your point is exactly right. It
00:22:17.820 | took us a fairly long time. But we're at a point now where
00:22:24.660 | we've been running AI powered software versus the old legacy
00:22:31.320 | deterministic solution. And we've been running it at 100%
00:22:34.980 | accuracy now for about 10 days. So this is still very new. And
00:22:41.260 | since it's an incredible thing, because to your point, our first
00:22:44.660 | version was like, in the mid 80s, then we were in the mid
00:22:48.460 | 90s, then we were, you know, 9798%. But there were still
00:22:52.380 | errors. And it just took a lot of engineering to figure out how
00:22:55.980 | to get to 100. But now it's at 100. And it's been consistently
00:22:58.660 | at 100. And so we're all kind of like scratching our head, because
00:23:01.900 | now the next step is, well, what do we do to your point? What do
00:23:06.380 | we do? Do we? So we're figuring that out right now. But the art
00:23:12.900 | of the possible is that I think well crafted AI software is as
00:23:18.820 | good as deterministic software in the sense that the error
00:23:22.140 | rates will be equivalent in production, and at the level of
00:23:27.700 | a very highly regulated public company. And I think that's the
00:23:31.500 | gold standard, because in those sectors, those companies have
00:23:34.820 | zero tolerance. It's not a toy. It's not even, you know, level
00:23:39.500 | one customer support. It's system of record type work.
00:23:43.620 | Yeah. But it shows what's possible. And to your point,
00:23:47.220 | sacks, we're doing that today, even though they're the best
00:23:50.580 | models. Imagine how good those under the underlying models will
00:23:53.860 | get in a year from now. Yeah, right. And I able to take on
00:23:57.340 | more and more work. It's, it's very stunning. Actually, it's
00:23:59.980 | really
00:24:00.420 | have you guys worked with the one preview yet, I just literally
00:24:04.260 | have been using this new reasoning engine that open AI
00:24:08.060 | released, and it is extraordinary. And it's kind of
00:24:11.300 | thinking about the next three or four prompts you would do. And I
00:24:14.180 | literally just got this while we're on the show. I've hit the
00:24:17.740 | I've hit the limit for my paid account, because this thing is
00:24:20.100 | so intense on compute, I guess.
00:24:23.180 | Well, the thing with one is that I think it's starting to add
00:24:26.620 | reasoning, but the way that you do reasoning is sort of this
00:24:29.780 | idea that you have this chain of thought. And I think that that's
00:24:33.420 | a very powerful but early concept. And as we refine those
00:24:39.140 | ways in which these models get to better answers, the wonderful
00:24:43.860 | thing is that open AI will preview Oh, one. And then
00:24:49.220 | they'll have the actual one production build, probably in
00:24:51.980 | the next couple of months, which will be probably pretty
00:24:53.740 | spectacular. But then you'll see something from Claude, you'll
00:24:56.860 | see something from llama. And the real art, I think, and this
00:25:01.860 | is where I do think it's a little bit of alchemy still,
00:25:04.300 | which I think is good, because it keeps humans involved. All of
00:25:07.220 | us involved. Yes. Is how do you stitch all of those things
00:25:11.340 | together to get to a 0% error rate? What what SAC said, you
00:25:15.380 | know, how do you minimize the blast radius? And how do you
00:25:17.340 | make sure these things are super high quality? Right? Well, and
00:25:21.540 | people don't, it's still a very hard technical problem. Go ahead,
00:25:24.340 | sex. And then I'll show you what
00:25:25.700 | so yeah, one of the reasons why I'm bullish on this customer
00:25:28.300 | support use case is because there's a very large data set
00:25:31.140 | to train on, you've got all of the product documentation that
00:25:34.500 | company's very created, you've got all of the previous email
00:25:38.380 | support, you know, and calls. Yeah, the calls have been
00:25:42.100 | recorded. So you can now train the AI on that. So there's a
00:25:45.180 | very large body of data to train the AI model on and it's not
00:25:49.580 | necessarily the most proprietary. It's not like
00:25:52.580 | dealing with people's medical records, or even confidential
00:25:56.620 | legal documents, something like that. So the data is readily
00:25:59.580 | available. And then the foundation models are getting
00:26:01.460 | really good. I think there's a big question here about value
00:26:04.940 | capture, which is there's a number of startups now that are
00:26:08.300 | becoming very highly valued that are chasing this disruption,
00:26:12.100 | this sort of customer support agent, disruption. And they're
00:26:16.260 | getting into very high valuations, even unicorn
00:26:19.620 | valuations already. And the question is, well, wait, if if
00:26:23.220 | the foundation models are advancing at such a radical,
00:26:26.220 | like a year from now, why couldn't like a developer, just
00:26:30.020 | a startup, if you guys take next year's model, it's such a train
00:26:33.620 | it and then commoditize the
00:26:35.380 | you're making such a good point this. So when we were trying to
00:26:40.620 | figure out like what applications we would build and
00:26:43.740 | like which sectors of the economy we would go after, I was
00:26:47.140 | like, guys, we got to go after the hardest, most regulated
00:26:50.940 | places. Because those are the things and places and people
00:26:54.860 | that have absolutely zero tolerance for error, and where
00:26:58.340 | you're going to need to do some amount of customization and,
00:27:01.460 | and specialization to actually solve these problems. And sacks
00:27:06.260 | to your point, like when you see and I said you cannot we cannot
00:27:08.940 | touch customer service, we cannot touch it, because it's
00:27:11.660 | going to get commoditized and run over by these foundational
00:27:16.300 | models within a year. Right? You'll be able to deploy these.
00:27:20.700 | It's just too easy, you'll be able to do it on a local
00:27:23.140 | computer. I mean, you'll just download the entire database of
00:27:25.860 | every call on a MacBook with an m3 just run and build on that,
00:27:29.860 | that the other thing that's now possible, and you saw this with
00:27:32.660 | Karna because Karna put out this like cryptic tweet slash press
00:27:36.940 | release, where I think maybe it was in their earnings, Nick,
00:27:39.140 | maybe you can find this, but they're like, we've deprecated
00:27:41.660 | Salesforce and worked it. That was strange. How, how can a
00:27:46.180 | company that big deprecate those two systems of record? How is
00:27:50.420 | that even? It's how is it? I think it means they're writing
00:27:53.260 | their own, right? Well, I'll tell I'll tell you how it's
00:27:55.220 | possible. And so this is like this next crazy thing that's
00:27:58.020 | been happening. We've been doing a version of this to go after
00:28:01.300 | some other sources of software. We haven't had the balls, to be
00:28:05.180 | honest, to go ever since or workday. But here's how they do
00:28:09.260 | it. They write these agents. And these agents can spawn other
00:28:13.580 | agents, right. So it's very classic kind of machine that
00:28:16.340 | builds a machine. And you start to observe the inputs and
00:28:19.740 | outputs of a system, right? I'm hyper simplifying, but I'm just
00:28:23.260 | it'll make the point. And over time, what the agents start to
00:28:27.060 | do is by observing the inputs and the outputs, they start to
00:28:29.580 | guess on what the intervening code is. And the code paths must
00:28:32.980 | be in the middle to generate the outputs based on these inputs.
00:28:35.940 | And so over time, what happens is you develop a digital twin.
00:28:40.380 | And then you run that against that counterfactual against
00:28:45.460 | workday or Salesforce. And then at some point, you're like, it's
00:28:48.860 | the same. And you just turn it off. And you're saving yourself
00:28:53.340 | 10s or 100s of millions of dollars. So that's, it's a
00:28:56.900 | version of what Karna did, it takes an enormous amount of
00:29:00.820 | technical strength to do it. It also takes tremendous, I think,
00:29:05.540 | executive courage and leadership, because I think
00:29:08.100 | that's a very difficult decision to embark on. But if you're an
00:29:11.340 | engineer, that must be an unbelievably exciting technical
00:29:15.620 | challenge to be a part of but but that's the basic premise of
00:29:18.780 | what they were able to do. Hopefully, they share more and
00:29:22.740 | maybe they even open source what they did, because I think it
00:29:25.100 | would just be an amazing thing for all of us to look at.
00:29:29.740 | Yeah, I mean, to restate it, watch people use a piece of
00:29:33.700 | software. And then based on what they do, you could write a code
00:29:37.700 | which you could take a video of a video game today, like Angry
00:29:42.300 | Birds and somebody did this, you give the Angry Birds iPad, you
00:29:46.260 | know, game from 15 years ago, to AI, it's going to back into the
00:29:50.140 | code, just by watching it. So why not just watch people use
00:29:54.380 | Salesforce or workday. And those are very expensive products,
00:29:57.540 | thousands of dollars per user, right?
00:29:59.380 | I want to I want to get Saksis point of view, like the thing in
00:30:01.780 | enterprise software that we were always told is you cannot touch
00:30:04.860 | these systems of record. Don't ever start a systems of record
00:30:07.900 | company, don't try to touch these systems of record
00:30:10.420 | companies, don't, you know, try to disrupt them. It's an
00:30:13.180 | impossible task. But then the question is, if you have these
00:30:17.500 | things, why do you necessarily need a system of record in the
00:30:22.020 | way that you needed to before when you're writing all this
00:30:24.540 | clunky deterministic?
00:30:26.340 | Well, I saw the cornice story where they said they were going
00:30:29.860 | to rip out Salesforce and workday because they were able
00:30:32.980 | to write their own bespoke code using AI. I mean, I have to say
00:30:36.580 | I'm a little bit skeptical of that story for a couple of
00:30:39.140 | reasons. One is, if that's their goal, why wouldn't they have
00:30:42.980 | open source this these products they created, you might as well
00:30:45.820 | get the whole ecosystem working on it, because they're not
00:30:48.380 | trying to sell this product that they've internally created.
00:30:51.940 | They're just trying to rip out the cost. So why not let the
00:30:54.300 | whole ecosystem see it? The other thing is, if it's so easy
00:30:58.980 | to do, why hasn't the market already been flooded with new
00:31:02.540 | startups that are effectively able to reverse engineer?
00:31:05.700 | I don't think you're right. I don't think it's easy to do
00:31:07.820 | because I don't think there's a generalization here. That's
00:31:10.860 | productizable. Do you know what I mean? Like, I do think that
00:31:13.620 | these are very custom specific things. So maybe there's like
00:31:17.740 | some scaffolding, but I don't think that that scaffolding has
00:31:20.540 | a ton of economic value. I think it's really good open source
00:31:23.260 | stuff. Yeah, I think it's what you build on top of it. And so
00:31:26.860 | that hasn't been figured out yet, for sure.
00:31:29.180 | Yeah, look, I think that if you're only using a few use
00:31:33.620 | cases of these big, complicated software packages, then yeah,
00:31:37.620 | it's probably easier than ever to deprecate them, you know,
00:31:41.420 | eliminate them from your stack and just have your own internal
00:31:43.780 | engineers build specifically what you need in a more tightly
00:31:47.020 | integrated way. I think that is possible.
00:31:49.740 | Nick, show this tweet to these guys.
00:31:51.620 | Here's the tweet. This is a crazy one. Yeah.
00:31:57.540 | So so look at but look at the code. Look at the actual
00:32:01.260 | product itself for a second.
00:32:02.900 | Yeah, but the products garbage. I mean, look how ridiculous this
00:32:05.660 | is. But that was 600. Sorry, it was a billion dollars that
00:32:10.180 | paid Oracle 600 million to build our course management portal.
00:32:16.900 | It's built on top of Oracle's PeopleSoft suite, which they
00:32:20.060 | refuse to customize without an extra 400 million to hit 1
00:32:23.020 | billion New Yorkers got the image below and pay 5 million
00:32:25.900 | plus a year for hosting.
00:32:27.660 | Look, this this is egregious government waste. I mean, that
00:32:31.460 | site looks like it's pathetic. I mean, honestly, this looks like
00:32:35.780 | a it could have been done with a SharePoint site and you pay some
00:32:39.140 | consultant to stand it up and for 1% of the cost. And there
00:32:44.020 | are better plot more modern platforms than that. So this is
00:32:47.420 | just incredibly wasteful and inefficient government spending.
00:32:51.860 | But they were going for retro. They wanted to harken back to
00:32:57.060 | the 90s.
00:32:59.620 | I wanted to show this to you is I think that these kinds of
00:33:02.380 | things will not be possible in the future. I just don't see how
00:33:05.900 | one could spend $1 billion if one tried to to enable that
00:33:11.580 | feature.
00:33:12.140 | Right, but that that that's 600 million that was wasted on that
00:33:17.180 | crappy portal. That shouldn't have happened even without AI.
00:33:21.220 | Right? Because I was like much better ways there you can you
00:33:24.940 | could buy a much better product for 1% of the cost. So or point
00:33:28.660 | 1% of the cost.
00:33:29.580 | There must be some regulatory capture going here where
00:33:32.380 | somebody's got a record. No, I like a 10 year rents. That's a
00:33:36.500 | 10 year relationship with somebody in Albany that you
00:33:39.820 | know, it's wasteful. Previously, it's wasteful and abuse. It's
00:33:43.580 | the same thing that's happening with rural internet. Do you see
00:33:47.220 | that? Paradoxically, it is our next story. So let's go for it.
00:33:51.780 | In related news of our government burning our money.
00:33:56.980 | We're all broadband rural broadband and EV charging 42
00:34:01.700 | billion and 7.5 billion almost $50 billion combined. Let's just
00:34:06.500 | go over these two programs real quickly here. Both were part of
00:34:10.100 | the $1.2 trillion infrastructure bill in 2021 42 billion carved
00:34:15.580 | out to provide high speed internet to people living on
00:34:19.540 | farms in rural locations 7.5 billion carved out to build 500
00:34:23.500 | 1000 EV chargers over 10 years. It's been 1000 days since the
00:34:28.180 | bill was passed. So let's check on the progress zero people have
00:34:31.220 | been connected. According to FCC Commissioner Brendan Carr, and
00:34:36.060 | eight 12345678 EV chargers have been built as of May, according
00:34:42.860 | to auto week magazine, what's even crazier, private industry
00:34:46.860 | already solved these problems. United Airlines just announced
00:34:50.260 | they're putting Starlink on 1000 of their planes, and they're
00:34:54.620 | going to offer it for free. And Starlink now has 2500 planes
00:34:58.900 | under contract with a bunch of other airlines. And in the
00:35:04.100 | second half of 2023, alone, the private sector built over 1000
00:35:08.620 | charging stations in the US. These are two problems that have
00:35:12.380 | already been solved sex. Why are we burning $50 billion in the
00:35:19.740 | future with things that have already been solved? We've
00:35:25.060 | solved for this you I own electric cars. I have the
00:35:28.220 | answer. You know the answer. Say the answer. Jason. Corruption.
00:35:31.580 | No, come on, Jason. incompetence. Really? craft.
00:35:36.220 | Keep going. I mean, you tell me. Corruption graft, buying votes
00:35:43.780 | from your constituents.
00:35:45.020 | They haven't they haven't delivered any of it.
00:35:48.540 | incompetence? Yes.
00:35:49.660 | Well, there's there's a couple things going on here. So one is
00:35:54.020 | typical government, waste, fraud and abuse, where they're
00:35:58.620 | allocated 42 billion for rural internet haven't hooked anyone
00:36:01.780 | up. And we could spend a fraction of that, giving people
00:36:05.420 | Starling, and allowing the private sector to do its job.
00:36:09.180 | And why even pay for it sacks? Why are we paying for it? If
00:36:12.140 | it's available? That's the baseline, but it's worse than
00:36:16.180 | that. Because on top of the waste, fraud and abuse, and the
00:36:19.100 | fact that the government is grossly incompetent and
00:36:22.100 | inefficient. You also have naked political retaliation going on
00:36:25.620 | here. That's the answer. Yeah, exactly. And Brendan Carr, who's
00:36:29.460 | an FCC commissioner pointed this out. He said that in 2023, the
00:36:34.660 | FCC canceled or revoked an $885 million contract with the
00:36:39.820 | company by claiming Starlink is not capable of providing high
00:36:42.740 | speed internet. Then the year later, that Yeah, of course,
00:36:47.140 | that was a lie. And then a year later, the FCC is now claiming
00:36:50.620 | that Starlink provides so much high speed internet that the
00:36:53.420 | word monopoly should be tossed out. Yeah. So look, this is just
00:36:58.020 | which is it's pure naked retaliation. The the Biden
00:37:01.940 | Harris administration doesn't want to admit that Elon has the
00:37:05.820 | best solution for rural internet, just like they
00:37:08.780 | couldn't admit he made the best electric cars. Remember when
00:37:11.300 | they did that Evie summit, and they didn't invite him. That was
00:37:13.780 | just nakedly political, because he's not right. So look, I mean,
00:37:19.300 | the Biden Harris administration, it look, it's blue no matter
00:37:22.540 | who, and Elon has drifted from being sort of independent and
00:37:28.420 | not a line. He was blue. He was blue. I mean, now he's for
00:37:32.860 | Hillary and Obama. He said he's no longer team blue. And so
00:37:36.100 | they're punishing him for this. Yeah. But it's costing
00:37:38.780 | taxpayers a huge amount of money. I think this is one of
00:37:41.420 | the worst decisions by the current administration. And if
00:37:44.460 | Trump gets in there, he should reverse it on day one.
00:37:46.900 | Well, we need to investigate. I mean, I think how we got to the
00:37:50.220 | point of wasting $50 billion. That requires an investigation,
00:37:55.940 | I think Chamath your thoughts.
00:37:56.940 | One comment is, and this is so sad, but I'm so desensitized by
00:38:01.740 | the amount of waste that I don't know whether 50 billion is a lot
00:38:05.540 | or a little anymore when it comes to the United States
00:38:07.780 | government. Isn't that sad? Like, because now everything I
00:38:11.260 | hear is normalized hundreds of billions and trillions, but 50
00:38:15.020 | billion is an enormous amount of money, right?
00:38:18.460 | Well, that's a good point. I remember, you know, back in the
00:38:21.820 | day, 60 Minutes used to do the segments on waste, fraud and
00:38:27.180 | abuse at the Pentagon, different parts of the government $42
00:38:29.940 | billion just spent on something that really taxpayers could have
00:38:33.820 | for free, or without the government getting involved. And
00:38:36.700 | you know, 42 billion that was lining someone's pocket when the
00:38:39.260 | service doesn't even work, that would have been a scandal. And
00:38:41.860 | the media would have covered it. But the media doesn't even cover
00:38:44.540 | it these days. And again, it's because the media has become so
00:38:47.500 | tribal, that it's better dead than red and blue, no matter
00:38:51.580 | who. And so because the media would have to admit that Elon's
00:38:55.460 | already solved this problem, they just can't go there, they
00:38:57.780 | won't even cover this. And so we have no accountability, there's
00:39:01.260 | no accountability on the government.
00:39:03.140 | If I had to just take a step back and just generalize going
00:39:06.260 | forward, do we want to live in the kind of administrative state
00:39:12.940 | where they will pick people that they dislike, based on totally
00:39:21.300 | random criteria, a tweet, a meme, a post, and then all of a
00:39:27.420 | sudden punish a bunch of the rest of us because of that.
00:39:31.020 | They're punishing all of America, because they collect
00:39:34.540 | our taxes to waste on it. And then they punish the people that
00:39:38.460 | they actually say they're going to uplift by not delivering what
00:39:41.420 | they promised. And if you take Elon out of it for a second, the
00:39:46.060 | the problem was when we crossed the chasm and did it with the
00:39:49.740 | first guy, him. But the reality is there's only one of him and
00:39:53.500 | then there's a lot of the rest of us. And what will happen is
00:39:55.980 | people would just get added to this list of folks that certain
00:40:02.540 | nameless faceless people in the administrative state dislike.
00:40:06.700 | And what happens is the country slows down, and the country
00:40:10.020 | wastes money and the country pilfers it away. And that has to
00:40:13.380 | stop. And so what really bothers me about these things is a, I
00:40:18.020 | don't know how to undesensitize myself to the fact that all of a
00:40:21.180 | sudden now because of just all of this sloppy waste, I didn't
00:40:25.300 | react as much as I should have to just $50 billion being flushed
00:40:29.900 | down the toilet on these two projects. And then to Jason,
00:40:33.380 | your point, it is a solved problem that you can give
00:40:37.540 | incredibly cheaply. And the fact that it's not left to private
00:40:42.940 | enterprise to solve this, and instead, it's just brazen
00:40:46.220 | partisanship combined with tally ation combined with
00:40:49.220 | incompetence,
00:40:49.980 | and buying votes by giving this money to other vendors who are
00:40:54.540 | giving them donations.
00:40:56.060 | And just to give the democrats their do what happens if then
00:40:59.700 | Trump does the same thing for a solution that you support and you
00:41:02.540 | need and you think should be everywhere. The point is, we
00:41:05.540 | don't want any of this stuff under any administration and
00:41:09.180 | apparent it's and the minute that one administration breaks
00:41:12.780 | the seal, and makes it acceptable, it becomes part of
00:41:17.180 | the water table. And that's the real problem. We broke the seal
00:41:21.460 | on this crazy, multi multi trillion dollar spending, and it
00:41:24.820 | is just never stopped since then.
00:41:26.980 | And you know, the incentives really matter. If you look at a
00:41:31.180 | private company, if you were at clarinet, and to our previous
00:41:34.220 | story, and you go to the boss and say, I know how to get rid
00:41:37.100 | of these, this wasteful spending we're doing here, we can get
00:41:40.180 | rid of all tier one calls with AI and save that money, you get
00:41:43.380 | a promotion. If you're in the government, he you can't if
00:41:47.900 | you're a politician, and you cut this program, your constituents
00:41:51.380 | get upset, you don't have that stuff being built in your
00:41:54.420 | district, there's a perverse incentive that you can't buy
00:41:57.340 | the votes, which is why these folks are constantly trying to
00:42:00.380 | buy votes. And the second news is the good news is, I really
00:42:04.820 | applaud the people that have the courage to show this stuff on x
00:42:09.300 | to tweet this brand so that the rest of us know about it and the
00:42:13.140 | person that talked about the NYC thing. But then the next step
00:42:17.060 | has to happen, which is that we all need to decide that this
00:42:20.020 | stuff needs to stop. Otherwise, it's going to bankrupt our
00:42:22.300 | country. And we have to celebrate it. That's the key. If
00:42:25.420 | we can celebrate people saving money again, like Malay is
00:42:29.220 | getting a lot of credit. And that's up to us leadership in
00:42:32.700 | podcasting or the media, or influential people have
00:42:35.820 | followings. If you point out, hey, this is a waste, go save
00:42:38.980 | this money. And somebody does save the money. Well, why don't
00:42:41.180 | we start celebrating people saving the money and doing the
00:42:43.580 | right thing here? Because this is our children's future.
00:42:46.740 | Is it true that Kamala was the broadband czar that was
00:42:50.380 | responsible for this thing? I mean, it's who knows? Just no,
00:42:53.820 | because I saw it. I saw that a bunch of senators wrote a letter
00:42:57.420 | to her. And they claim that she was the broadband czar. But I
00:43:01.260 | don't know if that's true or not true. And whether she was I
00:43:04.460 | mean, we just remember she was the AI czar. I mean, the
00:43:07.220 | administration did put her nominally in charge of various
00:43:11.180 | technology initiatives. Here's an idea. Save money, get the get
00:43:15.580 | the best solution at the lowest price, and then reevaluate that
00:43:18.900 | as you go. And I sort of point out with the, this is a subtle
00:43:22.540 | point. But Elon also open sourced his patents for the
00:43:27.060 | superchargers and let anybody do them. And he opened up the
00:43:30.620 | superchargers to other vehicles, which he didn't have to do. And
00:43:35.020 | when they gave him a loan, back in the Solyndra days in the
00:43:38.700 | Fisker days, remember, they gave these incentives in the form of
00:43:41.300 | loans, he's the only guy who paid it back, everybody else
00:43:44.020 | failed. So now you're punishing the guy who actually built the
00:43:47.380 | infrastructure for both of these projects. So the reward for
00:43:51.140 | actually doing the right thing, which Starlink did, SpaceX did,
00:43:54.660 | and Tesla did is to be punished. And then you're giving a leg up
00:43:58.460 | to somebody else who's building these trucks, who's more
00:44:00.500 | qualified to build these charges at scale, or a satellite network
00:44:04.300 | at scale, the person who's already done it. He's already
00:44:06.900 | done it.
00:44:07.420 | I do worry that there's a growing version of the Elon
00:44:11.820 | derangement syndrome. That's also kind of like festering.
00:44:14.940 | Yeah, for sure. Which just it just stops people from thinking
00:44:19.140 | rationally. Of course. I mean, we're talking about laying fiber
00:44:23.580 | lines, cable modems to people who are hundreds of miles into
00:44:27.980 | the countryside. That makes no sense when you can just put a
00:44:32.300 | satellite dish up today. What are we even talking about? I
00:44:36.020 | mean,
00:44:36.380 | government has never been particularly efficient. But
00:44:39.540 | there was a period of time where people would at least care about
00:44:43.260 | wanting to make it more efficient. And it would be a
00:44:46.140 | scandal if there was political corruption to try and bias the
00:44:49.380 | result in a way that actually deprived the intended recipients
00:44:54.260 | of the program from getting the services they're supposed to get
00:44:56.580 | and cost the government way more money than it needed to. We're
00:44:59.620 | so far beyond being that country anymore, where we actually
00:45:04.860 | debate the best policy. We're now it's just like we're warring
00:45:09.620 | political tribes. And the objective of the party is to
00:45:14.220 | punish its political opponents to engage in retaliation, and to
00:45:18.980 | basically loot the public coffers as much as possible on
00:45:21.860 | behalf of their constituents. And that's what's basically
00:45:24.380 | happening. You know, it's completely dysfunctional.
00:45:26.900 | Well, let's use this podcast. If you see government waste, tell
00:45:29.980 | And no one cares because the media doesn't really shine a
00:45:32.820 | light on it because they're they're completely tribalized
00:45:35.300 | as well.
00:45:35.700 | I agree with everything you're saying except the last part. I
00:45:37.900 | don't think it's on behalf of their constituents. I don't
00:45:39.860 | think any of us see any benefit from any of this spend.
00:45:43.300 | No, no, I meant their donors, the donor constituents. Yes, not
00:45:47.300 | not the citizens of the country. Yeah.
00:45:49.620 | But who's winning in this? It's not like this 42 is 42 billion
00:45:54.140 | lining the pockets of, I don't know, name for sure. How do you
00:45:57.380 | think?
00:45:57.580 | Companies that are gonna lay that fiber are gonna get that
00:46:00.900 | money.
00:46:01.340 | And then and then they're gonna 40,000. It's been three or four
00:46:05.180 | years, they haven't done a single thing.
00:46:07.420 | I mean, I still think they're cash the checks. Yeah, it seems
00:46:11.260 | like we're at the stage of just pure incompetence and
00:46:13.180 | retaliation. We're not even at the stage of actually then
00:46:15.340 | giving it to anybody else. I mean, that would be so they're
00:46:19.620 | giving the money away. And they're not getting the
00:46:22.140 | political benefit from it.
00:46:23.500 | They're so confident they can get out of their own way. But
00:46:26.140 | somebody is getting that, call it 50 billion that we don't need
00:46:29.900 | to spend. And the way that money is awarded is going to be
00:46:33.580 | political. We're going to think that they're going to turn
00:46:35.740 | around and get big political contributions.
00:46:37.860 | Of course. Well, I think I think that I think the good news is
00:46:40.220 | that the more of these things we shine a light on, the harder
00:46:43.380 | it'll be to hide when these grants are actually given or
00:46:48.260 | what the execution is. And, and let's start a running list.
00:46:52.700 | No, to your point, sacks, maybe like, you know, we need a
00:46:55.260 | revival of the 60 minutes, you know, waste, fraud and abuse on
00:46:59.060 | this program. We'll do it at the end of the show, every time
00:47:01.060 | we'll have a running list at all in calm of just every one of
00:47:05.220 | these scandals, and we'll feature it. So leak it to us
00:47:07.700 | first, send it to us, my DMS are open. All right, listen, early
00:47:11.540 | stage investing has always been hard. There was a tweet storm
00:47:14.140 | this week that y combinator might be having a hard time
00:47:16.940 | replicating their early success. We'll discuss it now. A thread
00:47:20.620 | this week from ex user Molson Hart caught a couple people's
00:47:23.780 | eyes. He made the case that it's been a rough decade for YC based
00:47:27.260 | on the accelerators top companies page. YC list is top
00:47:31.300 | companies by 2023 revenue there. And you'll notice there's not a
00:47:35.580 | lot of companies from the recent cohorts at the 50 companies
00:47:38.180 | featured only three are from the classes after 2020. Most of them
00:47:41.700 | being from the early 2010 10s. Obviously, that's because
00:47:45.560 | they've been around longer, but it sparked a big discussion that
00:47:49.420 | there were so many winners from the 2009 to 2016 era. And that
00:47:53.780 | maybe the class size at YC has expanded a whole bunch. And
00:47:58.580 | maybe that's part of the problem. But there's a bigger
00:48:00.700 | problem in VC that we've talked about here. Here's a chart from
00:48:04.420 | Carta that just shows the percentage of VC funds that have
00:48:07.740 | made a distribution since 2017. Over 40% of 2018 vintage funds
00:48:13.100 | have not made a single distribution yet. And it's
00:48:16.860 | getting to the point you're five, six or seven where you
00:48:19.460 | probably should have had some distributions occur. Obviously, a
00:48:22.980 | lot of this has to do with maybe M&A and those early wins being
00:48:26.460 | taken off the table. We've talked about that a whole bunch.
00:48:30.180 | But here's the chart. That kind of gets really interesting. An
00:48:34.940 | explosion in fund managers occurred, as we all know, and
00:48:38.040 | this chart shows from pitch book, the first time first time
00:48:41.980 | VC managers that raised a second VC fund as a share of all first
00:48:45.580 | time VC managers. And it's now down from above 50% to below.
00:48:51.420 | Gosh, 15%. So what are your thoughts here?
00:48:56.980 | My gosh, venture is a really, really tough business. Every
00:49:02.460 | year. For the last seven, six years, seven years, I have
00:49:07.820 | published my returns, which most VCs don't want to do. I do it
00:49:14.060 | because I go back and I look at it. And I think having public
00:49:18.580 | accountability actually drives some good decisions. They, they
00:49:22.620 | may seem suboptimal in the moment, but they in the long
00:49:27.100 | run, turn out to be good decisions. And the biggest one
00:49:30.340 | has been generating liquidity. So Nick, you can throw up this
00:49:35.380 | thing. But I'm sure there are funds in each of these vintages
00:49:38.620 | that have done way better than me. So I'm not I'm not saying,
00:49:41.220 | you know, it is what it is. But what I want to point out is, if
00:49:45.660 | I go and look inside of these funds and tell you how hard it
00:49:48.260 | has been to generate this DPI, it's like, it's like dragging an
00:49:53.860 | entire just sack of potatoes over the finish line. It's like
00:50:01.260 | like a truck of dead bodies over a finish line is super, super
00:50:06.340 | hard. And the things that we have fought are two. One is that
00:50:12.580 | the gestation of companies has totally blown out. We used to be
00:50:16.980 | in a world where by year five, six or seven, you could return
00:50:20.020 | money. You just can't do that anymore, unless you get
00:50:22.700 | extraordinarily lucky, which by the way, I got when sacks was
00:50:25.900 | running Yammer. It was an enormous win for all of us. But
00:50:31.340 | that is just exceptionally rare. And that was M&A in year what
00:50:34.980 | five or six sacks when there's so few, there's so few
00:50:37.620 | entrepreneurs capable of that he's one of maybe five or 10. So
00:50:41.500 | other than that, I've never really had a company that has
00:50:46.100 | generated liquidity in year five, six or seven, they've
00:50:49.060 | always generated if they did generated at all, in years 1112
00:50:54.060 | and 13. And so the problem with that is that at some point, you
00:50:59.140 | have these paper marks that say you're winning, and things are
00:51:02.380 | working. But there's no path to liquidity. So then I what I did
00:51:07.740 | was I stepped in to the secondary markets, and I would
00:51:11.100 | sell. And it would really upset certain founders. But I was very
00:51:17.900 | clear that when I was running outside capital, and I was
00:51:21.940 | running outside capital on behalf of really hit
00:51:24.380 | organizations that I believed in the Broad Foundation, the Mayo
00:51:27.820 | clinics, Memorial Sloan Kettering, my job was to get
00:51:31.060 | them money back. You know, these were their pension funds. These
00:51:34.380 | were the things that they use to build facilities, cancer
00:51:37.020 | research, cancer research, I didn't have the, you know,
00:51:40.980 | ability to just sit on my hands and say, Oh, you know what, year
00:51:43.860 | 15, don't worry. So it's just meant to say that that the
00:51:49.140 | tactics of generating liquidity in venture are very
00:51:53.220 | misunderstood, and very under appreciated. And even then, you
00:51:59.100 | sell some things that are just absolute winners that had you
00:52:01.660 | waited another five or six years would have turned another, you
00:52:05.220 | know, one or two turns. But that's not the job. The job is
00:52:09.380 | not to maximize absolute every single when the job is to return
00:52:13.300 | capital in a reasonable time period, so that your investors
00:52:17.900 | don't run out of money to give. Yeah, it's so it's a tough game,
00:52:21.660 | man. It is really, really, really tough. Yeah. And the
00:52:24.420 | inside, and sorry, by the way, and I feel this now because, you
00:52:27.940 | know, the last five or six years has been entirely my own
00:52:30.180 | capital. And my gosh, it's hard. Yeah, managing liquidity is
00:52:34.620 | impossible. It's impossible, especially when you can't rely
00:52:37.020 | on anybody else. So
00:52:38.460 | well, thank God for the secondary markets even emerging
00:52:41.780 | because at the same time that the secondary markets emerged
00:52:44.820 | and people were willing to buy venture assets, you know, going
00:52:48.660 | into their second decade, I would have been in real trouble
00:52:51.300 | without the without reasonably liquid, myself included. I mean,
00:52:54.620 | my numbers, my numbers would be a quarter of what they are.
00:52:57.580 | Yeah. And I took advantage of almost every time I had one of
00:53:00.940 | those opportunities to sell some shares pair some positions. And
00:53:04.380 | that's how we got our DPI as well. Because, let's face it,
00:53:07.540 | Lena Khan and the anti tech sentiment has led to these large
00:53:12.260 | companies, not buying startups. And instead, they compete with
00:53:16.380 | them, they just say, we'll build it in house, because you're not
00:53:18.940 | letting us buy it. And it's broken the entire ecosystem. Now
00:53:22.820 | that's broken, the the IPO process is broken. I tried to
00:53:30.380 | flip that on its head with snacks. You know, some work,
00:53:34.100 | some didn't, many didn't, in the end, many of mine didn't work
00:53:37.260 | out at the end, there was a period where it looked like it
00:53:39.260 | was working. But these are all attempts at changing the
00:53:43.340 | liquidity cycle. Yeah, of these companies, because the way that
00:53:47.740 | things stand today, we are not in a sustainable industry. It is
00:53:52.180 | if you raise funds and think about fee generation. But it is
00:53:56.020 | not if you think about returning money to founders, LPS, getting
00:54:00.140 | employees compensated for many years of, you know, oil that
00:54:03.300 | they put in. It's very tough game right now.
00:54:05.860 | Well, sacks right now we're seeing people do things like
00:54:08.700 | selling, you know, their early SpaceX or their early stripe,
00:54:12.820 | whatever it is, to other VCs to later stage funds, a lot of ways
00:54:18.060 | to try to secure DPI. What's your thoughts on the state of
00:54:21.460 | venture today, given all this data that we're looking at
00:54:24.340 | today?
00:54:24.740 | Well, two points. So first, I agree with Chamath that the
00:54:28.740 | amount of time it takes to generate an outcome for, I'd say
00:54:32.780 | most startups is longer than the 10 year period of these funds.
00:54:37.020 | And these funds can be extended up to 12 years usually, but then
00:54:40.020 | what do you do after that? I just takes a lot longer than
00:54:43.100 | that, in a lot of cases generate a meaningful outcome. I just had
00:54:46.140 | two companies that I invested in, in my second fund. So in
00:54:50.020 | 2019, and 2020. So four years ago, and five years ago, just
00:54:54.940 | got marked up. And it was a big markup of the companies doing
00:54:58.220 | well, I call them late bloomers, it took four to five years for
00:55:02.220 | them to accomplish what they wanted to in terms of like
00:55:04.620 | building out the tech. I mean, I invested at like the earliest
00:55:06.860 | stage. So that's how long it took. And now they just did
00:55:09.860 | growth rounds, and they're kind of off to the races. But, you
00:55:12.900 | know, I could easily be 10 years from here to get to Yeah, a
00:55:16.300 | liquidity event. So you're talking about more like 15 year
00:55:18.820 | funds. So I agree with that point. The second thing, though,
00:55:22.220 | is that the big thing that's happened in our industry is we
00:55:25.660 | had a bubble in 2020, and especially 2021. And we just had
00:55:31.660 | a ton of capital come into the industry because the Fed and the
00:55:35.260 | the federal government airdropped $10 trillion
00:55:38.740 | liquidity onto the economy in reaction to COVID. And not all
00:55:43.500 | that money went into VC went into a lot of places, but the
00:55:46.180 | VC industry was flooded with cash. And you see this in the
00:55:49.820 | deployments. I mean, in those bubble years, there was something
00:55:53.140 | like 200 billion a year of capital deployment when normally
00:55:56.140 | it's 60 to 100 billion. So if twice the amount of money is
00:56:00.300 | going into the industry and is being deployed, and rounds are
00:56:03.340 | now twice as big and valuations are twice as big. That has a
00:56:06.820 | huge outcome, a huge effect on returns. So for example, the
00:56:11.100 | average venture fund is like a 2x return. But if the entry
00:56:14.900 | prices were artificially double, then there goes your return
00:56:19.100 | right there you get to access one x. So such I think we're
00:56:22.140 | just in the hangover of this massive liquidity bubble that
00:56:26.500 | didn't originate in the venture capital industry came from
00:56:29.340 | frankly, the federal government, but we're just downstream of
00:56:33.060 | that. Now what I would say is I do think we're at the tail end
00:56:36.180 | of working that out. And the good news is that we now have
00:56:40.420 | maybe the most exciting tech wave ever, which is AI,
00:56:43.980 | definitely the most exciting tech wave since the internet
00:56:46.260 | came along in the mid to late 90s. So the hope is we're
00:56:50.780 | finally going to have like really exciting things to
00:56:52.700 | invest in, again. But But yeah, look, I think we're at the tail
00:56:56.900 | end of the last cycle and the beginning of a new cycle.
00:56:59.900 | And vintage distortion is so real, you know, it's very hard
00:57:03.900 | to understand how each of these vintages with your late bloomers
00:57:07.380 | or overpriced things, companies getting $100 million around
00:57:11.660 | totally at a billion dollar valuation before they have
00:57:13.980 | product market fit. And those distortions were just so
00:57:18.140 | pronounced the last five to 10 years that we're now sorting
00:57:21.300 | them out like a like a house of mirrors where you don't know
00:57:24.340 | who's tall, who's fat, who's skinny, what the reality is
00:57:26.620 | here. And the other big thing is this peanut butter effect that
00:57:29.540 | you know, I tweeted about today. You know, during peak Zerp, you
00:57:33.660 | had all these exceptional team members, you know, the number
00:57:37.500 | 2345 person at a company that was doing great, they would
00:57:41.500 | leave to start their own company. So the talent got
00:57:43.700 | spread, then you had so many of these founders rushing into the
00:57:47.300 | same vertical. So you'd have 20 startups, because there was too
00:57:50.540 | much capital pursuing the same opportunity, you pursue the same
00:57:53.420 | opportunity, what happens to earnings, they get spread, then
00:57:57.380 | what happens to customers, they get spread across 20 different
00:58:01.060 | products competing for the same customer. And then what happens
00:58:04.420 | with, you know, ownership stakes for us as GPs and LPs to mop the
00:58:09.340 | ownership stakes, because the valuations went up so much, they
00:58:12.380 | got spread like peanut butter. Instead of a series a getting
00:58:15.860 | you 20% of a company got you 10 instead of a C check getting you
00:58:19.060 | 5%, it got you one, there's no DPI possible,
00:58:22.380 | you nailed it. And sacks nailed it. But and the thing to
00:58:25.100 | remember is both of those two things now work together to
00:58:28.180 | erode the return stream for the general partner, but really most
00:58:32.380 | importantly, for the limited partner. So I do think that we
00:58:36.620 | are in a situation where the average returns are going to
00:58:39.980 | decay by 50 to 100%. Because of what sack said, and because of
00:58:43.580 | what you said. On top of that, I don't think we know what the
00:58:48.060 | actual cap structure needs to be for a successful AI company. Is
00:58:52.980 | it 20 people that does the work of 2000 now because they have
00:58:56.180 | all of these agents and systems that work on their behalf. If
00:58:59.420 | that's true, giving that company hundreds of millions of dollars
00:59:02.980 | is actually the opposite of what you want to do. You want to give
00:59:05.980 | that company 10 or 15 and then let them cook. And so we have a
00:59:11.020 | we have a right sizing of capital problem that needs to
00:59:13.300 | happen. The data would tell you though, that the industry
00:59:16.260 | understands that. So the fact that we've gone from 50% of
00:59:19.340 | people being able to raise a fund to 12% means that a lot of
00:59:23.220 | people will get washed out of the industry, less capital being
00:59:26.820 | raised, which probably is foreshadowing the fact that
00:59:30.460 | these companies will need a lot less capital. But you know, that
00:59:33.780 | has a lot of implications as it ripples through our economy. It
00:59:36.460 | has, I think it's very good for the early stage. I think, you
00:59:39.380 | know, you guys are very good there. You've talked about how
00:59:41.620 | it's good for you. It's very complicated, I think, for the
00:59:44.420 | expansion and growth stage capital. And then I think it's
00:59:47.460 | going to be there's going to be another turn on what happens on
00:59:50.980 | the IPO markets, because you can't have so many companies
00:59:55.060 | waiting with very, very few ways of accessing public market
01:00:01.500 | capital and exposure. I just think this is that is that is
01:00:04.500 | fundamentally broken. And we're going to have to reinvent we
01:00:07.780 | tried once with SPACs, we're going to have to go back to the
01:00:10.140 | drawing board and try again listings, secondary markets that
01:00:13.460 | are more fluid. I don't know what it is. But we need to do
01:00:15.780 | something because the status quo doesn't work.
01:00:17.460 | I think there's a lot so many good points that we're hitting
01:00:20.540 | here. I'll just say the the other thing to build on your
01:00:24.340 | point about, hey, these take less capital, you have to look
01:00:27.980 | at what does your ownership after you've been diluted half
01:00:31.260 | by 50% as a seed or series of a investor, you're going to be
01:00:34.820 | down to half. So if you own 10%, you own five, if you own seven,
01:00:38.580 | like YC, or we do in a company, you're going to own three,
01:00:41.540 | you're going to really have to model out is the valuation
01:00:44.940 | you're looking at, what does it pencil out to for an outcome.
01:00:48.020 | And when I did this with our investments, I saw a leak in my
01:00:51.020 | game, which was, hey, I'm putting 100k into a $25 million
01:00:54.060 | round or a $50 million round as a follow on investment, you
01:00:57.900 | know, to support the founder. Okay, what does that do for my
01:01:00.940 | LPS? Well, that 100k would need to hit some extraordinary
01:01:04.820 | outcome 510 20 $40 billion in order for us to return the fund.
01:01:10.100 | So now my team understands, hey, take that 125 k that 250 k that
01:01:14.540 | 500 k, do more for do for more accelerator companies with it,
01:01:18.260 | because those could return the fund. And that's that fund
01:01:22.340 | math, people stop doing I think all these fund managers who are
01:01:25.860 | getting wiped out, they never penciled out. What does this
01:01:29.620 | company I'm giving $1 million need to hit in order for me to
01:01:32.980 | return my fund. And now they're finding out that that's not
01:01:36.740 | that's just tweeted. Let me say, you know, everybody's course
01:01:41.340 | correcting,
01:01:41.900 | I mean, it's basically the capital deployments gone back to
01:01:44.980 | where it was in 2019. Let's call it. So again, we had this
01:01:49.100 | bubble, the foam started building in 2020. But you had
01:01:52.340 | COVID people didn't know what to think. So there was some
01:01:54.740 | restraint, I guess. And then 2021, it just went wild.
01:01:59.220 | That was nuts, man. Well, I mean, the question
01:02:01.500 | middle vintages are just going to be garbanzo beans. 2120.
01:02:05.180 | Well, you know, that's such an interesting point, you could
01:02:07.580 | return capital, you're going to look like a euro.
01:02:09.660 | Also, Chamath, I remember, I don't know if it was Michael
01:02:14.500 | Moritz, or Doug Leoni, but I was talking to Sequoia about the time
01:02:18.700 | dispersion of your fund, like over what period time are you
01:02:22.060 | deploying a fund? And man, people started deploying funds
01:02:25.620 | in 18 months, because they can raise the next fund so quick. So
01:02:28.740 | like, screw it, I'm going to deploy this fund in 18 months,
01:02:31.740 | 24 months, and LPS were saying to me, like, what period are
01:02:35.580 | you going to deploy this? And I said, Well, you know, I was
01:02:37.300 | taught by Fred Wilson, and this person 36 months, 48 months
01:02:40.580 | would be a good window to deploy capital, because you know, it
01:02:44.900 | makes it out.
01:02:45.900 | I think you're seeing the dirty little secret of the venture
01:02:49.380 | business, which is at some point, people get to a fork in
01:02:51.620 | the road. If they hyper optimize for returns. I'll put
01:02:56.180 | benchmark, I'll put Fred Wilson and USV, I'll put Sequoia's
01:02:59.700 | early stage fund, they have to introduce time diversity, they
01:03:04.340 | keep the funds small, and they look to hit grand slams. But
01:03:09.940 | there are many other people and I would say the most of the set
01:03:12.860 | outside of that, take the road more traveled, which is then
01:03:18.500 | you optimize for size, which then becomes a fee game. And so
01:03:21.980 | you optimize for velocity, get the funds out as quick as
01:03:24.700 | possible, raise a new fund, they have no intention of generating
01:03:27.500 | returns, because they have no ability to, when you have
01:03:30.660 | absolutely no time diversity in this business in a pool of
01:03:33.380 | capital, you're giving away one of your best edges. David just
01:03:36.620 | talked about it as a smart practitioner, he was able to
01:03:39.380 | nurture these companies that all of a sudden they start to win.
01:03:41.860 | If you've all of a sudden flushed all your money in fund
01:03:44.500 | one, then you go to fund to fund three, by the time something in
01:03:47.700 | fund one hits, what are you going to do, you're going to
01:03:49.940 | cross the funds, or you're going to justify taking money from the
01:03:53.820 | left hand to pay the right hand, or you're just going to let your
01:03:56.660 | ownership wane because you frittered all the money away.
01:03:59.580 | These are all the problems that most of these folks have
01:04:02.660 | encumbered themselves with, it's very difficult to get out of,
01:04:05.660 | it's going to take a look, in fairness to them, they probably,
01:04:08.900 | you know, got good while the getting's good. So they'll make
01:04:11.340 | a ton of money in fees, but they will not be able to raise funds.
01:04:15.140 | And those fees are not clawed back, folks, for those of you
01:04:17.420 | playing at home, just
01:04:18.500 | just by the way, I feel better about those late bloomers in my
01:04:22.180 | portfolio, because I know the marks are real, because if
01:04:24.660 | they're getting marked up now, then it's very, very solid,
01:04:28.260 | compared to frankly, some of those marks that we got in the
01:04:32.020 | bubble year, like 2021, I call them tiger marks, whether it was
01:04:35.700 | tiger or not. This is less real, quite frankly, and a lot of
01:04:39.500 | those companies are retrenching and have issues. So a mark now
01:04:43.500 | it just means something different than a mark then. But
01:04:46.300 | look, I want to, you know, just so we're not like totally
01:04:49.100 | beating up on VC, there was, you remember that in this bubble
01:04:53.420 | period of September 2021, everybody thought that this
01:04:58.540 | party would just continue forever. And this is a good
01:05:01.460 | example from the Wall Street Journal, where I was talking
01:05:03.700 | about how university endowments were minting billions in golden
01:05:06.460 | era of venture capital. So the bubble wasn't just in VC was in
01:05:10.820 | the public markets, too, because we had zero, right, like
01:05:13.500 | interest rates were zero, liquidity was just flowing. And
01:05:16.980 | so it was very easy for companies to get liquid, they
01:05:21.020 | IPO, and then the valuations were stratospheric. So the
01:05:24.660 | distributions to LPS were massive in 2021. And then that
01:05:29.340 | led to, again, more funds be able to raise bigger funds,
01:05:32.940 | everyone's just kind of paying it forward and thought the party
01:05:35.260 | would just keep going. So this is what happens in a bubble is
01:05:39.780 | everybody thinks that it's just gonna keep going like that is
01:05:43.220 | why it's so important as a fund manager or an entrepreneur for
01:05:47.460 | you to get great advice from people who've been at this for a
01:05:50.500 | long time and focus on the process. You cannot control all
01:05:54.460 | these outcomes, you cannot control all these meta events,
01:05:57.260 | what you can control is your relationship with your
01:06:00.300 | customers, building a team making great bets, supporting
01:06:05.300 | late bloomers, that's the critical part of all this is the
01:06:08.220 | process and you can make your process better. And so with my
01:06:11.620 | team internally, I'm constantly talking to them about our
01:06:14.820 | selection of companies, how we help companies get pulled
01:06:18.140 | through and get downstream funding, how we literally our
01:06:21.660 | big effort this year is how do we introduce our companies to
01:06:24.620 | the top VC firms. And we've been working on that as a internal
01:06:29.580 | project, right of just getting our great breakout companies to
01:06:34.100 | the best investors to increase our pull through. It is a
01:06:38.100 | process and you have to trust and focus on the process.
01:06:41.860 | Yeah. Well, ironically, just I mean, just to end on sort of a
01:06:46.980 | positive note, if these interest rate cuts are real, like if we
01:06:50.900 | just got 50 if we get another 50 this year, if inflation is
01:06:54.180 | really tamed, and it's never gonna go to zero, but if they
01:06:57.860 | go down substantially, and we have this new AI disruption,
01:07:03.340 | this new AI tailwind, we could be back in another golden era.
01:07:07.700 | It's not going to be a bubble, but it's could be another golden
01:07:10.220 | era. So we'll see
01:07:11.700 | start companies from your lips to God's ears.
01:07:14.500 | Love you guys. I gotta go. Love you.
01:07:18.620 | All right. Shemoth had to go do work. Apparently starting this
01:07:21.980 | new concept sacks, which mouth is actually going to work and at
01:07:25.900 | a company. We never got to talk about the debate because we were
01:07:30.300 | busy doing the summit and we took the week off from a new
01:07:32.500 | episode. People wanted to hear your take. What did you think of
01:07:36.260 | Kamala and Trump, the one and only debate we're going to hear
01:07:39.940 | apparently? Any, any thoughts?
01:07:42.740 | I think that Kamala Harris performed better than expected.
01:07:48.020 | She did that, I think, mostly through having canned answers to
01:07:54.060 | topics. And she was able to kind of memorize those answers and
01:07:58.540 | say them and she was never knocked out of her preparation.
01:08:02.260 | She was well prepared. Yeah,
01:08:04.540 | I think she was well prepared. However, we now know that these
01:08:07.300 | were canned answers, because in subsequent press interviews, she
01:08:10.380 | gives the exact same thing. It's like a jukebox where you just
01:08:13.020 | push the button, right? It's the same answer. Exactly. So she's,
01:08:15.980 | she's memorized a certain number of talking points. And that's
01:08:20.300 | all she's going to give you, no matter what the question is. And
01:08:23.300 | if you saw that it's become a meme now where you saw that
01:08:27.020 | question when she was asked about inflation, there's a pause
01:08:29.620 | when she's figuring out which greatest hit she's going to
01:08:31.740 | play. And then, you know, she, I guess, pushes B 26 in her head,
01:08:36.660 | and then it begins. So I was born in the middle class.
01:08:40.500 | And it's working, apparently, right? It seems like it's it's
01:08:43.780 | helping her. Yeah.
01:08:45.060 | I think what you saw is that she got a bounce out of the debate.
01:08:48.020 | But now it's sort of like a lot of these bounces, there's been
01:08:52.820 | kind of effervescence to it. And then it kind of settles down
01:08:55.660 | back to the recurring pattern. And so I think the election is
01:09:00.140 | extremely close. But I don't Oh, yeah. I mean, every day, it's
01:09:03.180 | like a poll going one way or the other. And this is the closest
01:09:06.340 | of our lifetime, maybe. Or that I can remember. I mean, it's
01:09:09.820 | nuts how this thing has flipped over and over again. What did
01:09:12.540 | you think of Trump's performance? Were you
01:09:14.380 | disappointed? There were some rumors, people were a little
01:09:17.460 | upset that he doesn't prep as much as he should. What what's
01:09:20.500 | your what's your advice there? You know,
01:09:22.740 | well, look, I mean, I think that he was in a very difficult
01:09:26.420 | situation. You basically had a three on one situation where he
01:09:30.220 | was up against not just Kamala Harris, but the two debate
01:09:32.620 | moderators. It turns out that Lindsay Davis is Kamala sorority
01:09:36.580 | sister. David Muir was fact checking him constantly. And
01:09:43.180 | some of those fact checks weren't even correct. For
01:09:45.740 | example, we now know that the Springfield City manager has
01:09:50.020 | acknowledged complaints about pets being eaten.
01:09:52.620 | Oh, here we go. We were gonna get through.
01:09:55.900 | It's as far as far back as March. There are videos of him
01:10:00.260 | talking about the complaints. Now you can you can say that you
01:10:06.220 | don't believe those stories or whatever. But those reports were
01:10:09.460 | real. But David Muir fact checked in real time saying that
01:10:14.180 | Trump was wrong. And there was like this effort to kind of gas
01:10:18.500 | light and make him sound crazy during the debate, when there
01:10:21.740 | are in fact sources for what he was saying.
01:10:23.460 | And it might have thrown him off a little bit. I noticed like it
01:10:25.700 | was like he I agree they going into it. I think they need to
01:10:30.860 | negotiate in the future. You know how they're negotiating the
01:10:34.660 | microphones on or off audience on or off? I think they should
01:10:38.540 | negotiate Are we fact checking in real time? Or are we not fact
01:10:42.060 | checking and who's doing that?
01:10:43.100 | And they only fact check one candidate, for example, when
01:10:46.020 | Kamala Harris repeated numerous hoaxes, like the very fine
01:10:48.900 | people hoax, the bloodbath hoax, the suckers and losers hoax. I
01:10:53.700 | mean, these are things that were already addressed in the last
01:10:56.740 | debate. And, you know, even left wing sites like Snopes have
01:10:59.980 | said the whole verified people thing is
01:11:02.420 | yes, for people who don't know that they there's been selective
01:11:05.460 | edits. And I mean, there's been selective edits forever. But
01:11:07.620 | that one is particularly egregious. And it's really
01:11:09.620 | egregious. The bloodbath one is really egregious, too, because
01:11:11.900 | because he was talking about the bloodbath. Yeah, just make it
01:11:15.780 | into a January 6 extension, which it's not.
01:11:19.260 | Right. So she was able to say these things and never got fact
01:11:22.620 | checked once, which meant she never got knocked out of
01:11:24.820 | preparation.
01:11:25.700 | And let's also be honest, like Trump is hyperbolic. So if you
01:11:29.500 | are going to say, you know, oh, we're going to fact check Trump,
01:11:33.260 | like there's a lot of material there. And he just he's a
01:11:36.620 | hyperbolic guy. That's kind of his schtick, right? I mean,
01:11:39.300 | but but here's the thing is that in the wake of that debate,
01:11:42.340 | look, I think a lot of people scoring the debate on like
01:11:45.660 | technical debaters points would award her the the win for that
01:11:49.780 | night. I don't clearly. Yeah, I don't deny that.
01:11:52.780 | However, what I think has been surprising is that in the wake
01:11:56.740 | of the debate, you're seeing her support sort of return more to
01:12:01.940 | its previous level. And so what I'm saying is the effect of
01:12:05.380 | that's wearing off. And I think one of the reasons why that's
01:12:07.580 | wearing off is because Trump still has the killer issues in
01:12:11.220 | this election. He's got the border, and he's got inflation
01:12:14.860 | and the economy. And Harris may have done well again on debaters
01:12:19.380 | points. But what substantive answer did she give in that
01:12:23.020 | debate, except to say, I'm not Joe Biden, which is, I guess,
01:12:27.420 | true. However, what you're basically saying is you won't
01:12:30.060 | defend your own administration's record. You are the incumbent,
01:12:33.340 | you're not the change candidate. And you're saying that people
01:12:36.740 | should vote for you because you're not Joe Biden. Well,
01:12:38.460 | what is it about Joe Biden's record that what is it about Joe
01:12:42.420 | Biden's policies that you don't agree with? I mean, after all,
01:12:45.380 | you cast the tie breaking vote for the Inflation Reduction Act,
01:12:50.020 | you cast it for the 2 trillion American rescue plan that set
01:12:52.900 | off the inflation. So the debate moderators never asked, Harris,
01:12:56.900 | well, what is it about you that is different than Joe Biden on a
01:13:00.460 | policy level, other than the fact that
01:13:02.220 | apparently she's pro gun, I thought that was like a great
01:13:04.220 | moment for her. Objectively, I think, you know, I've said this
01:13:08.540 | forever here on this show, putting our feelings aside about
01:13:11.780 | the candidates. I think whoever comes across as the most normal
01:13:15.340 | or the most moderate is going to win. And I think she's done a
01:13:19.060 | great job of like, persuading, convincing those moderates that
01:13:23.860 | she's not crazy. And he is what are your thoughts on that?
01:13:27.140 | Because people looked at this very podcast. And they've said
01:13:29.620 | to me, my god, that's the Trump I want to vote for that Trump 2.0
01:13:33.340 | the all in Trump. And then people are like, ah, he's going
01:13:36.180 | back to the insult comic Trump, but I don't want the chaos. What
01:13:39.460 | are your thoughts on moderates, specifically in the swing states
01:13:43.300 | and this sort of strategy? Let's talk about let's talk about the
01:13:46.460 | Teamsters. So Biden, when he was still in the race was plus eight
01:13:50.940 | among the Teamsters rank and file. And now that Harris is the
01:13:56.100 | candidate, Trump is up something like plus 26 with the Teamsters.
01:14:00.580 | Why is that? Because she's, isn't she pro union as well? He
01:14:03.740 | was Union Joe. So I mean, it was like in the name, I understand
01:14:07.000 | why they left him. There's something about her policies. And
01:14:11.220 | I think her that look, I think within the Democratic Party,
01:14:15.660 | it's your personality. I think I think it's partly personality,
01:14:18.460 | but I also think it's it's policies and cultural issues. So
01:14:21.340 | within the Democratic Party, there's always been two tracks,
01:14:24.340 | there's the beer track, and there's the wine track. And so
01:14:28.260 | you know, Bill Clinton was classic beer track guy, right?
01:14:31.460 | Your son, your son with Obama, right? And I think Joe Biden was
01:14:35.260 | was beer track, then there's kind of the wine track, which is
01:14:37.820 | the more it's the part of the party that cares about these
01:14:40.860 | boutique cultural issues, starting with di and equity and
01:14:45.980 | trans and things like that.
01:14:47.460 | Limiting liberals is what they used to be called. But I like
01:14:50.100 | yours wine liberals or the woke wine.
01:14:52.860 | Basically, the entire California Democratic Party is very wine
01:14:56.940 | track. I mean, Gavin is very wine track. Kamala Harris is
01:14:59.620 | very much right. You can understand why a blue collar
01:15:02.900 | worker, it doesn't appeal to that they want more of that
01:15:06.180 | lunch pail traditional Democrat, but that Democratic Party
01:15:10.140 | doesn't really exist anymore. I mean, the Democratic Party has
01:15:13.100 | evolved to be the party of the professional class, whereas the
01:15:16.300 | Republicans are more the party of the working class. And you're
01:15:20.140 | now starting to see it. I think Biden was the Democrats last
01:15:23.420 | vestige of this working class party. He really worked at being
01:15:26.900 | appealing to those voters, you know, the whole Scranton Joe
01:15:29.260 | image. Yeah, yeah, exactly. Whereas Kamala, when you get her
01:15:34.340 | talking in an unguarded moment, and it's not a canned answer,
01:15:38.100 | she's going to talk about diversity, equity and inclusion.
01:15:40.900 | And that's not what your typical teamster wants to hear.
01:15:43.500 | Let me ask you a challenging question, because when it's
01:15:45.500 | like, when I asked you a challenge a bit, if Trump loses,
01:15:48.340 | what do you think will be the cause of the loss? If he loses,
01:15:55.660 | like strategically, when we look back on the last six months,
01:15:58.820 | what do you think you would change? What would cause it?
01:16:02.020 | Well, look, I mean, the the great asset that Kamala Harris
01:16:06.260 | has is not her likability. It's not her track record. It's not
01:16:11.060 | her policies. It's the fact that she's got the media behind her.
01:16:14.700 | And if you look at like, for example, ABC News, 100% of the
01:16:19.940 | coverage by ABC News is positive, whereas something
01:16:23.460 | like 93% of their coverage on Trump is negative. And you saw
01:16:28.140 | this that before Harris replaced Biden as the nominee, she had
01:16:32.180 | very low favorability ratings, and then the media basically
01:16:36.060 | reinvented her as this transformative candidate. So
01:16:38.460 | look, when you've got the media willing to operate as de facto
01:16:42.660 | members of your campaign, that's tremendously powerful. If we had
01:16:46.380 | a fair media, this election wouldn't be close. So that is
01:16:50.420 | the advantage the Democrats had. Now look, should Trump have done
01:16:54.940 | the debate with ABC News? No, I think he should have chosen more
01:16:58.100 | fair moderators. I mean, to their credit, I think CNN played
01:17:01.820 | the Biden Trump debate pretty fair and down the middle. But
01:17:05.500 | ABC, I mean, it was predictable that, like I said, I mean, one
01:17:09.540 | of the hosts was her sorority sister, their friends. So you
01:17:13.300 | know, I think that if Trump loses, you could say that his
01:17:16.940 | willingness to walk into the lion's den, take on all comers
01:17:20.460 | do every interview, you could say maybe that wasn't as
01:17:23.340 | strategic as what she did. But at the end of the day, I think
01:17:25.860 | that voters will appreciate that both Trump and JD are willing to
01:17:30.900 | do basically every podcast, every interview, they're not
01:17:33.900 | afraid to answer questions. And when they do answer questions,
01:17:37.660 | you can see them thinking, and they don't give you the same
01:17:39.980 | canned answer they've given 10 times before, including at the
01:17:43.540 | debate. So yeah, I mean, that's my take. What's yours? Jekyll?
01:17:46.980 | On which aspect? Be more specific? Give me a give me a
01:17:50.020 | specific What do you think? What do you if if, if she ends up
01:17:53.740 | winning? What do you think the reason will be?
01:17:55.500 | Yeah, that's a good question. If she ends up winning, I
01:18:01.300 | think it will be that people believe that they I think it
01:18:06.500 | will be that moderates in those swing states and women believe
01:18:11.700 | that it's too much chaos, and that Trump will be too much.
01:18:15.860 | They want a calmer same thing reason Biden one, right, like,
01:18:19.500 | that there's this like concept that the adults are in the room,
01:18:22.260 | and it will be calm, and it won't be chaotic. And I think
01:18:24.900 | people just still see Trump as a bit chaotic. And I think that's
01:18:29.020 | the big fear. And I think they've played the abortion card
01:18:32.580 | and the right to choose really well, even though Trump said
01:18:35.380 | here, I'm not going to sign the abortion ban. I'm pro IVF. I
01:18:38.540 | think they have that really great win of saying, Hey, you
01:18:41.820 | bragged about overturning Roe v. Wade probably wasn't smart to
01:18:44.940 | brag about that. And they have that clip that they can keep
01:18:47.820 | reinforcing. So if he does lose, and I don't know that he's going
01:18:50.820 | to lose, I think there's a lot of people who are going to go in
01:18:56.500 | there and vote for him. But not say it to pollsters and not say
01:19:01.340 | it to their family and friends, because they're embarrassed.
01:19:03.820 | Because of the pressure against orange Hitler, you know, this
01:19:07.580 | whole rhetoric that he's going to, you know, overturn
01:19:13.500 | democracy. So I think it's a pretty good chance that he's
01:19:16.380 | going to win. Actually, I don't think that this I mean, look, I
01:19:19.940 | think race, right? Yeah, say the statistics in a close race favor
01:19:23.620 | him. Yeah, look, I mean, maybe we're asking the wrong question
01:19:27.260 | here, which is why would he lose? I mean, I think maybe the
01:19:28.980 | real question is, why is he favored to win? Because I think
01:19:31.140 | the polls, including Nate Stilver, still show him favor to
01:19:34.060 | win. And I think that when you look at what the big issues are
01:19:37.620 | in this campaign, and what has people agitated and upset, why
01:19:42.020 | they think the country's on the wrong track, something like 65%,
01:19:44.780 | it has to do with the economy has to do with inflation has to
01:19:47.380 | with the border. I think that on the cultural issues that trans
01:19:50.940 | stuff drives parents crazy, they don't want the government
01:19:53.060 | telling them what to do with their kids. So it's hard to
01:19:56.140 | think of a killer issue other than maybe abortion, that
01:19:59.900 | Harris has on her side, it feels like all the issues cut Trump's
01:20:04.220 | way. But again, the thing that Trump doesn't have, and there's
01:20:06.900 | no way to for him to fix this is the media is just so in the tank
01:20:11.940 | for for Harris. Now you raise a good point. Look, could Trump be
01:20:15.700 | more disciplined? Yeah, absolutely. However, you know, I
01:20:19.900 | think that what amplifies that is the fact that the media is
01:20:23.180 | quick to jump on every little thing he says and distorts it,
01:20:26.460 | and he sets himself up for it, you know, like, part of what
01:20:29.180 | makes him activate the base is that erratic behavior, his
01:20:34.100 | shtick, you know, the comedy, and then I do believe that it
01:20:37.300 | gets weaponized by the press, because it's like such so easy
01:20:40.460 | for them. I agree with you that Trump could be more disciplined.
01:20:43.100 | However, I don't think it's as bad as what you're saying.
01:20:46.420 | Because if it were, there'd be no need to make up these obvious
01:20:48.740 | hoaxes. There'd be no need to, you know, lie about the very
01:20:52.340 | fine people or blood or what he said about bloodbath. So if he
01:20:56.300 | was really saying that many outrageous things, why would you
01:20:59.020 | need to keep inventing things that he didn't say?
01:21:01.660 | And if you're just stacking them? Yeah, the answer to that
01:21:04.580 | question is just throw everything you got it. Yeah,
01:21:06.660 | that's everything in him. But look at look at Kamala's
01:21:09.340 | interviews. I mean, she hasn't given very many. But I mean, her
01:21:12.780 | answers are just I mean, just watch them. I'm not going to
01:21:15.060 | characterize them. But just just watch her actually said it. I
01:21:18.620 | mean, Megyn Kelly thinks she's stupid and not bright. I mean,
01:21:21.620 | she's not the most dynamic speaker, that's for sure. And
01:21:26.260 | she doesn't seem to be able to have a dynamic debate with
01:21:31.700 | intelligent people who are experts in their field, let's
01:21:35.620 | say, you know, she can't hold her own in the way you can see
01:21:38.860 | JD can write and Trump can. So here we go. And just on the on
01:21:44.780 | the second assassination attempt, I don't know if you
01:21:46.740 | even want to go there. But I mean, gosh, I'm so glad. He
01:21:51.540 | yes, he's got shot it again. It's this is scary stuff, folks.
01:21:54.780 | This rhetoric Scott to come down. I keep saying it. Nobody
01:21:58.060 | wants to listen to me. But man, be
01:22:01.660 | well, let's look at the rhetoric that Ryan Ruth was literally
01:22:05.060 | quoting on his Twitter was saying that Trump is basically
01:22:08.700 | an existential threat to democracy. He was quoting what
01:22:12.420 | Joe Biden and Kamala Harris and the mainstream media have been
01:22:15.300 | saying chapter and verse. So I think that, you know, if you
01:22:18.780 | want to ascribe motivation there, where did Ruth get these
01:22:24.860 | ideas? They've been endlessly amplified by the mainstream
01:22:27.900 | media. And it's not like a one off comment. It's been the
01:22:30.340 | central narrative for the last several years is that somehow
01:22:33.300 | Trump represents this existential threat to democracy.
01:22:36.300 | And one way or another, that threat must be eliminated. And I
01:22:39.260 | think Ryan Ruth simply took literally what the mainstream
01:22:43.300 | media has been saying. 1% of your followers is what I tell
01:22:46.460 | everybody high profile people you and I both know, his 1% of
01:22:51.020 | people in your following, and we all have large followings here.
01:22:54.980 | And there's certainly people who have extremely large
01:22:57.620 | followings, 1% are mentally ill. Like when I say mentally ill, I
01:23:00.900 | mean, severely mentally ill. And if it's but 1% of your following,
01:23:05.460 | if it's point 1%, this could be 1000s of people. And this is
01:23:08.820 | what happened to john Lennon and other famous people who've been
01:23:12.020 | killed tragically, is those mentally ill people interpret
01:23:15.900 | things in a very different way. And when you say, you know, a
01:23:20.300 | phrase that has triggers in it, threat to democracy, fight like
01:23:23.940 | hell, whatever it is, they interpret it differently. And so
01:23:27.220 | just please,
01:23:27.900 | when you call the guy Hitler for years, and again, you create
01:23:31.780 | millions or billions of impressions around that. And
01:23:35.500 | it's not like a one off statement, but it's something
01:23:37.860 | that's drummed into the public over and over again, it seems to
01:23:41.300 | me you're asking for trouble, stay safe, please tone down the
01:23:44.900 | rhetoric, everybody. And we will see you next time on the
01:23:48.340 | pollen podcast.
01:23:49.460 | Let your winners ride.
01:23:52.780 | Rain Man, David
01:23:55.300 | we open source it to the fans and they've just gone crazy with
01:24:02.220 | it. Love you.
01:24:03.420 | Queen of
01:24:04.060 | besties are
01:24:12.300 | dog taking a notice in your driveway.
01:24:16.260 | We should all just get a room and just have one big huge orgy
01:24:24.660 | because they're all just useless. It's like this like
01:24:26.500 | sexual tension that they just need to release somehow.
01:24:29.940 | What you're about to be. We need to get murky.
01:24:37.300 | Going on.
01:24:37.900 | Oh, man.