back to index

E152: Real estate chaos, WeWork bankruptcy, Biden regulates AI, Ukraine's “Cronkite Moment” & more


Chapters

0:0 Bestie intros, All-In CEO talk, and Chamath's research product
10:49 Ukraine has its “Cronkite Moment”
16:56 Market outlook: are we in for a Q4 rally?
22:5 CRE chaos: SF's firesales, understanding the second-order effects from CRE debtholders
41:38 WeWork will reportedly file for bankruptcy as soon as next week, Sacks makes the case for a turnaround
46:42 Biden's Executive Order on AI: end game, regulatory capture, confusion over clarity as a strategy
68:7 Silicon Valley's shift right

Whisper Transcript | Transcript Only Page

00:00:00.000 | Hey, everybody, welcome to another episode of the all in
00:00:03.000 | pod. The notorious threesome is here during the show. So today,
00:00:07.680 | threesome
00:00:08.640 | throuple. It's a throuple. It's a throuple. We have a throuple.
00:00:11.200 | I think it's more like a cuddle cuddle.
00:00:13.680 | Okay, the dictator to moth poly up a to rain man David Sacks,
00:00:18.640 | Sultan of Science, Dave Friedberg. Happy to be here
00:00:22.320 | with you guys today. We are absent. Jay cow, taking the week
00:00:27.360 | off. Any other housekeeping?
00:00:28.800 | Let's do it. Let's jump into it.
00:00:30.440 | Well, you announced that we're gonna hire a CEO, right?
00:00:32.640 | Oh, let's talk about that. How's it going? freeberg in our CEO
00:00:36.680 | search.
00:00:37.280 | Okay, I got to admit, I've had a few conversations, but I got to
00:00:40.960 | get through the list. There are 240 applicants. So Wow, and
00:00:46.600 | there's some really, like great people in there. So we need to
00:00:50.280 | figure out how we're going to manage this. But I think we're
00:00:53.920 | pretty excited.
00:00:54.480 | I can't believe so many people want to work for us. That's
00:00:57.240 | insane.
00:00:57.880 | There's some really great folks. I think the chemical you posted
00:01:00.840 | it and people were pretty positive, right? Generally
00:01:02.920 | positive. Not
00:01:03.520 | Yeah, it was really positive. Yeah. I mean, I think that
00:01:06.640 | people were very excited about the fact that we were going to
00:01:11.000 | try to kind of like professionalize this a little bit
00:01:13.600 | more. So let's just talk about that. We did the all in summit.
00:01:17.160 | In September, we thought it went really well, folks really
00:01:20.960 | enjoyed it. The survey data, the folks that attended was really
00:01:24.160 | positive. So we want to do that and do more of that live in
00:01:27.720 | person stuff. We all have full time jobs and other things to do
00:01:33.720 | so we need someone to come and run this as a business and
00:01:37.040 | organize it and carry it forward a little bit for us. And so
00:01:40.760 | we're really excited to hopefully get someone that can
00:01:43.880 | scale this. No ads ever. No subscription fees, nothing's
00:01:47.800 | going to change about the pod.
00:01:48.800 | So it's a media and events business and it might become a
00:01:52.440 | CPG business as well. Meaning was it CPGs consumer packaged
00:01:57.040 | goods. That's right. Thanks.
00:01:58.680 | That is the correct use of the acronym. David. Congratulations.
00:02:04.400 | Your first product would be what tequila.
00:02:06.760 | I would do like a tequila or like a hard seltzer or something
00:02:11.560 | like that. That seems to be a big growth category. Or we could
00:02:15.040 | figure out something very orthogonal. I mean, Mr. Beast
00:02:17.440 | did candy bars. And we found out at the all in summit that he's
00:02:20.800 | doing something like 250 million a year in revenue growing very
00:02:23.960 | fast.
00:02:24.520 | Yeah, we could do sacks nuts.
00:02:26.080 | The sacks nuts be too big to eat. You wouldn't be able to
00:02:30.080 | fit them in your mouth.
00:02:30.840 | They just be old and salty.
00:02:52.440 | Well, I think the other thing that's worth figuring out is
00:02:54.760 | this idea of like, what is what is the community really look
00:02:57.760 | like? I think that's the biggest. Yeah, that's the biggest
00:03:00.480 | upside for the CEO is like, is it physical places like zero
00:03:05.480 | bond? Or is it virtual sort of like, community oriented things?
00:03:10.840 | Is it some combination of the two? I'm really interested in
00:03:14.640 | trying to figure that out. And what answer they come up with
00:03:17.400 | then the other thing that I would really like to experiment
00:03:19.720 | with quite honestly, is if we could do a little bit of a tour,
00:03:22.880 | I just think it would be super fun to kind of go to like five
00:03:26.760 | or 10 cities and just kind of like crash those places and have
00:03:31.320 | like a big get together and talk to people and do a Q&A. I think
00:03:35.440 | the thing that is really fun is just the interactive q&a just
00:03:38.280 | because it allows everybody to get involved and then to be able
00:03:40.880 | to do dinners with each other. And like, there's just a really
00:03:43.560 | amazing community. So that person should be able to figure
00:03:45.720 | this out. I don't know the answer to it. But I'm hoping
00:03:48.360 | they figure that out.
00:03:49.760 | We should give a shout out to the guy who did the interview
00:03:52.560 | video or submitted. You see that it's like a four minute video
00:03:57.080 | Sean submitted his application by video on on x.
00:04:02.560 | I think I was on his podcast, Sean. He were. I was on his
00:04:06.240 | podcast like a year and a half ago. I think
00:04:08.320 | can we pull it up?
00:04:09.160 | Welcome, welcome to my five minute job application to become
00:04:13.520 | CEO of the all in podcast. This is the American dream, baby.
00:04:18.840 | That's great. Well, he posted a tweet the day before saying if
00:04:21.600 | it gets 1000 likes, he'll submit an application instantly got 1000
00:04:24.480 | likes, and then he put that video together. So great job,
00:04:27.480 | Sean. I will be calling you later for your second round
00:04:31.080 | interview. I think you're well qualified after that submission.
00:04:35.560 | But I do remember like at the all in summit, I've never been
00:04:38.520 | to a conference where I'd say like 95% of the audience was
00:04:42.800 | actually in the theater for the entire content session. Like
00:04:47.960 | they really sat through it, it was really engaging. So I think
00:04:50.280 | that's the sort of product that I'd love to bring. Bring out
00:04:55.360 | more, which is some of these important conversations with
00:04:57.800 | people that can provide a perspective that folks might not
00:05:01.800 | be getting elsewhere. To mark you also mentioned on Twitter
00:05:05.240 | that you're doing more content. Like your you called it learn
00:05:09.000 | with me. What is that?
00:05:09.920 | Yeah, I mean, well, I've been doing this for a while, but I
00:05:12.240 | just wanted to wrap it up and kind of make it into a clean
00:05:16.280 | kind of productized service. So, you know, I have this newsletter
00:05:20.040 | that I've been using weekly to just kind of share everything
00:05:23.800 | that I read. But then I thought, Okay, why don't I actually like
00:05:28.240 | really invest just a little bit of time every week to curate
00:05:31.440 | that even a little bit more, and then write little summaries of
00:05:35.840 | probably the three most important news topics of the
00:05:38.360 | week. And I just kind of give an insight into my process. So when
00:05:42.280 | I read all of this stuff, usually two to three times a
00:05:45.280 | month, but the average is around twice a month, I end up writing
00:05:48.160 | an essay. And it's like one or two pages really quick. And I
00:05:51.040 | just keep them for myself. And it's like, right now I'm just I
00:05:53.760 | just finished one. What is the Federal Reserve? And the reason
00:05:56.920 | I did that was because Druckenmiller was on TV, and he
00:05:59.440 | was talking about all this stuff. And I just wanted to
00:06:02.120 | remind myself about why the Federal Reserve exists the way
00:06:06.160 | it does. But basically, what happens is from these quick
00:06:08.840 | essays, and from what I read, I typically start every month, a
00:06:13.080 | deep dive where my team and I, we probably allocate at least a
00:06:18.680 | month to a month and a half where we figure out something in
00:06:22.720 | really excruciating detail, and we generate a slide deck and a
00:06:26.800 | write up. And then we take that and we go and we ask a bunch of
00:06:29.720 | experts what they think. And then we only keep it within
00:06:32.200 | ourselves. But then I thought, you know what, why don't we just
00:06:33.960 | start sharing it. So the reason I'm doing this is really to keep
00:06:37.600 | myself accountable to keep learning. Because I feel like in
00:06:41.800 | the last few years, the thing that I lost most when things
00:06:46.400 | were going crazy was just the incentive to learn. And I
00:06:49.720 | thought, if there are other people around me that are also
00:06:53.160 | relying on this, I'll be more accountable. And obviously, to
00:06:56.840 | the extent that this thing can actually be self funding, then I
00:07:00.840 | can take some of that revenue reinvested in more research
00:07:03.720 | associates, maybe publish more content, but it's kind of like
00:07:06.800 | all the information I wish I had so that I could make better
00:07:09.160 | decisions. I'm just going to kind of take what I have share
00:07:11.960 | with everybody, and kind of go from there. So we'll see.
00:07:15.560 | So you're charging for the kind of higher end part of it
00:07:18.600 | at the end of it, like it's a kind of like, I would say most
00:07:21.520 | of it is free. The Twitter spaces obviously is not because
00:07:25.400 | the point of these are these x spaces is we just don't want a
00:07:28.360 | bunch of bots and trolls. But there's a bunch of people that
00:07:32.680 | signed up to be a subscriber. So we'll do q&a is monthly for
00:07:35.280 | subscribers. And then once a month, I'll publish a deck. And
00:07:39.440 | like I said, if you want the full one, then you can be part
00:07:42.400 | of like a paying community. Otherwise, you'll get some
00:07:44.560 | portion of it for free.
00:07:45.520 | And then you'll have cash to reinvest in doing more content
00:07:48.840 | like right now I have a couple of research assistants that
00:07:51.440 | really helped me and we have a pool of experts that we use but
00:07:54.520 | if this thing picks up some steam and we can have enough
00:07:57.640 | revenue to hire three or four more RAs maybe at some point
00:08:00.880 | instead of monthly we can go by weekly. I don't know. But the
00:08:04.800 | idea is just to focus on sort of what I think is interesting. So
00:08:09.680 | that I can have a general view of things so that I can make
00:08:12.840 | better decisions. And like I said, to the extent people are
00:08:15.400 | interested now they can follow along at any level
00:08:17.720 | there you have it so people can go to your Twitter to see it
00:08:19.440 | right? Yeah. Okay.
00:08:20.800 | By the way, the other thing is it's kind of cool to be in this
00:08:23.080 | like content creator economy. You know, that's the other thing
00:08:25.800 | is like, I don't know about you guys, but I feel like I'm on the
00:08:30.080 | sidelines looking in at trends now whereas before I was in the
00:08:33.000 | middle of them like web 2.0 social, I was right in the heart
00:08:37.000 | of it. And I was like, Okay, this is amazing. And then as an
00:08:40.480 | investor in SAS and deep tech, I was kind of in the middle of
00:08:44.720 | those as well. That felt great. But this last push in AI, I'm
00:08:48.840 | like, wow, I'm really on the sidelines looking in or this
00:08:53.000 | content creator stuff. So I think it's, it's a good forcing
00:08:56.520 | function for me to kind of learn as well
00:08:59.000 | to be active in the content creator economy basically to
00:09:02.040 | understand it.
00:09:02.760 | Yeah, I don't know how else I'll ever learn how to because
00:09:05.240 | like, I could read all the stuff I want.
00:09:07.040 | Just so you know, you do a podcast that a lot of people
00:09:08.840 | listen to.
00:09:09.360 | Yeah, but I think that that's like a, it's the lowest order
00:09:13.880 | bit, quite honestly. You know, when you see like what Jimmy
00:09:17.200 | Donaldson does, or even like Kim Kardashian, what they do as
00:09:22.760 | content creators, it's just an intense process that then
00:09:26.560 | creates all this upside. So even if what I want to have a more
00:09:29.600 | informed view of where all in needs to go, I want to be a
00:09:34.440 | little bit more on the ground and actually in the engine room
00:09:37.000 | doing it for myself.
00:09:37.920 | Yeah, since we started doing the pod, Saks has been creating a
00:09:40.520 | ton of content to I mean, sex, you wrote a piece this week,
00:09:43.360 | didn't you? And on Ukraine, or like the Time article on
00:09:48.280 | Ukraine, didn't you? Yeah, yeah. Like you weren't doing a lot of
00:09:51.640 | content before we started doing the pod, right?
00:09:53.680 | No, he was he was writing a like a lot of blog posts on SAS.
00:09:57.000 | Those are excellent. He had like an occasional blog post, but
00:09:59.560 | he's like super active now, particularly on Twitter now.
00:10:02.600 | Like, has that changed a lot?
00:10:04.520 | Well, what basically happened is I would express political views
00:10:09.360 | on the pod, you know, on a weekly basis. Yeah, I wasn't
00:10:12.880 | really tweeting much about politics, but then it all kind
00:10:15.720 | of bled together. I mean, once you start talking politics on a
00:10:19.400 | pod, people were clipping it, they were putting it on Twitter,
00:10:23.080 | and then I have to respond to it, maybe correct. Yeah, miss
00:10:27.040 | interpretations or respond to attacks or whatever. So all the
00:10:31.160 | politics and business stuff just kind of bled together on Twitter,
00:10:33.520 | it'd be nice if I just had two accounts and people could just
00:10:35.640 | follow me for what they wanted to follow. My sub stack is just
00:10:38.520 | business. And then I'll occasionally write an op ed on
00:10:43.040 | the political stuff. So I've written a few on free speech,
00:10:46.520 | and then I've written a few on Ukraine. And so I wrote a piece
00:10:51.120 | for responsible statecraft this week on Ukraine, which was based
00:10:54.200 | on a Time magazine story that came out and then I did a
00:10:57.120 | version of it on x kind of a long post, which I called the
00:11:03.360 | Ukraine wars Cronkite moment, which described the Cronkite
00:11:07.800 | moment in the Vietnam War was that Walter Cronkite, who was
00:11:11.120 | the top newscaster, America's most trusted man in the 1960s.
00:11:15.880 | He went on a fact finding mission for himself to Vietnam,
00:11:20.000 | and he came back and he said the war is basically unwinnable. And
00:11:23.360 | that's when the public sentiment really turned against the
00:11:25.640 | Vietnam War, it still took us five years to get out of it. But
00:11:30.440 | what Cronkite said, there's a great quote from Cronkite, which
00:11:33.680 | I've included at the end, where Cronkite says is increasingly
00:11:37.200 | clear to this reporter that the only rational way out will be to
00:11:39.880 | negotiate, not as victors, but as honorable people who lived up
00:11:43.320 | their pledge to defend democracy, and did the best they
00:11:45.960 | could. So that's basically what happened in 1968. What happened
00:11:49.840 | this week is that Time magazine wrote a cover story on
00:11:52.720 | Zelensky, where the main sources for the story were all of
00:11:56.040 | Zelensky's aides and advisors. And what they told the Time
00:12:00.040 | reporter was effectively this war is unwinnable. And moreover,
00:12:04.520 | Zelensky is delusional. That was a word they use delusional in in
00:12:09.880 | not confronting the battlefield realities that the Ukrainian
00:12:13.720 | army had been decimated, and delusional in the sense that he
00:12:17.760 | refuses to engage in peace negotiations with the Russians.
00:12:21.540 | He's utterly unmovable on this idea that Ukraine's gonna
00:12:24.900 | somehow prevail. And the troops in the field are basically
00:12:29.520 | saying that they cannot follow the orders they're being given.
00:12:31.880 | They literally they can't advance or being ordered to
00:12:34.960 | advance. And the officers in the field, the frontline commanders
00:12:38.520 | are rejecting the orders, because they've been decimated
00:12:41.640 | so badly, and they they consider the order to be impossible. So I
00:12:44.640 | consider this to be the Cronkite moment of the war, when
00:12:47.120 | Zelensky his own inner circle is saying that the war is lost. And
00:12:51.980 | the only question is, when are we going to realize that? And
00:12:55.100 | start negotiating? Are we going to continue until the very end
00:13:00.620 | here with Zelensky in this psychological bunker that he's
00:13:06.180 | created? Or are we going to recognize reality the way that
00:13:10.280 | Cronkite urged the nation to in 1968?
00:13:13.340 | Do you think these aids to Zelensky proactively reached out
00:13:18.440 | to the journalists to try and get this story published? How
00:13:21.840 | does a story like this come about that provides this kind of
00:13:25.200 | revealing insight on the supposedly inner chamber in
00:13:28.880 | Zelensky his operations that can shed so much light on the
00:13:32.920 | challenges and with him and with the operation?
00:13:36.480 | I think it's a really interesting question. And I
00:13:38.240 | think the starting point for understanding that is that this
00:13:40.720 | writer Simon Schuster Time magazine wrote the in depth
00:13:44.540 | profile Zelensky four times person of the year at the end of
00:13:48.620 | 2022. So this reporter cannot be accused of unfavorable
00:13:54.300 | coverage.
00:13:54.860 | This is the person that profiled Zelensky as person of the year.
00:13:57.580 | Yes, yes, exactly. Wow. So they so through that he probably had
00:14:02.320 | contacts in the lens keys inner circle and doing his
00:14:05.440 | preparatory work for that article.
00:14:07.700 | Absolutely. Yeah, I've maintained those relationships
00:14:10.600 | and has a back channel now to be able to get this sort of
00:14:13.400 | insight.
00:14:13.820 | Well, he was physically there. I mean, I think he was physically
00:14:16.940 | there to write the person that your profile and he was
00:14:18.880 | physically,
00:14:19.320 | this isn't some like weirdly planted. This is like a legit
00:14:22.340 | source.
00:14:22.880 | Well, that's my whole point is that this reporter is an entire
00:14:27.280 | magazine, obviously is mainstream media. And this
00:14:29.600 | writer has given favorable coverage to Zelensky in the
00:14:32.280 | past. And I believe that's what led to him having this kind of
00:14:36.240 | access to Zelensky is in our circle. Okay, you have to take
00:14:39.060 | all that into account.
00:14:40.220 | The key question sexist does this change anything? So does
00:14:42.980 | this article change the sentiment of political leaders
00:14:48.300 | in the US? Or is nothing really different that we still need to
00:14:51.900 | defeat Putin continue to feed the war machine continue to
00:14:54.980 | support the effort against Putin? Or does this start to beg
00:14:59.280 | the question? Hey, maybe we need to look at this tactically and
00:15:01.980 | make a change. Is anything actually going to change from
00:15:04.500 | your reason to think that
00:15:05.760 | in a certain sense, this article isn't saying anything that's
00:15:09.780 | new. I mean, the article basically says the same things
00:15:12.600 | that I've been saying the same thing, the same kinds of things
00:15:15.120 | that Professor Mearsheimer has been saying Professor Jeffrey
00:15:17.320 | Sachs, that all the critics of this war have been saying these
00:15:20.720 | things for a number of months, which is the Russians are
00:15:23.160 | winning. There is no feasible way for the Ukrainians to evict
00:15:27.240 | the Russians from their territory. We are out of
00:15:29.880 | artillery and other types of ammunition to give them
00:15:33.060 | effectively, the war is unwinnable, we should negotiate.
00:15:35.480 | So a lot of people have been saying that for a long time. But
00:15:37.740 | what's different now is that if you believe this publication in
00:15:41.820 | this writer, which I do because of his access, it's the
00:15:44.720 | linsky's own inner circle are now saying those same things. So
00:15:49.080 | in other words, the so called Putin talking points that we're
00:15:51.600 | always accused of are coming from inside the house. Yeah. And
00:15:55.920 | to me, that's what makes this a Cronkite moment. Now, will
00:15:58.560 | policymakers in Washington change their tune or change
00:16:01.680 | their perspective on this? Probably not the unit party of
00:16:05.080 | both Republicans, Democrats are still seems to be a majority who
00:16:08.360 | want to fund another 61 billion for Ukraine. So is this
00:16:12.640 | penetrating the blob? I don't think so. I mean, but again,
00:16:15.720 | think about Walter Cronkite in 1968, the country basically came
00:16:19.220 | around to his point of view after that, but it still took
00:16:21.840 | five years to get out of the Vietnam War. So I think that
00:16:27.540 | this week was a watershed in terms of the way that public
00:16:31.480 | perception is going to evolve over the next few months. But it
00:16:34.720 | seems like the policymakers in Washington are the last ones to
00:16:38.000 | get the memo.
00:16:38.640 | Well, let's see what happens. I mean, I've shared my point of
00:16:41.280 | view for a couple years now, I think the US is in a call it
00:16:44.680 | subconscious conflict, escalatory state, that would
00:16:50.800 | continue to drive us into these sorts of it's great for market
00:16:54.600 | conflict seems to be great for markets. Conflict is great for
00:16:58.720 | markets, you're saying? No, no, no, no, the fact that it seems
00:17:02.760 | like there's a terminal outcome. And all we're debating is
00:17:07.880 | whether the Zelensky sees it is actually very reassuring for the
00:17:12.920 | market. Oh, you guys are markets, leaving the markets
00:17:15.680 | recognizing that there may be an end to the Ukraine conflict?
00:17:18.360 | Well, I do think I do. I do think in part, I don't think it
00:17:20.800 | explains why it's up necessarily today. I think that's more
00:17:23.220 | because I think I mentioned this before, but the end of the
00:17:27.400 | fiscal year for mutual funds was October 31. And so there was a
00:17:30.680 | lot of selling going into October 31, just to capitalize
00:17:34.040 | tax losses and tax loss harvest. But that leaves them with a lot
00:17:38.880 | of cash. And so starting November 1, which is day one of
00:17:42.560 | the new fiscal year, mutual funds are not allowed to really
00:17:45.480 | own cash, that's not why you pay them. And so they're going to be
00:17:49.120 | active buyers. So that's why in the short term, the market is
00:17:51.320 | up. But if you think about the overhang of risk that we've had,
00:17:55.760 | right, you had one forever war. And now the beginning of what
00:18:00.120 | potentially could be another forever war, that's definitely a
00:18:03.440 | dampener on market demand, right, and, and, and sentiment
00:18:07.240 | and investor confidence. But if a story like this is true, it
00:18:11.520 | actually takes one of those big risks off the table, which is
00:18:15.320 | this idea that now, it's like every other kind of a thing,
00:18:18.840 | which is it's the reason we spend money there is not to win
00:18:22.120 | something that's winnable. It's just graph. It's just typical
00:18:25.600 | corruption. And the article says that the quarter Ukrainian
00:18:29.280 | officials saying that people are stealing like there's no
00:18:32.280 | tomorrow. Right? Right. So that that the US, unfortunately
00:18:37.400 | enables in many places, that's not new. And so to the extent
00:18:41.840 | that that's now more of the status quo, that's reassuring
00:18:44.480 | for investors, because we're not talking about a war, we're just
00:18:47.400 | talking about bleeding more money, which I'm not saying is
00:18:50.520 | right, but it's morally more acceptable than the alternative.
00:18:55.240 | So there you have it. It's generally good for markets. Now,
00:18:58.120 | that leaves, I think, Israel, Gaza has a risk. And I think
00:19:05.680 | people and I think the market still view that as a potential
00:19:09.240 | war. And the longer that goes on. I think that there's a very
00:19:16.440 | good chance that we de risk that as well as again, not a war, but
00:19:20.280 | part of that cycle between Israel and Palestine, which is
00:19:24.880 | conflict, timeout, conflict, timeout, conflict, timeout. And
00:19:28.600 | so if what we think is now this is just a version of conflict,
00:19:31.480 | timeout, and the market de risks that then it's actually pretty
00:19:36.320 | positive for equities for startups. Because now the Fed
00:19:42.560 | has a reason to actually say, Okay, the economy is cooled off,
00:19:46.520 | inflation is calm. It looks like the markets are stable. Let's
00:19:51.160 | cut rates, right? Let's, let's reintroduce some demand into the
00:19:55.200 | market. It's generally a good thing. And if there's no reason
00:20:00.280 | to be risk off, people will use that as a reason to be risk on.
00:20:03.880 | Yeah, well, Treasury yields 30 the 30 year crack is, yeah, it's
00:20:10.000 | down. Correct. 40 minutes, two days. Yeah. So you think this is
00:20:16.480 | because the forecast or the outlook for q4 or q1 next year,
00:20:21.280 | it's starting to go down quite a bit, that people are starting to
00:20:24.760 | see more recessionary indicators. So you think that's
00:20:27.520 | like peak 4.9% growth rate, and then inflation seems to be tame
00:20:31.720 | quite a bit too.
00:20:32.520 | I think to be honest, the look through on q1 looks also pretty
00:20:35.960 | good. And q4 looks healthy as well. So for example, Shopify,
00:20:39.080 | their read through on GDP was actually pretty healthy. Harley
00:20:44.560 | was on I think it was CNBC I saw this morning basically saying he
00:20:48.480 | thinks consumer spending is still strong, and it's still
00:20:50.600 | there. So I think in terms of just like economic strength, I
00:20:55.200 | think it's pretty decent, actually, I think what changed
00:20:58.320 | for the market was the feds rhetoric. And I think that they
00:21:03.000 | were holding on to this option that they were going to show up
00:21:06.240 | out of nowhere with another 25 or 50 basis point increase. And I
00:21:09.880 | think that that's fundamentally now off the table. And I think
00:21:12.760 | that that's really heartening for the market. That's good for
00:21:15.760 | market stocks are ripping a firm volunteer up 20% today,
00:21:19.760 | ripping, ripping, ripping, open door, ripping, door dash,
00:21:23.480 | ripping. All these companies are up 20 and 25 and 30%. It's
00:21:27.320 | crazy. Yeah. So are we calling bottom sex?
00:21:30.680 | Well, maybe I mean, I don't know, it's so hard to predict
00:21:35.400 | the markets. But if you believe that there's more upside to
00:21:40.880 | rates than downside, meaning that the odds of a rate
00:21:44.440 | decrease are much greater than the odds of a rate increase from
00:21:47.560 | here, then there is upside to valuations, particularly for
00:21:50.840 | gross stocks, also for distressed real estate, because
00:21:53.960 | all these things get more valuable when rates are lower.
00:21:56.680 | So if you believe that we're going to be in our in a cycle of
00:22:00.320 | rate decreases, and that whole thing has played its way out,
00:22:03.560 | then everything's going to rally.
00:22:04.760 | Well, let's talk about real estate, because I think that's
00:22:08.280 | one of the biggest overhangs right now that hasn't fully been
00:22:11.720 | accounted for. There's, you know, $3 trillion of commercial
00:22:17.960 | real estate loans sitting out on commercial banks balance sheets
00:22:22.520 | in the US today, 3 trillion, it's an insane number. And we
00:22:26.760 | were all talking over text the other day, as you guys know,
00:22:29.560 | about this building that's being shopped and got some attention
00:22:32.440 | on Twitter. I think the it's 115 Samsung, you guys know this
00:22:36.160 | building, I've lived in San Francisco for 20. How long have
00:22:39.440 | I lived there 20 some odd years now. So I know this building.
00:22:43.600 | And it's 125,000 square foot building that's going to likely
00:22:48.040 | be fully vacant, and call it two to three years. So the guy puts
00:22:52.080 | it up for sale, the equity owner, he puts the building up
00:22:54.960 | for sale. And he's basically like running an auction on it.
00:22:58.600 | And the brokers are all saying, Hey, it's currently at 199 bucks
00:23:03.000 | a square foot, which translates to $25 million, it'll probably
00:23:06.600 | get done at 300 bucks a square foot, which is about $38 million
00:23:09.760 | for the building. The problem is, if you look at the debt on
00:23:13.600 | the building, there's 53 million of debt, B of A owns that note.
00:23:17.600 | So Bank of America in 2016, issued a $54 million note on
00:23:22.000 | this building. If this deal closes at $39 million, the
00:23:27.280 | equity owner who's running this auction to sell the building
00:23:29.880 | makes $0. All that happens is that money pays down the debt,
00:23:33.560 | and the debt only gets 70 cents on the dollar, and B of A has to
00:23:37.200 | take a write down on the rest. So there's a rumor that in this
00:23:41.720 | particular case, what's happening is the equity owner of
00:23:44.440 | the building is putting it up for auction to show the bank,
00:23:48.520 | hey, this thing's only worth 70 cents on the dollar for you on
00:23:52.080 | your debt, you should restructure my debt. So
00:23:54.800 | apparently, there have been a lot of building owners that have
00:23:57.080 | been going to their banks and saying, Hey, I want to
00:23:58.920 | restructure my note, I want to have longer payment terms, I
00:24:01.920 | want to keep the rates lower for longer, I want to have some of
00:24:04.600 | the debt forgiven anything they can do to get the cost of the
00:24:07.840 | debt down. Because the overhang of the debt on all these
00:24:10.320 | buildings is so significant, these buildings cannot make
00:24:13.080 | money and the owners will never make money. So they're like, why
00:24:15.120 | am I even wasting my time. So they have one option, which is
00:24:18.040 | to hand the keys back to the bank and say, here you go, you
00:24:20.040 | auction it and sell it. Good luck. And the other option is to
00:24:22.680 | go to the bank and say, hey, restructure my debt, let's
00:24:25.480 | figure out a deal. Here's the problem. The $3 trillion of debt
00:24:29.920 | that we just mentioned that sitting on all the bank's
00:24:31.440 | balance sheets is all being held at par, they're not discounting
00:24:34.960 | it at all. And they're not marking it as being impaired in
00:24:36.760 | any way. So as soon as they have to start writing this stuff
00:24:39.480 | down, the bank's balance sheets all get impaired. And so there's
00:24:43.160 | a real risk in the market that I don't think has been fully
00:24:45.280 | accounted for that we're starting to see the cracks. And
00:24:47.760 | to Chema's point, this might accelerate either fed action, or
00:24:52.640 | as I've mentioned in the past, I think the federal government
00:24:55.840 | steps in and starts to issue programs to support commercial
00:24:58.720 | real estate developers and owners, sacks, I know you've
00:25:00.920 | been following this quite a bit, you know, maybe you could share
00:25:03.380 | your point of view. And if I'm off on this, in terms of how
00:25:05.880 | significant of a problem this is, you know, how much of the
00:25:08.860 | 3 trillion of debt really is impaired? And by how much should
00:25:11.520 | it be impaired? Do we think
00:25:12.520 | massively, I think it's a huge problem. If you talk to any
00:25:16.760 | body, any of these commercial real estate developers or
00:25:20.440 | sponsors, they'll tell you many of them are just kind of hanging
00:25:22.920 | on by their fingernails. I mean, it's really ugly. And you know
00:25:27.160 | how distressed things are when you actually look at some of
00:25:29.360 | these sales, that same account on Twitter that you posted,
00:25:32.360 | he's got like a tweet storm here showing about half a dozen of
00:25:35.840 | these sales that have happened in the last few months. And
00:25:39.800 | buildings are now selling for two to $300 a square foot in San
00:25:44.000 | Francisco. These are buildings that could have sold for 1000
00:25:47.560 | bucks a foot, you know, a couple of years ago, and now
00:25:50.720 | they're selling for two to 300. The replacement cost if you're
00:25:53.720 | trying to recreate these buildings from scratch, given
00:25:56.080 | how expensive it is to build in San Francisco, and how
00:25:58.840 | complicated the entitlements processes, probably be 1200 plus
00:26:03.200 | a foot. So these buildings are selling for 10 to 20% of their
00:26:08.400 | replacement cost. These are fire sales.
00:26:11.760 | If you look at the loans in the in the San Francisco commercial
00:26:14.960 | real estate market, do you think that the loan value is probably
00:26:17.320 | impaired by 40%? Because that's roughly what it would say on
00:26:20.040 | that 115 Sansom building is B of A would probably have to take a
00:26:23.200 | 30 40% markdown on their debt? Do you think that's the right
00:26:26.160 | way to think about the numbers 30 to 40?
00:26:27.520 | You could figure it out this way. A typical commercial real
00:26:30.360 | estate deal is about one third equity and two thirds debt. So
00:26:33.320 | somebody bought a building at call it $1,000 a foot, they
00:26:37.680 | would have had to have put in call it 333 of equity 666 of
00:26:42.440 | debt. Yeah, now if that building is only worth, I don't know,
00:26:46.560 | 200 to 50 a foot, there's $750 of loss per foot, right? And
00:26:53.040 | only through 333, roughly half of it is equity. So the debt is
00:26:58.680 | taking a huge haircut there, too. Yeah, that's about 40 50%.
00:27:02.040 | Right? Yeah. And no one wants to recognize that loss. The equity
00:27:05.240 | holders certainly don't because they get wiped out. And even the
00:27:07.760 | banks are reluctant to recognize that loss because who knows what
00:27:12.120 | that could trigger when their balance sheets are so impaired.
00:27:14.840 | So what everyone's trying to people going to come from sucks
00:27:18.880 | where they were like, this is my this is my big problem with
00:27:21.920 | this argument is I, I think that numerically, you're right that
00:27:25.680 | there is this bottoming or there is just like a lot of value to
00:27:30.280 | be had, right? The problem is, and I'll get and I'll give you
00:27:33.000 | an example of a company that I own. So it's not even a company
00:27:36.840 | that I'm just a minority investor in. We used to be in
00:27:40.720 | San Francisco. And we went virtual. Now, all the salaries
00:27:45.680 | are pegged to San Francisco salaries. But now everybody's
00:27:48.240 | sort of like everywhere throughout the country, we don't
00:27:50.720 | have an office space. And I'm really struggling trying to
00:27:54.280 | figure out how to actually bring everybody together. And my
00:27:57.480 | choices were I thought, Okay, well, let me find what it would
00:28:00.680 | cost to actually run this place in San Francisco. And my, my
00:28:05.360 | opex would have gone up 1520% just for the building. So then
00:28:09.360 | I'm like, Okay, well, where else can I go? And it's like, well, I
00:28:12.520 | could go to a place like Las Vegas, and I could put the
00:28:15.720 | people there. It's an hour, so it's close by, we can fly there
00:28:20.600 | whenever we need to. It's got no tax. So maybe that's an
00:28:24.880 | advantage for the employees. And so I'm in this constant hamster
00:28:29.200 | wheel as an owner of a business. I can't get people into the
00:28:32.280 | office. If I do bring them into the office, it's super expensive
00:28:35.400 | and it bloats my opex. So I'm just trying to figure out how
00:28:39.280 | does that get resolved for people like me so that we want
00:28:41.560 | to go back to San Francisco and rent those buildings that are
00:28:44.560 | so cheap, those buildings are gonna have to get written down
00:28:46.920 | for. So I set up my company in 2006. I was paying $22 a square
00:28:51.760 | foot for my first office in a six. And then we ended up
00:28:55.000 | getting a really nice office. And I think we ended up paying
00:28:56.960 | like 35 to 40 bucks a square foot by the time we moved to a
00:29:00.280 | super nice office around 2010. But around that time, which was
00:29:05.800 | also when GFC happened the global financial crisis in a way
00:29:08.480 | to nine. And that's when rates dropped to zero. And that's also
00:29:12.400 | when all startups and tech companies said, Hey, let's start
00:29:15.080 | setting up in San Francisco, because it's mostly young people
00:29:18.440 | that are engineers. Now. They mostly want to live in the city.
00:29:21.040 | They mostly want to be single and have a good life here
00:29:23.080 | rather than go live in suburban Silicon Valley. And so real
00:29:26.360 | estate shot up because rates were zero. All the tech
00:29:29.320 | companies are setting I mean, sex you had Yammer in San
00:29:32.320 | Francisco, didn't you? Were you in the peninsula? Yeah. And that
00:29:35.200 | was like the place to have your company was in the city. And so
00:29:38.240 | I would argue like if you go back to the oh 809 era, that's
00:29:41.440 | really when San Francisco started to take off. And it was
00:29:44.680 | almost like a bit of an outlier of a bubble that started all the
00:29:48.280 | way back then. And then, you know, the ZURP environment kept
00:29:50.960 | things moving up and up and up over the years that followed. I
00:29:53.200 | do think that this like this market will likely normalize
00:29:57.160 | back to pre 2010 pre oh nine. And that's when rates were like
00:30:02.240 | in the 25 to 35 bucks a square foot range for you know,
00:30:06.640 | I will move back to that building and I'll and I'll move
00:30:09.440 | my company back there when it gets repriced. Yes, I do think
00:30:12.400 | so. Are you sure? I don't see how I don't see how the market
00:30:15.680 | stays where it is. Yeah.
00:30:16.640 | Tomas argument is basically on the demand side, basically
00:30:19.520 | saying, Where's all this demand going to come from?
00:30:21.200 | I think you could give it to me for zero. And I wouldn't bring
00:30:23.480 | my team back there. I'm just asking like, again, the math all
00:30:26.280 | makes sense on a spreadsheet. But the big issue is, as an
00:30:29.760 | owner of a company, why would I ever bring my team back to that
00:30:32.800 | place? And I'm struggling even at like,
00:30:35.360 | oh, you mean because it's a nasty city because it's all
00:30:36.960 | messed up the city's
00:30:37.640 | lost. First of all, what if you're if you're asking me to
00:30:40.000 | blow up my op x 10 20% the answer is absolutely not. Will I
00:30:43.880 | do that? Then if you're saying well, come on, how about a zero?
00:30:47.160 | Let's just go to zero, right? It's right. Take the building.
00:30:49.480 | Yeah, I still wouldn't do it because then these people have
00:30:52.080 | to move from all around the country come back here and all
00:30:54.440 | of a sudden just live in a slimy dungeon.
00:30:57.120 | Yeah, how does that get solved? So that building should be zero.
00:31:02.440 | Saks you own buildings in San Francisco. What's your answer as
00:31:05.520 | a landlord?
00:31:06.040 | Yeah, I only own buildings in Jackson Square, which is like
00:31:08.880 | the one decent part remaining of San Francisco, which is actually
00:31:12.160 | where people want to be. And we've seen pretty good leasing
00:31:13.960 | activity there because it's as people get driven out of Market
00:31:17.680 | Street and out of Soma, they actually they substitute to
00:31:21.000 | Jackson Square. Look, I think your mouth makes a good point,
00:31:23.680 | which is the demand picture for San Francisco is really unclear.
00:31:26.640 | There's something like, call it 30% vacancy. So there's an
00:31:30.160 | enormous number of these zombie buildings. And it's really hard
00:31:34.240 | to understand where the demand is going to come back to fill
00:31:37.640 | them. Now, the counter argument to that is, yeah, but it's so
00:31:42.000 | cheap. I mean, these buildings are so cheap. They're basically
00:31:45.160 | selling for almost land value.
00:31:47.280 | That's the most point it's like, it goes to zero. Yeah.
00:31:49.720 | So if there's a recovery, then it's very symmetric, right? I
00:31:54.040 | mean, you could get three or four x upside in your money very,
00:31:56.840 | very quickly. rates come down over the next couple of years,
00:31:59.760 | you could refire equity out. So that's the counter argument is
00:32:04.320 | Yeah, we're not exactly sure where the demands comes from.
00:32:06.280 | But if you're are willing to have a long term outlook, like
00:32:09.440 | five or 10 years, then there's a pretty good chance that will
00:32:12.960 | come back somehow, I think one of the things that has to happen
00:32:16.320 | in order to foster that demand is for there to be some
00:32:19.440 | fundamental changes in the politics of the city. And the
00:32:23.160 | question is, how does that happen? And I think one of the
00:32:25.400 | ways it could happen is the city starts facing huge budget
00:32:29.040 | shortfalls, which are coming is there's going to be a huge
00:32:32.200 | budget shortfall over the next year and next couple of years,
00:32:35.640 | because there's no activity. I mean, a third of the economic
00:32:37.800 | activity of the city is just dried up.
00:32:39.720 | And what about the rest of the country? So tell me like, if
00:32:42.480 | this is what's happening in San Francisco, what's happening in
00:32:44.720 | Boston, Dallas, New York City?
00:32:46.480 | Well, I don't think those areas are as impaired. They don't have
00:32:49.480 | the vacancy rates that San Francisco does. I mean, so if
00:32:52.840 | you look at the major market is 30%.
00:32:54.760 | So just coming back to my prior question, if you were to assume
00:32:57.520 | San Francisco commercial real estate debt is impaired on the
00:33:00.320 | order of 30%. What do you think the debt load for the for all
00:33:05.160 | commercial real estate around the country is impaired? Is it
00:33:07.360 | 10% 15% 20%?
00:33:09.640 | I don't know. I mean, I think that you probably for San
00:33:14.480 | Francisco, I'd say probably half the debt should be written off.
00:33:18.000 | I don't know what it is for the rest of the country. There's
00:33:20.120 | already half a dozen of these buildings that have traded in
00:33:22.600 | the range that we're talking about. And there'll be about 20
00:33:24.760 | more that are coming to market that will trade in the next
00:33:27.440 | year, it's going to be fire sale after fire sale. And you have to
00:33:31.600 | clear out all of this bad equity. And then you've got to
00:33:35.600 | clear out all this bad debt and reset itself. So tomorrow has
00:33:40.320 | asked the, you know, 64,000 hour question here, which is, when
00:33:45.000 | will the demand come back? And what will it consist of? And
00:33:48.800 | that is the leap of faith here that you have to make. Just to
00:33:51.800 | finish the point I was making, the city, the politicians are
00:33:54.840 | going to be under extraordinary stress, because there's not
00:33:58.360 | gonna be any budget. So what are they going to do? Are they going
00:34:00.120 | to lay off half the city employees reform, pensions and
00:34:04.120 | salaries? Or are they going to start to realize that all of
00:34:07.560 | their crazy transfer taxes, all of their crazy regulations and
00:34:10.800 | entitlements needs to be stripped away to start
00:34:13.320 | fostering real activity in the city? Are they going to start
00:34:15.600 | investing in police again? Are they going to start cleaning up
00:34:18.120 | the city? Are they going to act as a partner to business? I
00:34:21.600 | mean, that's where the incentives are going to be is
00:34:23.640 | for them to start acting like a partner again. And again, that's
00:34:27.240 | a huge leap of faith, given how crazy left wing the politics of
00:34:30.560 | services have been. But they may not have a choice, because it's
00:34:34.680 | going to come down to do we fire half the government employees?
00:34:37.800 | Or do we start working with business instead of driving them
00:34:40.800 | out of the city?
00:34:41.480 | I don't want to get too caught up in San Francisco politics in
00:34:44.280 | San Francisco as a region. I think we've we've hashed that
00:34:47.280 | one out. But the bigger question is, if there's this impairment
00:34:50.400 | on $3 trillion of commercial real estate debt across the
00:34:53.440 | country, who eats the loss? So who owns all the equity in these
00:34:57.800 | buildings? And where does that ultimately flow through to what
00:35:01.040 | are the second order effects? And who owns the debt? And where
00:35:03.720 | does that all flow through to we know the debt sits on the
00:35:05.720 | commercial banks? Do the bank stocks get beat up? The market's
00:35:09.600 | been cognizant of this folks have been shorting and selling
00:35:12.760 | regional bank stocks. And then Bill gross, the well known bond
00:35:16.680 | trader used to run PIMCO came out this week, and said, Hey,
00:35:19.680 | I've been tracking all these regional bank stocks. They're
00:35:22.360 | down so much. It's insane. They're trading at a huge
00:35:25.480 | discount to book. I'm going to start buying and then this
00:35:28.000 | morning, he put out a tweet saying I'm buying these stocks.
00:35:30.160 | And he listed the names of the bank stocks. He's like the
00:35:32.200 | bottoms been hit. Do you buy that? I mean, is this already
00:35:34.920 | been priced into the market, that this debt is going to be
00:35:37.480 | written off at some point and all these balance sheets are
00:35:39.360 | impaired? Because the banks are trading at a big discount the
00:35:42.120 | book value. So
00:35:42.800 | well, book value is a term that you need to put in quotes. So my
00:35:46.600 | question would be what are the rules around the real mark to
00:35:49.280 | market? Because I think that when we talked about the banking
00:35:51.600 | crisis, the biggest problem was these guys were playing fast and
00:35:55.160 | loose with valuations. And so are these things actually valued
00:35:58.840 | to market? And do they have to?
00:36:01.400 | They're not yet? No, that's the point. That's why they're trying
00:36:03.680 | to avoid these transactions. These book values are
00:36:06.600 | illegitimate. Yeah, and that's why they're trading at a
00:36:09.400 | discount. But Bill gross is now saying they're trading at such a
00:36:11.760 | discount to whatever the accounting is that they're like
00:36:14.360 | now well below the fair value.
00:36:16.600 | Yeah, that's the argument for San Francisco real estate. I'm
00:36:18.960 | sorry that at least for these fire sales that are happening.
00:36:21.840 | Yeah. Well, sex, let me ask you one more question. So I
00:36:23.920 | mentioned the Biden program, Biden announced this program,
00:36:26.640 | which is effectively $45 billion in federal money to convert
00:36:31.800 | commercial buildings into condos and residential buildings. He
00:36:35.200 | basically relies on a 1998 transportation and
00:36:39.240 | infrastructure bill that provides authority to the
00:36:41.880 | federal government to issue low interest loans for specific
00:36:45.440 | infrastructure projects. If you're a developer, is it
00:36:48.760 | realistic that this money is going to unlock potential for
00:36:52.480 | increasing affordable housing density in urban centers by
00:36:55.840 | converting office buildings into residential buildings?
00:36:59.040 | People talk a lot about these office to residential
00:37:02.080 | conversions. The problem is, it's easy to say and much harder
00:37:05.000 | to do most commercial office buildings don't meet the
00:37:09.040 | structural requirements or architecturally, they're not
00:37:11.480 | really suitable. If you think about most commercial office
00:37:15.320 | buildings, they've got really big floor plates. And if you
00:37:19.960 | think about like restructuring them into apartments, there's
00:37:22.120 | not proper window coverage, there's not proper utilities. So
00:37:25.880 | it's a lot harder than it sounds. And I'd say most office
00:37:29.920 | buildings don't meet a lot of the requirements for this. But I
00:37:34.200 | think what's good about this executive order, even though I
00:37:37.440 | don't really believe this is a great use of 45 billion is that
00:37:40.040 | it's sending a signal to all of these cities, all these local
00:37:44.320 | governments that we need to get with the program here. In other
00:37:47.320 | words, you've got, you know, a blue executive sending a message
00:37:51.680 | to all these blue cities and states, that commercial real
00:37:54.880 | estate is in distress, and you need to loosen up your
00:37:57.680 | entitlements, you need to loosen up your regulations. It's
00:38:00.560 | sending a message to the city of San Francisco and all these
00:38:03.720 | other cities that you guys need to start doing positive things.
00:38:08.320 | And this is my point about will San Francisco start acting like
00:38:13.720 | a partner for real estate development, which is how you
00:38:16.880 | make real estate work? Or will they continue to drive all of
00:38:20.520 | these crazy regulations. So I mostly see this Biden program as
00:38:24.080 | symbolic, I say, but the question is whether the
00:38:27.280 | symbolism will actually drive better behavior by these blue
00:38:30.360 | cities.
00:38:30.880 | I personally think they're just trying to find more ways to pump
00:38:33.400 | money into supporting commercial real estate markets because of
00:38:36.400 | the issues we just highlighted. And I think this is the first of
00:38:38.640 | what will likely be several programs to support framed as
00:38:42.640 | things like affordable housing, but really designed to support
00:38:45.840 | the economic loss impairment. That's going to be inevitable at
00:38:49.200 | some point, right.
00:38:49.960 | But you know, you can't you can't convert an office building
00:38:52.720 | to a residential building without reason. Yeah, typically,
00:38:55.760 | totally. Yeah. And this is where you need action by the city
00:38:58.560 | governments. They've got to rezone, they've got to reduce
00:39:01.320 | the regulations, they've got to eliminate these transfer taxes
00:39:04.600 | have to make these projects. Yeah, exactly. They have to make
00:39:07.000 | these projects economically viable.
00:39:09.760 | I think that we'll be able to look back in San Francisco will
00:39:13.040 | be an incredibly measurable experiment on the return on
00:39:21.840 | invested
00:39:22.320 | capital, so on the return on invested capital of progressive
00:39:27.360 | left ideology. And it will be a great case study. Now, if it
00:39:33.760 | works, it will prove that all of these ideas that were kind of
00:39:39.080 | talked about in kind of like high intellectual circles has
00:39:44.880 | some value. And if it fails, which it looks like it's
00:39:47.520 | failing, it'll never be tried again for a very long time.
00:39:50.160 | Either way, you're going to have a very clear answer. And the
00:39:52.880 | good news is we're, we're probably another five or 10
00:39:56.520 | years away from that bottom.
00:39:58.040 | Well, actually, guys, ask a quick question. So there was a
00:40:00.960 | article here saying that the city controller's office for San
00:40:05.800 | Francisco has released its projected budget shortfalls for
00:40:10.600 | the coming years. It's almost half a billion for 2024 25
00:40:15.880 | reaching 1.3 billion and 2728. So what do they do about this? I
00:40:21.720 | mean, they don't have the money, they'll borrow money.
00:40:23.920 | And you can look at all of these other municipalities around the
00:40:27.600 | country as examples, but counties and municipalities and
00:40:30.400 | cities that are in much worse fiscal shape than San Francisco
00:40:34.800 | was able to stay incompetent for a lot longer. And so that delta
00:40:40.040 | T of incompetence tends to be about five to 10 years, I would
00:40:43.120 | say the midpoint is eight. So if we're starting now, you'll
00:40:47.200 | probably see some rationality by 2032 2033. That's how much
00:40:54.120 | borrowing David, I think you can tap in the in the muni market.
00:40:58.600 | Because you know, again, this is sort of the the double edged
00:41:01.560 | sword, right? Like when you're a triple tax advantage borrower,
00:41:05.000 | you can paint the case for being in San Francisco, which is part
00:41:08.120 | of California, etc, etc, etc. And your alternative is to own
00:41:12.360 | something in a state that's not nearly as economically vibrant,
00:41:15.880 | it's a pretty easy case to make to be able to borrow the money,
00:41:18.280 | I think. And so the problem is that it'll take, like I said,
00:41:21.080 | another, probably eight years, 10 years, so before the the
00:41:25.680 | spigot gets turned off, and they have to make some harsh changes.
00:41:28.240 | So they'll keep running this experiment for at least, I think
00:41:31.160 | if you want to be conservative for at least a decade, another
00:41:33.280 | decade,
00:41:33.600 | so those pile on the debt until
00:41:36.000 | absolutely, I don't want to keep harping on San Francisco,
00:41:39.880 | let's let's move on to we work. Because we work feeds into this
00:41:43.800 | real estate piece. I don't know if you guys saw this article
00:41:45.920 | that Wall Street Journal reported that we work, it's
00:41:48.560 | going to file for chapter 11. As early as next week, they have a
00:41:52.320 | significant debt burden owed to SoftBank vision fund as part of
00:41:56.200 | their go public, they've signed leases in office buildings in
00:41:59.840 | San Francisco and elsewhere around the country $10 billion
00:42:02.480 | in total lease obligations due starting in the second half of
00:42:06.040 | 2023, through the end of 27. And after that, an additional $15
00:42:10.560 | billion starting in 2028. And as of June, we work had 70 777
00:42:15.880 | locations, including 229 in the US in major cities, the business
00:42:22.120 | made about call it 840 million bucks in revenue last quarter.
00:42:26.320 | So that works out to call it a three and a half billion dollar
00:42:29.680 | revenue run rate with you know, $10 billion of lease obligations
00:42:33.480 | starting next year. So the business is just drowning in
00:42:35.920 | these lease obligations and to restructure 777 lease
00:42:39.560 | obligations in this environment that we're talking about, while
00:42:42.880 | doing what you're saying, lowering rents to attract
00:42:46.360 | employers to show up and actually rent space from them
00:42:49.160 | is obviously causing the business model to distort even
00:42:52.320 | worse than it has been historically, they burned $8
00:42:54.600 | billion of pre cash flow in since q4 of 2019 $8 billion of
00:43:01.080 | free cash.
00:43:01.960 | There's no question that that we work has been a capital
00:43:04.400 | destruction machine. That being said, I actually think that some
00:43:08.680 | private equity player is going to buy this out of bankruptcy
00:43:11.320 | and make a fortune. I agree with that, too. Because out of those
00:43:15.320 | 777 locations, a lot of them are good locations and have good
00:43:22.040 | tenancy, they probably generate good revenue. The problem is
00:43:25.520 | that the leases are just bad. And bankruptcy gives you the
00:43:28.240 | power to break those leases or renegotiate them. So a really
00:43:32.080 | smart private equity player would come in here and say,
00:43:34.520 | Okay, we're going to take these locations, we're going to buy
00:43:36.600 | them into three buckets. bucket number one is we're gonna get
00:43:39.160 | rid of them, because there's bad locations, we don't want them,
00:43:41.720 | there's just no occupancy. Bucket number two is we're gonna
00:43:44.760 | go to the landlord and say that, sorry, like this lease doesn't
00:43:48.680 | work for us, we will be willing to work for you as an operator
00:43:53.000 | of the space. And you'll pay us a fee and a rev share on
00:43:56.560 | whatever it makes, but we can't pay you a guaranteed rent. So
00:44:00.560 | that's bucket number two. And then bucket three, which will be
00:44:03.080 | the best locations, they'll go back to the landlord and say,
00:44:05.600 | here's 60 cents on the dollar, we're willing to pay you this
00:44:08.360 | and rent. That's it. If that's not good enough for you, we're
00:44:11.520 | breaking the lease route. And those landlords are gonna have
00:44:14.080 | to accept it because we're gonna get who's any better. And they
00:44:17.480 | don't even have the TI is they don't have the capital to put
00:44:20.420 | some new tenant in there. And even if they could put someone
00:44:23.000 | in there, it's not gonna be at a rent that's much higher than 60
00:44:27.040 | cents on the dollar. Because we work made a lot of these top of
00:44:29.680 | market leases. So I think the landlord will take the bird in
00:44:33.560 | the hand. So think about it, some private equity player goes
00:44:36.800 | in there renegotiates all these leases, sheds the bad ones. And
00:44:40.580 | all of a sudden, the business is going to make a lot of money.
00:44:42.600 | And the reason why it's going to make money is because we work
00:44:46.400 | poured all of this capital into renovating these spaces and
00:44:51.480 | making them really nice. I mean, if you go into a we work, they
00:44:54.720 | are really nice spaces. And the reason for that is because we
00:44:57.760 | work spent billions of ti dollars making these things
00:45:02.480 | incredibly nice. Was that a wise investment? No, it was a
00:45:05.420 | terrible investment. But that money has been spent and already
00:45:08.800 | lost that that capital has been wiped out. So whoever buys this
00:45:12.440 | thing out of bankruptcy now is gonna be the beneficiary of all
00:45:15.280 | of those absurd ti dollars that were spent. When Adam Newman,
00:45:20.240 | oh my god, you just convinced me. Let's go buy it. Let's put
00:45:23.640 | it in a bid. Is it filed? Is it filed? The brilliance of Adam
00:45:27.800 | Newman was somehow convincing. Wow. All these investors to put
00:45:32.040 | in billions of dollars into ti money on the theory that it was
00:45:36.000 | a software business. This was never a software business. It's
00:45:39.040 | a real estate development company. It was Regis. It was
00:45:42.920 | really just as I got softbank put in a total of $16.9 billion
00:45:48.560 | in equity and debt. Oh my gosh, not incredible. Oh my gosh,
00:45:54.400 | software, which is mostly through the vision fund, which
00:45:57.280 | is as you guys know, 45% Saudi money. It was Regis much better
00:46:00.880 | design, but that design came at a huge expense, which was again,
00:46:05.540 | this over investment in ti dollars, combined with the fact
00:46:09.520 | that they had no discipline around signing leases, and they
00:46:12.560 | have so many top of market lease deals. Yeah. But again, that's
00:46:16.400 | all fixable for someone who comes in old saying in real
00:46:19.040 | estate, that is the third owner who makes all the money. Like
00:46:22.720 | the first owner who does all the ground up, they lose a fortune
00:46:25.600 | and they get blown out. Then the second guy comes in and thinks
00:46:29.000 | that they're gonna make it work. But they can't make it work
00:46:32.080 | either. So then they get blown out. And it's finally the third
00:46:34.960 | person who comes in and makes all the money that's gonna
00:46:36.840 | happen here too.
00:46:37.560 | Oh my god, that's just fantastic. This is a story of
00:46:41.360 | the ages. Let's transition from one bubble to the most recent
00:46:45.160 | bubble. So AI regulation, given the frenzy and froppiness and
00:46:48.640 | fear mongering on AI destroying the world, which I would
00:46:53.440 | personally argue is largely fear porn. The Biden
00:46:58.000 | administration took it upon themselves to try and be leaders
00:47:00.520 | in regulating AI and publish an executive order on October 30.
00:47:04.640 | This long anticipated executive order covers a very broad range
00:47:09.840 | of matters and 111 page order document covers very specific
00:47:14.080 | actions in very detailed terms. It uses technical terms of art.
00:47:17.480 | And I think it creates as much confusion as it does provide
00:47:21.040 | clarity. It's an executive order, so it's not legislation.
00:47:24.440 | And there's much in here that some would argue needs to be
00:47:28.200 | legislated. As an executive order, it can be overturned very
00:47:32.000 | quickly and easily by the next administration. It largely
00:47:35.720 | demands voluntary action from technology companies to submit
00:47:39.680 | their models, infrastructure and tools for review. For proof of
00:47:45.520 | safety. Don't forget the equity and inclusion. There's an equity
00:47:49.280 | inclusion component, which is that your models have to account
00:47:51.720 | for diversity, equity and inclusion, diversity, equity,
00:47:53.760 | inclusion, there's a phrase from the executive order, the term
00:47:57.080 | dual use foundational model means an AI model that is
00:47:59.560 | trained on broad data, generally uses self supervision contains
00:48:03.840 | at least 10s of billions of parameters is applicable across
00:48:06.480 | a wide range of contexts. And that exhibits high levels of
00:48:10.200 | performance and tasks that pose a serious risk to security,
00:48:13.200 | national economic security, public safety, or any
00:48:15.720 | combination of those matters, none of which ultimately
00:48:18.160 | describes endpoints, none of which describes loss, or
00:48:22.240 | applications. To me, this is the biggest problem I see with the
00:48:26.240 | executive order and with the approach to AI regulation
00:48:28.760 | broadly, particularly with what's being developed and has
00:48:31.480 | been leaked in terms of what's being developed out of the EU,
00:48:34.560 | which is that many of these government agencies, government
00:48:37.600 | actors are trying to define and regulate systems and methods,
00:48:43.640 | rather than regulate outcomes and applications. For example,
00:48:47.600 | if you were to say you cannot commit fraud, and falsely
00:48:51.280 | impersonate someone using software, you can rightly say
00:48:55.840 | that it follows the rule of law. And we should be able to
00:48:58.920 | adjudicate that cleanly in courts. Instead, to have
00:49:02.680 | government agencies and have what this executive order
00:49:05.280 | describes as a chief AI officer in every federal agency,
00:49:08.560 | responsible for regulating the systems and methods of all the
00:49:12.440 | actors and all the private companies that are building
00:49:15.560 | software to say this is the scale that your software can get
00:49:20.000 | to, you can have a certain number of parameters. If you're
00:49:22.720 | bigger than this number of parameters, we have to come in
00:49:24.600 | and check your software creates an outlandish standard. And one
00:49:29.000 | that makes absolutely no sense, particularly in the context of
00:49:31.760 | the pace of AI is progression. Remember, the paper that was
00:49:36.240 | foundational to transformer model development, which is what
00:49:39.040 | a lot of people call AI today, which is developed these LLM and
00:49:41.800 | so on, came out in 2017, and wasn't really widely adopted
00:49:45.200 | until 2018. As a standard, we're five years into this. So the way
00:49:49.920 | that we're making models, the types of models we're building
00:49:52.160 | the scale of the models, the number of parameters being used,
00:49:54.640 | the pace at which these things is changing is staggering. We
00:49:58.720 | are only in the first inning. And so to come out and say, here
00:50:01.680 | are the standards by which we want to now regulate you. This
00:50:04.400 | is the size that the model can be these are the types of models.
00:50:07.000 | It's going to look like medieval literature in three years, none
00:50:10.800 | of this stuff's even going to apply anymore. I'm just really
00:50:13.200 | of the point of view, as you guys know, that the market needs
00:50:16.120 | to be allowed to develop. If we don't allow our market to
00:50:19.960 | develop in the United States, India, and China, and Singapore,
00:50:24.200 | and other markets will get well ahead of us in terms of their
00:50:27.080 | model development and their capabilities as a nation and
00:50:30.440 | their capabilities as an industry. And the more our
00:50:33.360 | government actors step in and try and tell us what systems and
00:50:36.560 | methods we are allowed to use to build stuff, the more at risk we
00:50:39.960 | are falling behind. That's my general statement on the matter.
00:50:42.920 | I'll pass the the mic. I mean, to mock you were advocating for
00:50:47.120 | AI regs a couple months ago, I think, right? I mean, where does
00:50:50.000 | this fall with respect to what you were advocating for and what
00:50:52.360 | you were concerned about?
00:50:53.240 | Well, I had a very specific lens that I viewed this stuff through
00:50:59.240 | and they didn't really address it. I think that this was a
00:51:02.840 | little bit of a kitchen sink yo. And I think what probably
00:51:08.320 | happened was depending on look, I mean, I think the thing is
00:51:12.400 | like, look, they when this process first started, it
00:51:16.560 | started in the Senate and we started with the majority leader
00:51:19.560 | Chuck Schumer. And I met with Chuck and then I met with Chuck's
00:51:22.520 | team. And then it morphed into this much bigger thing. And I
00:51:26.480 | think it morphed into a lot of people trying to do the right
00:51:31.920 | thing, meeting with a lot of very important and very famous
00:51:35.400 | people. And I think somewhere along the way, it just became
00:51:39.720 | this convoluted and confused document, because I do agree
00:51:45.320 | with you that it's not super coherent. There's a lot of
00:51:47.680 | arbitrary requirements, you know, like there's like, I think
00:51:51.040 | there's a requirement here that as a certain number of
00:51:54.080 | parameters, you have to like self report yourself to the
00:51:56.840 | government, which is like, what does that even mean? Why? Why is
00:52:01.880 | that even important? So it's just a lot of random stuff. So
00:52:05.520 | it just seems like anybody who had the ear of the people
00:52:08.560 | writing this had a chance to write something in. So it's a
00:52:14.000 | little confusing. It's not going to do the job. And I think that
00:52:17.800 | you're right, two or three years, we're going to look back
00:52:19.880 | and this is going to look medieval.
00:52:21.640 | I'll just say I'll point out another point, just one of many
00:52:24.600 | that I could highlight in this document, I read most of it.
00:52:26.920 | They say, you know, there's got to be legislation on water
00:52:30.720 | marking AI content. Think about that for a second. What is AI
00:52:35.000 | content? We've been using Adobe Photoshop for 30 years, to
00:52:39.720 | change photographs and change documents. We've been using
00:52:42.840 | various tools for doing audio generation and music. I mean,
00:52:47.080 | there's no music that doesn't run through a digital processor
00:52:49.520 | of some sort. There's no video. There's no movies that don't
00:52:53.920 | have some degree of CGI elements or some degree of post
00:52:57.320 | production, digital rendering integrated into the video
00:53:00.960 | itself. What makes something AI? Is it the fact that 100% of the
00:53:05.680 | pixels are generated by software? What if it's only 98%
00:53:09.240 | is Pixar AI, because all of Pixar is made on computers? Is
00:53:13.840 | the auto tune hip hop artist AI, because his voice isn't coming
00:53:17.560 | through on the audio track? So what is AI in this definition,
00:53:22.320 | the fact that any piece of content needs to be watermarked
00:53:25.800 | if it's AI, I think is one of the most outlandish infringements
00:53:29.960 | on First Amendment rights I've seen, and doesn't really
00:53:33.520 | understand any set in any sense, how software is generally used
00:53:37.720 | in the world today, which is frankly everywhere. And to say
00:53:41.640 | that now there is a certain definition of a certain type of
00:53:45.040 | software that we're going to arbitrarily call AI, we want to
00:53:48.520 | watermark this stuff, and we want to get a stamp on that
00:53:51.560 | content. So the government can track it and audit it, I think
00:53:55.440 | is absurd. tax over to you.
00:53:57.800 | Yeah, okay, three, three quick points here. And number one, AI
00:54:01.080 | has been convicted of a pre crime. This, this EO, it
00:54:06.400 | describes it describes this litany of horribles, this parade
00:54:10.520 | of horribles, that's going to happen. Unless you know, the
00:54:14.200 | wise people in the central government, like Joe Biden,
00:54:16.840 | Kamala Harris, guide its development. So that's what the
00:54:19.480 | EO says can do is guide the development of this technology.
00:54:22.080 | Because if we don't, it's going to result in all these horrible
00:54:24.480 | things happening. Steven Sanofsky, who was a key
00:54:27.800 | executive of Microsoft in the I think it going back to the 80s,
00:54:30.600 | wrote a really interesting blog post about this, where he was
00:54:34.120 | there at the beginning of the PC revolution, with the dawn of the
00:54:37.760 | microprocessor in the 70s and 80s. Yeah. And he pulled a bunch
00:54:41.340 | of books off his bookshelf from that time period, which were
00:54:44.340 | sort of the sci fi books like forecasting doom infringement on
00:54:48.240 | privacy, one was called the assault on privacy, and it was
00:54:51.420 | called electronic nightmare. Another one was called the rise
00:54:54.440 | of the computer state, and so on. And so it was also predicting
00:54:57.900 | all these horrible things that would come from the birth of the
00:55:01.900 | computer. And it would put all these people out of work and so
00:55:04.740 | on. Totally. But nobody allowed these fears to guide the
00:55:11.060 | development of the industry. There was no executive order at
00:55:13.740 | that time, saying that we're going to guide the development
00:55:16.820 | of the microprocessor processor to avoid all these harms that
00:55:20.060 | haven't occurred yet. And instead, we're taking a
00:55:23.420 | different approach here. And he makes the point that if the
00:55:26.420 | industry had been guided in that way, then it never would have
00:55:30.260 | achieved the potential that it ultimately achieved. And
00:55:32.640 | furthermore, he makes the point that a company like IBM would
00:55:35.380 | have been more than happy to work with the regulators to say,
00:55:38.220 | yeah, let's define a bunch of these rules to make sure that it
00:55:41.340 | doesn't go in this dark direction that everyone thinks
00:55:43.140 | it's going to go in. And I would have gotten very extreme
00:55:45.420 | regulatory capture.
00:55:46.420 | Can I pre cog one of the crazy lines in this thing? Here are
00:55:50.460 | the pre cogs. What here's one when a foreign person transacts
00:55:55.540 | with an infrastructure as a service provider. So as your
00:55:59.300 | Google Cloud Amazon, to train a large AI model with potential
00:56:03.600 | capable with potential capabilities that could be used
00:56:06.240 | in malicious cyber enabled activity, they propose
00:56:09.060 | regulation that requires that I asked provider to submit a
00:56:12.060 | report. So if you have to, if a foreigner is using your API, so
00:56:23.280 | AWS, file a TPS report, what is going on? I have a clarifying
00:56:30.460 | question. What happens if you're a foreigner that's on an h1 b at
00:56:33.600 | Facebook or, you know, Microsoft? Well, technically a
00:56:38.340 | foreigner, right? And then Chamath Meanwhile, remember when
00:56:40.780 | SpaceX is now being sued by the DOJ because they haven't employed
00:56:43.580 | enough foreigners. So it's like, outlawed both sides of it. You're
00:56:52.940 | in non compliance just by existing.
00:56:58.980 | By death. If you employ foreigners in the creation of
00:57:05.220 | these very sophisticated technologies, you're creating a
00:57:07.220 | national security threat. If you exclude them, you're violating
00:57:10.560 | their civil rights. And these are regulations are promulgated
00:57:13.480 | by two different parts of the government. So the government is
00:57:16.360 | basically getting to a point where no matter which side of
00:57:20.160 | the regulation you choose, you're going to be a non
00:57:22.280 | compliance somewhere. And this is just a recipe for the
00:57:24.660 | government to basically take action against anybody who they
00:57:27.400 | don't like politically. Yeah. Now in this pre crime point, you
00:57:31.600 | know, one of the crazier stories that came out, and I think this
00:57:34.560 | was in variety, is that Biden grew more concerned about AI
00:57:38.280 | after screening the most recent Mission Impossible movie, which
00:57:41.800 | the villain is a sentient AI that seizes control of the
00:57:46.040 | world's intelligence apparatus. Is that true? That's not true.
00:57:49.640 | Well, that's what the story said.
00:57:51.120 | Didn't the press reporter asked the press secretary this and
00:57:54.520 | this was the response. Who said this?
00:57:56.920 | The chief of staff, I guess said read said read who? Yeah, wow.
00:58:01.320 | Unbelievable. The crazy thing is that I think the reason they're
00:58:03.560 | putting out this story is they don't want to admit how the EO
00:58:06.200 | really happened, which is a bunch of lobbyists came in. And
00:58:09.680 | like you said, a lot of powerful people from big tech, they're
00:58:12.280 | all clamoring for regulatory capture, they're all clamoring
00:58:15.600 | to define the regulations so that they can benefit themselves
00:58:18.800 | and keep out new entrants. Because one of the big targets
00:58:21.680 | here is going to be open source software. So if you are, for
00:58:25.120 | example, open AI, which is no longer open as closed source,
00:58:29.120 | the number one thing you want to do is pull up the ladder before
00:58:33.040 | open source software can get a lot of momentum. And a big part
00:58:37.440 | of the regulation here does apply to open source software. I
00:58:40.320 | think one of the most problematic things about this
00:58:42.920 | just to move on to the next point is the way that it's not
00:58:46.880 | just this 110 page EO that's being promulgated. The
00:58:50.880 | directive is for all these other parts of the federal government
00:58:54.320 | to start creating their own regulations. So it says here
00:58:57.160 | that the National Institute of Standards and Technology will
00:58:59.840 | develop standards for red team testing of these models. It
00:59:03.320 | directs the Department of Commerce to develop standards
00:59:06.480 | for like you said, freeberg labeling AI generated content
00:59:09.280 | directs explicitly the FTC to use existing leg regulation to
00:59:13.320 | go and legislate it, it requires the FCC to think about spectrum
00:59:19.360 | licensing rules. Through this lens, you're absolutely right,
00:59:22.320 | it activates every agency of the government to create more
00:59:25.760 | bureaucracy,
00:59:26.520 | create more importantly, it creates authority for agencies
00:59:29.640 | of the government to go into private servers and run audits.
00:59:32.920 | You're allowing the government for the first time ever to have
00:59:35.600 | access to private information, private computing
00:59:37.960 | infrastructure, rather than do what the job of the government
00:59:41.120 | should be, which is to regulate the outputs to regulate the
00:59:44.000 | outcomes to regulate illegal and illicit activities, rather than
00:59:49.200 | regulate systems and methods that as sacks point out are
00:59:52.680 | pre convicted of a crime, there is no crime until a crime is
00:59:55.800 | committed. And so this idea that the government can come in and
00:59:59.000 | regulate and keep a crime from being committed by auditing,
01:00:03.040 | managing all the systems and methods that software companies
01:00:06.360 | are using puts the United States is at an extraordinary and unfair
01:00:09.600 | disadvantage on a global stage, by the way. And by the way,
01:00:12.560 | llama two is out there. Hardware is out there. All the tooling is
01:00:16.160 | out there for others to start to get well ahead.
01:00:18.440 | Good news about the immigration thing. There is they actually on
01:00:21.760 | the other side do streamline the immigration process for people
01:00:25.600 | with AI or other critical technologies. So on the one
01:00:29.440 | hand, we should be able to hire them. But then on the back end,
01:00:31.960 | we'll have to file a TPS report about what they do.
01:00:34.200 | We just finished my point about different agencies being
01:00:38.560 | directed now to promulgate regulations. And by the way,
01:00:40.840 | there's some new committee or council that's being created of
01:00:43.280 | 29 different parts of the federal government that are now
01:00:48.680 | going to coordinate on this AI stuff. So it's going to be a
01:00:51.520 | Brussels style bureaucracy. What's going to happen is that
01:00:55.520 | with all of these different bodies issuing new regulations,
01:00:58.920 | it's going to get more and more burdensome on technology
01:01:02.520 | companies until the point where they cry out for some sort of
01:01:06.480 | rationalization of this regime, they're going to say, Listen, we
01:01:10.080 | can't keep up with FCC and Department of Commerce and this
01:01:13.200 | and it standards board, just give us one agency to deal with.
01:01:17.560 | And so the industry itself is eventually going to cry uncle and
01:01:21.240 | say, like, please just give us one and you're ready here. People
01:01:25.320 | like Sam Altman and so forth, calling for the equivalent of
01:01:29.320 | Atomic Energy Commission for AI. This is how we're going to end
01:01:33.040 | up with a federal software commission, just like we have a
01:01:35.760 | FCC to run big communications, and we have a FDA to run big
01:01:41.520 | pharma, we're going to end up with a federal software
01:01:43.400 | commission to run software, big software. And the thing to
01:01:47.200 | realize about AI is that functionally, there's no way to
01:01:51.040 | really say what the differences between AI and software in
01:01:55.200 | general, every single technology company, it's looking Valley has
01:01:58.560 | AI on its roadmap, it's a new computing paradigm, everybody is
01:02:02.160 | incorporating it, everyone is using some elements of AI. So
01:02:06.560 | this wave of AI is just becoming software. So we're going to take
01:02:12.440 | the one part of our economy that has been free market and
01:02:17.080 | relatively unregulated, which is software, it's like the last
01:02:20.160 | bastion of true entrepreneurial capitalism, and we're going to
01:02:24.120 | create a new federal agency to manage it. That's where we're
01:02:29.600 | all headed here. And we're going to end up as an industry in the
01:02:34.560 | same place that pharma has ended up or no telecom has ended up,
01:02:39.920 | which is you go for permission to Washington.
01:02:42.920 | No, I Nick, can you just put this up? So I, I did, you know,
01:02:47.080 | write a blog post sort of asking for this AI regulation. And, and
01:02:51.000 | I do think like the model of the FDA is probably the best one
01:02:55.520 | where in certain use cases, I do think that you can create a
01:03:00.320 | sandbox where these things can be tested, and I think can be
01:03:04.200 | evaluated. The problem is that this legislation or this
01:03:08.200 | executive order doesn't really do that. So the way that I wrote
01:03:12.200 | this was, well, what are the arguments against what I'm
01:03:15.000 | proposing? And unfortunately, I have to say every single one of
01:03:18.320 | those arguments is not valid. So I think that we should have gone
01:03:24.440 | in the direction of like a simplifying assumption, which is
01:03:26.960 | that most of this stuff is just software. And there are going to
01:03:30.720 | be some very specific kinds of businesses that before you take
01:03:33.920 | it to market, you should do something that's equivalent to
01:03:38.080 | sticking it in a sandbox, letting people see it. And I
01:03:40.600 | think it's easy to define what those are actually. And the fact
01:03:43.400 | that we didn't do that, and we have this overly generalized
01:03:47.080 | approach is going to create more chaos and people will not know
01:03:50.280 | whether they're in violation or they're in support, they're
01:03:52.600 | going to be sort of pre guilty, as you said, sacks of a pre
01:03:55.480 | crime. It's just gonna, it's just gonna be chaos. So I think
01:04:00.360 | this is a smorgasbord for politicians, because everyone's
01:04:03.760 | gonna be lobbying them to try and shape the regulations the
01:04:06.400 | way they want. I think it's going to be a smorgasbord for
01:04:09.680 | lawyers. Yeah, I mean, everyone's gonna hire lawyers
01:04:13.240 | and lobbyists and policy people to try and shape these
01:04:16.200 | regulations.
01:04:16.840 | Good time to bet on India. Thank God for Mark Zuckerberg and the
01:04:20.480 | open sourcing of the llama to model.
01:04:23.920 | It's not entirely open source, right? I mean, I think the
01:04:26.040 | biggest thing with llama to that I find problematic is that it
01:04:28.680 | has these arbitrary arbitrary thresholds on the number of
01:04:31.560 | users one can have. Yes, yes. Before you have to go back to
01:04:34.040 | Facebook. So yeah, it's 700 million, I think, right? Yeah.
01:04:37.400 | Yeah. Before I give all the credit to Facebook, I'd rather
01:04:39.520 | say that I think there are a lot of open source alternatives,
01:04:43.160 | including Mistral that I think are much better. Yeah. And free
01:04:47.000 | and open and in the clear where you can have unencumbered
01:04:49.280 | growth, not dependent on anybody else.
01:04:51.760 | Yeah, yeah. But I think your freeware, your larger point
01:04:56.840 | about us versus India versus China. I think this could have a
01:05:00.800 | major impact on our global competitiveness, of course,
01:05:03.720 | because as you look around the what's happening in the American
01:05:06.840 | system, there are so many things going wrong, right? Our fiscal
01:05:09.680 | situation is untenable. We're mired in massive debt. We have
01:05:13.560 | tremendous internal division in the country. Many of our cities
01:05:17.760 | are rife with crime, you just look at the number of people
01:05:20.040 | living on the streets. It's not good. I mean, you know, there's
01:05:23.160 | a lot of indicators that America's in decline, we're
01:05:25.200 | involved in way too many foreign wars. So there's a lot of bad
01:05:29.480 | things happening. However, the one bright spot that America has
01:05:33.040 | had going for it now, I think for decades is that we've always
01:05:35.640 | been the leader in new technology development. I mean,
01:05:39.080 | we're the ones who pioneered the internet. We're the place where
01:05:42.840 | mobile took off, we're the place where the cloud took off, and
01:05:45.400 | everyone else has been playing catch up with that. And I think
01:05:47.920 | we are in the lead in terms of AI development. I mean, other
01:05:50.920 | countries are doing it too. But I think that we are in the lead.
01:05:54.040 | And it's not because of the involvement of government. It's
01:05:56.080 | because we have this vibrant private sector that's willing to
01:06:00.280 | invest risk capital, where you know that nine out of 10 checks
01:06:05.240 | that you write are going to zero, but that one check that one
01:06:07.800 | out of 10, hopefully, could be a huge outcome. So we've had this
01:06:11.280 | incredible ecosystem of software development that is free and
01:06:15.160 | unregulated that is clustered around Silicon Valley, but it
01:06:18.520 | radiates out to many other places. And the only reason that
01:06:22.600 | ecosystem has thrived and survived is the reason that Bill
01:06:26.120 | Gurley stated at all on summit, which is it was 2200 miles away
01:06:31.240 | from Washington, and Washington has generally kept its mitts
01:06:35.560 | off. And now we are headed to a place where not just AI, but
01:06:40.680 | basically all software companies are headed for regulation. And
01:06:44.640 | like I said, right now, it's 29 different agencies promulgating
01:06:48.480 | the regulations. But where this is going to end up is that the
01:06:52.160 | industry is eventually going to beg for its own agency just to
01:06:55.960 | rationalize all the spaghetti of different regulations, we are
01:06:59.280 | eventually going to beg for our federal agency overlord, just to
01:07:03.840 | rationalize this whole mess. And sadly, I think that's where we're
01:07:06.600 | going to end up and a lot of what is special about our
01:07:09.840 | industry, which is that two guys in a garage really can get
01:07:12.480 | started in a completely permissionless way. I think
01:07:15.680 | that's going to be jeopardized.
01:07:16.760 | If that's the endgame. That's genius.
01:07:18.600 | I think that is the endgame.
01:07:19.800 | I mean, do you think that they wrote this thing in a way just
01:07:22.360 | to create enough confusion where we all just cry uncle and say,
01:07:25.680 | just regulate us? Give us? Yeah,
01:07:27.720 | look, I don't know if it's that conscious. I think what it is,
01:07:30.120 | is there's a lot of politicians in Washington, and Biden and
01:07:33.560 | Harris definitely fall into this category, who just think that
01:07:36.960 | central planning or the central guiding of the economy or
01:07:40.200 | aspects of the economy to avoid certain problems that they think
01:07:44.440 | they can do this, they think that they are wise enough to do
01:07:46.680 | this. Or maybe they're cynical, and they just know this a
01:07:49.480 | pathway to campaign contributions. But either way,
01:07:52.440 | they kind of believe in this approach. And they threw the
01:07:56.880 | kitchen sink at this 110 pages. Every agency is not gonna be
01:08:01.560 | writing 1000s of pages. And I think where it's going to end up
01:08:04.080 | is with a new agency to manage software.
01:08:07.280 | So I think, Saks, it's a, the point you're making is one that
01:08:12.160 | some folks in Silicon Valley have felt for a while, which
01:08:16.080 | that is that a lot of the freedom and opportunity that the
01:08:20.480 | United States offers pioneers and entrepreneurs has been
01:08:25.600 | realized and born out in Silicon Valley, that's feeling like it's
01:08:28.720 | been stymied in a number of ways. But the events in Israel
01:08:34.880 | couple weeks ago, in my opinion, seems to have shocked a lot of
01:08:40.720 | people that are traditionally very liberal Democrats into
01:08:45.960 | being red pilled. I have had a number of calls with individuals
01:08:49.840 | over the last few weeks that have historically been very
01:08:53.040 | diehard blue liberal Democrats have spent their, their time
01:08:57.920 | with NGOs with nonprofits with various efforts to promote
01:09:03.360 | liberal causes. And these are folks who may or may not be
01:09:06.600 | Jewish, but who have been so shocked by what's happened in
01:09:12.680 | Israel, and how many liberal organizations have created this
01:09:18.320 | narrative around being pro Palestinian, in a way that is
01:09:22.240 | being deemed, and is coming across as highly, and in many
01:09:25.840 | cases is very much anti semitic. And so I've had folks say, I'm
01:09:30.360 | giving up everything else I'm doing. And all I'm going to do
01:09:33.640 | is change the causes I'm working on now. And I have just been red
01:09:38.200 | pilled. Do you guys and Shamak, you shared your story on our
01:09:42.880 | show a couple of weeks ago, the clip of you saying that went
01:09:45.920 | viral on how you kind of recognize that maybe Trump was a
01:09:49.080 | good president in the context of what you're experiencing with
01:09:52.000 | the Biden administration. Now we're seeing with the Biden
01:09:53.840 | administration. Now, you seem to be like, the case study that
01:09:58.600 | I'm now hearing more from Silicon Valley folks. Yeah,
01:10:01.040 | about this mission.
01:10:01.960 | I think that over the last three or four years, my political
01:10:09.200 | positions really haven't changed that much. But I think what's
01:10:13.440 | happened is that the the political party that I've
01:10:17.240 | historically been affiliated with has just kept lurching
01:10:20.520 | further and further to the left. And so it leaves me looking for
01:10:27.520 | a new home, I think. And I think that there are a lot of people
01:10:33.880 | that thought they were signing up for a set of beliefs around
01:10:37.320 | free speech around being pro reproductive rights around being
01:10:42.960 | supportive of LGBTQ issues, but also supportive of a rational
01:10:49.200 | approach to the border and reasonable fiscal discipline.
01:10:54.200 | These are all seem to just be totally out the window. So I
01:10:59.200 | understand because I've actually talked to a lot of my friends
01:11:01.880 | and my friends, some of them have been huge donors to both
01:11:07.240 | elite colleges and the Democratic Party. And they've
01:11:12.320 | paused all of it. And it's definitely given me pause. And
01:11:17.160 | I'm just trying to figure out where to go from here. Because I
01:11:20.840 | think the reality is that this is a really crazy situation. So I
01:11:24.360 | think that there's just a lot of people that are in the center
01:11:27.440 | that don't really belong anymore, the way that the
01:11:29.760 | Democrats represent themselves. So there's clearly something
01:11:33.080 | going on that we just need to get to the bottom of here,
01:11:35.200 | because I think it's not probably the Democratic Party
01:11:38.120 | that a lot of people thought that they belong to.
01:11:39.760 | Do you have other friends and folks that you know, that have
01:11:42.120 | traditionally been democrat donors and Silicon Valley
01:11:44.600 | Chamath that you see are now maybe considering shifting where
01:11:49.080 | they're giving money to Republican Party candidates and
01:11:53.760 | causes?
01:11:54.240 | I think it's easy to say that amongst my friends, it's in the
01:11:57.680 | billions of dollars that's been paused. That's insane. That's a
01:12:02.640 | lot of money.
01:12:03.160 | Actually, you used to be part of a niche in Silicon Valley,
01:12:07.200 | Silicon Valley, Republican niche. Is your niche growing?
01:12:11.120 | I think so. I actually made a list of I think eight issues
01:12:15.680 | that have driven the shift. And when you say there's been a
01:12:18.560 | shift, right, I wouldn't say that there's been a shift all
01:12:21.960 | the way to the right. But I think there's been a shift from
01:12:24.040 | the left to the center.
01:12:24.920 | By the way, that statement is consistent with what I've heard
01:12:27.320 | personally from a lot of folks, which is I consider myself a
01:12:30.560 | centrist. Now that I see that some of these liberal causes are
01:12:35.000 | so they're extreme from my point of view,
01:12:37.240 | right. So let me rattle off what I think the key drivers are
01:12:40.320 | number one, reckless fiscal monetary policies coming out of
01:12:43.400 | Washington, as Druckenmiller says, we've been spending like
01:12:45.840 | drunken sailors. Number two, the dismal state of San Francisco
01:12:49.520 | because of crime, drugs, homelessness, and so on. And we
01:12:52.560 | all know that's an extreme one party government. Number three,
01:12:55.960 | the ruinous COVID policies, they botched the science, they harmed
01:12:59.560 | the educational development of kids. And they also harmed work
01:13:02.560 | culture by making employees feel permanently entitled to work
01:13:05.400 | from home. So that's number three. Number four, this long
01:13:09.000 | kind of regulatory capture in both Washington, Sacramento that
01:13:12.840 | we just talked about with AI, people are waking up to that.
01:13:15.640 | Number five, the betrayal of true liberal values, like free
01:13:19.600 | speech and open inquiry. Number six, the way that tech
01:13:22.600 | visionaries like Elon have been targeted for harassment by left
01:13:26.640 | wing politicians. Remember, Biden said we need to look into
01:13:29.160 | this guy. Well, Reina Gonzalez was even less subtle. She said
01:13:33.000 | simply Elon Musk, number seven, the growing awareness that
01:13:36.360 | wokeness has gone way off the rails. We saw this even before
01:13:40.320 | the Israel Hamas war. But now I would say number eight is this
01:13:43.920 | war, it's really thrown gasoline on the fire by showing that
01:13:49.120 | wokeness leads to the simplistic breaking up of the world into
01:13:52.480 | oppressor and oppressed categories. And what we're
01:13:56.240 | seeing is that there's a lot of people in the woke left who are
01:13:59.440 | basically cheering for a terrorist organization. And
01:14:02.600 | they're able to rationalize atrocities because of this
01:14:05.680 | simplistic woke dichotomy. That being said, I do think it's
01:14:09.200 | possible to support the desire for a Palestinian state and a
01:14:13.120 | two state solution without falling into that bucket. I want
01:14:15.840 | to be clear about that. But I think that there's been way too
01:14:18.560 | many people on the woke left, who have blithely disregarded
01:14:22.360 | the atrocities and were basically cheering for Hamas
01:14:26.560 | from day one on this thing. And I think that has woken up a lot
01:14:30.040 | of liberals, both Jews and non Jews who saw Jews as allies on
01:14:34.640 | the left. And when they see Jews being attacked this way, that
01:14:38.760 | has led them to really, I think, question their their priors on
01:14:41.280 | this.
01:14:41.640 | What did you think of the Kamala Harris announcement
01:14:43.640 | yesterday that they're launching a program to combat Islamophobia
01:14:47.880 | in the US at a time when my understanding is many of their
01:14:52.560 | Jewish donors are up in arms about their lack of action on
01:14:57.840 | anti semitism. I mean, it just
01:14:59.920 | It just seems tone deaf. I mean, look, I have not seen in this
01:15:03.080 | particular case, I don't think I've really seen outpourings of
01:15:06.800 | Islamophobia. To be clear, I'm against Islamophobia. I'm
01:15:10.440 | against hatred towards any particular group, including
01:15:13.880 | Muslims or Jews to me, you know, any outpouring of hate towards a
01:15:17.880 | particular group is bad. But yeah, this seems like a problem
01:15:22.040 | that we really haven't seen. So it seems a little bit tone deaf.
01:15:25.520 | And look, this administration has been kind of stumbling
01:15:27.600 | around on on this whole issue. It seems like they don't really
01:15:29.960 | know what to do.
01:15:30.560 | I don't know how to quantify it. I just think it's an anecdote.
01:15:33.200 | What do you know, I like I'm saying, I don't know how to
01:15:36.280 | quantify it. But I think that the anecdote is a lot of folks
01:15:40.320 | know what, what do you think? What do you feel?
01:15:43.320 | What do I feel? Yeah. I mean, I'm not a party guy. Taking a
01:15:50.520 | step back, this is getting a gonna get a little esoteric. But
01:15:53.320 | I think humans organize into social systems. One social
01:15:56.920 | system is a family one is a company. And I think that the
01:16:00.000 | government is a social system. The point of a social system is
01:16:03.800 | to get influence to achieve the things you want to achieve by
01:16:07.680 | aggregating together by creating a tribe by creating a system.
01:16:10.520 | And that's what a government is. It lets a lot of people pull
01:16:13.280 | resources to get things done as a group that you wouldn't
01:16:15.800 | otherwise be able to get done individually. Same with a
01:16:18.040 | company, same with a family to support each other, etc. And I
01:16:21.800 | think that the problem with democratically elected
01:16:25.280 | governments, or all governments for that matter, frankly, is like
01:16:28.160 | any other system. They and I've said this in the past, they have
01:16:31.160 | a natural incentive to grow. I met with a government person the
01:16:35.400 | other day, well known government person. And I was struck by the
01:16:40.440 | the tone. I've been struck by the tone on multiple people I've
01:16:44.840 | met from the government on here's all the next things I'm
01:16:47.720 | going to get done. And here's how we're going to grow and do
01:16:49.440 | more and be bigger. Just like if you're running a company, that's
01:16:53.000 | your incentive. That's your model of thinking. And like with
01:16:56.240 | your family, you want to grow the balance sheet for your
01:16:58.000 | family, you want to have another home you want to take care of,
01:17:00.240 | you want to build security, etc. So governments as an
01:17:03.480 | organizing system are no different. And so I think that
01:17:08.840 | in a democracy, over time, there is, as Saks points out, a group
01:17:17.200 | of folks who get elected, and as a result, like a philosophy that
01:17:21.840 | endures in that system, that is all about overcoming the power
01:17:25.960 | that is that we all view there to be powers that are keeping us
01:17:29.960 | from doing the things we want to do. And I think that this is the
01:17:33.200 | ultimate manifestation of what happens is that you push the
01:17:36.920 | government to be as big as possible, with no end. And you
01:17:41.120 | push the government to destroy any system of power that gets in
01:17:44.880 | the way of those who are able to vote these folks in. And now a
01:17:48.000 | lot of folks who thought that they were supporting liberal
01:17:50.080 | causes to help those in need, are realizing that you're
01:17:53.200 | actually repressing minorities in the process that there are
01:17:56.960 | minorities that are viewed to be powerful, like Jews, that there
01:18:00.800 | are minorities that are viewed to be unfairly advantaged, like
01:18:04.840 | those in the tech industry, that there are minorities who are
01:18:07.800 | viewed to be unfairly advantaged, like billionaires. And
01:18:11.040 | I know that this sounds insane to say that those terms, but
01:18:14.800 | that's the objective is to destroy any system of power, any
01:18:18.800 | individual of power that has any form of leverage over us. It's
01:18:22.240 | also why I think we've generally been techno pessimistic in the
01:18:25.840 | West since the late 1960s, early 70s, is because technology has
01:18:29.680 | been viewed as this point of leverage for creating a power
01:18:32.120 | system for a minority of the people. So I generally view this
01:18:36.560 | as part of a longer arc and a natural kind of conclusion to
01:18:40.360 | this stuff. And you know, I'm not trying to be super
01:18:42.520 | pessimistic about where things are going, because I do see that
01:18:45.400 | the a lot of folks are now waking up to this reality. And
01:18:48.360 | they're like, so my anecdote is, I've had a lot of folks tell me
01:18:51.520 | in the last couple weeks, we have to be thoughtful about how
01:18:54.320 | we elect and how we govern, so that we create a more balanced.
01:18:57.760 | So what's the end? What's the end game in your framework? My
01:19:02.120 | end game is socialism. I think that's where things go,
01:19:04.600 | ultimately, long term, I think that there's a recapturing and
01:19:07.480 | redistribution of power and value, I think that there's some
01:19:10.520 | degradation of the system that erodes to looking like something
01:19:13.480 | like France, it's not like a nasty, and it's like a slow
01:19:17.240 | whimpering, like you end up looking like France or
01:19:18.840 | something. And so that's my end game in that framework. And I
01:19:22.160 | believe strongly, this is why we talked about this with Dahlia,
01:19:24.520 | that there are political actions, decisions, voting
01:19:27.960 | mechanisms that that hopefully we can put in the place to stall
01:19:31.440 | out and to stop that from happening. Because I do think
01:19:34.600 | that human progress is critical for the minor for the majority
01:19:38.040 | of people that are disadvantaged. So we need to
01:19:40.320 | enable it.
01:19:40.760 | Okay, so I'm guessing you're on the I don't want to be France
01:19:45.120 | side.
01:19:45.760 | 100%. That's right. Okay, so and so what is the I want to be the
01:19:49.520 | Wild West, I want to be the 19th, the early 19th century,
01:19:52.920 | what do you think that we should do in America right now? I think
01:19:56.240 | we should unwind a lot of laws. And I think we should have as a
01:19:59.320 | mandate, the objective is, how do we get more of our economy
01:20:03.440 | and more of our people in this country to be supported by and
01:20:09.600 | progress through private industry, rather than public
01:20:12.240 | support, the federal government and the role that the federal
01:20:14.680 | government has in funding industry and hiring individuals
01:20:18.920 | in pumping dollars into markets, I think has gotten so far beyond
01:20:23.960 | the tell, that we, as we know, are being sloshed back and
01:20:28.280 | forth based on the decisions being made by the Federal
01:20:30.600 | Reserve. That should have never been the case. The free market
01:20:33.760 | should have always been allowed to operate. Progress should have
01:20:36.440 | always been allowed the government should have stopped
01:20:37.920 | people from being hurt should stop negative outcomes. But the
01:20:41.080 | idea that the government should come in and start to run things
01:20:43.520 | and employ people and get to the scale that it's gotten to this
01:20:46.560 | is the largest organization in human history, the federal US
01:20:50.680 | federal government, it has got the highest dollar spend the
01:20:53.280 | highest budget of any organization in human history,
01:20:56.040 | even accounting for the Roman Empire, there should be
01:20:58.200 | accountability standards for every law, which is how do I
01:21:00.120 | know that this thing did what it was supposed to do? And how do
01:21:01.960 | I measure if it didn't? And by the way, if it doesn't, and the
01:21:04.560 | dollars aren't returned to me, it's a multiple, we shut that
01:21:07.320 | law down, we shut that department down, we move on,
01:21:09.800 | there should be an accountability standard for
01:21:11.520 | every dollar that's spent. But we don't describe any outcomes.
01:21:14.920 | We don't say here's the standard of measure, and here's the
01:21:17.400 | accountability. And if it doesn't work, and if there's
01:21:20.640 | this dollar isn't returned to us, we don't spend it anymore.
01:21:23.400 | And we don't keep doing this thing over and over again, and
01:21:25.480 | then layer something else on top and layer something else on top.
01:21:28.200 | And then you've got 10 layers of nothing accountable to anything.
01:21:30.840 | And the thing costs 10 times as much to do half as much. I am
01:21:35.800 | optimistic still, I'm always an optimist.
01:21:38.280 | Oh, make sure you put in that caveat. Good. Yeah.
01:21:42.600 | What's going right in your view? I feel like we just killed the
01:21:45.640 | last good thing.
01:21:46.640 | Yeah, what is going right? And you will the AI thing is what
01:21:49.640 | really pisses me off, man. I mean, this whole idea that we're
01:21:51.880 | gonna come in and tell people how to run business.
01:21:53.640 | I hear you go. Just go back to nuts. But do you think AI is
01:21:56.640 | going well, right now? We have nothing to really show for yet.
01:22:01.080 | So it's hard to say that it's, it's cute. And it's cool. But
01:22:03.880 | there's there's nothing really
01:22:04.960 | I don't get the term. When I started my company, I had people
01:22:08.600 | that I called math people and statisticians, then there was
01:22:10.880 | called data science, then it was called big data, then it was
01:22:13.880 | called ml. Now it's called AI. At the end of the day, it's
01:22:18.000 | software based algorithms that use three things, data, software
01:22:21.760 | and statistics. And the statistical tools and the
01:22:24.440 | approach to the software algorithm development have
01:22:26.520 | changed with the transformer model. That was paper that was
01:22:29.480 | described. But it's the same thing we've been doing since the
01:22:32.000 | 1960s, which is creating packages of software that make
01:22:35.080 | predictions. And that do things for us. And all that's happened
01:22:38.320 | in the last two years, is we've seen a piece of software that
01:22:41.360 | can communicate well with us, based on these LLM. And we're
01:22:44.640 | like, oh, intelligence is here. You know, the first time a
01:22:47.160 | computer in the 1960s said, Hello, world. Everyone was like,
01:22:50.240 | Oh, my God, we have AI now AI is sentient. It just said, Hello,
01:22:53.560 | world.
01:22:54.120 | Well, the same thing happened when when big blue, which was a
01:22:57.600 | big IBM mainframe beat Kasparov and chess, people were like, Oh,
01:23:01.560 | my God, AI is just about here. Because at one time, being great
01:23:06.120 | at chess was considered to be only a human capability. And if
01:23:10.600 | a computer could ever get that good, the theory was that it
01:23:13.880 | would have to be an AI. But we actually found out that
01:23:17.200 | computers could do chess, but that didn't make them an AI, it
01:23:20.280 | just meant that they could acquire that one narrow
01:23:22.560 | capability. I think kind of what you're saying is that computers
01:23:25.960 | now thanks to language models, have some ability to interface
01:23:29.560 | with us using language and to understand language. By itself,
01:23:33.120 | that is not artificial intelligence in the way that
01:23:35.600 | that is portrayed in science fiction movies.
01:23:37.720 | There's two key things that are happening. One is a new a new
01:23:40.800 | modality in human computer interaction through chat, which
01:23:43.400 | is incredible, and opens up lots of new software applications and
01:23:46.680 | markets, which I think can be transformative and worth
01:23:48.880 | trillions of dollars, no fucking doubt. And then the other one is
01:23:51.640 | in generative tooling where you can make art and you can make
01:23:54.680 | video and audio and all this other sort of stuff. So you're
01:23:57.440 | creating digital outputs that are not necessarily text, but
01:24:00.760 | other forms that also start to represent the expectation that
01:24:04.240 | you might otherwise have. And so that is really powerful and
01:24:07.320 | opens up all these new models for media, content, exploring
01:24:10.680 | new business models, etc. So both of those are really
01:24:13.360 | important new software capabilities that have emerged.
01:24:16.280 | But this this notion that I think people confuse AI as being
01:24:19.760 | this, like, we've now got humans in software that can do things.
01:24:22.800 | It's almost like back in the 60s, when the computer said,
01:24:25.280 | Hello, world. And they said the same thing.
01:24:27.160 | I think that what people are hauled by right now is that a
01:24:32.080 | computer's ability to statistically guess has gotten
01:24:35.680 | several orders of magnitude better than it ever was before.
01:24:39.040 | And so these guesses seem human and lifelike. But you'll find
01:24:43.000 | the edge cases where this stuff fails. And we'll be asking
01:24:45.360 | ourselves yet again in a few years. What the next leap is,
01:24:49.920 | but the thing that is different is that there is a level of
01:24:55.640 | investment multiplied. And Nick, I tweeted this, you can just
01:24:59.280 | show the math here, there's a level of investment, multiplied
01:25:02.640 | by a level of improvement in the underlying hardware, multiplied
01:25:06.680 | by a level of improvement in the underlying models, that's
01:25:10.040 | creating a 400 x improvement from the baseline every year.
01:25:16.400 | And so the thing that I think is different is that we are taking
01:25:21.680 | this statistical guessing, if you want to just call it that.
01:25:25.880 | And we're getting exponentially good at it. So there is an
01:25:30.240 | efficient frontier, at which point you're like, Oh, my gosh,
01:25:32.720 | this is as good as certain humans, you know, maybe not the
01:25:36.520 | smartest of the smart humans, but certain humans. And I think
01:25:39.200 | that that's what's really, really crazy and intense right
01:25:42.200 | now. So I can understand why people are like, we need to get
01:25:45.280 | a handle on this. But it may kill the golden goose. Although
01:25:51.560 | I would just say there is no golden goose yet. There's just
01:25:54.480 | a lot of really interesting stuff that people can say is
01:25:57.400 | novel, and it's inspiring to see. But by no means has there
01:26:02.600 | been just a proven use case that is 10x in productivity. Yeah,
01:26:06.720 | that hasn't happened yet.
01:26:07.680 | That's the reason to wait is this whole thing just seems very
01:26:10.920 | premature. Let the technology develop a little bit. Let the
01:26:15.000 | problems become a little bit clearer, more manifest, let the
01:26:17.280 | way the cake bake, let the cake bake, let the problems emerge
01:26:21.720 | in a way, instead of we're guessing about them, we can
01:26:24.840 | actually see them, and also allow there to be time for
01:26:28.680 | solutions to be developed by the industry on its own. And we
01:26:34.720 | don't need Kamala Harris's help or guidance to do.
01:26:36.800 | Okay, there you have it, folks. Thank you for joining us for
01:26:42.200 | another episode of the all in pod. We hope that you walk away
01:26:46.200 | with the utmost of optimism, enthusiasm for the world around
01:26:49.680 | there's awe and wonder outside go take a walk and enjoy it.
01:26:53.000 | There's so much more than what is on Twitter. I leave everyone
01:26:56.760 | with others what blessings and love. Gentlemen, anything else?
01:27:01.560 | Final words, David, you better fucking be there.
01:27:04.920 | Oh, for poker. Yeah, I will. I will. I'm coming last week.
01:27:09.240 | sacks RSVP is to poker, then texts, he's supposed to be at
01:27:12.800 | dinner. Seven o'clock text at 730. I'm on my way. 11 o'clock
01:27:17.000 | rolls around. No sacks. No idea what happened to you. You
01:27:20.720 | disappear into like the Bermuda Triangle of the Bay Area. I
01:27:24.320 | don't know where you go. But you just disappear. He gets in an
01:27:28.040 | Uber. He hasn't parked in the airport and he just plays chess.
01:27:31.560 | Alright, boys. Bye bye bye bye bye.
01:27:37.560 | Rain Man David
01:27:43.160 | sack. We open source it to the fans and they've just gone
01:27:49.520 | crazy.
01:27:49.920 | Love you.
01:27:50.920 | West. I squeened up.
01:27:51.920 | besties are gone.
01:28:00.640 | That's my dog taking a notice in your driveway.
01:28:04.160 | We should all just get a room and just have one big huge orgy
01:28:12.520 | because they're all like this like sexual tension but they
01:28:15.480 | just need to release
01:28:18.640 | your feet
01:28:21.440 | waiting to get
01:28:23.840 | Merchies are back.
01:28:25.460 | (upbeat music)
01:28:28.040 | (upbeat music)
01:28:30.620 | ♪ I'm going all in ♪