back to index

E165: Vision Pro: use or lose? Meta vs Snap, SaaS recovery, AI investing, rolling real estate crisis


Chapters

0:0 Bestie intros!
2:8 Apple Vision Pro breakdown
20:46 Meta vs Snap: god-king CEO, dependent on ad revenue, drastically different performance
32:53 Positive signals indicating a big SaaS bounceback
41:6 VCs are split into three camps on how to approach AI investing
72:16 Rolling real estate crisis continues

Whisper Transcript | Transcript Only Page

00:00:00.000 | All right, freeberg is back. Welcome back to the all in podcast episode 160 something your favorite
00:00:05.280 | podcast in the world, yada, yada, yada. With me again, the chairman dictator from off poly
00:00:09.680 | happatea. The rain man. Yeah, definitely David Sachs is here and back from his time in the
00:00:17.120 | metaverse. We found him somewhere out in space in the solar system. In his Apple goggles,
00:00:23.520 | your favorite, Sultan of science. David freeberg is back from the metaverse.
00:00:28.960 | I missed you guys. Welcome home. Thanks for having me.
00:00:32.240 | What did you discover when you went to Uranus in Google class? Sorry,
00:00:37.440 | I actually use the Apple Vision Pro takeout. I ordered them. I ordered them and I walked by the
00:00:42.640 | Apple Store and I was going to go in and try them. And there were so many lunatics in there. I was
00:00:47.200 | like, Yeah, I'm not doing it. But I ordered them you use you actually use them. I ordered one
00:00:51.760 | online to be delivered and it was like delayed by a month. So I went down to the Apple Store
00:00:56.320 | and pick one up. Okay, and my kids cannot stop using it. Really? I went down to the Apple Store,
00:01:02.160 | but I got cleaned out by the thief to stole everything. So
00:01:05.200 | you're winners ride. Rain Man David Sachs.
00:01:24.640 | That was crazy. That was crazy. We'll put the video in here to the idiots who are robbing Apple
00:01:29.120 | stores, all the devices get bricked when you steal them. And they all have GPS in them.
00:01:33.360 | Have you tried it tomorrow? No, I was too busy working out making love and winning.
00:01:37.840 | Oh, okay. So you were you were making sweet love. You were watching your portfolio go up.
00:01:44.240 | And you were just generally winning. Got it. Got it. Yeah. Yeah. So freeberg the rest of us were
00:01:49.040 | being men in the world accomplishing stuff. But but do tell us about your time in the metaverse.
00:01:54.560 | Do those goggles come with a lifetime prescription of SSRIs?
00:01:58.800 | You guys sound like one of these like, tech journalists that are actually anti tech people.
00:02:03.840 | You guys are actually journalists like it talking about computing platform. I remember when the iPad
00:02:09.840 | came out and everyone poopoo the iPad. I thought it was stupid. I tried to use it. I couldn't get
00:02:13.440 | any value out of it. And in 2010 or 2011 when it come out 2010 2011 we started using it with our
00:02:20.240 | sales team selling to farmers. And we gave every sales guy an iPad and they went out in the field
00:02:25.280 | with 3g. And they were able to close sales in the field meeting with farmers which had never been
00:02:30.320 | done before. Usually have to get a farmer to come into an office how many iPads sold the product?
00:02:33.600 | Oh, so we had like no, they were selling climate.com software. We had dozens of these
00:02:39.760 | sales guys. We gave them out to our sales agents as well. The independent agents, they started
00:02:43.440 | using them. And it was like a real game changer in how sales was done in agriculture. And I had
00:02:48.320 | never even contemplated that when I first used the iPad. So let's let's get to brass tacks here.
00:02:52.640 | What is the killer app? What do you think in the next five years people are going to be doing with
00:02:56.880 | this thing on a daily basis? Is there a daily use case? I'll say a couple things. One is like,
00:03:01.440 | I feel the same way I did about the iPad, which is I don't know what it is today. But I can tell
00:03:05.120 | that there's something there. And I'll give you an example of something I thought about. First of
00:03:08.480 | all, the AR is game changing. Okay, if you've used like the meta, yeah, the Oculus Quest,
00:03:14.480 | it like makes me super dizzy, makes my head hurt, makes my eyes hurt, like you're super
00:03:18.400 | disoriented. What Apple solved is that you're like still in reality. But then you get to interact
00:03:24.480 | with these three dimensional kind of objects in reality. And it's like really well done.
00:03:28.560 | It's definitely v1. And there's going to be incredible changes in the next couple generations.
00:03:33.040 | But it gets rid of all that dizziness, disconnected kind of stuff that happens
00:03:36.800 | with the the full VR experience, which I thought was really incredible. Then last week, and I'm
00:03:42.000 | sorry, I missed the show. We have a facility with my company in North Carolina, we have this giant
00:03:46.480 | greenhouse facility, and I was doing meetings with farmers and stuff. I go to the greenhouse facility.
00:03:50.880 | And there's so much work that the greenhouse techs and lab techs are doing, where they're using an
00:03:56.800 | iPhone and a barcode scanner and a printer. And they're holding all these pieces of equipment,
00:04:01.600 | scanning the QR codes on flowers, taking the pollen out, putting it in the next flower,
00:04:06.400 | training each other how to do it. And I was like, I put this Apple Vision Pro on. And I was like,
00:04:11.120 | man, all the apps and all the tools that we had all these different pieces for that was taking
00:04:15.680 | people tons of time, image collection, data collection, could all just be done, streamlined
00:04:21.600 | while you're working. You could have a task report. Yeah, yeah, you have a task.
00:04:25.440 | cameras are taking images in the middle QR codes are automatically scanned data is being ingested.
00:04:31.920 | The task list is kind of, you know, giving folks next steps. They could listen to music while
00:04:36.080 | they're working. And I realized for that job, and I met with all the team out there and spend time
00:04:40.400 | with them. And I actually did the work that they do to get a better sense for the workflow. And I
00:04:44.800 | was like, man, literally, every aspect of this job will be massively improved and productivity will
00:04:48.800 | go up by 10x with these goggles. Will it happen in the next couple of weeks or months? I don't
00:04:52.720 | know. But my engineering team is looking into it. Can we take it? Can we use some software?
00:04:56.800 | Can we build some software? And can we put this on folks to give them a better work experience
00:05:01.440 | to increase our productivity to do automated data capture. So I don't know exactly where it goes.
00:05:05.840 | But I could start to see how this can become a more ubiquitous part of a workforce setting and
00:05:10.080 | not just be a video game and movie tool for consumers. So I'm I'm reasonably optimistic
00:05:14.800 | about where this goes. It's definitely v1. I feel like it's the iPad days where no one's really
00:05:18.960 | sure where the applications are. But yeah, enterprise applications. Unbelievable makes
00:05:23.360 | total sense. And also training training, right? assembly line workforce, sure house workers,
00:05:28.240 | where you're getting real time kind of task updates, data is being ingested,
00:05:32.720 | all in real time. And, and by the way, the other thing I'll say is training is incredible. There's
00:05:36.640 | spatial video recording on it. So it looks like you're living through the experience that someone
00:05:41.040 | else had. So you can train someone how to do a difficult task. And rather than have a human go
00:05:45.840 | spend hours training a workforce, the workforce can be trained by the goggles in a way that you
00:05:50.320 | cannot do a two dimensional video today. So I don't know, I'm pretty optimistic. Very strange
00:05:55.200 | days, right? I don't know. You're a fan. I remember strange days. Totally. What's going to
00:05:59.920 | happen first year? Are humans going to become more like robots by putting these on and do this
00:06:05.440 | factory work? Or is Ilan with optimists and some of humane I think is the other one. There's a
00:06:10.000 | couple of other people building a general use robots figures. Figure yet which one wins the day?
00:06:15.600 | Is it going to be humans having eyes and you know, data collection like robots or robots having
00:06:22.480 | appendages like humans? Well, let me let me put two ideas together and see what you think of this
00:06:29.120 | argument. If you think about the generation of human beings that have as close to any other
00:06:37.680 | generation before it lived in a totally immersive world, I would say the best representation of that
00:06:43.840 | are current teenagers and 20 year old people, and maybe at the upper edge, the early 30s people.
00:06:50.960 | And why is that you know, they've lived inside of social media their entire lives,
00:06:56.080 | they've lived inside of immersive video games their entire lives. But the question is,
00:06:59.920 | are they better off and happier as far as we know, from an evolutionary perspective,
00:07:05.040 | and I would tell you that the answer is a is a huge gaping No. So if you believe that the
00:07:14.320 | rise in depression, the rise in suicide, the dependency on drugs, the dependency on SSRIs,
00:07:22.560 | the sexual promiscuity, the lack of marriage, the lack of kids, if all of those things are in some
00:07:29.120 | ways, a correlated byproduct, let's not say it's causal, right? Let's just say it's a correlated
00:07:35.280 | byproduct of this entire immersive, almost exclusionary detached world that these folks
00:07:42.240 | have grown up in. Taking that to the limit. I'm just going to put out there may not be the solution
00:07:48.640 | to our problems. And so I guess the more directed answer to your question is, I would hope that the
00:07:54.480 | latter wins, so that we take these goggles off and actually learn how to talk to each other and
00:07:58.800 | look each other in the eyes. Get married and have children because I think that's actually better
00:08:03.680 | for the world. And I would probably say that it's almost better for the world than a 10xing
00:08:09.520 | of productivity. Interesting. And then you see the correlation to cancer and disease that is
00:08:15.360 | disproportionately higher amongst these young people. So I think it's at some point to ask
00:08:20.080 | ourselves, what is structurally happening in the lives of the 16, you know, 15 to 31 year olds,
00:08:27.680 | that is just so poor in terms of outcomes. And if you look at some of the environmental variables
00:08:34.320 | that they live in, and then take some of those and take them to the limit, I think that there's
00:08:38.640 | a reasonable argument that their lives get worse before that gets better. Yeah, I mean, the amount
00:08:43.120 | of time you spend on social media is correlated with depression, social media, I'm just saying,
00:08:47.760 | just this immersive, like, detach from the world and live through a microphone and glasses taken
00:08:54.240 | to the limit. I'm not sure is the solution to these kids feeling detached, lonely, isolated.
00:09:00.480 | Yeah, I mean, it correlates all of these things that we're seeing in this younger generation
00:09:05.760 | correlates with the introduction. So could it be a good productivity device? Yes, of course. I hope
00:09:09.600 | it's a good productivity device. Yes. But if we try to make it the panacea for anything and
00:09:14.000 | everything, I think we're going to, we're going to compound the systemic issues that these young
00:09:20.080 | people have. And I suspect, on the margin, if you were going to bet, all of these things that we see
00:09:25.840 | in these young people today will get worse, as a byproduct of technology, not necessarily get
00:09:30.320 | better. So if you can take a different path, like optimists, or the figure AI robots, where that
00:09:37.280 | work is done, at least we have a different problem, probably maybe even more existential
00:09:42.560 | abundance, but a different problem, which is now how do you find purpose, but maybe you can find
00:09:46.800 | purpose through connection and the types of things that humans have been bred over billions of years
00:09:51.680 | to actually optimize for, okay. Saks, I remember when you were starting craft, you fired up like
00:10:00.800 | a group for VR, and you got pretty heavy into you made a couple of small bets. I remember that I
00:10:06.080 | don't think any of it worked out really, you could tell me if I'm wrong here, but you got in a little
00:10:10.000 | bit earlier there, maybe you talk about the business case for this. And has that changed?
00:10:13.760 | Because you believed I believed a lot of folks thought, hey, maybe this is the time when Zuck
00:10:19.600 | really start, you know, had bought Oculus, and they started putting out some good product,
00:10:23.120 | seemed like it was a false start. Is this the actual starting pistol? And is this the start of
00:10:28.000 | the VR AR adoption race? I don't think we're quite there yet. Okay. We've been talking about
00:10:35.840 | VR being a thing for over a decade. Yeah, no more like 30. Remember the Nintendo VR stuff?
00:10:42.880 | It's like always on the verge of happening. I think that the big complaint about the Apple
00:10:46.720 | device is has a lot of capability, but it's still a pretty huge device to wear on your forehead.
00:10:51.200 | This is not really going to be comfortable enough to be something that people want to use all the
00:10:56.960 | time. I mean, there's also a question of use cases, but they're getting there with the use cases.
00:11:04.480 | In any event, I do think that Apple Vision Pro is, like I said last week, it's a useful prototype
00:11:11.920 | or proof of concept, and it will get better. So I'm glad they did it. Because I think you need
00:11:18.080 | to start somewhere and then just keep iterating. But eventually for this to, I think, really take
00:11:23.280 | off, you need to shrink the form factor, miniaturize the technology, just every version
00:11:30.240 | of it, make it simpler, lighter, easier to use. Yeah. I mean, eventually it'll feel like sunglasses.
00:11:36.960 | And so that is, I guess, if they become like regular glasses, I think we all agree it becomes
00:11:41.360 | a nice computer. I feel like it's pretty damn comfortable. I don't know if you guys haven't
00:11:45.600 | really used it, but that's what I've heard. That's a surprise. It was totally online saying
00:11:49.680 | any other headset I've ever worn. They did an incredible job designing. Does it feel like ski
00:11:56.160 | goggles? It doesn't feel heavy. It doesn't feel pressure pressure compared to ski goggles.
00:12:01.680 | If you were wearing ski goggles, it's less constricting than ski goggles. It's more
00:12:05.040 | comfortable. It like floats on you a little bit. They did a great job with this cushioning device
00:12:09.280 | they built. And the band you put on it feels very natural. It's Apple design, right? It's like a
00:12:14.000 | really well designed product. That's unlike anything else you've ever tried. I've always
00:12:18.000 | felt like when Apple comes into the race, that's the starters pistol. And I think this is it
00:12:22.000 | because I've heard the same thing from everybody. You have to try it. It feels like different than
00:12:27.680 | Oculus and some of those versions that came out previously. And they have the app ecosystem. And
00:12:34.000 | I would not discount that when you know the ability to monetize the app ecosystem and have
00:12:38.560 | all the people who are already building the comm app, the Uber app, whatever notion, you know,
00:12:43.840 | all the stuff that people use and love Spotify, YouTube, and then ported over here, Fortnite,
00:12:49.280 | whatever. I think that's going to be the magic. And the statistics are not lying here. I mean,
00:12:54.480 | this is unbelievable. They've sold already 200,000 units, which doesn't seem like a lot.
00:12:59.760 | But for v1, that is a lot. And they're going to sell a half million this year. It's gonna be close
00:13:04.080 | to like, that's not that many. Well, it's a couple of billion meta sells more. They do. Yeah. But,
00:13:10.080 | you know, this is $4,000. This isn't 500. So to sell that many of a $4,000 device is incredible
00:13:17.120 | for concept. It's not like a regular Apple product that is a mass market device that
00:13:24.400 | 10s or 100s of millions of people are going to buy, but it puts them on a path. Yeah,
00:13:28.240 | to where they can iterate and keep making it better. See, I think, and this is I guess,
00:13:32.080 | what I'd ask free bird, do you compare this to buying a MacBook Pro, buying an iPhone,
00:13:37.920 | or buying the Oculus, you know, whatever they you know, $500 unit, because everybody I see
00:13:43.600 | talking about online is comparing it to the purchase of a laptop, because of the desktop,
00:13:48.480 | and you can kind of do your coding or surf the web and do all that. Where do you put this? Is
00:13:52.560 | it buying a TV? Is it buying a laptop? Is it buying a smartphone? What would you say?
00:13:55.840 | You have to have a keyboard to be really productive on it. If you're going to use it for writing
00:14:00.160 | purposes or coding purposes, so it doesn't really work with just the headset, but you could do that.
00:14:05.520 | Yeah, it's definitely like buying a new computing device. But people felt the same way about the
00:14:10.320 | iPad. I get go back to 2010, when the iPad came out. And everyone was like, who is it for? It's
00:14:16.320 | a whole new computer. Who's it for? You already have a phone, you already have a computer? Why
00:14:20.160 | do you need an iPad? And then they sell 10s of millions a quarter now. Yeah, so I really I as I
00:14:25.200 | do the math on this, I was just kind of doing some back of the envelope stuff. I think they're going
00:14:28.000 | to sell $100 billion of Apple Vision pros, not this version, but this version plus the next version,
00:14:33.920 | probably over the next. I would guess for them to get to 100 billion in sales, it'll take them
00:14:40.960 | less than five years. I think they're gonna run the table on everybody. I think they're gonna own
00:14:44.800 | the entire thing. Everyone's underestimating this as a new computing platform. And once these
00:14:49.360 | applications, particularly in the enterprise setting, start to kick in, and I will say that
00:14:53.200 | the movie watching experience is way better than watching on a TV in your living room.
00:14:57.280 | My kids cannot stop asking me to use the goggles to watch instead of an iPad or TV.
00:15:01.280 | Because you see 3d like all Pixar movies are natively 3d. And so you got the Disney plus app
00:15:07.840 | on there, you watch a Pixar movie, and you're watching in 3d, the kids are blown away. So I
00:15:12.720 | think we're all going to be surprised by how this Disney is all in on it. Remember when our parents
00:15:16.720 | told us not to sit too close to the TV? Now we're just strapping the thing to our face. Yeah, I had
00:15:23.440 | the most Silicon Valley moment ever. I go to buy a cup of coffee. I was going for a little walk. I
00:15:28.880 | see blue bottom like, Oh, you know, I'll get myself a mocha. You know, I lost a little bit
00:15:31.520 | of weight. I'm going to treat myself $9 for a mocha. Number one that in the city tilted me.
00:15:37.440 | Now $9 from $8 and then I gave a dollar tip and then I felt cheap giving a dollar to 899 for a
00:15:44.960 | carton of clover milk all organic. I mean, you can make infinite lattes at home. Anyway, where did
00:15:51.600 | you go for your $9 mocha? I was I'm in Palo Alto right now because we lost the bottle. Yeah, I
00:15:58.240 | posted this. I'm like $9 What am I doing? You know, I just I felt like buying a chocolate bar
00:16:03.040 | and the stain your dirty lips left on the cup. Oh my god. Look at you know what you're a little
00:16:08.800 | obsessed with my lips. Take it easy there. So anyway, then there's a kid in the place wearing
00:16:15.280 | the goggles with the keyboard. No, stop pounding. He's getting work done. This kid was doing work.
00:16:21.280 | And I tell you the truth in the hours he was putting in the hours. No one looks at your
00:16:25.760 | laptop. No one. That's what I love about your work without anyone seeing what you're doing.
00:16:30.640 | This kid had four desktops up. This guy was probably on Pornhub, Spotify, writing code.
00:16:36.720 | How many words did this person say to another human being while you were there?
00:16:39.760 | No zero. And you know what, when they're on a laptop, they're the same. What's the difference?
00:16:43.600 | He's coding. Nobody bat it. And I think this is gonna they're gonna run the table on this.
00:16:48.240 | I think it's a hundred million in sales. 100 years. Yeah, I take the over. Yeah, I take the over.
00:16:53.680 | What do you got the over the under? Because even if they keep it at three grand, they got to sell
00:16:57.920 | 30 million units to get to 100 million. They're gonna make up a lot of money on this app store,
00:17:02.000 | too. By the way, you guys are right that it's going to be successful in terms of revenue.
00:17:05.840 | What I'm asking is a more societal question is, do you guys actually think it's better?
00:17:10.160 | No, I don't want my kids in this all day now. And I can see this becoming super
00:17:13.040 | dicking. Hey, freeberg, I can I buy three for your kids? Just have them walk around with them?
00:17:16.960 | No, I have a no house rule as well. But wait a minute. Hold on. What about productivity? Freeberg?
00:17:21.440 | My kids aren't trying to be productive. They're using it to burn. It's called child.
00:17:26.160 | You don't have a productive childhood. It's supposed to be not productive.
00:17:29.120 | You guys understand that at some point, you guys will be the only six kids whose parents
00:17:35.040 | haven't given them this stupid thing to put on their face. Now this is going to be time restricted.
00:17:39.680 | I have a no iPad, no phone. No, like, I let them use the headset. But it's so good for them. So
00:17:45.120 | good. No, no, it burns their brains. Their brain away burns their brain. Oh, I mean, I totally
00:17:50.880 | agree with you. Social interaction. The loss of our ability to communicate is human. It's critical
00:17:56.000 | and it's a fail point. I do think that there are applications where these things create great
00:17:59.600 | unlocks. I think this is an enterprise device. Can you imagine giving the sales team on the
00:18:03.920 | farms to go there, they can take off their sweaty headset when the sun is shining and then give it
00:18:08.720 | to the farmer to put on and then he can put it on and feel the sweat and the headband will be wet.
00:18:14.000 | It doesn't by the way, it's a very personal device. In order to log in, you know, it does like a
00:18:20.000 | eye scan. Or you have to have like a lock in like login like you do with your phone, but then you
00:18:24.480 | got to reset the eye because it automatically sets the eye position. So when you put on someone else's
00:18:29.120 | headset, you got to reset the IP, it's a whole thing. So it's not a transferable device. It's a
00:18:33.360 | very personal computing, you know, kind of thing. So I don't think it's going to be the same as like
00:18:37.600 | an iPad or a phone. It's a very different kind of thing. I don't know what it's going to look like.
00:18:40.560 | Yeah, I don't know. I say next week, we do the show inside of these or at least me and you free
00:18:44.160 | bird will be will be there. There's a there's an app, there's an avatar thing. So what it does,
00:18:51.600 | it scans your face while you're talking. It can see each other as the avatar. Yeah,
00:18:56.320 | let's do it. It'd be hilarious. I had a moment this week in parenting. I had a moment this week
00:19:01.280 | where I told one of my children that when I send a text message, I expect an immediate response.
00:19:07.760 | Otherwise, I am going to cancel that child's phone and take it away. And then separately,
00:19:13.840 | when they respond, it has to be in structured, well thought out, perfectly formatted English.
00:19:20.080 | And then then third, I said, every single email I see from you interacting with your teachers or
00:19:25.600 | anybody else that's there to help you needs to be incredibly well written and formatted. And if I see
00:19:31.360 | garbage English, I'm going to take your phone away.
00:19:34.080 | Okay, so you don't want them on their phones, but they have to respond right away.
00:19:38.720 | Well, they have very strict rules about they can use they're there for literally,
00:19:41.840 | all they can do is communicate like they can use iMessage. But it is shocking to me that
00:19:47.840 | despite the lack of games that they have, or whatever, how poor they are in being able to
00:19:54.320 | communicate, and what little access to devices they have, have already made them orders of
00:20:02.720 | magnitude less able to communicate than frankly, I was able to when I was their age. And so I can
00:20:07.840 | just imagine what happens when you become even more ensconced in something that you can cocoon
00:20:12.640 | yourself with and not have to interact with the rest. I don't disagree with you. I don't disagree
00:20:15.920 | with you. Not to say that it's not going to be a revenue generator. But I think that you could just
00:20:20.640 | as easily frankly, instead of impacting Apple's revenues, you can probably go along the makers of
00:20:26.960 | SSRIs. Here comes a spread trade, but Bumble and Tinder and you'll get to the same place economically.
00:20:36.080 | All right. All right, here we go. We got a lot on the what a great leap forward for humanity. I
00:20:40.880 | can't wait. So I just see this as a laptop replacement. Okay. I wanted to talk a little
00:20:47.360 | bit about what apparently is going to be the the spread trade of the last year. Meta is continued
00:20:53.040 | their unbelievable run and snap dropped by 30%. Here's a chart for y'all of snap versus meta. You
00:21:01.040 | can take a quick look at it here. And just for context, both companies did great during COVID
00:21:06.720 | and ZURP hit all time highs in 2021. But they both got crushed due to the ad spend pullback,
00:21:11.520 | obviously, but then better started to get less focused on their headsets and more focused on AI
00:21:16.880 | started doing their reduction in headcount 22% year over year from 86,000 to 67,000 the last
00:21:25.200 | quarter for meta. And their quarterly profits have increased to an all time high of $14 billion.
00:21:32.000 | That's profits folks in q4 for meta all time high for the stock price $470 a share $1.2 trillion
00:21:40.320 | market cap, snap down 60% from its closing price on its IPO day in 2017. Let me just jump to
00:21:47.440 | chamath before I get into more charts and everything you pointed out chamath and maybe
00:21:51.200 | you could explain to the audience just how ridiculous the voting rights were, and the
00:21:57.760 | massive dependence that the snap team and the executives had on stock based comp two issues
00:22:05.360 | for you chamath. Well, I mean, I think I said it before, I think that case studies have been
00:22:11.360 | written about how tilted the governance is in snap. I think the point is that they basically
00:22:17.600 | have infinite to zero voting power over common shareholders. So there's no real feedback loop.
00:22:24.320 | And I think that that has probably adversely affected the types of people that traffic in
00:22:31.280 | their stock. Now, look, activists and short sellers sometimes have a very bad reputation.
00:22:39.840 | But if you steal man their side of it, what they are there to do is to shine a light
00:22:45.840 | on inefficiency and in the short seller case, sometimes in propriety, but it should all lead
00:22:52.320 | to companies being better run. Right? I think meta had this example where they had a really
00:22:58.720 | big hiccup. And everybody including us sort of pointed out the levels of spend that they were
00:23:06.000 | making really didn't make any sense. I think we had a chart that compared the level of spend of
00:23:10.080 | meta second only to like the spaceship program, right? Just like bonkers, an enormous amount of
00:23:16.800 | money. And look, Mark got the message. He heard it loud and clear. I think he got fed up with
00:23:23.600 | whatever was going on there and he fixed it and it's in the numbers. Now I don't know snap because
00:23:30.000 | to be honest with you, I've never taken more than one second to look at that company. And the reason
00:23:36.000 | is, there's just zero ability for me to have any useful say. So I've never honestly looked at its
00:23:41.600 | performance. I've never studied a single characteristic. I've never trended it. And
00:23:46.880 | I think the point is that I am probably where a lot of other reasonably smart folks who could give
00:23:53.200 | a reasoned opinion on how to make it better land. And part of the reason is because there is no
00:24:00.240 | feedback loop that matters. Yeah. And when you know that, why would you waste your time, at least
00:24:05.600 | in their other options, right? There are other options and then meta was another one, you know,
00:24:09.200 | you can write a letter, it gets picked up on CNBC and Bloomberg and whatever. And all of a sudden,
00:24:15.840 | they kind of pay attention. And I think and you look at Disney, Nelson Peltz goes and gets Ike
00:24:20.800 | Perlmutter shares, buy some more takes a larger position. Yeah, we'll see whether that fixes
00:24:25.760 | itself. The point is that in all of these other cases, people are investing the time because
00:24:31.040 | they think that there's even a small shred of a chance that the company listens. But if you
00:24:36.880 | literally have no say, you couldn't even do a proxy, you couldn't vote the shares,
00:24:42.160 | why would you bother? And I think that's more of an example where maybe there is a so I don't even
00:24:47.600 | know why snapped it poorly. And again, I'm not going to really take the time because it's like,
00:24:52.560 | why bother taking the time? section should they unwind this like, no voting common shares,
00:24:58.080 | super voting shares, nonsense. And should this go away as a concept in the stock market?
00:25:03.840 | Well, I mean, Facebook, or meta has a pretty similar concept. I mean, I guess Zuckerberg has
00:25:10.800 | 60% voting control, whereas Evan Spiegel is 99%. So snap is more egregious. The difference is that
00:25:18.640 | Zuckerberg is listening, and Spiegel is not. The reason why snap is doing poorly is not because
00:25:26.320 | its revenue has deteriorated. So I looked up, let's put this past chat GPT for their key metrics. So
00:25:34.080 | assuming GPT is not hallucinating, if you compare 2021 to 2023, their total revenue went up from
00:25:41.840 | 4.1 to 4.5 billion. And gross profit went from call it 2.4 to 2.5 billion. So not a huge increase,
00:25:52.320 | but revenue and gross profit were slightly up. But if you look at operating expenses,
00:25:57.120 | they went from 3 billion to 4 billion a year. And that is why their operating income or operating
00:26:03.920 | loss went from a $700 million loss to $1.4 billion loss in two years. So that's the source of the
00:26:11.840 | problem is that they increase their operating expense by a billion dollars a year from 2021
00:26:19.600 | to 2023. Yeah, pretty simple. They seem like they're the last ones to get the memo. Yeah,
00:26:24.560 | they were the last ones to get the memo and just finish the point. So you saw that
00:26:27.920 | a few days ahead of this quarterly announcement, where their stock got crushed, they put out a
00:26:34.320 | press release saying they're going to cut their headcount 10%. It's too little too late. Yeah,
00:26:39.600 | they knew, right? They knew they had a problem. So they released the press release saying,
00:26:44.240 | "Oh, we're going to cut." Well, you should have done what Zuckerberg did. Zuckerberg did a 20%
00:26:50.080 | cut last year, he got serious, he got lean and fit. And instead, these guys held out, did nothing.
00:26:57.840 | Then when they know that the market's going to crush them, they put out this lame announcement
00:27:02.000 | 10%. No, not 10%. Really, if you just want to get back to where you were two years ago,
00:27:08.880 | in terms of operating expense, you need a 25% reduction. Yeah, yeah. But it's more than that.
00:27:14.320 | If you look at the numbers, let's use operating cash flow was 165 million for snap for the quarter.
00:27:19.680 | So their operations generated 165 million a profit. But for the entire year, because they
00:27:25.440 | lost money in the quarters prior, they generated free cash flow of only $35 million. So the business
00:27:32.240 | net in produced $35 million of incremental cash. You know how stock based comp accounting works,
00:27:38.160 | the charge happens when it vests. So this is what employees are vesting. During the year of 2023,
00:27:44.320 | employees vested $1.3 billion of stock based comp. So that means new shares or options were issued
00:27:50.560 | that on an accounting basis, the options are valued using black shoals, and the shares are
00:27:54.480 | valued based on the share price. So they issued 1.3 billion a stock based comp. So they generated
00:27:59.120 | 35 million of free cash. And they use $1.3 billion to compensate employees beyond their
00:28:05.040 | topics. So that means that they paid employees 40 times the free cash flow that was generated
00:28:10.880 | for shareholders during the year, which is also equivalent to 10% of the enterprise market value
00:28:17.680 | of this company. So the enterprise value of the company is $15 billion. 10% of that was issued to
00:28:23.760 | employees to compensate them. Now let me give you the story of another city, meta. And by the way,
00:28:28.720 | snaps share count, because they issued all the stock, the number of shares outstanding increased
00:28:33.760 | by 4% during the year. During the year metas number of shares outstanding decreased by half a percent
00:28:40.560 | because they use cash to go and buy back stock so they were able to reduce the shares outstanding.
00:28:45.200 | Now, as you guys talked about, medic cut employee count by 22%. And snap cut employee headcounts by
00:28:50.960 | 3% during the year. But here's the crazy difference in performance. The stock based comp expense
00:28:56.800 | for meta during that year was about $14 billion that vested that year, that company generated
00:29:03.760 | 71 billion of operating cash flow. So while while snap gave employees 40 times the free cash flow,
00:29:11.520 | meta gave employees you know, about a 20% of the of the free cash flow. And then and then
00:29:17.200 | meta went around and they use some of that extra cash to buy back $20 billion of stock. So they
00:29:21.760 | bought back more shares than what the employees were issued back that year work. So it shows such
00:29:26.800 | a difference in looking out for shareholders. So if I'm an investor, and by the way, meta is
00:29:32.400 | trading at like 25 times free cash flow, which is not a crazy multiple, given all the new businesses
00:29:36.640 | that they have in Lama to and the progression to cloud and other things that they might do.
00:29:40.800 | If I'm looking at those two businesses as a shareholder, you got this guy that controls the
00:29:44.720 | whole stock. He's giving employees a billion three of shares a year, when he's only making
00:29:49.680 | $30 million of free cash flow a year. And then the other guy is issuing $14 billion of shares,
00:29:55.440 | buying them all back. And he's making 70 billion of free cash flow a year. I don't know, it's very
00:29:59.600 | hard to decide which one you go. Spiegel brought it up in an interview, I saw him in a lot of the
00:30:05.120 | layoffs were top heavy. So he got rid of a lot of the top people who had these huge comp packages.
00:30:11.120 | And then what I'm hearing from a lot of executives is cutting these highly stock comp executives,
00:30:18.480 | who are, you know, also have big cash comp, cutting them, putting lieutenants in charge,
00:30:23.040 | and then moving more jobs to other locations where people don't expect stock based comp,
00:30:28.960 | you know, if you're in India, or you're in South America, whatever, you know,
00:30:32.960 | stock based comp is not like the obsession it is here. So as everybody optimizes these businesses,
00:30:38.960 | I mean, Facebook, even 5000 employees. So they announced roughly 500 job cuts out of what 5500
00:30:47.360 | employees. That's crazy. I mean, should that company be operating with 2000 employees?
00:30:54.560 | It's good question. I cut the number of Twitter employees from 8000 to 1500.
00:30:59.040 | When you look at the number of apps that they're running, and the number of products that they're
00:31:01.840 | running compared to meta, right meta has far more apps, far more infrastructure,
00:31:05.760 | meta is serving 3.2 billion daily active users snap is about 400 million. So meta is eight x
00:31:14.000 | the users with many more applications and much more infrastructure. So I think it's a it's another
00:31:20.720 | great kind of ratio to look at the performance of these to 12. You're exactly right. Exactly.
00:31:25.600 | Yeah. The other advantage that meta has is because they're so profitable, they have the resources to
00:31:31.680 | go big and AI big time, which is very expensive. So yeah, so they are the leader,
00:31:36.400 | you get all this option value at meta, which you don't get it snap, there's all this infrastructure
00:31:40.640 | that they can leverage much like Amazon did with AWS into things like cloud AI tools for third
00:31:46.800 | party developers, third party applications. And then obviously, the you know, meta is the biggest
00:31:53.040 | advertising platform next to Google in the world now. And there's much more that they can start to
00:31:57.840 | do to extend further into the they did get an awesome save. Remember, Apple screwed them and
00:32:04.640 | was like, you can't track devices now. And like that just took a massive hit in the ad network.
00:32:09.920 | And it was all those headwinds. They're like, Okay, we're just going to use AI to optimize
00:32:14.000 | ads. And supposedly the AI optimization of ads, I was talking to somebody on the inside.
00:32:17.600 | They said, like, yeah, we got it all back. We gained it back. We've got massive AI advertising
00:32:23.920 | optimization going on. So totally, yeah, that's great that Tim Cook, you know, kicked us in the
00:32:28.480 | nuts, but we don't care. By the way, that's a great point. J. Cal, it really says a lot about
00:32:33.200 | how meta was able to respond to that change, which a lot of people speculated would destroy the
00:32:38.720 | advertising business. And the fact that they were able to engineer solutions to drive advertising
00:32:43.200 | revenue up to $40 billion. It's just mind blowing. It's a really kind of impressive outcome for the
00:32:49.440 | team. And I think it speaks a lot to the quality of the engineers there. Yeah, I think it's a great
00:32:53.040 | point. Yeah,
00:32:53.840 | sacks, you tweeted that you're seeing a little SAS bounce back all of a sudden, that's interesting.
00:32:58.720 | I am seeing something similar last year, last two years, you had a ton of people cutting
00:33:03.200 | their SAS spend, maybe removing the number of SAS vendors, they had consolidating vendors,
00:33:09.680 | you tweeted, many public and private software companies are experiencing accelerating growth
00:33:13.280 | after six to seven quarters of deceleration. SAS recession appears to be over according to
00:33:19.200 | the SAS master David sacks. You want to unpack this for us? What do you say?
00:33:24.240 | Well, it's still pretty early, because not everyone's reported. But if you looked at
00:33:28.400 | the big tech cloud performance in q4, you could see that there's a bounce back in here. This is
00:33:35.120 | net new ARR added for AWS, Azure and Google Cloud. So you see here in q4, that there's a huge
00:33:44.400 | increase in net new ARR for the big cloud computing platforms. And then I think another bellwether is
00:33:51.280 | Atlassian. So we're still waiting to hear from HubSpot, Salesforce, zoom, Adobe companies like
00:33:56.640 | that they haven't reported yet. But if you look at it, makes Jira amongst other products, they're
00:34:01.920 | based in Australia. Yeah, major. Yeah, exactly. collection of SAS companies, right? It's a
00:34:06.240 | collection of SAS products. Yeah, so net new ARR would be the amount of growth in that quarter.
00:34:12.800 | And this is on a year over year basis. So you can kind of see who q4 of 21 was the absolute peak,
00:34:19.760 | and then plummeted. And then it actually went negative for about a year.
00:34:25.760 | That's tough to be in a company with new ARR going negative.
00:34:30.480 | Yeah, yeah. That doesn't mean by the way, the company shrinking, it just means that the amount
00:34:34.800 | of net new ARR, which is the amount of growth is actually smaller than that same quarter a year
00:34:42.240 | before. Yeah. And then in q4, you could see there's some acceleration here, that they're starting to
00:34:48.400 | add more, they added more net new ARR, I guess 33% more in q4 than they did over the previous year.
00:34:55.280 | And part of that sacks is because the comps are lower, and they kind of bottomed out. Yeah.
00:35:00.080 | They bottomed out now they're reaccelerating. Yeah, that's great.
00:35:02.080 | So we're starting to see this in some of my board meetings as well, where in 2022,
00:35:09.200 | everybody was missing their numbers and reforecasting down, and then they would miss
00:35:12.400 | the reforecast. Yeah.
00:35:13.920 | So by 2023, the forecasts were very, very conservative. And I would say, now I'm seeing
00:35:20.320 | companies beat the sort of the lower forecast in q4. This wasn't happening earlier in the year.
00:35:28.240 | But finally, I think people are starting to be their sort of their lower forecast for q4.
00:35:33.360 | That's the question that I was curious about. What do you what do you actually think is happening is
00:35:37.360 | that we've rebaseline these businesses. So now what would have looked like just a massive
00:35:43.280 | miss over the last two years now looks like a beat because we've just completely reset
00:35:47.920 | expectations. Is it that or is it that the economy is actually expanding and we can count on
00:35:56.560 | some reasonable growth rates? Is it a combo of the two? What do you think it actually is?
00:36:00.320 | Yeah, I mean, it's definitely a new baseline in the sense that if you go back to 2020 or 2021,
00:36:09.200 | we considered good growth to be, you know, two to three x year over year. And now if it's going
00:36:15.440 | from 60 to 80% growth year over year, you're happy. So there's definitely been a lowering of
00:36:20.160 | expectations. That being said, you still see in these numbers, there has been a bottoming out and
00:36:25.920 | we're starting to not grow from this new baseline. So, for example, I think with Atlassian here,
00:36:34.640 | we are seeing an increase in spend basically in growth, right? So the way a recession is typically
00:36:40.480 | defined is two quarters of negative growth, right? We had six to seven quarters of decelerating or
00:36:47.040 | negative growth in SAS in tech and SAS, which is why I called it the SAS depression or be that
00:36:54.160 | yeah, it's actually kind of a depression. You're right. But now we're seeing quarter over quarter
00:36:58.560 | growth. So growth is re accelerating. Growth is higher than it was. So is he going to get to where
00:37:04.160 | it was that probably will take some time, but it feels like the problems in the ecosystem work
00:37:09.600 | themselves out. And now we're back to growth again. Yeah, I can add psychologically, because
00:37:13.840 | I'm on a couple of SAS boards as well. And psychologically, it felt like you tell me if
00:37:18.160 | I'm right, SAS, SAX, and you saw the same thing. There were two years of calling up customers,
00:37:22.640 | and they were like, we're consolidating vendors. And by the way, we did a riff.
00:37:26.480 | And so we need 20% less seats. So we're going to have 20% less SAS companies that we're buying from,
00:37:33.360 | and we're going to have 20% less seats. So you start putting that all together. Man,
00:37:38.080 | everybody was just in psychological triage mode. We cannot spend money. I don't want to lose my
00:37:43.440 | job. So you're if you're a procurement person, you're the CTO, you don't want to lose your job,
00:37:46.960 | you don't want to have more cuts. So you're like, well, I can cut some software costs.
00:37:50.240 | Do I get points for that? And the point you would score the last two years was cutting costs.
00:37:55.040 | But the market ripping, and you now got a really, you know, efficient company, you're like, hey,
00:38:01.200 | can we spend a little bit on SAS to make the remaining employees even more, you know,
00:38:06.560 | productive? Okay, maybe that's a reasonable discussion. And then people are playing ball
00:38:10.160 | in terms of negotiating prices. So that's the other thing I see is like, people are like,
00:38:14.640 | well, we'll take your software, but here's what we want to pay. And then they're coming to the
00:38:17.920 | board and saying, Can we do this deal would have been a million dollar deal, but it's a $200,000.
00:38:21.920 | Again, take the money, take the money. Let's let's bear hug that customer.
00:38:25.760 | The market is generally an escalator on the way up an elevator on the way down. So the recovery
00:38:30.720 | is going to take a long time. But at least we've bottomed out and we're in recovery as opposed to
00:38:35.680 | continuing declines. Yeah, by the same token, if you're a startup, and you're not seeing improvement
00:38:41.360 | in your Q4 sales, then you no longer have a macro excuse for why you're not doing well.
00:38:48.160 | Interesting. And then freeberg, you added, you know, you're like, I'll make my own software,
00:38:53.200 | you said, you know, some SaaS software is too expensive. I'll put a developer on it. And so
00:38:58.640 | how's that working out for you? Are you still in that mindset of like, yeah, maybe we just build
00:39:02.320 | our own software? Yeah, I mean, I, it's not just us, I think we're seeing a lot of companies
00:39:08.240 | pursuing this path, a couple engineers can rebuild the functionality of core applications,
00:39:13.920 | particularly because I think, if you think about the business model that makes us so great,
00:39:18.640 | is they could value share, rather than charge the cost of an engineer plus some margin,
00:39:25.040 | the great business model, the equity value that comes in software, if you can build something
00:39:30.720 | once that creates $100 of value, you can probably charge your customer 30 $40 for that product,
00:39:36.640 | because it's saving them 60 bucks, 70 bucks, and they'll make that switch to software.
00:39:41.120 | So you know, the ROI driven value share model in SaaS has made it incredibly valuable.
00:39:47.920 | The problem now is that an engineer can be hired to build the replacement. And so it creates price
00:39:54.960 | compression. So the SaaS company can no longer capture that much value, because the savings is
00:40:00.240 | actually less than that. Because the enterprise might say, hey, I'm going to hire someone and
00:40:04.400 | instead of spending 60 grand a year, on your software, I'm going to allocate a quarter of
00:40:08.880 | an engineer's time to build that software, and it's going to replace that, that cost.
00:40:12.800 | So I think that that's still the case. So while there might be bookings, there's still
00:40:16.320 | which are driven largely by a search for efficiency gains a search for more profitability for more
00:40:22.640 | productivity within an enterprise. There are other options for that enterprise to realize
00:40:27.200 | that productivity gain today. And that's what's going to cause perhaps price compression,
00:40:32.480 | and more competition than has been the case. But I don't think that the adoption of software
00:40:37.600 | is going to slow down. It certainly seems to be reaccelerating, which is more competitive,
00:40:41.840 | right? We're moving into a hyper competitive market, right, especially with AI. It's a mix
00:40:46.400 | of internal software. It's a mix of internal fire. As you guys know, there are very few
00:40:50.720 | traditional non tech enterprises now that don't have a software team that can write code. Now
00:40:55.920 | that so many companies have software teams that write code, they're all going to be asking the
00:40:59.760 | question, should we be buying the software? Or should we be building something internal? Yep,
00:41:03.840 | it's a classic buyer build situation. All right, let's talk a little bit about VCs and how they're
00:41:08.320 | investing in AI. There seems to be three camps shaping up here tomorrow. You know, one group is
00:41:13.040 | like, the incumbents are going to win, you know, Microsoft, Google, Amazon, everybody, they're
00:41:17.840 | going to win the day. So they're going to wait and see. Then there's another group who's sitting
00:41:23.520 | it out because they're like, hey, open source is going to win. Metas committed to open source and
00:41:29.280 | collaborative platforms. I've been playing with hugging face with Sandeep as well as huge mouth
00:41:35.600 | and it's pretty amazing what's happening over there. And then a bunch are obviously placing
00:41:38.800 | bets right now. The valuations are absurd. Founders Fund and Andreessen Horowitz, two notable
00:41:44.240 | firms are approaching it differently. Founders Fund bought into open AI at a $29 billion valuation.
00:41:51.200 | But aside from that investment, they're generally avoiding the ideals. On the other hand, Andreessen
00:41:56.080 | is betting heavily character AI, Repl.it, 11 labs, Mistral, you're also in Repl.it, SACS. So
00:42:04.080 | what do you think is open source going to win the day you've been picks and shovels the whole way?
00:42:08.640 | You've been talking about compression, maybe this isn't actually a good market.
00:42:11.440 | What you're thinking as a capital allocator tomorrow,
00:42:14.800 | I think foundational models will have no economic value. I think that they will be
00:42:20.080 | an incredibly powerful part of the substrate. And they will be broadly available and entirely free.
00:42:28.080 | So if you think about that, any closed model, especially a closed model,
00:42:33.120 | that operates on open on the open internet is not very valuable.
00:42:36.960 | And any open source model that operates on the open that trains on the open internet
00:42:44.000 | will make that so. So in that world, things like Mistral and Lama will essentially decay the market
00:42:52.640 | to zero. So if you if you're looking at any economic value that has been captured up until
00:42:58.080 | today, if it has been captured by having a proprietary closed model, trained on open data,
00:43:05.040 | that economic value will go away. And I think Google and Microsoft and Facebook and
00:43:13.200 | Amazon and all these startups have a deep economic incentive actually to make that so.
00:43:18.720 | So now you can evaluate what that means. So if you get an open model from hugging face,
00:43:25.280 | that's just kick ass. Where do you spend money? Well, you're going to have to spend money to
00:43:30.720 | actually train it to fine tune it, maybe to have some pretty zippy inference. And all of that means
00:43:39.520 | that there's a new kind of substrate that has to be built, which is all around the way that the
00:43:44.560 | tokens per second are provisioned to the apps that sit on top of the model. What that means is you
00:43:50.720 | need to go back to 2006 and seven and say, Okay, when we first created the cloud, who made money,
00:43:56.800 | and fast forward, 18 years later, it's the same people that are still making money.
00:44:02.720 | So the people that made money in 2006 and seven were Amazon principally because of EC two and s3.
00:44:09.840 | The perfect analogy of EC two and s3 in 2024 is the token per second provider. Now, there you have
00:44:19.040 | to double click and say, Okay, well, what does a token per second provider need to do to make a lot
00:44:23.520 | of money? And I think the ultimate answer is you need your own proprietary hardware. So who is in
00:44:28.320 | a position to do that? Amazon has announced that they have an inference and training solution.
00:44:33.680 | For training cerebrus has announced a pretty compelling solution. Google obviously has TPU,
00:44:40.240 | then there's a handful of startups, including one that I helped get off the ground in 2016
00:44:44.320 | that I funded, called grok. All of those companies are in a position to build a tokens per second
00:44:50.720 | service. Then you have companies like together AI, which basically just go and take venture money
00:44:56.000 | and wrap Nvidia GPUs. And you can debate what the advantage will be there. One could say, well, it's
00:45:04.880 | not really a huge advantage over time. So my refined thoughts today are sort of what my initial
00:45:12.880 | guess was, when we started talking about AI a year ago, which is the picks and shovels providers can
00:45:19.200 | make a ton of money. And the people that own proprietary data can make a ton of money. But I
00:45:25.280 | think open source models will basically crush the value of models to zero economically, even though
00:45:30.640 | the utility will go to infinity, the economic value will go to zero. Did any of you guys see
00:45:34.960 | Chamath's interview with Jonathan Ross? No, not yet. Keep you put it out right to
00:45:39.760 | market. You made it public. You know, I did it just for my subscribers. But Jonathan is the founder
00:45:43.920 | and CEO of grok, the company that I just mentioned. And the quick version of that story is I was I
00:45:50.720 | would, I would pour over the Google earnings results in the mid teens of 2000. Because I was
00:45:56.480 | pretty actively investing in a bunch of different public equities. And Sundar said in a press
00:46:01.280 | release, he mentioned that they had rolled their own silicon for machine learning, called TPU.
00:46:06.400 | And I was like, what is going on that Google thinks that they can actually roll their own
00:46:13.200 | silicon? What must they know that the rest of us don't know. And so it took me about six or nine
00:46:18.400 | months, but through sunny, I got introduced to Jonathan. And then we were able to get Jonathan
00:46:24.640 | to leave Google. And he started and he Jonathan was the founder of TPU at Google. And then he
00:46:29.680 | started grok, which I was able to lead that funding round in 2016. So eight years ago.
00:46:35.840 | Anyways, I did a spaces with Jonathan talking about the entire AI landscape and AI acceleration
00:46:43.040 | to my subscribers, but it was so good. I gotta say he is he was so impressive.
00:46:48.720 | That we kind of like, figured out a way to just play the space and tape it and then we published
00:46:57.200 | it to everybody. So it's it's on. It's on my Twitter for anybody that wants to listen. It is
00:47:01.360 | amazing. He is really impressive. I was sitting on the 17 going to Santa Cruz, not moving for
00:47:10.480 | hour and a half and I listened to it. So I kept me alive. But I thought it was really
00:47:15.280 | what do you think? He's great. No, he's great. He has some great insights. And I think he's very
00:47:20.400 | compelling in arguing why some of the big cloud providers today that are offering infrastructure
00:47:29.120 | for AI model training and inference are going to be challenged if someone can build full stack
00:47:37.040 | and be sick and do it successfully. So it was a really good interview. I actually think it's
00:47:41.520 | really worth listening to. But I enjoyed it. Yeah, thanks for putting it out there. I was
00:47:46.480 | like, literally just sitting I was sitting in the car browsing Twitter. And I saw your thing and I
00:47:50.320 | clicked on it. And I just ended up with it was a little bit. It's a little hard. Actually, when
00:47:54.400 | you do a space for your subs, you can't actually just flip a switch and then release it to all
00:48:00.720 | of your followers. So we actually had to like, literally play it and then just capture the audio
00:48:07.440 | out and then republish it. But anyways, despite that inconvenience, if anybody's interested in
00:48:12.080 | learning about AI hardware, he is very compelling, and he's very educational.
00:48:16.080 | So sacks your thoughts on just how you're approaching investing in AI,
00:48:20.480 | if you're specifically investing in the underpinnings of AI, picks and shovels,
00:48:24.320 | yada, yada, or if you're just looking on the application level, and it's,
00:48:28.080 | you know, that kind of approach?
00:48:29.520 | Well, we divided the space into three categories. One is the the models themselves,
00:48:35.520 | the foundation models, which can be either open source or closed source.
00:48:39.120 | There's infrastructure. So like Jamal saying, it could be like model training.
00:48:44.560 | It could be vector databases, tools that developers use to create the AI stack typically inside their
00:48:52.960 | enterprise. And then the third would be applications, which can be things like co pilots,
00:48:57.360 | or it could be a pre AI app that's using AI to kind of turbocharge its capabilities.
00:49:04.800 | Yeah, most SAS would be in the application bucket. And so that's principally where we're
00:49:10.000 | focused. Although we do look at infrastructure plays and models. However, I do think there is
00:49:15.600 | an argument for I mean, really, with the question of commoditization, well, like all the model
00:49:21.200 | companies just get totally commoditized. We're really we're talking about open AI, right? Because
00:49:26.000 | we're the leader. So the question is, can they maintain their lead? I do think there is an
00:49:30.560 | argument that open AI will stay in the lead and actually do quite well. And I think there's a few
00:49:39.760 | points there. One is that if you're a consumer, you just want to use the best GPT, you want to
00:49:45.360 | use Google. It's just like search, right? If Google is a little better, or the process is,
00:49:51.520 | it's a little better than being or the other search engines, you don't win a plurality of
00:49:56.720 | search traffic, you actually end up winning at all because consumers just want the very best one. So
00:50:00.880 | most of the tests show that open AI is still ahead of the open source models. And I think
00:50:06.640 | even people in the open source movement will tell you that open AI is called six months ahead,
00:50:11.920 | they have no doubt that open source will get to where open AI is now in six months. Nonetheless,
00:50:18.160 | if open AI just maintains a little bit of a lead over open source, then it could compound. Yeah,
00:50:26.240 | it can basically win the vast, vast majority of the call consumer search or consumer GPT market.
00:50:32.800 | So that's point number one. Point number two is now that open AI has these hundreds of millions
00:50:39.920 | of consumers using it, that's a pretty attractive audience for developers to want to reach. And
00:50:46.960 | open AI has done a really good job creating a platform for developers to create, you know,
00:50:52.320 | what are called custom GPTs. So most developers don't want to go through the hassle of training
00:50:59.760 | a model, fine tuning a model, doing all of that work that you would have to do in the open source
00:51:03.840 | ecosystem. They just want to point chat GPT at a repository of data or documents information,
00:51:14.320 | have it learn what it needs to learn, fine tune it in that way, maybe add some
00:51:18.240 | lightweight functionality using open AI's platform to create a custom GPT. That's what I think most
00:51:24.160 | developers want is they just want a simple stack to work with. And they're going to prize,
00:51:29.360 | again, simplicity and the power of the developer tools over the theoretical control they get by
00:51:36.880 | rolling their own models, training and fine tuning their own models in open source.
00:51:40.640 | And so I think what you're seeing now is I mean, how many custom GPTs have already been created?
00:51:45.120 | It might be 10s of 1000s. I mean, there's so many millions. Yeah, it's so easy to create them.
00:51:50.240 | So you have a classic developer network effect where you've got open AI aggregating hundreds
00:51:55.600 | of millions of consumers, because they perceive that chat GPT is the best, then you've got
00:52:00.400 | developers wanting to reach that audience. So they build custom GPTs on the open AI platform
00:52:06.160 | that actually gives chat GPT more capability. Yeah. And that's something that open source
00:52:11.120 | can easily catch up with. Well, actually, actually, just finish the point. So yeah,
00:52:16.240 | so it is a flywheel where, you know, classic operating system, developer network effect,
00:52:22.400 | where you want to use the operating system that has the most programs written for it.
00:52:26.720 | Yeah. And interestingly, hugging face has realized this and hugging face released this week,
00:52:32.880 | their own version of GPTs, which is really interesting. And you can pick sacks, which
00:52:38.080 | open source project you want to use to make it so unlike GPTs on chat, you'd be do we have to pick
00:52:42.880 | theirs. On the hugging face one, you could pick you know, llama or whichever one you want.
00:52:48.800 | There's an account called artificial analysis that you can follow. The thing to keep in mind
00:52:54.080 | sacks is that for any of this to be true, these API's need to be usable, right? I mean,
00:52:59.040 | I don't know if you remember, but when we were building apps, even as back as the late 2000s and
00:53:05.200 | early 2010s, one of the things was there was a pretty important paper that was published by
00:53:09.600 | Google about attention span. And it would look at page load times in a cold cache environment,
00:53:15.760 | right? And it basically said you have to be at like 150 milliseconds, right? That's like,
00:53:20.720 | best in class performance or faster. And I remember when we read that at Facebook, we went crazy.
00:53:26.800 | So much so that at one point, a small team and I kind of actually launched a stripped down version
00:53:32.080 | of Facebook to compete with Facebook. If there's a Nick, you can probably find this article on tech
00:53:36.560 | crunch. And we did it without telling everybody was called like Facebook zero. Anyways, the point
00:53:39.920 | is, speed matters. Because in the absence of having very snappy response, you could have the
00:53:45.840 | best model in the world. But if it takes 10 2030 seconds to basically initiate and get back data
00:53:52.160 | from a fetch request, it's an impossible thing to do. So I think one of the things that you have to
00:53:57.920 | keep in mind is that there are these two things that need to move at the same time. One is the
00:54:01.440 | quality of how the model is, but two is the speed and its responsiveness, which is a function of
00:54:06.480 | again, hardware and your ability to basically tokenize tokens per second very, very quickly,
00:54:12.320 | so that developers are incentivized to not just play around in a sandbox, but to actually build
00:54:17.440 | production code. And I don't think we've seen that second thing happen because nobody is delivering
00:54:21.920 | it. And that's the big thing that nobody talks about. For example, like AWS, if you look inside
00:54:26.800 | of how expensive it is to build an app there, I've tried, even when they give you credits,
00:54:31.440 | the credits they give you aren't sufficient enough to even pay for half the power.
00:54:35.120 | And then the way that they schedule and the way that they try to orchestrate you to use hardware,
00:54:40.480 | makes building production apps unless you are willing to spend millions and millions of dollars
00:54:45.680 | for a very slow app, unfeasible. And so if you go back to a startup economy raising money here,
00:54:52.400 | the venture investor should start asking the question, well, what is the speed and usability
00:54:59.360 | of these services that I'm funding? And the reason is because you could build the best experience in
00:55:04.480 | the world that runs on localhost. But if all of a sudden you actually try to launch it as an app,
00:55:09.360 | and the thing takes 35 and 40 seconds to generate something, it's DOA. And I don't think enough
00:55:15.040 | people ask those questions or understand that that's true. So this is why I think you have to
00:55:19.920 | sort of be looking at both of these two things at the same time. But this account is interesting,
00:55:25.120 | because it kind of just strips things down to the bare facts. And they start to allow you
00:55:31.840 | as a third party to understand what you can do. Yeah, speed is just such a critical component
00:55:39.120 | of this. And what Google found was, as you know, free brokers, you were there,
00:55:43.040 | every time they lowered a certain number of milliseconds, the usage went up, right? People
00:55:46.720 | did more searches, which makes sense if you get your results back faster. Yeah, it was a key metric
00:55:51.840 | from day one at Google, Marissa Mayer, she ran all the consumer facing products at Google during this,
00:55:57.760 | you know, earlier era, she was like, beat it into the team. I mean, if you guys remember one of the
00:56:03.120 | first, the first kind of early feature of the Google results page was the amount of time it
00:56:08.720 | took to load the results, they'd show you how many milliseconds. Yeah, they show you that. Yeah,
00:56:11.920 | they literally put your North Star metric exposed to the consumer, which must have lit a fire under
00:56:17.360 | the asses of all the developers and server people. Yeah, well, I mean, they were kind of showing off
00:56:21.200 | the quality of the infrastructure and the way they did indexing and everything. But the result
00:56:24.720 | really played out in usage, the faster the results, the more frequently you would use the
00:56:30.240 | search engine, and the more likely you were to come back. And it's amazing how much consumer
00:56:34.320 | behavior drifts based on milliseconds. Like, you have a few milliseconds in this, right? I mean,
00:56:41.600 | if you look at the if you ever see the movie, the founder, where they explain the McDonald's
00:56:45.040 | process, they learned it to guys look at this. This is really interesting on this analysis.
00:56:49.200 | I mean, Samantha, are you saying that you don't think open AI can achieve the necessary levels
00:56:53.280 | of performance? No, I'm saying two things. Open AI is three different businesses. Open AI has a
00:56:58.960 | closed model that's trained on the open internet. I think economically, it's going to be very hard
00:57:03.600 | to sustain that unless they start buying all number of apps, so that they can get some fine
00:57:09.360 | tunes that they control that are proprietary to them. So for example, if open AI were to buy
00:57:13.760 | all of Reddit, that would be a really interesting development that would improve the quality of
00:57:19.360 | open AI in a unique and differentiated way, relative to where things like llama and Mr.
00:57:25.120 | will get to at the same time as well as x is grok. I think they're all going to converge
00:57:30.080 | to the same quality in the next probably 12 to 18 months. That's point number one,
00:57:35.600 | your belief there is there's enough data in those pools that everybody reaches parody.
00:57:40.320 | No, did you guys Okay, Nick, did you so I published this primer on AI primer.
00:57:45.360 | There is a slide in there, Nick, that you can pull out, but it just shows you
00:57:50.080 | that there is a converging in the quality of the results, as the number of the parameters of the
00:57:56.880 | model gets higher and higher. And what it effectively shows you is that we are already in
00:58:01.040 | the land of diminishing returns when models are trained on the same underlying data. So if you are
00:58:07.920 | using the open internet, llama, Mistral, open AI, they're all getting to the same quality code point,
00:58:14.720 | and they will be there within the next six to nine months. So that's business number one on
00:58:18.960 | open AI. Business number two is a consumer facing app called chat GPT. That has a lot of legs,
00:58:24.800 | because I think people are, you know, develop habits, it'll be very sticky. And I think it'll
00:58:29.440 | get better and better. And then the third business that they're in is selling enterprise services
00:58:34.800 | to large fortune 500. In fact, if you look at their open AI days, what they talk about is they
00:58:40.240 | sell, they've sold already to like 94% of the fortune 500. What does that mean? I think what
00:58:45.840 | that actually means is they've sold a lot of test environments and sandboxing. But again, in order
00:58:51.040 | to translate that into functional production code that's used by Bank of America, right, or Boeing
00:58:57.600 | in production, you have to have zippy zippy fast SLAs and a level of performance that no cloud
00:59:06.800 | provider yet has delivered none, nobody. So Nick, if you just go to that, please the thing, I just
00:59:13.040 | wanted to show you this because it's a really interesting chart. This is not mine. This is
00:59:15.760 | theirs. If you look at quality versus price tax, it starts to starts to show you like, where do
00:59:20.800 | you want to be? You want to be in the upper left quadrant in their analysis, right? And so the
00:59:29.040 | point is, what you can see is that a ton of different models are getting to the same place.
00:59:33.920 | And so obviously, you'd want to use the model that's the cheapest, or most convenient.
00:59:38.720 | Well, who's going to pay for that? If you if you and your LPS want to pay for that,
00:59:44.000 | the person that figures out the way that it's the cheapest to give you the same answer will
00:59:49.360 | actually end up winning because you will run out of money and they will not.
00:59:52.480 | I don't know. I mean, I think that there's a lot of business problems inside companies where people
00:59:58.400 | just want to very quickly set up their own, again, custom GPT, without having to go through
01:00:04.800 | the time the cost, the hassle of trying to do model training or fine tuning. So let's just back
01:00:11.920 | up. Here's the path that open AI is on. So step one, get hundreds of millions of consumers using
01:00:18.320 | it and getting them to view open AI or chat GPT as the Google in this area, right? Strong
01:00:25.840 | presumption. This is just the one you go to when you have a question. Step two,
01:00:31.520 | these same people, these same consumers now want to use chat GPT at work because there's some
01:00:38.400 | research they want to do. So open AI has just rolled out both enterprise licenses and team
01:00:44.560 | workspaces. So you can work collaboratively on the same queries in a team workspace.
01:00:48.960 | Step three is rolling out a very easy to use dev platform that allows developers to, again,
01:00:55.840 | create custom GPTs by just pointing open AI at repositories. Okay. And so let's say that you're
01:01:03.520 | the customer support team and you want to create a GPT to help customer support answer cases.
01:01:13.600 | You could basically then train chat GPT on, let's say every customer support ticket and email
01:01:25.200 | that the company has ever produced. Right now, you could wait for the company's it department
01:01:31.680 | to get us to act together and figure out how to train an open source model on the same thing.
01:01:37.760 | But do you really want to wait for that? Or do you just want to get going?
01:01:40.640 | You know, and now open AI has given you the enterprise license that you need to
01:01:45.840 | pacify the concerns about security and privacy and all that kind of things. To some degree,
01:01:52.000 | there's always going to be those super paranoid fortune 500 companies that will insist on doing
01:01:57.760 | on owning everything and doing it doing it open source. Let me build on your example. So I run a
01:02:03.280 | small software company during the day called hustle. And we saw a lot of tickets related to
01:02:11.840 | this specific legislation that exists whenever you're texting or you're doing auto dialing stuff
01:02:18.560 | called 10 DLC. And so we wanted to eliminate those those tickets, right. So I actually went and
01:02:26.000 | I built a GPT, which was called the privacy policy generator, because a lot of these
01:02:31.200 | trouble tickets were because the privacy policies were bad. And we trained them using a handful of
01:02:37.760 | ones that were good, and a handful of ones that are bad with a bunch of rules. And I trained them
01:02:41.840 | on. And it's wonderful, except I can't run it in production, because it's not the kind of thing
01:02:48.560 | that is usable in that way right now, it's still very difficult. And so all I'm saying is, I'm
01:02:54.480 | happy to keep spending a few $100 a month, a few 1000 bucks a month, whatever it is that I'm
01:02:59.280 | spending, I don't quite exactly know. And I agree with you, it was very easy. I think opening I does
01:03:05.120 | an excellent job of getting off the ground. But what I'm also saying is that when you actually
01:03:11.600 | translate that into a mainline use case, right, where I want to now give it to my support team
01:03:18.560 | and say, this is now a tool you can rely on. It's integrated into your workflow into your other
01:03:23.360 | tools, it's integrated in to how you pipe out data into Salesforce, or what have you. It's just very
01:03:29.600 | hard. And I'm not saying it's not going to get fixed. I'm saying we're just not there yet. And
01:03:34.720 | one of the ways in which it's not there is that there is no place I can go, including open AI,
01:03:40.320 | that actually makes it fast enough to be usable in production.
01:03:44.080 | You wrote this on open AI stack, you wrote a custom GPT?
01:03:48.000 | Yeah, build myself.
01:03:48.880 | Yeah, and you can do the monohogging face. Now it's gonna be a lot of options.
01:03:53.040 | In terms of integrating into your workflows, I think it's a really interesting point, because I
01:03:56.880 | saw a demo somewhere where now, actually, I think open AI announced this that you can add mention,
01:04:03.680 | a custom GPT.
01:04:05.520 | Yeah, yeah, Sonny showed me that this week on the pond.
01:04:08.320 | Yeah, in chat GPT, you can now add mention a custom GPT to kind of invoke it.
01:04:14.000 | Yeah. So how it works is you would say, Hey, I'm heading to New York, what flights
01:04:19.200 | can I get at Expedia at kayak, whatever, and then it gives you, you know, the results here,
01:04:24.640 | and you're kind of pulling that up,
01:04:26.640 | just to the point about, about where data advantages lie. And that's ultimately going
01:04:30.720 | to drive value. I cannot, I've tried to think a lot about this. I cannot think about a better
01:04:37.680 | data advantage. That is orders of magnitude better than anything else.
01:04:44.560 | Say YouTube, say, Yeah, it is.
01:04:48.640 | So here's, here's the numbers. I pulled this up. You guys know, like GPT three and three and a
01:04:53.680 | half were trained with a heavy weighting on common crawl, which is this open source. Yeah,
01:04:58.000 | we talked about this before Gil Albaz runs it open source, crawling of the web, the total amount of
01:05:04.400 | data in common crawl, which I think accounted and I could be off on this something like 40 to 60%
01:05:08.640 | of the waiting in GPT three or three, five, I'm off on this probably. So the total amount of data
01:05:14.240 | in that common crawl data set is about 10 petabytes. Okay. Based on YouTube's public
01:05:20.960 | statement, recently, they're seeing about 500 hours a minute of video uploaded or 720,000 hours
01:05:28.800 | a day. And if you assume somewhere between, you know, just under 1080 p on that video,
01:05:33.920 | we're talking about probably one to two petabytes of data being uploaded to YouTube per day. So if
01:05:42.800 | you assume like over time, the definition of the videos gone gotten better, and the amount of
01:05:47.360 | uploads gotten up, you could probably assume that there's roughly I'm guessing there's probably
01:05:53.200 | somewhere between 2000 and 3000 petabytes of data in YouTube, growing by one to two petabytes per
01:06:01.200 | day, which makes YouTube's data repository 300 times larger than common crawl, which makes it
01:06:08.400 | bigger than anything else that anyone else has. And here's the amazing thing about it. It has video,
01:06:13.440 | it has image, it has audio, it has text, it has everything, multi mode is growing. So if you were
01:06:21.040 | to take a bet or build a thesis around this point that the data advantage is going to drive value
01:06:26.480 | creation. If Google gets its act together and leverages the data repository at YouTube,
01:06:31.760 | it is an insurmountable moat that will only continue to extend because the quality of
01:06:36.720 | the YouTube experience and the network effects continue to accumulate for them. So I think it's
01:06:40.880 | the most valuable asset in the world today, based on this thesis, that AI value is going to accrue
01:06:46.800 | to the data on I think you're making such an important point. This is why the counterfactual
01:06:51.040 | is is true. And it's actually showing up in the data. And Nick will show you this slide again,
01:06:56.880 | from from the AI primer. But that is why we're seeing these diminishing returns freeberg and
01:07:01.360 | all of these third party benchmarks of these models in the same data sets, it's all using
01:07:04.880 | the same data set. So what we are proving is not that the underlying hardware can't scale,
01:07:10.160 | nor that transformers are only efficient to a point. That's not what all of this convergence
01:07:14.560 | is showing. It's that in the absence of proprietary data, you're just going to get to the
01:07:18.400 | same model quality. And we're seeing a bunch of different models get to a very early finish line,
01:07:24.400 | which, again, if people like Facebook are doing for free, that's much easier to underwrite,
01:07:30.320 | because you don't have to underwrite it, a differentiator in five years. But if you have
01:07:36.640 | a if you have a startup with equity value tied to a model, I think it's very, it's much more of a
01:07:43.040 | tenuous place to be in the absence of proprietary data. And everyone in the world has a camera and
01:07:48.960 | a microphone in their pocket, and high speed internet now, from the phone in their pocket,
01:07:54.320 | and more and more people are uploading that content that that data that's being generated.
01:07:59.040 | YouTube's got this free data vacuum. And they're just out in the world. And most of it's getting
01:08:03.520 | up. Well, it is public facing, though. So it's not just true for text. It's also true for you
01:08:10.320 | know, all of the image generation. So like, if you look at they can train more than just an LLM on
01:08:14.640 | it, right? They can build all sorts of Yeah, good. No, no, I was just gonna say like, the version of
01:08:20.080 | common crawl for training these image models also exists. And so to your point, it's like,
01:08:25.040 | we're all operating from the same brittle, very fixed, small quantum of training information.
01:08:31.680 | And so that is why I think like Facebook, and Google are doing a really important job by
01:08:39.600 | deciding that these models should be free. Right? And then being able to the so then the question
01:08:45.920 | that just accentuates their data advantage. It does. And I think that it allows them to decide
01:08:52.800 | how much to leak out. So for example, whenever like, if you were using a lot of Google services,
01:08:58.080 | like GFS, BigTable, BigQuery, you know, TensorFlow, the versions that you had access to via GCP,
01:09:07.120 | was always one or two generations behind what the Google employees got to use, right.
01:09:11.760 | But it was still so much better than anything else that we could get anywhere else that you would
01:09:16.960 | still build to those endpoints. And I think there's a similar version of this, where Facebook
01:09:21.200 | and Google probably realized like, look, we'll have version five running internally to optimize
01:09:27.600 | ads and all of this other stuff that makes our business that much better. And we'll expose
01:09:31.600 | version three to the public. But version three is still trained on so much proprietary data that
01:09:36.400 | it's so much better than version 10. And anything else that's just operating on the open internet.
01:09:40.880 | Right. And, you know, to your point, free bird, that's the outward facing stuff. YouTube is a
01:09:46.640 | collection of things people want to share. What Google also has is Google Docs, and Gmail, things
01:09:53.040 | that people say privately, so they have another data resource there that they can tap, you know,
01:09:58.000 | and there'll be regulations and privacy around that. But maybe there's a difference there. But
01:10:02.320 | I honestly can't think of the quantum coming close to YouTube. Not even close.
01:10:07.120 | Well, the thing to Jason's point, which is really interesting is like, you know,
01:10:10.240 | there's a modality in AI called rag, where you can actually just augment with very specific
01:10:16.640 | training on a very specific subset of documents to improve. It's like a, it's like a hacked version
01:10:21.680 | of a fine tune. But the beautiful thing about that is like, if you have a Google workspace,
01:10:25.600 | my entire company runs on on Google workspace. In fact, most of my companies do at this point,
01:10:30.720 | to click a button, where all of a sudden now, all of that stuff in all of my G drives, all of a
01:10:37.280 | sudden, is trainable, so that the end plus first employee comes in and has an agent that's tuned on
01:10:44.400 | every deck, every model, spreadsheet, every document, that's a huge edge. Huge, huge edge,
01:10:51.360 | by the way, and as a CEO, if you gave me that choice, I don't think anybody underneath that
01:10:57.120 | reports to me has any right to make that decision. But as a CEO, I would click that button instantly.
01:11:01.440 | And I had that right as a CEO. And so like, that's the CEO pitch. It's like, look, I can just give
01:11:06.240 | you these agents that are that are like, the next version of a knowledge base that we've always
01:11:11.440 | wanted inside of a company. Right? notion has this, you know, they've basically you can start
01:11:16.720 | asking your entire notion instance questions about notion, which is incredible. And yeah,
01:11:23.440 | you can just add as a CEO, you can see across everything, Chama, because as you know, with
01:11:27.840 | Google Docs, if you're in a compliance based industry, like finance, you can see everything,
01:11:32.960 | every message, every email, every document, and you can start the security model and the data
01:11:38.320 | model becomes very complicated. And all of that stuff, like, for example, like, how do you know
01:11:43.440 | that this spreadsheet is actually, you should learn on it, but who gets to actually then have
01:11:49.760 | that added to the subset of answers, right? Oh, all of a sudden, like salaries, yeah, HR
01:11:56.000 | information gets put into the training model, very dangerous, or subset A of a company's working on
01:12:02.240 | a proprietary chip design, but they actually like the way that Apple runs highly, highly segregated
01:12:07.520 | teams where nobody else can know. So there's all kinds of complicated security and data model and
01:12:12.800 | usage questions there. But yeah,
01:12:14.960 | for every new world. So there's been a lot of discussion real estate, you shared a video with
01:12:18.800 | us, why don't you kick it off for us here, free bird, what's going on in commercial real estate
01:12:21.920 | and sacks, you've got holdings and a lot of thoughts as well. So let's kick up the commercial
01:12:25.440 | real estate challenges of the moment. Well, I mean, I think we're teeing off of Barry's
01:12:31.760 | comments at this event last week, you and I met backstage, because I spoke right before him.
01:12:37.200 | And then he gave this talk, which is available on YouTube, where he talked about the state of
01:12:42.000 | the commercial real estate market, and particularly he talked about the office market. Just to take a
01:12:47.600 | step back to talk about the scale of commercial real estate as an asset class in the US, Nick,
01:12:53.280 | if you'll pull up this chart, the total estimated market value of commercial real estate in the US,
01:12:59.600 | across different categories is about $20 trillion, with about $3 trillion being in the office market,
01:13:06.000 | which is specifically what he was talking about. And he was saying that, in the US, we're seeing
01:13:10.080 | people not coming back to work, and all these offices are empty. And we've talked a lot about
01:13:14.320 | these offices being written down. So how significant of a problem is this? So $20 trillion
01:13:18.960 | asset class, obviously, the multifamily market is probably not as bad as office and retail,
01:13:24.000 | which are the most heavily affected, each of which are about $3 trillion. A piece,
01:13:29.040 | the rest of these categories seem relatively unscathed. In comparison, industrial hospitality,
01:13:35.680 | healthcare, you know, those those real estate sectors are probably pretty strong data centers,
01:13:40.160 | obviously growing like crazy self storage, the great market, if you pull up the next image.
01:13:44.720 | So it turns out that of the $20 trillion of market value, there's about $6 trillion of debt. So you
01:13:52.000 | can kind of think about that 20 trillion being 6 trillion owned by the debt holders and 14 trillion
01:13:59.200 | by the equity holders. And the debt is owned roughly 50% by banks and thrifts. And this was
01:14:05.840 | this concern that we've been talking about with higher rates is the debt on office actually going
01:14:10.400 | to be able to pay the debt on retail going to be able to pay when half of that debt is held by
01:14:14.480 | banks and thrifts that as we've talked about, have such a close ratio to deposits that you
01:14:21.200 | could actually see many banks become technically insolvent. If the debt starts to default,
01:14:26.080 | Barry's point that he made was if you look at the office market, which you know, is marked on
01:14:32.720 | everyone's books as $3 trillion of market value, he thinks it's probably worth closer to 1.8 trillion.
01:14:38.880 | So there's $1.2 trillion of loss in the office category. And if you assume 40% of that 3 trillion
01:14:48.640 | is held as debt, you're talking about $1.2 trillion of office debt, a reduction from 3
01:14:55.040 | trillion to 1.8 trillion means that the equity value has gone down from 1.8 trillion to 600
01:15:03.280 | billion. So they've lost equity holders in office real estate have probably lost two thirds of their
01:15:08.960 | value, two thirds of their investment, and who owns all of that. Most of that 60 plus percent
01:15:16.720 | call it two thirds of that is likely owned by private equity funds and other institutions,
01:15:21.920 | where the end beneficiary is actually pension funds and retirement funds. And so if two thirds
01:15:27.200 | of the value has to be written off in these books, and it hasn't happened yet, what's going to happen
01:15:31.120 | to all these retirement funds. And this is where going back to my speculation a couple months ago,
01:15:35.920 | kind of gets revisited. If you're actually talking about a two third write down on the value in these
01:15:40.080 | funds, most of that being pension funds, you're not going to see governments let that happen.
01:15:45.440 | You're gonna see the government. Yeah, it's good. There's gonna be some action at some point.
01:15:50.000 | And it's unlikely the office market is going to suddenly rebound overnight. If this stays the way
01:15:55.680 | it is, who's going to fill that hole for retirees and pensioners, because we're not going to let
01:16:00.960 | that all get written down. Someone is going to step in and say, we've got to do something about
01:16:04.240 | this. And there's going to need to be some sort of structured solution to support retirees and
01:16:09.200 | pensioners because that's ultimately who ends up holding the bag in this massive write down. He
01:16:13.840 | didn't go all the way there in his statements, he was talking more about his estimate of 3 trillion
01:16:17.680 | to 1.8 trillion. And then I tried to connect the dots and what that actually means. And ultimately,
01:16:22.560 | there's going to be some pain felt by retirement funds, that's going to need to be dealt with
01:16:26.720 | somehow. So I don't know if that if that sits right with you. I mean, I think the big picture
01:16:30.960 | is right. I think you're applying a lot of averages, right? I think in the office market,
01:16:35.680 | in particular, the typical office deal is more like one third equity and two thirds debt,
01:16:40.960 | there's a lot more leverage, right? So the point number one, which makes the situation worse,
01:16:45.840 | even worse. Yeah. So I would say that there's a huge amount of equity that's been written off.
01:16:51.840 | But in addition to that, there's a lot of debt holders who are in trouble, too. And that debt
01:16:59.040 | is is held by regional banks. So these commercial loan portfolios are significantly impaired. That's
01:17:05.520 | what we saw with Community Bank of New York is that their stock cratered when they reported
01:17:11.360 | higher than expected losses in their commercial real estate portfolio. So freeberg, I think the
01:17:16.960 | point is just the the pain from this is not just going to be on the equity holders, but also on
01:17:24.800 | these banks, which can't afford to lose. It's not evenly distributed. Yeah, yeah, right. Yeah,
01:17:29.920 | right. And we saw this in San Francisco, where some of these buildings have 70% equity ratios,
01:17:35.120 | and you know, the the value puts them in the hole and equities wiped out completely in the
01:17:39.840 | debt holders have to take a hit. And normally, you know, that debt is not really written off
01:17:44.400 | very often. It's well, this is why that the debt holders, the debt holders don't want to foreclose,
01:17:49.040 | they don't want to get these buildings back, because when they do, they're gonna have to
01:17:52.320 | write down the loan. As long as the loan is still outstanding, and they have a foreclosed,
01:17:56.960 | they can pretend that the value of the building is not impaired.
01:18:00.640 | Kick the can down the road is the best strategy for them. So it's called pretend and extend. So
01:18:05.440 | what I'll do is they'll work out a deal with the the landlord, the equity holder that the equity
01:18:12.000 | holder say, Listen, I can't pay the interest. So they'll just tack on the interest, basically as
01:18:16.560 | principal at the end of the loan. And they'll extend out the term of the loan, which would
01:18:20.960 | wipe out the equity at a certain point. Yeah. And what it does is it allows the equity holders to
01:18:26.800 | stay in control own the building, right? Because yeah, the equity holder can't pay make their debt
01:18:32.160 | payments today. But they're going to postpone those debt payments till the end of the of the
01:18:37.760 | loan. And again, in the meantime, just kind of hope that the market
01:18:41.280 | could match debt at some point, since they have so little equity in these buildings typically just
01:18:45.840 | exceed the value of the of the property. And it's like, I'm just working for the bank now.
01:18:50.560 | And you know, why am I even putting this work in?
01:18:53.200 | Because everyone kind of hopes that the market will recover the value, their equity will go up.
01:18:57.440 | And they'll be able to make their debt payments again. Yeah. So if you're equity,
01:19:01.760 | if you're the equity holder, you'd rather hold on and have a chance to your equity being worth
01:19:06.240 | something in recovery, then definitely lose the building. And if you're a regional bank,
01:19:10.720 | you'd rather blend and extend or pretend and extend as opposed to having to realize the loss
01:19:17.200 | right now. Yep. And showing the market that your solvency may not be as good as you thought.
01:19:23.040 | The same thing happened with government bonds. Remember that with SVB and these other banks,
01:19:28.160 | they had these huge held to maturity port bond portfolios. Yeah. These are being mostly just
01:19:34.960 | T bills that were worth, I don't know, 60 cents on the dollar when interest rates spiked from
01:19:39.600 | zero to 5%. But they didn't have to recognize that loss as long as they weren't planning to sell
01:19:46.960 | them. Right. And then when they had the bank run, they had to sell. Well, yeah, that's right. So
01:19:51.600 | when depositors left, because they needed their money, or because there was a run or because they
01:19:57.120 | could get higher rates in a money market fund, all of a sudden, these banks had to sell their
01:20:02.560 | health maturity portfolios and to recognize that loss. And that's when everyone realized,
01:20:06.720 | wait a second, they're not actually solving. Okay, so Jamaa, supply demand matters in real
01:20:11.120 | estate. We have a tale of two cities here on one side in real estate for commercial real estate,
01:20:16.880 | no demand for office space, which is in way too much supply. Paradoxically, on the other side,
01:20:24.000 | we have this incredible market for developers, which is, gosh, there's not enough homes,
01:20:28.720 | I think we need 7 million more homes. And the demand is off the charts for homes. Yeah.
01:20:33.680 | Yeah. I mean, I think I think you're basically right. It's not I keep trying to explain
01:20:36.560 | residential is not a great market either, because interest rates have spiked up. So
01:20:40.080 | there's not a vacancy problem. multifamily developers are still able to lease the units,
01:20:45.840 | they're still able to rent. The problem is their financing costs have shot through the roof. So
01:20:52.080 | again, let's say you were a developer who built multifamily in the last few years,
01:20:55.920 | you took out a construction loan, that construction loan might have been at three,
01:20:59.920 | 4%. Now, you want to put long term financing on it. But if you can even find debt right now,
01:21:06.800 | because there's a credit crunch going on, you may have to pay eight, nine, 10%.
01:21:09.680 | Yeah, but at least you can find a renter.
01:21:12.160 | You can find a renter. That's true, but only at a certain price. And let's say you underwrote that
01:21:17.280 | property to I don't know, like a five cap, like a certain yield. Yeah, but now your financing costs
01:21:22.960 | are much higher than you thought. You might be underwater. Yeah, that situation isn't as bad
01:21:29.200 | as what's happening. Why? I think it's worse in some ways. If you're fully if you're fully rented,
01:21:36.160 | and your building is underwater, because now your debt payments are much higher than you expected,
01:21:41.600 | then there's no business model. Yeah, but are we seeing that? Are we seeing tons of
01:21:45.040 | multifamily go under? Can I make two points? One, I think I think David is is right, which is that
01:21:50.320 | I don't know this market very well. But just as a, as a bystander, here's what I observe,
01:21:56.000 | it seems that the residential market has a feature. And I don't know whether it's good or
01:22:01.680 | bad. But that feature is that you reprice to market demand every year. So to the extent that supply
01:22:09.520 | demand is changing, and default rates are up or whatever. That's reflected in rents. And you see
01:22:15.840 | that because rents change very quickly. And most human beings are signing six month to one year
01:22:21.200 | leases. So that reset happens very quickly. So it can more dynamically adapt. So to the extent that
01:22:25.920 | a market segment is impaired, you see the impairment quickly on the on the office side,
01:22:32.240 | what I see is that there's been a structural behavior change in COVID that has reset in every
01:22:39.120 | other part of the world except for the United States, where there are these frankly, typically
01:22:45.280 | younger, typically more junior employees that have held many of these companies hostage in a bid to
01:22:51.520 | return back to office space. And so we know that there is this vacancy cliff that's going to hit
01:22:57.040 | commercial real estate. We just don't know when because there are there in long term leases,
01:23:01.920 | they're canceling these leases over long periods of time. So the reset cycle is longer. That's just
01:23:06.240 | my observation as an outsider. I don't know what that means for for prices or anything else. But
01:23:11.440 | it just seems that at least the residential market can find a bottoming sooner, because you can reset
01:23:16.080 | prices every year. But commercial just seems like a melting ice direction. Correct to use x
01:23:21.440 | that assessment. Commercial has both the demand problem and a financing problem. multifamily
01:23:28.320 | just as a financing problem, but it's important to understand we're talking about office,
01:23:31.920 | office retail, and then there's the office and then there's other industrial China,
01:23:37.120 | China has 50 million homes ahead of schedule 50 million additional supply that can house 150
01:23:44.640 | million people. So as acute as our issues are, the China issue might be much much Yeah, seismic.
01:23:50.880 | Can we just give you an example on the multifamily side? Okay, let's say that you buy a building,
01:23:55.360 | okay, let's say you bought a building in 2021, the absolute peak of the market. And you could get
01:24:00.720 | debt at, say, 4%. Okay, and you penciled out, let's call it a 6% yield, that with the debt you're
01:24:08.480 | getting. So let's say you did two thirds debt at 4%. You could now lever up that 6% yield to 10%.
01:24:15.920 | Okay, that's like sort of the math, right? Now, all of a sudden, and to get there, you'd have to
01:24:21.920 | do some value added work on the property, you have to spruce it up. Okay. Now, it's a few years later.
01:24:27.600 | And your short term financing is running out, and you need to refi and you've done your value
01:24:33.280 | added work. But here's the problem. The overall valuations in the market have come way down.
01:24:38.320 | So before, the bank was willing to give you two thirds loan to value. Now the values come way down,
01:24:45.680 | you may not even be able to get two thirds loan to value. So you're gonna have to do
01:24:48.640 | what's called an equity in refinancing, you're gonna have to produce more equity, you're gonna
01:24:53.760 | have to pony up more money. So instead of taking equity out, like when the deal goes well, you're
01:24:57.920 | gonna have to put equity in, you may not have that equity if you're the developer. The other thing is
01:25:02.320 | that your financing costs now might be 10%. So now you've got negative leverage, you're generating
01:25:08.400 | a 6% yield, but you're borrowing at 10% to generate that 6% yield. So the debt no longer
01:25:14.560 | makes sense. You're again, you're not positively leveraged, you're negatively leveraged. So
01:25:18.240 | you're not going to want to take out that debt. And if you do take out that debt,
01:25:23.040 | the buildings in the underwater, it's not gonna be generating net operating incomes to be generating
01:25:28.240 | losses. So that's why even categories like multifamily, where you don't have a vacancy
01:25:36.320 | problem, there's strong demand. Yeah, those properties still don't make sense. If you had
01:25:42.080 | long term debt on your multifamily, if you were able to lock in that 4% loan for 10 years,
01:25:47.440 | you're fine. But for all the people who are refinancing now who are coming up this year,
01:25:52.480 | last year, next year, they're in deep trouble. And that's why there's a rolling crisis in real
01:25:58.320 | estate is because the debt rolls over time. It's not like everybody hits the wall and has to
01:26:03.680 | refinance at the same time. Well, thank God, right? I mean, this would be cataclysmic if it was if
01:26:08.320 | ever. And can you imagine if Silicon Valley and San Francisco had to say here's actually the reality,
01:26:13.760 | anybody want to actually pay for this office all in the same year? Right? That would be insane.
01:26:19.920 | But the crisis is growing is as the leases roll. And those old rents that were higher than market
01:26:26.000 | roll off. And now you have to take on new leases, if you can even get them, it's gonna be much
01:26:31.760 | lower. And as the old loans roll that we're at a much lower interest rate, you have to get financing,
01:26:37.440 | even if you get it at a much higher interest rate. That's when all of a sudden these buildings go
01:26:41.520 | from being basically solvent to insolvent. Yeah, I mean, Janet Yellen's just gonna bail
01:26:47.520 | these folks out. That means you won't bail out the banks themselves, but you'll bail out the
01:26:50.720 | creditors. Obviously, the people holding the bag, they'll get bailed. Yeah, that's everybody agrees.
01:26:56.160 | Janet Yellen?
01:26:56.960 | Yellen.
01:26:58.560 | Our Treasury Secretary.
01:26:59.680 | I don't know if she's gonna be the one to do it. I think there's gonna be congressional action on
01:27:03.760 | this stuff.
01:27:05.120 | Yeah, I mean, they tend to lead it so. All right, for the Sultan of Science, David Preberg,
01:27:11.840 | and David Sachs, and Chamath Palihapitiya, the Chirmy Dictator, I am the world's greatest
01:27:17.040 | moderator. We'll see you next time on The Oil and Pot. Bye bye.
01:27:19.840 | Bye bye.
01:27:20.880 | [Music]
01:27:22.960 | [Music]
01:27:25.120 | [Music]
01:27:27.840 | [Music]
01:27:30.000 | [Music]
01:27:32.960 | [Music]
01:27:34.960 | [Music]
01:27:40.960 | [Music]
01:27:46.960 | [Music]
01:27:55.920 | [Music]
01:27:57.920 | [Music]
01:28:03.920 | [Music]
01:28:09.920 | [Music]
01:28:17.920 | [music]