Back to Index

E111: Microsoft to invest $10B in OpenAI, generative AI hype, America's over-classification problem


Chapters

0:0 Bestie intro!
0:43 Reacting to Slate's article on All-In
11:18 SF business owner caught spraying homeless person on camera
29:22 Microsoft to invest $10B into OpenAI with unique terms, generative AI VC hype cycle
69:57 Biden's documents, America's over-classification problem
87:16 Best cabinet positions/ambassadorships

Transcript

Is anybody else seeing a half a second lag with J. Cole? Like a second lag? Test, test, test. One, two, one, two. From like the way his mouth moves. Well, that always happens. Oh, God. Here it comes. His mouth never stops moving. Relax, Sax. Relax, Sax. Are we going?

Are we recording? Are you ready to go? Don't lose this. Don't lose this. A plus material. Are we going to do hot sacks? All right, let's go. This is Chappelle at the punchline. Let's go. Let's go. I'm ready to go. Let your winners ride. Rain Man David Saxon. I'm going all in.

And it's said we open sourced it to the fans, and they've just gone crazy with it. Love you guys. Queen of Kinwam. I'm going all in. All right, everybody. Welcome to episode 111 of the World's Greatest Podcasts, according to Slate, the podcast that shall not be mentioned by the press, apparently.

No, what do you mean? They just did a profile on us. Well, they did. This is the conundrum. It's so much of a phenomenon that we're the number one business and the number one tech podcast in the world, hands down, that the press has a hard time giving us any oxygen, because they want to hate us.

They want to cover it. You're saying they take the ideas, but not the-- They don't want to cite it. They don't want to cite it. They don't want to cite it. But anyway, shout out to Slate. Yeah, what I thought was interesting was the guy pointed out that we don't want to subject ourselves to independent journalists asking us independent questions.

Therefore, we go direct. And that's kind of the thing nowadays. When everyone says they want to go direct, it's because they don't want to be subject to independent journalists. Well, one might ask themselves why subjects don't want to go direct. Yeah, exactly. You mean don't want to go to journalists.

Yeah, because there's a specific reason why principals, the subject of stories, do not want to have the press interpret what they're saying is because they don't feel they're getting a fair shake. They feel like the world should be interesting. The challenge is that then we avoid independent scrutiny of our points of view and our decisions.

They're constantly writing hit pieces about us. The question is, when we want to present our side of it, do we need to go through their filter or not? Why would you go through their filter when it's always going to be a hit piece? Well, and also-- They have a class hatred of basically of technology entrepreneurs and investors.

Just use sex. By the way, I think-- No, I don't think-- Just use sex. You're right, J-Kal. They don't hate you because you genuflect to their political biases. You see, if-- Hold on. If you do what SPF did, which is basically agree with all of their biases, then yes, they'll treat you better.

That's the deal. That's how it works. And when you say they, you're referring to specific large media outlets, right, Saks? They all think the same way. He's not referring to Fox and Tucker. OK, you can name one. I'll trade you. I'll tell you what. I'll trade you Fox for MSNBC and CNN and The New York Times, The Washington Post, and The Atlantic Magazine, and on and on and on.

You get a lot of mileage out of being able to name Fox. The fact of the matter is-- Megyn Kelly? That's a podcaster. She's independent now. That's true. She is independent. You can name one, I mean, literally one outlet that is not part of this mainstream media. And they all think the same way.

There are very small differences in the way they think. It's all about clicks. It's all about clicks at this point. And it's all about advocacy journalism. Not just about clicks. And advocacy. It's that combination. What you're calling advocacy is bias and activism. It's activism. That's what I'm talking about, activism journalism, yes.

I think Draymond also highlights a really important point, which is he started his podcast. It's become one of the most popular forms of sports media. And he can speak directly without the filtering and classification that's done by the journalist. And it seems to be a really powerful trend. The audience really wants to hear direct and they want to hear unfiltered, raw points of view.

And maybe there's still a role for, I think, the journalism separate from that, which is to then scrutinize and analyze and question and-- It's not journalism. It's just activism. They're just activists. There are also journalists out there, Sax. Actually, well, it depends what the topic is and what the outlet is.

But actually, I would argue that most of these journalists are doing what they're doing for the same reason that we're doing what we're doing, which is they want to have some kind of influence because they don't get paid very much. But the way they have influence is to push a specific political agenda.

I mean, they're activists. They're basically party activists. It has become advocacy journalism. Yes, that's the term I coined for it. It's advocacy journalism. You guys see this brouhaha where Matt Iglesias wrote this article about the Fed and about the debt ceiling? And through this whole multi-hundred word, thousand word tome, he didn't understand the difference between a percentage point and a basis point and then he didn't calculate the interest correctly?

Yeah, I did see that. Wow, so wait a second. You're saying the Fed's raising 25%? Yeah, that's a huge difference between-- My mortgage is going up 25%? Between a principal and an outside analyst, right? Like a principal has a better grasp typically of the topics and the material. But the argument from a journalist-- But he's considered, within the journalist circle, he's considered the conventional wisdom.

I get it. But the argument from a journalist is that by having that direct access, that person is also biased. Because they're an agent, because they're a player on the field, they do have a point of view and they do have a direction they want to take things. So it is a fair commentary that journalists can theoretically play a role, which is they're an off-field analyst and don't necessarily bring bias.

I would argue they're less educated and more biased than we are. That may or may not be true, what the two of you guys are debating, which is a very subjective take. But the thing that is categorical and you can't deny is that there is zero checks and balances when something as simple as the basis point, percentage point difference isn't caught in proofreading, isn't caught by any editor, isn't caught by the people that help them review this.

And so what that says is all kinds of trash must get through because there's no way for the average person on Twitter to police all of this nonsensical content. This one was easy because it was so numerically illiterate that it just stood out. But can you imagine the number of unforced errors journalists make today in their search for clicks that don't get caught out, that may actually tip somebody to think A versus B?

That's, I think, the thing that's kind of undeniable. Right. Yeah. You only need to-- Wasn't the Rolling Stone article? There's a very simple test for this. If you read the journalists writing about a topic you are an expert on, whatever the topic happens to be, you start to understand, OK, well, on that story I'm reading, that they understand about 10% or 20% or 30% of what's going on.

But then when you read stories that you're not involved in, you know, you read a story about Hollywood or, I don't know, pick an industry or a region you're not super aware of, you're like, OK, well, that must be 100% correct. And the truth is, journalists have access to 5 to 20-- Yeah, there's a name for that.

There is a name for it, yeah. It's called the "Gentleman Amnesia Effect." You just plagiarized Michael Crichton, who came up with that. Yeah, so you-- Yeah. But no, he's exactly right. But I think it's worse than that. It's because now the mistakes aren't being driven just by sloppiness or laziness or just a lack of expertise.

I think it's being driven by an agenda. So just to give you an example on the Slate thing, the Slate article actually wasn't bad. It kind of made us seem, you know, cool. The subheadline was, "A close listen to all in the infuriating, fascinating, safe space for Silicon Valley's money men." OK.

But the headline changed. So I don't know if you guys noticed this. The headline now is, "Elon Musk's Inner Circle is telling us exactly what it thinks." First of all, like, they're trying to-- It's Elon for clicks. Yeah, it's Elon for-- so they're trying way too hard to, like, describe us in terms of Elon, which, you know, is maybe two episodes out of 110.

But before Inner Circle, the word they used was cronies. And then somebody edited it, because I saw cronies in, like, one of those tweet, you know, summaries. You know, where, like, it does a capsule or whatever? Yeah, yeah, yeah. And those get frozen in time. So, you know, they were trying to bash us even harder.

And then somebody took another look at it and toned it down. Well, here's what happened. I'll tell you what happens in the editorial process. Whoever writes the article, the article gets submitted. Maybe it gets edited, proofread, whatever. Maybe it doesn't even in some publications. They don't have the time for it, because they're in a race.

Then they pick-- there's somebody who's really good at social media. They pick six or seven headlines. They A/B test them. And they even have software for this, where they will run a test. Sometimes they'll do a paid test. They put $5 in ads on social media. Whichever one performs the best, that's the one they go with.

So it's even more cynical. And because people who read the headlines, sometimes they don't read the story, right? Obviously, most people just see the headline. They interpret that as a story. That's why I told you, when they did that New Republic piece on you with that horrific monstrosity of an illustration, don't worry about it.

People just read the headline. They know you're important. Nobody reads the story anyway. But it wasn't a bad article, actually. It was well-written, actually. I was in shock. I was like, who is this writer that actually took the time to write some prose that was actually decent? Yeah, he had listened to a lot of episodes, clearly.

That was a really good moment, actually. That was great advice, because you gave it to him. And you gave it to me, because both of us had these things. And Jason said the same thing. Just look at the picture. And if you're OK with the picture, just move on.

And I thought, this can't be true. And it turned out to mostly be true. Yeah, but my picture was terrible. Yeah, but it's close to reality. So I mean, you've got to-- It was really-- Oh, ha, geez. I mean, the person who did the worst there was Peter Thiel.

Poor Peter. Yeah, but that just shows how ridiculously biased it is, right? My picture wasn't so bad. My picture wasn't so bad. Hugh Grant. Elon, let's pull that up one more time here. Elon looks like Hugh Grant. I just kind of-- Yeah, he does. Not bad. Kind of looks like Hugh Grant in "Notting Hill." I knew that article was going to be fine when the first item they presented as evidence of me doing something wrong was basically helping to oust Chasey Boudin, which was something that was supported by like 70% of San Francisco, which is a 90% Democratic city.

So not exactly evidence of some out-of-control right-wing movement. Look at the headline. "The Quiet Political Rise of David Sachs, Silicon Valley's Prophet of Urban Doom." I'm just letting you know, people don't get past the six word in the image. It's 99% of people are like, oh my god, congrats on the "Republic" article.

It could have literally been Laurel-- what do they call them? Laurel Ipsums? You know, like it could have just been filler words from their second graph down and nobody would know. Yeah. But now apparently if you notice that San Francisco streets look like Walking Dead, that apparently you're a prophet of urban doom.

I mean, these people are so out of touch. I mean, they can't even acknowledge what people can see with their own eyes. That's the bias that's gotten crazy. And I don't know if you guys saw this really horrible, dystopian video of an art gallery owner who's been dealing with owning a storefront in San Francisco, which is challenging, and having to clean up feces and trash and whatever every day.

And I guess the guy snapped, and he's hosing down a homeless person who refuses to leave the front of his store. Oh, I saw that. I saw that. The humanity in this is just insane. Like, really, like you're hosing a human being down-- It's terrible. --who is obviously not living a great life and is in dire-- I can feel for both of them, J-Kal.

I can feel for both of them. I agree that it's not good to hose a human being down. On the other hand, think about the sense of frustration that store owner has, because he's watching his business go in the toilet, because he's got homeless people living in front of him.

So they're both, like, being mistreated. The homeless person's being mistreated. Say more about the store owner. Say more. The homeless person's being mistreated. The store owner's being mistreated by the city of San Francisco. Yeah. Say more about the store owner. That person's not in a privileged position. That person probably-- the store owner.

He's probably fighting to stay in business. I'm just saying-- I'm not saying that's right, but-- No, no. I'm just-- I'm laying the rope. No. I'm just-- --I'm just saying-- --justified in any way. What you're trying to do is, oh, my god, look at this homeless person being horribly oppressed.

No. That store owner is a victim, too. Yeah, there's no doubt. It's horrible to run a business in there. I mean, what is that person supposed to do? No, wait. This is symbolic of the breaking down of basic society. Like, both of these people are obviously like-- it's just a horrible moment to even witness.

It's like, ugh. It's like something-- Jason, do you have equal empathy for the store owner and the homeless person, or no? Under no circumstances should you hose a person down in the face who is homeless. Like, it's just horrific to watch. It's just inhumane. This is a human being.

Now, but as a person who owns a store, yeah, my dad grew up in the local business. If people were abusing the store, and you're trying to make a living, and you've got to clean up whatever, excrement every day, which is horrific. Yes. And this thing is dystopian. In that moment, look, in that moment, the empathy is not equal.

I think you have more empathy, obviously, for the person on the receiving end of that hose. But in general, our society has tons of empathy for homeless people. We spend billions of dollars trying to solve that problem. You never hear a thing about the store owners who are going out of business.

So on a societal level, not in that moment, but in general, the lack of empathy is for these middle class store owners, who may not even be middle class, working class, who are struggling to stay afloat. And you look at something like, what is it, like a quarter or a third of the storefronts in San Francisco are now vacant?

I'm just shocked this person is running-- the shocking thing is, like, this person is running an art gallery storefront in San Francisco. Like, why would you even bother? Why would you bother to have a storefront in San Francisco? I mean, everybody's left. It's just-- What do you mean, why do you bother?

If you've opened a store, what are you supposed to do, start to code all of a sudden? Well, no. I mean, you would shut it down at some point and find an exit. And do what? Just like all businesses have. Yeah, but a store has large fixed costs, right?

So what are they supposed to do? Maybe an investment he made 10 years ago. Exactly. At some point, you have to shut down your store in San Francisco the second you can get out of the lease. The solution to everything, J. Cal, isn't go to coding school online and then, you know, end up working for Google.

Oh, I didn't say it was. Moving to another city is a possibility. So true. A lot of folks in Silicon Valley, I think, in this weirdly fucked up way, do believe the solution to everything is learn to code. Or become an Uber driver. Or become a masseuse. Get a gig job.

Get a gig job. The guy spent years building his retail business. I mean, the thing is-- And then a homeless person camps in front. And he calls the police. The police don't come and move the homeless person. The homeless person stays there. He asks nicely to move. Customers are uncomfortable going in the store as a result.

Yeah, I stopped going to certain stores in my neighborhood because of homeless tents being literally fixated in front of the store. And I'd go to the store down the road to get my groceries or whatever. I mean, it's not a kind of uncommon situation for a lot of these small business owners.

They don't own the real estate. They're paying rent. They've got high labor costs. Everything's inflating. Generally, city population's declining. It's a brutal situation all around. I think if everybody learns to code or drives an Uber, the problem is that in the absence of things like local stores and small businesses, you hollow out communities.

You have these random detached places where you kind of live. And then you sit in your house, which becomes a prison while you order food from an app every day. I don't think that is the society that people want. So I don't know. I kind of want small businesses to exist.

And I think that the homeless person should be taken care of. But the small business person should have the best chance of trying to be successful because it's hard enough as it is. The mortality rate of the small business owner is already 90%. It's impossible in San Francisco, let's just be honest.

So stop genuflecting, J. Cal. I'm not genuflecting. Listen-- Yeah, you are because here's-- Wait, how am I genuflecting? I'm saying the guy-- I'm just shocked that the guy even has a storefront. I would have left a long time ago. You're showing a tweet that's a moment in time. And you're not showing the 10 steps that led up to it.

Oh, 1,000 steps. The five times he called the police to do something about it. I framed it as dystopian from the get. I'm not genuflecting. I'm not genuflecting. I'm not genuflecting. I'm not genuflecting. I'm not genuflecting. I'm not genuflecting. I'm not genuflecting. I'm not genuflecting. I'm not genuflecting. I'm not genuflecting.

I'm not genuflecting. I'm not genuflecting. I'm not genuflecting. I'm not genuflecting. I'm not genuflecting. I'm not genuflecting. I'm not genuflecting. I'm not genuflecting. I'm not genuflecting. I'm not genuflecting. I'm not genuflecting. I'm not genuflecting. It's addiction. It's addiction. You say you know this, and it's mental illness. Schellenberger's done the work.

It's like he said, 99% of the people he talks to, it's either mental illness or addiction. But we keep using this word homeless to describe the problem. But the issue here is not the lack of housing, although that's a separate problem in California. But it's basically the lack of treatment.

Totally. So we should be calling them treatmentless. And mandates around this, because-- And enforcement. You cannot have-- you can't have a super drug be available for a nominal price and give people a bunch of money to come here and take it and not enforce it. You have to draw the line at fentanyl.

I'm sorry. Fentanyl is a super drug. There's three alternatives. There's mandated rehab, mandated mental health or jail, or housing services. If you're not breaking the law, you don't have mental illness, you don't have drug addiction. And then provide-- those are the four paths of outcome here of success. And if all four of those paths were both mandated and available in abundance, this could be a tractable problem.

Unfortunately, the mandate-- I mean, you guys remember that Kevin Bacon movie, where Kevin Bacon was locked up in a mental institution, but he wasn't like-- he wasn't mentally ill? It's a famous story. Footloose? What's that? Footloose. No. No, it's a famous story. You guys-- someone's probably going to call me an idiot for messing this whole thing up.

But I think there's a story where mandated mental health services, like locking people up to take care of them when they have mental health issues like this, became kind of inhumane. And a lot of the institutions were shut down, and a lot of the laws were overturned. And there are many of these cases that happened, where they came across as torturous to what happened to people that weren't mentally ill.

And so the idea was, let's just abandon the entire product. Like a goose nest? That's another good one. Yeah. Well, that's another one, right? And it's unfortunate, but I think that there's some-- we talk a lot about nuance and gray areas, but there's certainly some solution here that isn't black or white.

It's not about not having mandated mental health services, and it's not about locking everyone up that has some slight problem. But there's some solution here that needs to be crafted, where you don't let people suffer, and you don't let people suffer both as the victim on the street, but also the victim in the community.

You're talking about a 51/50, I think, like when people are held because they're a danger to themselves or others kind of thing? Right, but Jacob, let's think about the power of language here. If we refer to these people as untreated persons instead of homeless persons, and that was the coverage 24/7 in the media is, this is an untreated person, the whole policy prescription would be completely different.

We'd realize there's a shortage of treatment. We'd realize there's a shortage of remedies related to getting people in treatment, as opposed to building housing. But why-- And laws that mandate it, that don't enable it. Because if you don't mandate it, then you enable the free reign and the free living on the street and the open drug markets and all this sort of stuff.

There's a really easy test for this. If it was yourself, and you were addicted, or if it was a loved one, one of your immediate family members, would you want yourself or somebody else to be picked up off the street and held with a 51/50 or whatever code involuntarily against their will because they were a danger?

Would you want them to be allowed to remain on the street? Would you want yourself if you were in that dire straits? And the answer, of course, is you would want somebody to intervene. But what's the liberal policy perspective on this, J. Cal? So let me ask you as our diehard liberal on the show-- No, I'm not a diehard liberal.

No. No, he's an independent and only votes for Democrats. Please, get it right. 75% of the time I vote a Democrat, 25% Republican. Please, get it right. Independent votes for Democrats. 25% Republicans. Is it not that your individual liberties are infringed upon if you were to be, quote, "picked up and put away"?

My position on it is if you're not thinking straight and you're high on fentanyl, you're not thinking for yourself. And you could lose the liberty for a small period of time-- 72 hours, a week-- especially if you're a danger to somebody, yourself or other people. And in this case, if you're on fentanyl, if you're on meth, you're a danger to society.

I think if more people had that point of view and had that debate, as Saxe is saying, in a more open way, you could get to some path to resolution on-- Just not in San Francisco. It's not how it happened. So you guys know this. We won't say who it is, but someone in my family has some pretty severe mental health issues.

And the problem is, because they're an adult, you can't get them to get any form of treatment whatsoever. Right, right. You only have the nuclear option. And the nuclear option is you basically take that person to court and try to seize their power of attorney, which is essentially saying that-- Individual liberties are gone.

And by the way, it is so unbelievably restrictive what happens if you lose that power of attorney and somebody else has it over you. It's just a huge burden that the legal system makes extremely difficult. And the problem-- Well, some of the law is a backstop. If the person's committing something illegal, like camping out or doing fentanyl, meth, whatever, you can use the law as the backstop against personal liberty.

You can't use the law. All that person can do is really get arrested. Even that is not a high enough bar to actually get power of attorney over somebody. The other thing that I just wanted you guys to know-- I think you know this, but just a little historical context-- is a lot of this crisis in mental health started because Reagan defunded all the psychiatric hospitals.

He emptied them in California. And that compounded, because for whatever reason, his ideology was that these things should be treated in a different way. And when he got to the presidency, one of the things that he did was he repealed the Mental Health-- I think it's called the Mental Health Systems Act, MHSA, which completely broke down some pretty landmark legislation on mental health.

And I don't think we've ever really recovered. We're now 42 years onward from 1980, or 43 years onward. But just something for you guys to know that that's-- Reagan had a lot of positives, but that's one definitely negative check in my book against his legacy is his stance on mental health in general and what he did to defund mental health.

Well, let me make two points there. So I'm not defending that specific decision. There were a bunch of scandals in the 1970s, and epitomized by the movie One Flew Over the Cuckoo's Nest with Jack Nicholson, about the conditions in these mental health homes. And that did create a groundswell to change laws around that.

But I think this idea that somehow Reagan is to blame when he hasn't been in office for 50 years, as opposed to the politicians who've been in office for the last 20 years, I just think it's letting them off the hook. I mean, Gavin Newsom, 10, 15 years ago when he was mayor of San Francisco, declared that he would end homelessness within 10 years.

He just made another declaration like that as governor. So I just feel like-- I'm not saying it's Reagan's fault. I'm just saying-- Well, I just think it's letting these guys off the hook. --it's an interesting historical moment. I think it's letting the politicians off the hook. Society needs to start thinking about changing priorities.

We didn't have this problem of massive numbers of people living on the streets 10, 15 years ago. It was a much smaller problem. And I think a lot of it has to do with fentanyl. The power of these drugs has increased massively. There's other things going on here. So in any event, I mean, you can question what Reagan did in light of current conditions.

But I think this problem really started in the last 10, 15 years. Like in an order of magnitude bigger way. These are super drugs. Until people realize these are a different class of drugs, and they start treating them as such, it's going to just get worse. There's no path.

Oh, as far as I know, Reagan didn't hand out to these addicts $800 a week to feed their addiction so they could live on the streets of San Francisco. That is the current policy of the city. I hear you. It's a terrible policy. All I just wanted to just provide was just that color that we had a system of funding for the mental health infrastructure, particularly local mental health infrastructure.

And we took that back. And then we never came forward. And all I was saying is I'm just telling you where that decision was made. I think that's part of the solution here is, yeah, we're going to have to basically build up shelters. We're going to have to build up homes.

And to support your point, the problem now, for example, is Gavin Newsom says a lot of these things. And now he's gone from a massive surplus to a $25 billion deficit overnight, which we talked about even a year ago because that was just the law of numbers catching up with the state of California.

And he's not in a position now to do any of this stuff. So this homeless problem may get worse. Well, they did appropriate-- I forget the number. It's like $10 billion or something out of that huge budget they had to solve the problem of homelessness. I would just argue they're not tackling it in the right way.

Because what happened is there's a giant special interest that formed around this problem, which is the building industry, who gets these contracts to build the, quote, "affordable housing" or the-- Homeless industrial complex is insane. The homeless industrial complex. And so they end up building 10 units at a time on Venice Beach, like the most expensive land you could possibly build because you get these contracts from the government.

So there's now a giant special interest in lobby that's formed around this. If you really want to solve the problem, you wouldn't be building housing on Venice Beach. You'd be going to cheap land just outside the city. Totally. And you'd be building scale shelters. I mean, shelters that can house 10,000 people, not 10.

And you'd be having treatment services-- But they also have to get treatment. Yes. You have to get treatment. But with treatment built into them, right? You'd be solving this problem at scale. And that's not what they're doing. By the way, do you guys want to hear this week in Grift?

Sure. We're all in. That's a great example of Grift. I read something today in Bloomberg that was unbelievable. There's about $2 trillion of debt owned by the developing world that has been classified by a nonprofit, the Nature Conservancy in this case, as eligible for what they called nature swaps.

So this is $2 trillion of the umpteen trillions of debt that's about to get defaulted on by countries like Belize, Ecuador, Sri Lanka, Seychelles, you name it. And what happens now are the big bulge bracket, Wall Street banks and the Nature Conservancy, goes to these countries and says, listen, you have a billion dollar tranche of debt that's about to go upside down.

And you're going to be in default with the IMF. We'll let you off the hook. And we will negotiate with those bondholders to give them $0.50 on the dollar. But in return, you have to promise to take some of that savings and protect the rainforest or protect a coral reef or protect some mangrove trees.

All sounds good. Except then what these folks do is they take that repackaged debt. They call it ESG. They mark it back up. And then they sell it to folks like BlackRock who have decided that they must own this in the portfolio. So it literally just goes from one sleeve of BlackRock, which is now marked toxic emerging market debt.

And then it gets into someone's 401(k) as ESG debt. Isn't that unbelievable? So you could virtue signal and buy some ESG to make yourself feel good, yeah. $2 trillion of this paper. Here's all you have to know about ESG is that Exxon is like the number seven top-ranked company according to ESG.

And Tesla is even on the list. Disastrous. How crazy is that? It's a complete scam. Yeah. All of those-- we've said this many times, but each of those letters individually means so much and should be worth a lot to a lot of people. But when you stick them together, it creates this toxic soup where you can just hide the cheese.

Yeah, I mean, governance is important in companies. Of course, the environment is important. Social change is important. I mean, but why are these things grouped together in this? It just perverts the whole thing. It's an industry, J. Cal. It's an industry of consultants. It's a grift. Absolutely. All right, speaking of grifts, Microsoft is going to put $10 billion or something into chat GPT.

Degenerate AI, as I'm calling it now, is the hottest thing in Silicon Valley. The technology is incredible. I mean, you can question the business model, maybe, but the technology is pretty-- Well, I mean, yeah. So what I'd say is $29 billion for a company that's losing $1 billion in Azure credits a year.

That's one way to look at it. That's also a naive way to look at a lot of other businesses that ended up being worth a lot down the road. I mean-- Sure. You can model out the future of a business like this and create a lot of really compelling big outcomes.

Potentially, yeah. So Microsoft is close to investing $10 billion in open AI in a very convoluted transaction that people are trying to understand. It turns out that they might wind up owning 59% of open AI, but get 75% of the cash and profits back over time. 49%. 49%, yeah, of open AI.

But they would get paid back the $10 billion over some amount of time. And this obviously includes Azure credits and chat GPT. As everybody knows, this just incredible demonstration of what AI can do in terms of text-based creation of content and answering queries has taken the net by storm.

People are really inspired by it. Sax, do you think that this is a defensible, real technology? Or do you think this is like a crazy hype cycle? Well, it's definitely the next VC hype cycle. Everyone's kind of glomming onto this, because VC really right now needs a savior. Just look at the public markets, everything we're investing in, it's in the toilets.

So we all really want to believe that this is going to be the next wave. And just because something is a VC hype cycle doesn't mean that it's not true. So as I think one of our friends pointed out, mobile turned out to be very real. I think cloud turned out to be, I'd say, very real.

Social was sort of real in the sense that it did lead to a few big winners. On the other hand, web 3 and crypto was a hype cycle and it's turned into a big bust. VR falls into the hype cycle. I think VR is probably a hype cycle so far.

No one can even explain what web 3 is. In terms of AI, I think that if I had to guess, I would say the hype is real in terms of its technological potential. However, I'm not sure about how much potential there is yet for VCs to participate. Because right now, it seems like this is something that's going to be done by really big companies.

So open AI is basically a-- it looks like kind of a Microsoft proxy. You've got Google, I'm sure, will develop it through their DeepMind asset. I'm sure Facebook is going to do something huge in AI. So what I don't know is, is this really a platform that startups are going to be able to benefit from?

I will say that some of the companies we've invested in are starting to use these tools. So I guess where I am is I think the technology is actually exciting. I wouldn't go overboard on the valuations. I wouldn't buy into that level of the hype. But you think there could be hundreds of companies built around an API for something like ChatGBT, Dolly-- Maybe, yeah.

I don't think startups are going to be able to create the AI themselves. But they might be able to benefit from the APIs. Maybe that's the thing that has to be proven out. Freebird. There's a lot of really fantastic machine learning services available through cloud vendors today. So Azure has been one of these kind of vendors.

And obviously, OpenAI is building tools a little bit further down on the stack. But there's a lot of tooling that can be used for specific vertical applications. Obviously, the acquisition of InstaDeep by BioNTech is a really solid example. And most of the big dollars that are flowing in biotech right now are flowing into machine learning applications where there's some vertical application of machine learning tooling and techniques around some specific problem set.

And the problem set of mimicking human communication and doing generative media is a consumer application set that has a whole bunch of really interesting product opportunities. But let's not kind of be blind to the fact that nearly every other industry and nearly every other vertical is being transformed today.

And there's active progress being made in funding and getting liquidity on companies and progress with actual products being driven by machine learning systems. There's a lot of great examples of this. So the fundamental capabilities of large data sets and then using these kind of learning techniques in software and statistical models to make kind of predictions and drive businesses forward in a way that they're not able to with just human knowledge and human capability alone is really real.

And it's here today. And so I think let's not get caught up in the fact that there's this really interesting consumer market hype cycle going on, where these tools are not being kind of validated and generating real value across many other verticals and segments. Chamath, when you look at this Microsoft OpenAI deal, and you see something that's this convoluted, hard to understand, what does that signal to you as a capital allocator and company builder?

I would put deals into two categories. One is easy and straightforward. And then two is, you know, cute by half or the two hard bucket. This is clearly in that second category, but it doesn't mean that-- - Why is it in that category? - Well, it doesn't mean that it won't work.

In our group chat with the rest of the guys, one person said, there's a lot of complex law when you go from a nonprofit to a for-profit. There's lots of complexity in deal construction. The original investors have certain things that they want to see. There may or may not be, you know, legal issues at play here that you encapsulated well in the last episode.

I think there's a lot of stuff we don't know. So I think it's important to just like, give those folks the benefit of the doubt. But yeah, if you're asking me, it's in the two hard bucket for me to really take seriously. Now, that being said, it's not like I got shown the deal.

So I can't comment. Here's what I will say. The first part of what Zach said, I think is really important for entrepreneurs to internalize, which is where can we make money? The reality is that, well, let me just take a prediction. I think that Google will open source their models because the most important thing that Google can do is reinforce the value of search.

And the best way to do that is to scorch the earth with these models, which is to make them widely available and as free as possible. That will cause Microsoft to have to catch up. And that will cause Facebook to have to really look in the mirror and decide whether they're going to cap the betting that they've made on AR/VR and reallocate very aggressively to AI.

I mentioned this in the, I did this Lex Friedman podcast, but that should be what Facebook does. And the reason is, if Facebook and Google and Microsoft have roughly the same capability and the same model, there's an element of machine learning that I think is very important, which is called reinforcement learning.

And specifically it's reinforcement learning from human feedback, right? So these RLHF pipelines, these are the things that will make your stuff unique. So if you're a startup, you can build a reinforcement learning pipeline. How? You build a product that captures a bunch of usage. We talked about this before.

That data set is unique to you as a company. You can feed that into these models, get back better answers, you can make money from it. Facebook has an enormous amount of reinforcement learning inside of Facebook. Every click, every comment, every like, every share. Twitter has that data set.

Google inside of Gmail and search. Microsoft inside of Minecraft and Hotmail. So my point is, David's right. The huge companies, I think, will create the substrates. And I think there'll be forced to scorch the earth and give it away for free. And then on top of that is where you can make money.

And I would just encourage entrepreneurs to think, where is my edge in creating a data set that I can use for reinforcement learning? That I think is interesting. That's kind of saying, I buy the ingredients from the supermarket, but then I can still construct a dish that's unique. And the salt is there, the pepper is there, but how I use that will determine whether you like the thing or not.

And I think that that is the way that I think we need to start thinking about it. - Interestingly, as we've all pointed out here, OpenAI was started as a nonprofit. The stated philosophy was this technology is too powerful for any company to own. Therefore, we're going to make it open source.

And then somewhere in the last couple of years, they said, well, you know what, actually, it's too powerful for it to be out there in the public. We need to make this a private company and we need to get $10 billion from Microsoft. That is the disconnect I am trying to understand.

That's the most interesting part of the story, Jason, I think, if you go back to 2014, is when Google bought DeepMind. And immediately everyone started reacting to a company as powerful as Google, having a toolkit and a team as powerful as DeepMind within them. And that sort of power should not sit in anyone's hands.

I heard people that I'm close with that are close to the organization and the company comment that they thought this is the most kind of scary, threatening, biggest threat to humanity is Google's control of DeepMind. And that was a naive kind of point of view, but it was one that was close, that was deeply held by a lot of people.

So Reid Hoffman, Peter Thiel, Elon Musk, a lot of these guys funded the original kind of open AI business in 2015. And here's the link. So I'm putting it out here. You guys can pull up the original blog post. - Do all those people who donated get stock in?

- So what happened was they were all in a non, it was all in a nonprofit. And then the nonprofit owned stock in a commercial business now. But your point is interesting because at the beginning, the idea was instead of having Google own all of this, we'll make it all available.

And here's the statement from the original blog post in 2015, open AI is a nonprofit AI research company. Our goal is to advance digital intelligence in the way that is most likely to benefit humanity as a whole, unconstrained by a need to generate financial return. Since our research is free from financial obligations, we can better focus on a positive human impact.

And they kind of went on and the whole thing about Sam, Greg, Elon, Reed, Jessica, Peter Thiel, AWS, YC are all donating to support open AI, including donations and commitments of over a billion dollars, although we expect that to only be a tiny fraction of what we will spend in the next few years, which is a really interesting kind of, if you look back historical perspective on how this thing all started seven years ago and how quickly it's evolved as you point out into the necessity to have a real commercial alignment to drive this thing forward without seeing any of these models open sourced.

And during that same period of time, we've seen Google share AlphaFold and share a number of other predictive models and toolkits and make them publicly available and put them in Google's cloud. And so there's both kind of tooling and models and outputs of those models that Google has open sourced and made freely available.

And meanwhile, open AI has kind of diverged into this deeply profitable, profit seeking kind of enterprise model. And when you invest in open AI, in the round that they did before, you could generate a financial return capped at 100X, which is still a pretty amazing financial return. You put a billion dollars in, you can make a hundred billion dollars.

That's funding a real commercial endeavor at that point. - Well, and then to- - It is the most striking question about this whole thing, about what's going on in our AI. And it's one that Elon's talked about publicly and others have kind of sat on one side or the other, which is that AI offers a glimpse into one of the biggest and most kind of existential threats to humanity.

And the question we're all gonna be tackling and the battle that's gonna be happening politically and regulatory wise, and perhaps even between nations in the years to come, is who owns the AI, who owns the models, what can they do with it? And what are we legally gonna be allowed to do with it?

And this is a really important part of that story, yeah. - To build on what you're saying, I just put in PyTorch, people don't know that's another framework, P-Y-T-O-R-C-H. This was largely built inside of Facebook. And then Facebook said, "Hey, we want to democratize machine learning." And they made, and I think they put a bunch of executives, they may have even funded those executives to go work on this open source project.

So they have a huge stake in this and they went very open source with it. And then TensorFlow, which you have an investment in Chamath, TensorFlow was inside of-- - No, I don't have an investment in TensorFlow. We built-- - No, TensorFlow, the public source came out of Google and then you invested in another company.

- But we were building Silicon for machine learning. That's different. - Right. But it's based on TensorFlow. - No, no, no, no. The founder of this company was the founder of TensorFlow. - Oh, got it, okay. - Oh, sorry, not of TensorFlow, pardon me, of TPU, which was Google's internal Silicon that they built to accelerate TensorFlow.

- Right. - If that makes sense. - And so that's the, I don't mean to be cynical about the whole project or not, it's just the confounding part of this and what is happening here. It reminds me, I don't know if you remember this, but Chamath and René allowed-- - The biggest opportunity here is for Facebook.

I mean, they need to get in this conversation, ASAP. I mean, to think that, like, look, PyTorch was like a pretty seminal piece of technology that a lot of folks in AI and machine learning were using for a long time. TensorFlow before that. And what's so funny about Google and Facebook is they're a little bit kind of like, they're not really making that much progress.

I mean, Facebook released this kind of like rando version of AlphaFold recently. It's not that good. I think these companies really need to get these products in the wild as soon as possible. It cannot be the case that-- - Is that, yeah. - You have to email people and get on some list, I mean, this is Google and Facebook, guys, come on.

Get going. - This is the, I think the big innovation of OpenAI, SACS, to bring you in the conversation. They actually made an interface and let the public play with it to the tune of $3 million a day in cloud credits or costs. Which, you know-- - By the way, just on that, my son was telling me, he's like, "Hey, dad, do you want me to tell you when the best time to use ChatGPT is?" And I'm like, "Huh?" He's like, "Yeah, my friends and I, we've been using it so much, we know now when we can actually get resources." - Oh, wow.

- And it's such an interesting thing where a 13-year-old kid knows when it's mostly compute-intensive that it's unusable and when to come back and use it. It's incredible. - When's the last time, SACS, that technology became this mainstream and captured people's imagination this broadly? Let's check GPT is. - It's been a while.

I don't know, maybe the iPhone or something. Yeah, look, it's powerful. There's no question it's powerful. I mean, I'm of two minds about it because whenever something is the hype cycle, I just reflexively want to be skeptical of it. But on the other hand, we have made a few investments in this area.

And I mean, I think it is powerful and it's going to be an enabler of some really cool things to come. There's no question about it. - I have two pieces of more insider information. One, I have a ChatGPT iOS app on my phone. One of the nice folks at OpenAI included me in the test flight.

And it's the simplest interface you've ever seen, but basically you type in your question, but it keeps your history. And then you can search your history. So it looks, SACS, like you're in iMessage, basically. And it has your threads. And so I asked him, "Hey, what are the best restaurants in Yawnville?" A town near Napa.

And then I said, "Which one has the best duck?" And it literally like gave me a great answer. And then I thought, "Wait a second, "why is this not using a Siri or Alexa-like interface? "And then why isn't it, oh, here's a video of it." And I gave the video to Nick.

- By the way, Jason, this, what you're doing right now is you're creating a human feedback reinforcement learning pipeline for ChatGPT. So just the fact that you asked that question, and over time, if ChatGPT has access to your GPS information and then knows that you went to restaurant A versus B, it can intuit.

And it may actually prompt you to ask, "Hey, Jason, we noticed you were in the area. "Did you go to Bottega? "If you did, how would you rate it one through five?" That reinforcement learning now allows the next person that asks, "What are the top five restaurants?" to say, "Well, over a thousand people "that have asked this question, "here's actually the best answer," versus a generic rank of the open web, which is what the first data set is.

That's what's so interesting about this. So this is why, if you're a company that already owns the eyeballs, you have to be running to get this stuff out there. - Well, and then this answer, you cited Yelp. Well, this is the first time I've actually seen a Chatshippeetee site, and this is, I think, a major legal breakthrough.

It didn't put a link in, but if it's gonna use Yelp's data, I don't know if they have permission from Yelp, but it's quoting Yelp here, it should link to French Lounge, Bottega, and Bouchon. Bouchon actually has the best duck confit, for the record. And I did have that duck, so I asked this afterwards to see, you know, in a scenario like this.

But it could also, if I was talking to it, I could say, "Hey, which one has availability "this afternoon or tomorrow for dinner?" And make the phone call for me, like Google Assistant does, or any number of next tasks. - I was thinking about-- - This was an incredibly powerful display in a 1.0 product.

- I was thinking about what you said last week, and I thought back to the music industry in the world of Napster. And what happened was, there was a lot of musicians, I think Metallica being the most famous one, famously suing Napster, because it was like, "Hey, listen, you're allowing people to take my content, "which they would otherwise pay for.

"There's economic damage that I can measure." That legal argument was meaningful enough that ultimately Napster was shut down. Now, there were other versions of that, that folks created, including us at Winamp. We created a headless version of that. But if you translate that problem set here, is there a claim that Yelp can make in this example, that they're losing money?

That if you were going through Google, or if you were going through their app, there's the sponsored link revenue and the advertising revenue that they would have got that they wouldn't get from here. Now, that doesn't mean that ChatGPT can't figure that out, but it's those kinds of problems that are gonna be a little thorny in these next few years that have to really get figured out.

This is a very-- - If you were a human reading every review on Yelp about duck, then you could write a blog post in which you say, "Many reviewers on Yelp "say that Bouchon is the best duck." So the question is, is GPT held to that standard? - Yeah, exactly.

- Or is it, or something different? And is linking to it enough? - This is the question that I'm asking. I don't know. It should be, I'll argue it should be, because if you look at the four-part test for fair use, which I had to go through because blogging had the same issue.

We would write a blog post and we would mention Walt Mossberg's review of a product and somebody else's. And then people would say, "Oh, I don't need to read Walt Mossberg's "Indiana Wall Street Journal subscription." And we'd say, "Well, we're doing an original work. "We're comparing two or three different." You know, human is comparing two or three different reviews and we're adding something to it.

It's not interfering with Walt Mossberg's ability to get subscribers in the Wall Street Journal. But the effect on the potential market is one of the four tests. And just reading from Stanford's quote on fair use, "Another important fair use factor "is whether your use deprives the copyright owner of income "or undermines a new or potential market "for the copyrighted work.

"Depriving a copyright owner of income "is very likely to trigger a lawsuit." This is true even if you are not competing directly with the original work. And we'll put the link to Stanford here. This is the key issue. And I would not use Yelp. In this example, I would not open the Yelp app.

Yelp would get no commerce and Yelp would lose this. So, ChatGPT and all these services must use citations of where they got the original work. They must link to them and they must get permission. That's where this is all gonna shake out. And I believe that's-- - Using permission?

Well, forget about permission. I mean, you can't get a big enough data set if you have to get permission in advance, right? You have to go out and negotiate. - It's gonna be the large data sets. Quora, Yelp, the App Store reviews, Amazon's reviews. So, there are large corpuses of data that you would need.

Like Craigslist has famously never allowed anybody to scrape Craigslist. The amount of data inside Craigslist, but one example of a data set, would be extraordinary to build ChatGPT on. ChatGPT is not allowed to because, as you brought up robots.txt last week, there's gonna need to be an AI.txt. Are you allowed to use my data set in AI?

And how will I be compensated for it? I'll allow you to use Craigslist, but you have to link to the original post, and you have to note that. - The other gray area that isn't there today, but may emerge, is when section 230 gets rewritten. Because if they take the protections away for the Facebook and the Googles of the world, for the basically for being an algorithmic publisher, and saying an algorithm is equivalent to a publisher.

What it essentially saying is that an algorithm is kind of like doing the work of a human in a certain context. And I wonder whether that's also an angle here, which now this algorithm, which today, David, you said the example, I read all these blog posts, I write something.

But if an algorithm does it, maybe can you then say, no, actually there was intent there that's different than if a human were to do it? I don't know. My point is, very complicated issues that are gonna get sorted out. And I think the problem with the hype cycle is that you're gonna have to marry it with an economic model for VCs to really make money.

And right now there's just too much betting on the come. So to the extent you're going to invest, it makes sense that you put money into open AI because that's safe. Because the economic model of how you make money for everybody else is so unclear. - No, it's clear actually, I have it for business.

I just signed up for chat GPT premium. They had a survey that they shared on their discord server. And I filled out the survey and they did a price discovery survey, Freeberg. What's the least you would pay, the most you would pay, what would be too cheap of a price for chat GPT pro, and what would be too high of a price?

I put in like 50 bucks a month would be what I would pay. But I was just thinking, imagine chat GPT allowed you, Freeberg, to have a Slack channel called research. And you could go in there or anytime you're in Slack, you do slash chat or slash chat GPT.

And you say slash chat GPT, tell me, what are the venues available in, which we did this actually for, I did this for venues for all in some way. I can say, what are the venues that seat over 3000 people in Vegas? And it just gave us the answer.

Okay, well, that was the job of the local event planner. They had that list. Now you can pull that list from a bunch of different sources. I mean, what would you pay for that? A lot. - Well, I think one of the big things that's happening is all the old business models don't make sense anymore.

In a world where the software is no longer just doing what it's done for the last 60 years, which is what is historically defined as information retrieval. So you have this kind of hierarchical storage of data that you have some index against, and then you go and you search and you pull data out, and then you present that data back to the customer or the user of the software.

And that's effectively been how all kind of data has been utilized in all systems for the past 60 years in computing. Largely, what we've really done is kind of built an evolution of application layers or software tools to interface with the fetching of that data, the retrieval of that data, and the display of that data.

But what these systems are now doing, what AI type systems or machine learning systems now do, is the synthesis of that data and the representation of some synthesis of that data to you, the user, in a way that doesn't necessarily look anything like the original data that was used to make that synthesis.

And that's where business models like a Yelp, for example, or like a web crawler that crawls the web and then presents webpage directories to you. Those sorts of models no longer make sense in a world where the software, the signal to noise is now greater. The signal is greater than the noise in being able to present to you a synthesis of that data and basically resolve what your objective is with your own consumption and interpretation of that data, which is how you've historically used these systems.

And I think that's where there's, going back to the question of the hype cycle, I don't think it's about being a hype cycle. I think it's about the investment opportunity against fundamentally rewriting all compute tools. Because if all compute tools ultimately can use this capability in their interface and in their modeling, then it very much changes everything.

And one of the advantages that I think businesses are going to latch onto, which we talked about historically, is novelty in their data, in being able to build new systems and new models that aren't generally available. - Example? - In biotech and pharma, for example, having screening results from very expensive experiments and running lots of experiments and having a lot of data against those experiments gives a company an advantage in being able to do things like drug discovering.

We're going to talk about that in a minute, versus everyone using publicly known screening libraries or publicly available protein modeling libraries, and then screening against those. And then everyone's got the same candidates and the same targets and the same clinical objectives that they're going to try and resolve from that output.

So I think novelty in data is one way that advantage arises. But really, that's just kind of, where's there an edge? But fundamentally, every business model can and will need to be rewritten that's dependent on the historical, on the legacy of information retrieval as the core of what computing is used to do.

- Sax, on my other podcast, I was having a discussion with Molly about the legal profession. What impact would it be if ChantGPT took every court case, every argument, every document, and somebody took all of those legal cases on the legal profession, and then the filing of a lawsuit, the defending of a lawsuit, public defenders, prosecutors, what data could you figure out?

Like, just to think of the recent history, you look at Chesa Boudin, you could literally take every case, every argument he did, put it through it, and say, you know, versus an outcome in another state, and you could figure out what's actually going on with this technology. What impact did this have on the legal field?

You are a non-practicing attorney, you have a legal degree. - I never practiced, other than one summer at a law firm. But no, I think-- - Did you pass the bar? - What's that? I did pass the bar, yes. - First try. - Yes, of course. - First try, of course, yeah, come on.

- You gotta be kind of dumb to fail the bar exam. - Look, it's like a rolling in science, yes, I went to Stanford, dude. I'm rated 1800. - It took two days of studying, come on. - I may not have passed the bar, but I know a little shit, enough to know that you can't legally-- - I would be curious in terms of a very common question that an associate at a law firm would get asked would be something like, you know, summarize the legal precedence in favor of X, right?

And you could imagine GPT doing that like instantly. Now, I think that the question about that, I think there's two questions. One is, can you prompt GPT in the right way to get the answer you want? And I think, you know, Chamath, you shared a really interesting video showing that people are developing some skills around knowing how to ask GPT questions in the right way.

Is that prompt engineering? Why? 'Cause GPT's a command line interface. So if you ask GPT a simple question about what's the best restaurant in, you know, Napa, it knows how to answer that. But there are much more complicated questions that you kind of need to know how to prompt it in the right way.

So it's not clear to me that a command line interface is the best way of doing that. I could imagine apps developing that create more of like a GUI. So we're an investor, for example, in Copy AI, which is doing this for copywriters and marketers, helping them write blog posts and emails.

And so, you know, imagine putting that like, you know, GUI on top of Chad GPT. They've already been kind of doing this. So I think that's part of it. I think the other part of it is on the answer side, you know, how accurate is it? Because in some professions, having 90 or 95 or 99% accuracy is okay.

But in other professions, you need six nines accuracy, meaning 99.9999% accuracy. Okay, so I think for a lawyer going into court, you know, you probably need, I don't know, it depends on the question. - Yeah, it's a parking ticket versus a murder trial is two very different things. - Yeah, exactly.

So is 99% accuracy good enough? Is 95% accuracy good enough? I would say probably for a court case, 95% is probably not good enough. I'm not sure GPT is at even 95% yet. But could it be helpful? Like could the associates start with Chad GPT, get an answer and then validate it?

Probably, yeah. - If you had a bunch of associates bang on some law model for a year, again, that's that reinforcement learning we just talked about. I think you'd get precision recall off the charts and it would be perfect. By the way, just a cute thing. I don't know if you guys got this email.

It came about an hour ago from Reid Hoffman. And Reid said to me, "Hey Chamath, I created Fireside Chatbots, a special podcast mini series where I will be having a set of conversations with Chad GPT." So you can go to YouTube by the way and see Reid having, and he's a very smart guy, so this should be kind of cool.

And by the way, Chad GPT will have an AI generated voice powered by the text to speech platform, play.ht. Go to YouTube if you want to see Reid have a conversation with Chad GPT. - Well, I mean, Chamath, we have a conversation with the two Davids every week. What's the difference?

We know how this is gonna turn out. - Hey, but actually, so synthesizing Chamath's point about reinforcement learning with something you said, Jay Cal, in our chat, which I actually thought was pretty smart. - Well, that's a first. - Yeah, so I'm gonna give you credit here 'cause I don't think you've said it on this episode, which is you said that these open AI capabilities are eventually gonna become commoditized or certainly much more widely available.

I don't know if that means that they'll be totally commoditized or there'll be four players, but there'll be multiple players that offer them. And you said the real advantage will come from applications that are able to get ahold of proprietary data sets and then use those proprietary data sets to generate insights.

And then layering on what Chamath said about reinforcement learning, if you can be the first out there in a given vertical with a proprietary data set, and then you get the advantage, the moat of reinforcement learning, that would be the way to create, I think, a sustainable business. - Just to build on what you said, this week is the J.P.

Morgan Conference. Freeberg mentioned it last week. I had dinner on Wednesday with this really interesting company based in Zurich. And what they have is basically a library of ligands, right? And so these ligands are used as a substrate to deliver all kinds of molecules inside the body. And what's interesting is that they have a portfolio of like a thousand of these, but really what they have is they have all the nuclear medicine about whether it works.

So, you know, they target glioblastoma, glioblastoma. And so all of a sudden they can say, "Well, this ligand can actually cross the blood-brain barrier "and get to the brain." They have an entire data set of that, and a whole bunch of nuclear imagery around that. They have something for soft-cell carcinoma.

So then they have that data set. So to your point, that's really valuable because that's real work that Google or Microsoft or OpenAI won't do, right? And if you have that and you bring it to the problem, you can probably make money. You know, there's a business there to be built.

- Just building on this conversation, I just realized like a great prompt engineer is gonna become a title and an actual skill, the ability to interface with these AIs. - Here you go, coding school 2.0. - And that prompt engineer, well, no, a prompt engineer, somebody who is very good at talking to these, you know, instances and maximizing the result for them and refining the results for them, just like a detective who asks great questions, that person is gonna be 10 or 20 times more valuable.

They could be the proverbial 10X engineer in the future of as in a company. And as we talk about austerity and doing more with less and the 80% less people running Twitter now or Amazon laying off 18,000 people, Salesforce laying off 8,000, Facebook laying off 10 and probably another 10,000, what catalytic effect like could this have?

We could be sitting here in three or four or five years and instead of running a company like Twitter with 80% less people, maybe you could run it with 98% less people. - Look, I think directionally, it's the right statement. I mean, you know, I've made the statement a number of times that I think we move from this idea of creator economy to narrator economy, where historically it was kind of labor economy, where humans use their physical labor to do things.

Then we were knowledge workers, we used our brains to make things. And then ultimately we kind of, I think, resolve to this narrator economy where the way that you kind of can state intention and better manipulate the tools to drive your intentional outcome, the more successful you're gonna be.

And you can kind of think about this as being the artist of the past. Da Vinci was, what made him so good was he was technically incredible at trying to reproduce a photographic like imagery using paint. And there's these really great kind of museum exhibits on how he did it using these really interesting kind of like split mirror systems.

And then the better, the artist of the 21st century, the 20th century was the best user of Adobe Photoshop. And that person is not necessarily the best painter. And the artist of the 22nd century isn't gonna look like the Photoshop expert. And it's not gonna look like the painter, it's gonna look like something entirely different.

It could be who's got the most creative imagination in driving the software to drive new outcomes. And I think that the same analogy can be used across every market and every industry. However, one thing to note, Jekyll, it's not about austerity 'cause the Luddite argument is when you have new tools and you get more leverage from those tools, you have less work for people to do and therefore everyone suffers.

The reality is new work emerges and new opportunities emerge. And we level up as a species. And when we level up, we all kind of fill the gaps and expand our productivity and our capability set. - I thought what Jekyll was saying was more that Google will be smaller didn't mean that the pie wouldn't grow.

It's just that that individual company is run differently, but there will be hundreds of more companies or thousands more, millions more. - Yeah, that's sort of, I have an actual punch up for you. Instead of narrative, it's the conductor economy. It's you're conducting a symphony. - Ooh, a punch up.

- Punch up there. But I do think like we're gonna, there's gonna be somebody who's sitting there like, remember Tom Cruise in Minority Report as a detective was moving stuff around with a interface in, you know, with the gloves and everything. This is kind of that manifested. You could, even if you're not an attorney, you can say, hey, I wanna sue this company for copyright infringement, give me my best arguments.

And then on the other side say, hey, I wanna know what the next three features I should put into my product is. Can you examine who are my top 20 competitors? And then who have they hired in the last six months? And what are those people talking about on Twitter?

You can have this conductor, you know, who becomes really good at that. - Well, the leveling up-- - Massive amounts of tasks. - Yeah, the leveling up that happens in the book Ender's Game, I think is a good example of this where the guy goes through the entire kind of ground up and then ultimately he's commanding armies of spaceships in space and his orchestration of all of these armies is actually the skillset that wins the war.

Yeah. - You predicted that there would be like all these people that create these next gen forms of content. But I think this Reid Hoffman thing could be pretty cool. Like what if he wins a Grammy for his, you know, computer created podcast miniseries? That's really cool. - The thing I'm really excited about, when's the first AI novel gonna get published by a major publisher?

I think it happens this year. When's the first AI symphony gonna get performed by a major symphony orchestra? And when's the first AI generated screenplay get turned into an AI generated 3D movie that we all watch? And then the more exciting one I think is when do we all get to make our own AI video game where we instruct the video game platform what world we wanna live in?

I don't think that's happening for the next three or four years, but when it does, I think everyone's got these new immersive environments that they can live in. - I have a question. When, you know, when you think-- - When I say live in, I mean video game wise.

Yeah, sorry, go ahead. - When you have these computer systems, just like to use a question of game theory for a second, they're, these models are iterating rapidly. These are all mathematical models. So inherent in, let's just say this, the perfect answer, right? Like if you had perfect precision recall, if multiple models get there at a system-wide level, everybody is sort of like, they get to the game theory optimal.

They're all at Nash equilibrium, right? All these systems working at the same time. Then the real question would then be what the hell do you do then? 'Cause if you keep getting the same answer, if everybody then knows how to ask the exact right question and you start to go through these iterations where you're like, maybe there is a dystopian hellscape where there are no jobs.

Maybe that's the Elon world, which is you can, you can recursively find a logical argument where there is no job that's possible, right? And now I'm not saying that that path is the likely path, but I'm saying it is important to keep in mind that that path of outcomes is still very important to keep in the back of our mind as we figure these things out.

- Well, Freeberg, you were asking before about this, like, will more work be created? Of course, artistic pursuits, podcasting is a job now, being an influencer is a job, yada, yada, new things emerge in the world. But here in the United States in 1970, I'm looking at Fred, I'm looking at the St.

Louis Fed, 1970, 26.4% of the country was working in a factory, was working in manufacturing. You want to guess what that is in 2012? - Sorry, what percentage? - It was 26% in 1970. And in 2015, when they stopped the percentage in manufacturing, I say it's, they discontinued this, it was a 10.

So it's possible we could just see, you know, the concept of office work, the concept of knowledge work is going to follow, pretty inevitable, the path of manufacturing. That seems like a pretty logical theory or no? - I think we should move on, but yes. - Okay, so how would we like to ruin the show now?

Should we talk about Biden and the documents and ruin the show with political talk? Or should we talk about? Since it's been such a great episode so far, what do we want to talk about next? I'll give you a couple of choices. - I know what you don't want to talk about.

(laughing) - Some of these network guys are talking about productivity. - Give it to him, give it to him. - Wait, we all know, we all know J. Cal. - Apple cash, time-max, hourglasses? - Hold on a second, we all know J. Cal, that according to you, when a president is in possession of classified documents in his home, that apparently have been taken in an unauthorized manner, basically stolen, he should have his home raided by the FBI.

- Almost, close, close. So anyway, Biden, as of the taping of this, has now said there's a third batch of classified documents. This group, I guess there was one at an office, one at a library, now this third group is in his garage with his Corvette, certainly not looking good.

- Well, they say in his defense, they say the garage was locked, meaning that you could use a garage door opener to open and close it. It was locked when it went closed. - So pretty much as secure as the documents at Mar-a-Lago, same equivalency. - No, no, no, actually, I mean, just to be perfectly fair, the documents at Mar-a-Lago were locked in a basement.

The FBI came, checked it out, said we'd like you to lock those up, they locked 'em up. So a little safer than being in a garage with a Corvette. - But functionally the same, functionally the same. - The only difference here would be what, Sachs, when you look at these two cases?

- Well, that in one case, Merrick Garland is appointed an independent counsel to investigate Trump, and there's no such a special counsel or investigator appointed to investigate Biden. I mean, these things are functionally the same. - Didn't he put somebody on it, though? Wait, did he put somebody on it?

- No, I don't think they've appointed a special counsel yet. - No, they did. As of an hour ago, a special counsel was appointed. - Okay. - Did that just happen? - Yeah, one hour ago. Robert Herr is his name. - Okay, I guess there are real questions to look into here.

The documents apparently were moved twice. Why were they moved? Who ordered that? What was a classified document doing in Biden's personal library? What do the documents pertain to? Do they touch on the Biden family's business dealings in Ukraine and China? So there are real things to look into here.

But let me just take a step back. Now that the last three presidential candidates have been ensnared in these classified document problems, remember, it's Biden now, and then Trump, and Hillary Clinton before Trump. I think it's time to step back and ask, are we over-classifying documents? I mean, are we fetishizing these documents?

Are they all really that sensitive? It seems to me that we have an over-classification problem, meaning that ever since FOIA was passed, the Freedom of Information Act, the government can avoid accountability and prying eyes by simply labeling any document as classified. So over-classification was a logical response by the permanent government to the Freedom of Information Act.

And now it's gotten to the point where just about everything handed to a president or vice president is classified. So I think I can understand why they're all making this mistake. And I think a compounding problem is that we never declassify anything. There's still all these records from the Kennedy assassination that have never been declassified.

- And they're supposed to have declassified these. The CIA keeps filibustering on the release of the JFK assassination documents, and they've been told they have to stop, and they have to release them, and then they keep redacting stuff, which is making it... - They're not releasing them. - I hate to be a conspiracy theorist here, but what are they trying to cover up?

I mean, this is a long time ago. - That's the only way to interpret it. But even for more mundane documents, there are very few documents that need to be classified after even, say, five years. You could argue that we should be automatically declassifying them after five years, unless they go through a process to get reclassified.

I mean, I'd say, like, you guys in business, I know it's not government, in business, how many of the documents that you deal with are still sensitive, are trade secrets, five years later? - Certainly 20 years later, they're not, right? Like, in almost all cases. - No, but I wouldn't even say, like, five years.

I mean, the only documents-- - The Coca-Cola formula. - The only documents in business that I think I deal with that you could call sensitive are the ones that pertain to the company's future plans, right, because you wouldn't want a competitor-- - Cap table. - To get those. - Yeah, cap table.

- There's a handful of things. - Legal issues, yeah. - Even cap table is not that sensitive, because by the time you go public, it legally has to be public. - Yeah, so at some point-- - It's on Carta, like, there's 100 people who have that. I mean, it's-- - Exactly.

- So, like, in business, I think our experience has been there's very few documents that stay sensitive, that need to remain secret. Now, look, if Biden or Trump or whoever, they're reviewing the schematics to the Javelin missile system or to, you know, how we make our nuclear bombs or something, obviously that needs to stay secret forever, but I don't believe our politicians are reviewing those kinds of documents.

- Well, I mean, we both-- - I don't really understand what it is that they're reviewing-- - Why are they keeping them-- - That needs to be classified five years later. - Well, and why are they keeping them was the issue we discussed previously. We actually agreed on that.

I think they're just keeping mementos. - I think there's a simple explanation for why they're keeping them, Jason, which is that everything is more classified, and there's a zillion documents, and if you look, like, both Biden and Trump, these documents were mixed in with a bunch of personal effects and mementos.

My point is, if you work in government and handle documents, they're all classified. So, I mean-- - And if the National Archive asks for them back, or you find them, you should just give them back. I mean, that's gonna wind up being the rug here is Trump didn't give them back, and Biden did.

So that's the only difference here. - Well, no, no, no, hold on. The FBI went to Trump's basement, they looked around, they said, "Put a lock on this." They seemed to be okay with it initially. Then maybe they changed their minds, I don't know. I'm not defending Trump, but-- - Well, it's pretty clear that he wouldn't give them back in.

That was the issue. - The point I'm making is that now that Biden, Trump, and Hillary Clinton have all been ensnared in this, is it time to rethink the fact that we're over-classifying so many documents? I mean, just think about the incentives that we're creating for our politicians. Okay, just think about the incentives.

Number one, never use email. Remember Hillary Clinton and the whole email server? You gotta be nuts to use email. Number two, never touch a document. Never touch a document. Never let anyone hand you a document. - Flush them down the toilet. - Never let anyone hand you a document.

I mean, if you're a politician, an elected official, the only time you should ever be handling anything is go into a clean room, make an appointment, go in there, read something, don't take notes, don't bring a camera, and then leave. I mean, this is no way to run a government.

- It's crazy. - Who does this benefit? Hold on, who does this benefit? It doesn't benefit our elected officials. It makes it almost impossible for them to act like normal people. - That's why. - It benefits the insiders, the permanent government. - You're missing the most important part about the sex.

This was, if you wanna go into conspiracy theories, this was a setup. Biden planted the documents so that we could create the false equivalency and start up Biden versus Trump 2024. This ensures that now Trump has something to fight with Biden about, and this is gonna help Trump. - 'Cause they're both tainted, equally tainted from the same source.

- Yes, yes. - They are equally tainted now, but I-- - It puts Trump in the new cycle. - No, I think it's the opposite. I think Merrick Garland now is gonna have to drop the prosecution against Trump for the stolen documents, or at least that part of what they're investigating him for.

They might still investigate him over January 6th or something, but they can't investigate Trump over documents now. - And Georgia. Those two seems more sticky, yeah. I agree with that, actually. I think it's gonna be hard to do. - But my point is, just think about, look, both sides are engaged in hyper partisanship.

The way right now that the conservatives on the right, they're attacking Biden now for the same thing that the left was attacking Trump for. My point is, just take a step back, and again, think about the incentives we're creating about how to run our government. You can't use email, and you can't touch documents.

- And everything's an investigation the second you get on the office. - And by the way, don't ever go into politics if you're a business person, because they'll investigate every deal you ever did prior to getting into politics. I mean, just think about the incentives we're creating. - So what are you gonna do when you try to get your treasury position?

What's gonna happen? - You gotta be nuts. You gotta be nuts to go into government. - So you're not gonna take a position in DeSantis' cabinet? - My point is that the Washington insiders, by which I mean the permanent Washington establishment, i.e. the deep state, they're creating a system in which they're running things, and the elected officials barely can operate like normal functioning humans there.

- Interesting. - That's what's going on. - I heard a great rumor. This is total gossip mongering. - Oh, here we go. - That one of Ken Griffin's best out is to get DeSantis elected so that he can become treasury secretary. I mean, Ken Griffin would get that if he wanted it.

And then he would be able to divest all of Citadel tax-free. So he would mark the market like $30 billion, which is a genius way to go out. Now, then it occurred to me, oh my God, that is me and Sachs' path too. (laughing) With a lot less money, but the same path.

- Why would it be tax-free? - When you get appointed to those senior posts, you're allowed to either stick it in a blind trust, or you can sell with no capital gains. - What? What? - Well, because they want you to divest. - They want you unconflicted. - Yes, anything that presents a conflict, they want you to divest.

And so the argument is, if you're forced to divest it to enter a government, you shouldn't be forcibly taxed. - Wait, if I become mayor of San Francisco or Austin, I can go to federal government. - No, no, no, wait, wait, wait. - It's like having a-- - Secretary of Transportation, J.

Cal, you can do that. - Oh, I'm qualified for that. I'd take the bus, I got an electric bike. - To answer Freeberg's point, I think Citadel Securities, there's a lot of folks that would buy that because that's just a securities trading business. And then Citadel, the hedge fund, probably something like a big bulge bracket bank for Blackstone, probably Blackstone in fact, because now Blackstone can plug it into a trillion dollar asset machine.

I think there would be buyers out the door. - This is an incredible grift. Now I know why Sachs-- - It's not a grift at all, but it's an incredible-- - Oh, come on, man, a cabinet position for no cap gains? - Well, that's not a grift, those are the laws.

They force you to sell everything. - It so feels grifty to me. - And then you do public service. - I think you're misusing the word to continue to genuflect to the left wing. - No, I'm not genuflecting. I think you're being a little defensive 'cause you see this as a bad-- - Is it genuflect?

- That or you're dumb. - Big bulge. (laughing) - I'm not stupid, man, you don't even know. - I don't grift when I see it. You take a cabinet position-- - Sachs, would you take a cabinet position? - You don't pay cap gains? - Would you be secretary of the treasury?

- Where does that exist? - Yeah, Sachs, if you were asked to serve-- - Look, any normal person who wants to serve in government, you can't use email and you can't touch a document, and every deal you've ever done gets investigated. Why not? - That's a yes. - Why would you wanna do it?

I mean, all of a sudden, you get to divest tax free. - Methink thou doth protesteth too much, David Sachs. - The fact that you two know this rule, and Freebird and I don't. - No, I know it. It's like a well-known-- - I'm the only person who doesn't know this.

Rich people know it. - Jake, I looked up grift. It means to engage-- - Everyone knows this. - It means that to engage in a petty or small-scale swindle. I don't think selling a $31 billion entity-- - Yeah, no, it's not small. - To a combination of BlackRock and Blackstone would be considered a petty, small-scale swindle.

- Did any of you guys watch the Madoff series on Netflix? - No, was it good? - No. - Oh my God, it is so depressing. I gotta say, just that Madoff series, there is no glimmer of light or hope or positivity or recourse. Everyone is a victim. Everyone suffers.

It is just so dark. - So gross. - Don't watch it. It's so depressing. - The Madoff one. - The Madoff one is so depressing. It's so awful. - Yeah, they all kill themselves and die. - They become like all the-- - No, one guy died. - Everyone was a victim.

- One guy died of cancer. - And then Irving Picard, I didn't realize all this, the trustee that went and got the money, he went and got money back from these people who were 80 years old and retired and had spent that money decades ago, and he sued them and took their homes away from them, and no one, and they had no idea-- - No one.

- That they were part of the scam. No one, no one won. It was a brutal, awful whole thing, yeah. - You know what they said. By the way, that's gonna be really interesting as we enter this SBF trial because-- - 100%. - 100%. - That is what happens if you got-- - And that's why the Southern District of New York said that this case is becoming too big for them because all the places that SBF sent money, all those PACs and all those political donations-- - Sent before.

- They have to go and investigate where that money went and see if they can get it back, and it's gonna open up an investigation into each one of these campaign finance and election and kind of interfering actions that were taken. - Wait, not Politico. ProPublica, sorry, ProPublica. - On the other end of the spectrum, I did watch this weekend, "Triangle of Sadness." Have you guys seen this?

- I watched it too! It was great! - Oh my God! - "Triangle of Sadness" is great! It's so dark. - To the Davids, listen, this is one of the, I thought it was, it didn't pay off the way I thought, but this is one of the best setups you'll see in a movie.

So basically, it's a bunch of people on a luxury yacht. So you have a bunch of rich people as the guests. Then you have the staff that interacts with them. And this is mostly Caucasian. And then in the bowels of the ship, what you see are Asian and black workers that support them, okay?

So in some ways, it's a little bit of a microcosm of the world. - Oh, I thought you were gonna say a microcosm of something else! - And then what happens is there's a shipwreck, basically. - Yeah, oh, don't spoil it, come on. - Okay, but no, but I'll just tell you.

So the plot is you have this Caucasian patriarchy that then gets flipped upside down because after the shipwreck, the only person who knows how to make a fire and catch the fish is the Filipino woman who is in charge of cleaning the toilets. So she becomes in charge. So now you flip to this immigrant matriarchy.

- It's a pretty great meditation on class and survival. It's pretty well done. - It didn't end well, I thought. I thought it could. - Well, it's hard to wrap that one up. - Well, you know what they say, boys. Still a little and they throw you in jail.

Still a lot and they make you king. Famous Bob Dylan quote. - There you go. - All right, well, this has been a great episode. Great to see you, besties. Austerity menu tonight, Chamath? What's on the austerity menu tonight? What are we doing, salad, some tuna sandwiches? - No, I think Kirsten is doing, I think, durod.

- Durod, yeah, that's, yeah. - That's a good fish. - Jake and I once had a great durod in Venice. - In Venice. - In Venice. - The durod from Venice. - That's one of the best meals we ever had. Am I right? - So good, I agree. - When it's done well, the durod kicks ass.

- There's only one way to cook a durod. Do you know what that is? You gotta, it's the way they did in Venice. You gotta cook the whole fish. - Yeah, yeah, yeah, she's doing that. - And then after you cook the fish, then you de-bone it. And yeah, that's the way to do it.

- That was back when Saxon and I used to enjoy each other's company. (laughing) - Before this podcast made us into mortal enemies. - Jake, I'm a little disappointed you couldn't agree with my take on this documents scandal. Instead of dunking in a partisan way, I tried to explain why it was a problem of our whole political system.

- I like your theory. I think, you know, you keep making me defend Biden. I think Biden's a grifter. I told you these guys are grifting. Just think you're, I just think your party grifts a little bit more. But yeah, compare your grift. - Are we going to play Saturday after the wild card game?

Are you guys interested in playing Saturday as well? 'Cause I got the hall pass. I can do a game on Saturday. - I don't know. I have to check with my boss. - Who's going to the, are you guys all going? Sax, are you going to come to play poker at that live stream thing for the day, LA?

- I doubt it, no. - He doesn't want to interact with humans. - That does not play well in confirmation hearings. - No. - The last time I did one of those, Alan Keating destroyed me on camera. I like, I had like, I had two left feet and every time he bluffed, I folded.

Every time he had the nuts, I called. It was brutal. - That's true. - Now you do this. - That was a bad one. - A shellacking, a classic Sax shellacking. - So it's an edge saving thing. Is that what's going on here? - No, no, no, no. - Preserving.

- No, I just said I don't have time to do it. - No, it has to do with the cabinet positions. He doesn't need to be seen recklessly gambling. It's a bad look. - If you could pick any cabinet position, Sax, which one would it be? - State. - Treasury.

- State's a lot of travel. - Say the word, Sax. - State's a lot of travel. You never stay at home. You're always on a plane. - I'll tell you, I don't know. - That's what he's looking for. - I don't know that those like cabinet positions are that important.

I mean, they run these giant bureaucracies that again are permanent. You can't fire anyone. So if you can't fire a person, do they really report to you? - Right. - I mean, only ceremonial. - Trump's idea was like, put a bunch of hardline, CEO type people in charge, have them blow up these things and make it more efficient.

It didn't really work, did it? - Yeah, well, you know why a CEO is actually in charge? Like Elon, he walks in, if he doesn't like what you're doing, he'll just fire you. You can't fire anyone. How do you manage them when they don't have to listen to anything you say?

That's our whole government right now. Our cabinet heads are figureheads for these departments, these giant departments. - Is that a no, or is that a yes, you'd still take state? Look at that. - Well, I think I have another theory. I think he's going for the ambassadorship first. What is the best ambassadorship?

- Well, you can't divest everything with no cash. - Historically, you can tell which ambassadorship is the best one based on how much they charge for it. - Yeah. - So I think London is the most expensive. I think that one's $15 million. - It's 10 million for London, 10 to 15.

- 10, 15 million, yeah. - 10, 15 million? That's what Sax is fourth least expensive home cost. - No, no, no, you have to spend that every year to run it, Jason. You only get, you get a fixed- - That's nothing for him. - You could be the ambassador to Guinea or the ambassador to the UK, you get the same budget.

- Actually, what's kind of funny is I know two people who served as ambassadors under Trump and it was really cheap to get those because no one wanted to be part of the Trump administration. - Oh, they were on fire sale, two for one. - They were on fire sale after, because of Trump.

- Who wants to be tainted? - But by the way, one of them, and you can just bleep out the name, (beep) was telling me it was the best thing because he ended up selling, they already got the all-time highs to take the job. He was like, "I got to get out of all of this stuff." - No, but listen, let me tell you, the ambassadorships, it was a smart trade by those guys because ambassador's a lifetime title.

So you're ambassador, whatever. No one remembers what president when you were ambassador, no one cares. - So you are going for the ambassador. - No, no. - So Steve, I think it's fair to say. - I'm not, I'm not interested in ceremonial things. I'm interested in making an impact.

And the problem with all these positions, I mean, being a cabinet official is not much different than being an ambassador. - So you're going to enlist in the Navy? - No. What has a bigger impact? Being on all in pod or being an ambassador? Who's more influential? Sax on the all in pod or beep as the ambassador of Sweden?

- Being on the all in pod actually. - All in pod is more impactful. - By the way, this is why I take issue with your statement about the term mainstream media. 'Cause I think you have become the mainstream media more than most of the folks that- - No, we're independent media.

We're independent media. - Trust me, it's independent. This thing's hanging on by a thread. - And stop genuflecting. (laughing) - No, it's independent. Who knows if this thing's going to last another three episodes. - I just like saying the word genuflect. - You like genuflecting, I know. - That is the top word of 2023 so far for me.

- Oh, is that, is somebody doing an analysis with Chad Chappity of the words used here? - No, but Saks brought that word up. It was just, it's a wonderful word. It's not used enough. All right, everybody, we'll see you next time on the all in podcast. Comments are turned back on.

Have at it, you animals. Love you guys. - Enjoy, bye-bye. - Love you besties, bye. (upbeat music) (upbeat music) (upbeat music) (upbeat music) (upbeat music) (upbeat music) (upbeat music) (upbeat music) (upbeat music) (upbeat music) (upbeat music) (upbeat music) (eerie music) <3 (Music plays)