back to index

E111: Microsoft to invest $10B in OpenAI, generative AI hype, America's over-classification problem


Chapters

0:0 Bestie intro!
0:43 Reacting to Slate's article on All-In
11:18 SF business owner caught spraying homeless person on camera
29:22 Microsoft to invest $10B into OpenAI with unique terms, generative AI VC hype cycle
69:57 Biden's documents, America's over-classification problem
87:16 Best cabinet positions/ambassadorships

Whisper Transcript | Transcript Only Page

00:00:00.000 | Is anybody else seeing a half a second lag with J. Cole?
00:00:03.360 | Like a second lag?
00:00:04.440 | Test, test, test.
00:00:05.080 | One, two, one, two.
00:00:05.840 | From like the way his mouth moves.
00:00:07.720 | Well, that always happens.
00:00:09.200 | Oh, God.
00:00:09.720 | Here it comes.
00:00:10.440 | His mouth never stops moving.
00:00:12.080 | Relax, Sax.
00:00:13.160 | Relax, Sax.
00:00:16.040 | Are we going?
00:00:16.640 | Are we recording?
00:00:17.600 | Are you ready to go?
00:00:18.600 | Don't lose this.
00:00:19.720 | Don't lose this.
00:00:20.360 | A plus material.
00:00:21.400 | Are we going to do hot sacks?
00:00:22.360 | All right, let's go.
00:00:23.000 | This is Chappelle at the punchline.
00:00:24.640 | Let's go.
00:00:25.120 | Let's go.
00:00:25.640 | I'm ready to go.
00:00:26.440 | [MUSIC PLAYING]
00:00:28.240 | Let your winners ride.
00:00:30.960 | Rain Man David Saxon.
00:00:32.240 | I'm going all in.
00:00:35.200 | And it's said we open sourced it to the fans,
00:00:37.200 | and they've just gone crazy with it.
00:00:39.160 | Love you guys.
00:00:39.660 | Queen of Kinwam.
00:00:41.160 | I'm going all in.
00:00:43.160 | All right, everybody.
00:00:44.160 | Welcome to episode 111 of the World's Greatest Podcasts,
00:00:50.720 | according to Slate, the podcast that shall not be mentioned
00:00:54.320 | by the press, apparently.
00:00:55.680 | No, what do you mean?
00:00:56.140 | They just did a profile on us.
00:00:57.600 | Well, they did.
00:00:58.480 | This is the conundrum.
00:01:01.120 | It's so much of a phenomenon that we're
00:01:03.040 | the number one business and the number one tech
00:01:05.120 | podcast in the world, hands down,
00:01:07.840 | that the press has a hard time giving us any oxygen,
00:01:13.240 | because they want to hate us.
00:01:16.760 | They want to cover it.
00:01:17.840 | You're saying they take the ideas, but not the--
00:01:20.360 | They don't want to cite it.
00:01:21.600 | They don't want to cite it.
00:01:22.680 | They don't want to cite it.
00:01:23.520 | But anyway, shout out to Slate.
00:01:25.120 | Yeah, what I thought was interesting
00:01:26.080 | was the guy pointed out that we don't
00:01:28.560 | want to subject ourselves to independent journalists asking
00:01:31.440 | us independent questions.
00:01:32.960 | Therefore, we go direct.
00:01:34.160 | And that's kind of the thing nowadays.
00:01:36.760 | When everyone says they want to go direct,
00:01:38.560 | it's because they don't want to be subject
00:01:39.680 | to independent journalists.
00:01:40.840 | Well, one might ask themselves why subjects
00:01:43.960 | don't want to go direct.
00:01:44.920 | Yeah, exactly.
00:01:45.880 | You mean don't want to go to journalists.
00:01:47.640 | Yeah, because there's a specific reason why
00:01:50.360 | principals, the subject of stories,
00:01:52.160 | do not want to have the press interpret what they're
00:01:55.800 | saying is because they don't feel
00:01:56.680 | they're getting a fair shake.
00:01:57.640 | They feel like the world should be interesting.
00:01:59.000 | The challenge is that then we avoid
00:02:00.520 | independent scrutiny of our points of view
00:02:03.160 | and our decisions.
00:02:04.160 | They're constantly writing hit pieces about us.
00:02:06.440 | The question is, when we want to present our side of it,
00:02:10.680 | do we need to go through their filter or not?
00:02:12.760 | Why would you go through their filter when it's always
00:02:15.120 | going to be a hit piece?
00:02:17.360 | Well, and also--
00:02:18.200 | They have a class hatred of basically of technology
00:02:23.400 | entrepreneurs and investors.
00:02:24.640 | Just use sex.
00:02:25.560 | By the way, I think--
00:02:26.440 | No, I don't think--
00:02:27.240 | Just use sex.
00:02:27.960 | You're right, J-Kal.
00:02:28.800 | They don't hate you because you genuflect
00:02:30.640 | to their political biases.
00:02:31.760 | You see, if--
00:02:33.080 | Hold on.
00:02:34.720 | If you do what SPF did, which is basically
00:02:38.640 | agree with all of their biases, then yes,
00:02:41.600 | they'll treat you better.
00:02:42.600 | That's the deal.
00:02:43.560 | That's how it works.
00:02:44.560 | And when you say they, you're referring
00:02:46.040 | to specific large media outlets, right, Saks?
00:02:48.240 | They all think the same way.
00:02:49.440 | He's not referring to Fox and Tucker.
00:02:52.640 | OK, you can name one.
00:02:53.880 | I'll trade you.
00:02:54.600 | I'll tell you what.
00:02:55.400 | I'll trade you Fox for MSNBC and CNN and The New York Times,
00:02:59.000 | The Washington Post, and The Atlantic Magazine,
00:03:01.000 | and on and on and on.
00:03:02.200 | You get a lot of mileage out of being able to name Fox.
00:03:04.480 | The fact of the matter is--
00:03:05.560 | Megyn Kelly?
00:03:07.480 | That's a podcaster.
00:03:08.240 | She's independent now.
00:03:09.160 | That's true.
00:03:09.680 | She is independent.
00:03:10.480 | You can name one, I mean, literally one outlet that
00:03:14.160 | is not part of this mainstream media.
00:03:17.480 | And they all think the same way.
00:03:19.480 | There are very small differences in the way they think.
00:03:21.720 | It's all about clicks.
00:03:22.920 | It's all about clicks at this point.
00:03:24.880 | And it's all about advocacy journalism.
00:03:26.560 | Not just about clicks.
00:03:27.480 | And advocacy.
00:03:28.040 | It's that combination.
00:03:29.080 | What you're calling advocacy is bias and activism.
00:03:32.760 | It's activism.
00:03:33.920 | That's what I'm talking about, activism journalism, yes.
00:03:36.200 | I think Draymond also highlights a really important point,
00:03:38.840 | which is he started his podcast.
00:03:40.920 | It's become one of the most popular forms of sports media.
00:03:45.240 | And he can speak directly without the filtering
00:03:47.640 | and classification that's done by the journalist.
00:03:52.840 | And it seems to be a really powerful trend.
00:03:54.760 | The audience really wants to hear direct
00:03:56.480 | and they want to hear unfiltered, raw points of view.
00:04:00.640 | And maybe there's still a role for, I think,
00:04:03.040 | the journalism separate from that, which
00:04:05.680 | is to then scrutinize and analyze and question and--
00:04:08.360 | It's not journalism.
00:04:09.320 | It's just activism.
00:04:10.480 | They're just activists.
00:04:13.080 | There are also journalists out there, Sax.
00:04:15.000 | Actually, well, it depends what the topic is
00:04:17.840 | and what the outlet is.
00:04:19.680 | But actually, I would argue that most of these journalists
00:04:23.040 | are doing what they're doing for the same reason
00:04:25.080 | that we're doing what we're doing,
00:04:26.480 | which is they want to have some kind of influence
00:04:29.080 | because they don't get paid very much.
00:04:31.200 | But the way they have influence is
00:04:32.720 | to push a specific political agenda.
00:04:34.720 | I mean, they're activists.
00:04:36.080 | They're basically party activists.
00:04:37.480 | It has become advocacy journalism.
00:04:38.680 | Yes, that's the term I coined for it.
00:04:40.120 | It's advocacy journalism.
00:04:41.160 | You guys see this brouhaha where Matt Iglesias
00:04:44.440 | wrote this article about the Fed and about the debt ceiling?
00:04:49.120 | And through this whole multi-hundred word,
00:04:51.960 | thousand word tome, he didn't understand
00:04:57.400 | the difference between a percentage point and a basis
00:04:59.440 | point and then he didn't calculate the interest
00:05:01.280 | correctly?
00:05:02.000 | Yeah, I did see that.
00:05:04.400 | Wow, so wait a second.
00:05:05.680 | You're saying the Fed's raising 25%?
00:05:08.760 | Yeah, that's a huge difference between--
00:05:11.520 | My mortgage is going up 25%?
00:05:13.520 | Between a principal and an outside analyst, right?
00:05:15.720 | Like a principal has a better grasp typically
00:05:18.400 | of the topics and the material.
00:05:19.840 | But the argument from a journalist--
00:05:21.960 | But he's considered, within the journalist circle,
00:05:25.640 | he's considered the conventional wisdom.
00:05:27.320 | I get it.
00:05:27.800 | But the argument from a journalist
00:05:29.200 | is that by having that direct access,
00:05:31.880 | that person is also biased.
00:05:33.360 | Because they're an agent, because they're
00:05:35.120 | a player on the field, they do have a point of view
00:05:37.120 | and they do have a direction they want to take things.
00:05:39.320 | So it is a fair commentary that journalists can theoretically
00:05:42.880 | play a role, which is they're an off-field analyst
00:05:45.480 | and don't necessarily bring bias.
00:05:47.200 | I would argue they're less educated and more biased
00:05:50.080 | than we are.
00:05:50.680 | That may or may not be true, what the two of you
00:05:52.480 | guys are debating, which is a very subjective take.
00:05:55.040 | But the thing that is categorical and you can't deny
00:05:57.480 | is that there is zero checks and balances when something
00:06:02.120 | as simple as the basis point, percentage point difference
00:06:05.520 | isn't caught in proofreading, isn't caught by any editor,
00:06:09.360 | isn't caught by the people that help them review this.
00:06:12.120 | And so what that says is all kinds of trash
00:06:15.040 | must get through because there's no way for the average person
00:06:18.920 | on Twitter to police all of this nonsensical content.
00:06:22.200 | This one was easy because it was so numerically illiterate
00:06:25.280 | that it just stood out.
00:06:27.000 | But can you imagine the number of unforced errors journalists
00:06:31.920 | make today in their search for clicks that don't get caught
00:06:35.320 | out, that may actually tip somebody to think A versus B?
00:06:39.040 | That's, I think, the thing that's kind of undeniable.
00:06:42.120 | Right.
00:06:42.880 | Yeah.
00:06:43.400 | You only need to--
00:06:44.200 | Wasn't the Rolling Stone article?
00:06:44.800 | There's a very simple test for this.
00:06:46.760 | If you read the journalists writing about a topic
00:06:50.720 | you are an expert on, whatever the topic happens to be,
00:06:54.040 | you start to understand, OK, well,
00:06:55.760 | on that story I'm reading, that they understand about 10%
00:07:00.160 | or 20% or 30% of what's going on.
00:07:03.280 | But then when you read stories that you're not involved in,
00:07:05.740 | you know, you read a story about Hollywood or, I don't know,
00:07:08.320 | pick an industry or a region you're not super aware of,
00:07:11.160 | you're like, OK, well, that must be 100% correct.
00:07:14.040 | And the truth is, journalists have access to 5 to 20--
00:07:16.400 | Yeah, there's a name for that.
00:07:17.160 | There is a name for it, yeah.
00:07:18.080 | It's called the "Gentleman Amnesia Effect."
00:07:19.680 | You just plagiarized Michael Crichton,
00:07:21.360 | who came up with that.
00:07:22.560 | Yeah, so you--
00:07:23.800 | Yeah.
00:07:24.560 | But no, he's exactly right.
00:07:25.800 | But I think it's worse than that.
00:07:27.180 | It's because now the mistakes aren't being driven just
00:07:31.120 | by sloppiness or laziness or just a lack of expertise.
00:07:34.640 | I think it's being driven by an agenda.
00:07:37.080 | So just to give you an example on the Slate thing,
00:07:39.320 | the Slate article actually wasn't bad.
00:07:40.940 | It kind of made us seem, you know, cool.
00:07:43.360 | The subheadline was, "A close listen
00:07:45.520 | to all in the infuriating, fascinating, safe space
00:07:48.400 | for Silicon Valley's money men."
00:07:50.560 | But the headline changed.
00:07:52.380 | So I don't know if you guys noticed this.
00:07:54.080 | The headline now is, "Elon Musk's Inner Circle
00:07:57.440 | is telling us exactly what it thinks."
00:07:59.140 | First of all, like, they're trying to--
00:08:00.840 | It's Elon for clicks.
00:08:01.640 | Yeah, it's Elon for-- so they're trying way too hard to,
00:08:03.900 | like, describe us in terms of Elon, which, you know,
00:08:07.000 | is maybe two episodes out of 110.
00:08:09.280 | But before Inner Circle, the word they used was cronies.
00:08:12.960 | And then somebody edited it, because I
00:08:14.560 | saw cronies in, like, one of those tweet, you know,
00:08:19.680 | summaries.
00:08:21.440 | You know, where, like, it does a capsule or whatever?
00:08:23.960 | Yeah, yeah, yeah.
00:08:24.680 | And those get frozen in time.
00:08:26.520 | So, you know, they were trying to bash us even harder.
00:08:28.920 | And then somebody took another look at it and toned it down.
00:08:30.960 | Well, here's what happened.
00:08:32.080 | I'll tell you what happens in the editorial process.
00:08:34.040 | Whoever writes the article, the article gets submitted.
00:08:36.580 | Maybe it gets edited, proofread, whatever.
00:08:38.800 | Maybe it doesn't even in some publications.
00:08:40.640 | They don't have the time for it, because they're in a race.
00:08:43.200 | Then they pick-- there's somebody who's
00:08:44.880 | really good at social media.
00:08:46.080 | They pick six or seven headlines.
00:08:47.640 | They A/B test them.
00:08:49.080 | And they even have software for this,
00:08:51.000 | where they will run a test.
00:08:52.360 | Sometimes they'll do a paid test.
00:08:53.760 | They put $5 in ads on social media.
00:08:57.400 | Whichever one performs the best, that's the one they go with.
00:09:00.520 | So it's even more cynical.
00:09:01.920 | And because people who read the headlines,
00:09:04.720 | sometimes they don't read the story, right?
00:09:06.480 | Obviously, most people just see the headline.
00:09:07.760 | They interpret that as a story.
00:09:09.280 | That's why I told you, when they did that New Republic
00:09:11.480 | piece on you with that horrific monstrosity of an illustration,
00:09:17.360 | don't worry about it.
00:09:18.280 | People just read the headline.
00:09:19.480 | They know you're important.
00:09:20.640 | Nobody reads the story anyway.
00:09:22.720 | But it wasn't a bad article, actually.
00:09:24.720 | It was well-written, actually.
00:09:25.760 | I was in shock.
00:09:26.480 | I was like, who is this writer that actually
00:09:28.320 | took the time to write some prose that was actually decent?
00:09:31.280 | Yeah, he had listened to a lot of episodes, clearly.
00:09:33.600 | That was a really good moment, actually.
00:09:34.880 | That was great advice, because you gave it to him.
00:09:36.640 | And you gave it to me, because both of us had these things.
00:09:39.100 | And Jason said the same thing.
00:09:40.880 | Just look at the picture.
00:09:42.320 | And if you're OK with the picture, just move on.
00:09:44.320 | And I thought, this can't be true.
00:09:46.200 | And it turned out to mostly be true.
00:09:47.920 | Yeah, but my picture was terrible.
00:09:50.400 | Yeah, but it's close to reality.
00:09:51.840 | So I mean, you've got to--
00:09:54.480 | It was really--
00:09:55.640 | Oh, ha, geez.
00:09:57.920 | I mean, the person who did the worst there was Peter Thiel.
00:10:00.880 | Poor Peter.
00:10:01.400 | Yeah, but that just shows how ridiculously biased it is,
00:10:03.800 | right?
00:10:05.480 | My picture wasn't so bad.
00:10:06.480 | My picture wasn't so bad.
00:10:08.200 | Hugh Grant.
00:10:09.100 | Elon, let's pull that up one more time here.
00:10:11.100 | Elon looks like Hugh Grant.
00:10:12.220 | I just kind of--
00:10:12.940 | Yeah, he does.
00:10:13.940 | Not bad.
00:10:14.900 | Kind of looks like Hugh Grant in "Notting Hill."
00:10:17.500 | I knew that article was going to be fine when
00:10:20.060 | the first item they presented as evidence of me doing something
00:10:24.700 | wrong was basically helping to oust Chasey Boudin, which
00:10:27.980 | was something that was supported by like 70% of San Francisco,
00:10:32.380 | which is a 90% Democratic city.
00:10:34.140 | So not exactly evidence of some out-of-control right-wing
00:10:38.060 | movement.
00:10:38.780 | Look at the headline.
00:10:40.420 | "The Quiet Political Rise of David Sachs, Silicon Valley's
00:10:43.500 | Prophet of Urban Doom."
00:10:44.780 | I'm just letting you know, people don't get past the six
00:10:46.940 | word in the image.
00:10:48.180 | It's 99% of people are like, oh my god,
00:10:50.620 | congrats on the "Republic" article.
00:10:52.660 | It could have literally been Laurel--
00:10:54.140 | what do they call them?
00:10:55.500 | Laurel Ipsums?
00:10:56.260 | You know, like it could have just
00:10:57.580 | been filler words from their second graph down
00:10:59.540 | and nobody would know.
00:11:00.900 | Yeah.
00:11:01.940 | But now apparently if you notice that San Francisco streets
00:11:06.020 | look like Walking Dead, that apparently you're
00:11:09.100 | a prophet of urban doom.
00:11:10.900 | I mean, these people are so out of touch.
00:11:12.580 | I mean, they can't even acknowledge what people
00:11:14.580 | can see with their own eyes.
00:11:16.020 | That's the bias that's gotten crazy.
00:11:17.540 | And I don't know if you guys saw this really horrible,
00:11:21.820 | dystopian video of an art gallery
00:11:24.140 | owner who's been dealing with owning a storefront in San
00:11:27.500 | Francisco, which is challenging, and having to clean up feces
00:11:31.860 | and trash and whatever every day.
00:11:36.020 | And I guess the guy snapped, and he's
00:11:38.180 | hosing down a homeless person who refuses to leave
00:11:42.020 | the front of his store.
00:11:43.700 | Oh, I saw that.
00:11:44.700 | I saw that.
00:11:45.500 | The humanity in this is just insane.
00:11:48.500 | Like, really, like you're hosing a human being down--
00:11:53.020 | It's terrible.
00:11:53.620 | --who is obviously not living a great life and is in dire--
00:11:57.980 | I can feel for both of them, J-Kal.
00:11:59.660 | I can feel for both of them.
00:12:00.940 | I agree that it's not good to hose a human being down.
00:12:03.740 | On the other hand, think about the sense of frustration
00:12:06.060 | that store owner has, because he's watching his business go
00:12:09.180 | in the toilet, because he's got homeless people living
00:12:11.780 | in front of him.
00:12:12.660 | So they're both, like, being mistreated.
00:12:15.220 | The homeless person's being mistreated.
00:12:16.900 | Say more about the store owner.
00:12:18.700 | Say more.
00:12:19.180 | The homeless person's being mistreated.
00:12:20.780 | The store owner's being mistreated
00:12:21.900 | by the city of San Francisco.
00:12:23.620 | Yeah.
00:12:24.820 | Say more about the store owner.
00:12:25.620 | That person's not in a privileged position.
00:12:27.660 | That person probably-- the store owner.
00:12:29.580 | He's probably fighting to stay in business.
00:12:32.100 | I'm just saying-- I'm not saying that's right, but--
00:12:34.180 | No, no.
00:12:34.660 | I'm just-- I'm laying the rope.
00:12:36.460 | I'm just--
00:12:36.940 | [INTERPOSING VOICES]
00:12:37.940 | --I'm just saying--
00:12:38.940 | [INTERPOSING VOICES]
00:12:39.940 | --justified in any way.
00:12:40.940 | What you're trying to do is, oh, my god,
00:12:43.260 | look at this homeless person being horribly oppressed.
00:12:46.180 | That store owner is a victim, too.
00:12:48.420 | Yeah, there's no doubt.
00:12:49.380 | It's horrible to run a business in there.
00:12:51.380 | I mean, what is that person supposed to do?
00:12:53.220 | No, wait.
00:12:53.740 | This is symbolic of the breaking down of basic society.
00:12:57.700 | Like, both of these people are obviously like--
00:13:01.780 | it's just a horrible moment to even witness.
00:13:04.060 | It's like, ugh.
00:13:06.020 | It's like something--
00:13:07.140 | Jason, do you have equal empathy for the store owner
00:13:10.180 | and the homeless person, or no?
00:13:12.700 | Under no circumstances should you
00:13:14.780 | hose a person down in the face who is homeless.
00:13:17.100 | Like, it's just horrific to watch.
00:13:18.620 | It's just inhumane.
00:13:19.700 | This is a human being.
00:13:20.980 | Now, but as a person who owns a store,
00:13:22.780 | yeah, my dad grew up in the local business.
00:13:24.540 | If people were abusing the store,
00:13:25.900 | and you're trying to make a living,
00:13:27.460 | and you've got to clean up whatever, excrement every day,
00:13:31.780 | which is horrific.
00:13:33.620 | And this thing is dystopian.
00:13:34.820 | In that moment, look, in that moment,
00:13:36.460 | the empathy is not equal.
00:13:37.740 | I think you have more empathy, obviously,
00:13:39.660 | for the person on the receiving end of that hose.
00:13:42.540 | But in general, our society has tons of empathy
00:13:47.260 | for homeless people.
00:13:48.100 | We spend billions of dollars trying to solve that problem.
00:13:50.740 | You never hear a thing about the store owners
00:13:53.340 | who are going out of business.
00:13:54.700 | So on a societal level, not in that moment, but in general,
00:13:59.740 | the lack of empathy is for these middle class store owners, who
00:14:03.740 | may not even be middle class, working class, who
00:14:06.260 | are struggling to stay afloat.
00:14:08.260 | And you look at something like, what is it,
00:14:10.220 | like a quarter or a third of the storefronts in San Francisco
00:14:13.340 | are now vacant?
00:14:14.420 | I'm just shocked this person is running--
00:14:16.980 | the shocking thing is, like, this person
00:14:18.580 | is running an art gallery storefront in San Francisco.
00:14:21.540 | Like, why would you even bother?
00:14:23.180 | Why would you bother to have a storefront in San Francisco?
00:14:25.620 | I mean, everybody's left.
00:14:26.740 | It's just--
00:14:27.260 | What do you mean, why do you bother?
00:14:28.780 | If you've opened a store, what are you supposed to do,
00:14:31.100 | start to code all of a sudden?
00:14:32.500 | Well, no.
00:14:33.020 | I mean, you would shut it down at some point and find an exit.
00:14:35.620 | And do what?
00:14:36.140 | Just like all businesses have.
00:14:37.340 | Yeah, but a store has large fixed costs, right?
00:14:38.940 | So what are they supposed to do?
00:14:40.220 | Maybe an investment he made 10 years ago.
00:14:41.940 | Exactly.
00:14:42.460 | At some point, you have to shut down your store
00:14:44.420 | in San Francisco the second you can get out of the lease.
00:14:46.220 | The solution to everything, J. Cal,
00:14:47.660 | isn't go to coding school online and then, you know,
00:14:49.900 | end up working for Google.
00:14:51.020 | Oh, I didn't say it was.
00:14:52.020 | Moving to another city is a possibility.
00:14:53.740 | So true.
00:14:54.820 | A lot of folks in Silicon Valley,
00:14:56.220 | I think, in this weirdly fucked up way,
00:14:58.140 | do believe the solution to everything is learn to code.
00:15:00.980 | Or become an Uber driver.
00:15:02.180 | Or become a masseuse.
00:15:03.980 | Get a gig job.
00:15:05.460 | Get a gig job.
00:15:07.540 | The guy spent years building his retail business.
00:15:09.940 | I mean, the thing is--
00:15:10.820 | And then a homeless person camps in front.
00:15:12.540 | And he calls the police.
00:15:13.660 | The police don't come and move the homeless person.
00:15:15.700 | The homeless person stays there.
00:15:16.700 | He asks nicely to move.
00:15:17.540 | Customers are uncomfortable going in the store as a result.
00:15:19.860 | Yeah, I stopped going to certain stores in my neighborhood
00:15:22.140 | because of homeless tents being literally fixated
00:15:24.780 | in front of the store.
00:15:25.380 | And I'd go to the store down the road
00:15:26.980 | to get my groceries or whatever.
00:15:28.300 | I mean, it's not a kind of uncommon situation
00:15:30.660 | for a lot of these small business owners.
00:15:32.380 | They don't own the real estate.
00:15:33.700 | They're paying rent.
00:15:35.140 | They've got high labor costs.
00:15:36.780 | Everything's inflating.
00:15:38.220 | Generally, city population's declining.
00:15:40.020 | It's a brutal situation all around.
00:15:43.540 | I think if everybody learns to code or drives an Uber,
00:15:46.860 | the problem is that in the absence of things
00:15:49.220 | like local stores and small businesses,
00:15:51.740 | you hollow out communities.
00:15:53.980 | You have these random detached places where you kind of live.
00:15:57.260 | And then you sit in your house, which
00:15:58.820 | becomes a prison while you order food from an app every day.
00:16:02.300 | I don't think that is the society that people want.
00:16:04.660 | So I don't know.
00:16:05.380 | I kind of want small businesses to exist.
00:16:08.180 | And I think that the homeless person should be taken care of.
00:16:10.780 | But the small business person should
00:16:12.380 | have the best chance of trying to be successful
00:16:14.340 | because it's hard enough as it is.
00:16:15.980 | The mortality rate of the small business owner
00:16:19.260 | is already 90%.
00:16:22.020 | It's impossible in San Francisco, let's just be honest.
00:16:24.260 | So stop genuflecting, J. Cal.
00:16:26.100 | I'm not genuflecting.
00:16:27.420 | Listen--
00:16:27.940 | Yeah, you are because here's--
00:16:29.460 | Wait, how am I genuflecting?
00:16:30.620 | I'm saying the guy--
00:16:31.500 | I'm just shocked that the guy even has a storefront.
00:16:33.500 | I would have left a long time ago.
00:16:34.380 | You're showing a tweet that's a moment in time.
00:16:36.460 | And you're not showing the 10 steps that led up to it.
00:16:39.620 | Oh, 1,000 steps.
00:16:40.660 | The five times he called the police to do something about it.
00:16:43.300 | I framed it as dystopian from the get.
00:16:44.940 | I'm not genuflecting.
00:16:45.980 | I'm not genuflecting.
00:16:46.980 | I'm not genuflecting.
00:16:47.900 | I'm not genuflecting.
00:16:48.820 | I'm not genuflecting.
00:16:49.740 | I'm not genuflecting.
00:16:50.660 | I'm not genuflecting.
00:16:51.620 | I'm not genuflecting.
00:16:52.540 | I'm not genuflecting.
00:16:53.460 | I'm not genuflecting.
00:16:54.380 | I'm not genuflecting.
00:16:55.260 | I'm not genuflecting.
00:16:56.100 | I'm not genuflecting.
00:16:56.980 | I'm not genuflecting.
00:16:57.860 | I'm not genuflecting.
00:16:58.740 | I'm not genuflecting.
00:16:59.620 | I'm not genuflecting.
00:17:00.500 | I'm not genuflecting.
00:17:01.420 | I'm not genuflecting.
00:17:02.340 | I'm not genuflecting.
00:17:03.220 | I'm not genuflecting.
00:17:04.140 | I'm not genuflecting.
00:17:05.020 | It's addiction.
00:17:05.740 | It's addiction.
00:17:06.700 | You say you know this, and it's mental illness.
00:17:08.620 | Schellenberger's done the work.
00:17:09.940 | It's like he said, 99% of the people he talks to,
00:17:12.740 | it's either mental illness or addiction.
00:17:14.500 | But we keep using this word homeless
00:17:16.500 | to describe the problem.
00:17:18.140 | But the issue here is not the lack of housing,
00:17:20.900 | although that's a separate problem in California.
00:17:22.980 | But it's basically the lack of treatment.
00:17:25.860 | Totally.
00:17:26.340 | So we should be calling them treatmentless.
00:17:27.860 | And mandates around this, because--
00:17:29.340 | And enforcement.
00:17:30.140 | You cannot have-- you can't have a super drug
00:17:33.540 | be available for a nominal price and give people
00:17:38.380 | a bunch of money to come here and take it and not enforce it.
00:17:40.980 | You have to draw the line at fentanyl.
00:17:43.380 | I'm sorry.
00:17:44.020 | Fentanyl is a super drug.
00:17:45.500 | There's three alternatives.
00:17:46.820 | There's mandated rehab, mandated mental health or jail,
00:17:52.140 | or housing services.
00:17:54.340 | If you're not breaking the law, you don't have mental illness,
00:17:56.860 | you don't have drug addiction.
00:17:57.700 | And then provide-- those are the four paths of outcome here
00:18:00.180 | of success.
00:18:00.900 | And if all four of those paths were both mandated
00:18:03.780 | and available in abundance, this could be a tractable problem.
00:18:08.500 | Unfortunately, the mandate-- I mean,
00:18:10.860 | you guys remember that Kevin Bacon movie,
00:18:12.820 | where Kevin Bacon was locked up in a mental institution,
00:18:16.220 | but he wasn't like--
00:18:18.100 | he wasn't mentally ill?
00:18:19.340 | It's a famous story.
00:18:20.420 | Footloose?
00:18:21.540 | What's that?
00:18:22.380 | Footloose.
00:18:23.380 | No, it's a famous story.
00:18:24.820 | You guys-- someone's probably going to call me an idiot
00:18:27.060 | for messing this whole thing up.
00:18:29.740 | But I think there's a story where mandated mental health
00:18:37.260 | services, like locking people up to take care of them
00:18:40.460 | when they have mental health issues like this,
00:18:43.260 | became kind of inhumane.
00:18:45.660 | And a lot of the institutions were shut down,
00:18:47.500 | and a lot of the laws were overturned.
00:18:49.300 | And there are many of these cases that happened,
00:18:51.620 | where they came across as torturous to what
00:18:53.860 | happened to people that weren't mentally ill.
00:18:55.980 | And so the idea was, let's just abandon the entire product.
00:18:58.500 | Like a goose nest?
00:18:59.780 | That's another good one.
00:19:00.820 | Yeah.
00:19:01.340 | Well, that's another one, right?
00:19:02.700 | And it's unfortunate, but I think that there's some--
00:19:05.060 | we talk a lot about nuance and gray areas,
00:19:06.940 | but there's certainly some solution here
00:19:09.020 | that isn't black or white.
00:19:10.100 | It's not about not having mandated mental health
00:19:12.020 | services, and it's not about locking everyone up
00:19:14.860 | that has some slight problem.
00:19:16.540 | But there's some solution here that needs to be crafted,
00:19:19.500 | where you don't let people suffer,
00:19:21.060 | and you don't let people suffer both as the victim
00:19:24.860 | on the street, but also the victim in the community.
00:19:27.020 | You're talking about a 51/50, I think,
00:19:28.020 | like when people are held because they're
00:19:30.660 | a danger to themselves or others kind of thing?
00:19:33.100 | Right, but Jacob, let's think about the power of language
00:19:34.900 | here.
00:19:35.400 | If we refer to these people as untreated persons instead
00:19:39.380 | of homeless persons, and that was the coverage 24/7
00:19:42.740 | in the media is, this is an untreated person,
00:19:45.220 | the whole policy prescription would be completely different.
00:19:47.720 | We'd realize there's a shortage of treatment.
00:19:50.020 | We'd realize there's a shortage of remedies related
00:19:52.220 | to getting people in treatment, as opposed to building housing.
00:19:56.740 | But why--
00:19:58.100 | And laws that mandate it, that don't enable it.
00:20:00.860 | Because if you don't mandate it, then you
00:20:02.780 | enable the free reign and the free living on the street
00:20:05.820 | and the open drug markets and all this sort of stuff.
00:20:08.140 | There's a really easy test for this.
00:20:10.460 | If it was yourself, and you were addicted,
00:20:14.180 | or if it was a loved one, one of your immediate family members,
00:20:16.980 | would you want yourself or somebody else
00:20:19.500 | to be picked up off the street and held with a 51/50
00:20:22.260 | or whatever code involuntarily against their will
00:20:26.220 | because they were a danger?
00:20:27.340 | Would you want them to be allowed to remain on the street?
00:20:29.740 | Would you want yourself if you were in that dire straits?
00:20:32.260 | And the answer, of course, is you
00:20:34.500 | would want somebody to intervene.
00:20:35.980 | But what's the liberal policy perspective on this, J. Cal?
00:20:38.320 | So let me ask you as our diehard liberal on the show--
00:20:41.060 | No, I'm not a diehard liberal.
00:20:42.460 | No, he's an independent and only votes for Democrats.
00:20:44.780 | Please, get it right.
00:20:45.660 | 75% of the time I vote a Democrat, 25% Republican.
00:20:47.940 | Please, get it right.
00:20:48.820 | Independent votes for Democrats.
00:20:50.460 | 25% Republicans.
00:20:51.900 | Is it not that your individual liberties are infringed upon
00:20:54.540 | if you were to be, quote, "picked up and put away"?
00:20:57.420 | My position on it is if you're not thinking straight
00:21:00.780 | and you're high on fentanyl, you're
00:21:02.700 | not thinking for yourself.
00:21:03.940 | And you could lose the liberty for a small period of time--
00:21:06.820 | 72 hours, a week--
00:21:09.340 | especially if you're a danger to somebody, yourself
00:21:12.620 | or other people.
00:21:13.420 | And in this case, if you're on fentanyl, if you're on meth,
00:21:15.940 | you're a danger to society.
00:21:17.740 | I think if more people had that point of view
00:21:21.020 | and had that debate, as Saxe is saying, in a more open way,
00:21:24.180 | you could get to some path to resolution on--
00:21:26.220 | Just not in San Francisco.
00:21:27.660 | It's not how it happened.
00:21:28.700 | So you guys know this.
00:21:30.940 | We won't say who it is, but someone in my family
00:21:34.500 | has some pretty severe mental health issues.
00:21:36.700 | And the problem is, because they're an adult,
00:21:41.660 | you can't get them to get any form of treatment whatsoever.
00:21:45.700 | Right, right.
00:21:47.660 | You only have the nuclear option.
00:21:49.740 | And the nuclear option is you basically take that person
00:21:52.020 | to court and try to seize their power of attorney, which
00:21:54.660 | is essentially saying that--
00:21:56.180 | Individual liberties are gone.
00:21:57.900 | And by the way, it is so unbelievably restrictive
00:22:00.940 | what happens if you lose that power of attorney
00:22:03.860 | and somebody else has it over you.
00:22:06.840 | It's just a huge burden that the legal system
00:22:10.660 | makes extremely difficult.
00:22:12.540 | And the problem--
00:22:13.340 | Well, some of the law is a backstop.
00:22:14.500 | If the person's committing something illegal,
00:22:16.460 | like camping out or doing fentanyl, meth, whatever,
00:22:20.180 | you can use the law as the backstop against personal
00:22:23.100 | liberty.
00:22:23.600 | You can't use the law.
00:22:23.900 | All that person can do is really get arrested.
00:22:25.820 | Even that is not a high enough bar
00:22:27.660 | to actually get power of attorney over somebody.
00:22:29.660 | The other thing that I just wanted you guys to know--
00:22:31.940 | I think you know this, but just a little historical context--
00:22:34.860 | is a lot of this crisis in mental health
00:22:37.580 | started because Reagan defunded all the psychiatric hospitals.
00:22:41.100 | He emptied them in California.
00:22:43.580 | And that compounded, because for whatever reason,
00:22:46.420 | his ideology was that these things should
00:22:48.780 | be treated in a different way.
00:22:50.540 | And when he got to the presidency,
00:22:52.700 | one of the things that he did was
00:22:54.100 | he repealed the Mental Health--
00:22:55.700 | I think it's called the Mental Health Systems Act, MHSA,
00:23:00.420 | which completely broke down some pretty landmark legislation
00:23:03.260 | on mental health.
00:23:04.860 | And I don't think we've ever really recovered.
00:23:06.900 | We're now 42 years onward from 1980, or 43 years onward.
00:23:11.420 | But just something for you guys to know that that's--
00:23:14.900 | Reagan had a lot of positives, but that's
00:23:16.860 | one definitely negative check in my book against his legacy
00:23:21.060 | is his stance on mental health in general
00:23:22.740 | and what he did to defund mental health.
00:23:24.540 | Well, let me make two points there.
00:23:26.740 | So I'm not defending that specific decision.
00:23:29.400 | There were a bunch of scandals in the 1970s,
00:23:32.220 | and epitomized by the movie One Flew
00:23:33.860 | Over the Cuckoo's Nest with Jack Nicholson,
00:23:35.860 | about the conditions in these mental health homes.
00:23:39.580 | And that did create a groundswell
00:23:41.780 | to change laws around that.
00:23:43.780 | But I think this idea that somehow Reagan
00:23:47.100 | is to blame when he hasn't been in office for 50 years,
00:23:50.060 | as opposed to the politicians who've
00:23:51.540 | been in office for the last 20 years,
00:23:53.540 | I just think it's letting them off the hook.
00:23:55.340 | I mean, Gavin Newsom, 10, 15 years ago
00:23:57.900 | when he was mayor of San Francisco,
00:23:59.460 | declared that he would end homelessness within 10 years.
00:24:03.240 | He just made another declaration like that as governor.
00:24:06.180 | So I just feel like--
00:24:07.500 | I'm not saying it's Reagan's fault.
00:24:09.460 | I'm just saying--
00:24:10.180 | Well, I just think it's letting these guys off the hook.
00:24:11.340 | --it's an interesting historical moment.
00:24:12.220 | I think it's letting the politicians off the hook.
00:24:14.380 | Society needs to start thinking about changing priorities.
00:24:17.260 | We didn't have this problem of massive numbers of people
00:24:20.340 | living on the streets 10, 15 years ago.
00:24:23.100 | It was a much smaller problem.
00:24:24.700 | And I think a lot of it has to do with fentanyl.
00:24:26.660 | The power of these drugs has increased massively.
00:24:29.660 | There's other things going on here.
00:24:31.540 | So in any event, I mean, you can question what Reagan did
00:24:35.780 | in light of current conditions.
00:24:37.060 | But I think this problem really started in the last 10,
00:24:39.820 | 15 years.
00:24:41.100 | Like in an order of magnitude bigger way.
00:24:43.580 | These are super drugs.
00:24:44.500 | Until people realize these are a different class of drugs,
00:24:47.500 | and they start treating them as such,
00:24:49.700 | it's going to just get worse.
00:24:50.860 | There's no path.
00:24:52.500 | Oh, as far as I know, Reagan didn't hand out
00:24:55.620 | to these addicts $800 a week to feed their addiction
00:24:59.020 | so they could live on the streets of San Francisco.
00:25:01.140 | That is the current policy of the city.
00:25:02.900 | I hear you.
00:25:03.400 | It's a terrible policy.
00:25:04.400 | All I just wanted to just provide
00:25:06.140 | was just that color that we had a system of funding
00:25:10.180 | for the mental health infrastructure,
00:25:11.700 | particularly local mental health infrastructure.
00:25:14.180 | And we took that back.
00:25:16.460 | And then we never came forward.
00:25:18.180 | And all I was saying is I'm just telling you
00:25:20.580 | where that decision was made.
00:25:21.780 | I think that's part of the solution here is, yeah,
00:25:23.860 | we're going to have to basically build up shelters.
00:25:26.380 | We're going to have to build up homes.
00:25:28.060 | And to support your point, the problem now, for example,
00:25:31.220 | is Gavin Newsom says a lot of these things.
00:25:34.180 | And now he's gone from a massive surplus
00:25:37.380 | to a $25 billion deficit overnight, which we talked
00:25:40.780 | about even a year ago because that was just
00:25:43.140 | the law of numbers catching up with the state of California.
00:25:46.940 | And he's not in a position now to do any of this stuff.
00:25:49.260 | So this homeless problem may get worse.
00:25:51.540 | Well, they did appropriate--
00:25:53.460 | I forget the number.
00:25:54.380 | It's like $10 billion or something out of that huge
00:25:56.620 | budget they had to solve the problem of homelessness.
00:25:58.460 | I would just argue they're not tackling it in the right way.
00:26:01.060 | Because what happened is there's a giant special interest
00:26:04.220 | that formed around this problem, which
00:26:06.340 | is the building industry, who gets these contracts to build
00:26:12.060 | the, quote, "affordable housing" or the--
00:26:14.460 | Homeless industrial complex is insane.
00:26:16.020 | The homeless industrial complex.
00:26:17.380 | And so they end up building 10 units at a time
00:26:20.420 | on Venice Beach, like the most expensive land you could
00:26:23.540 | possibly build because you get these contracts
00:26:26.140 | from the government.
00:26:27.060 | So there's now a giant special interest
00:26:29.020 | in lobby that's formed around this.
00:26:31.020 | If you really want to solve the problem,
00:26:33.100 | you wouldn't be building housing on Venice Beach.
00:26:36.940 | You'd be going to cheap land just outside the city.
00:26:40.020 | Totally.
00:26:40.660 | And you'd be building scale shelters.
00:26:42.820 | I mean, shelters that can house 10,000 people, not 10.
00:26:47.900 | And you'd be having treatment services--
00:26:49.660 | But they also have to get treatment.
00:26:50.140 | You have to get treatment.
00:26:50.780 | But with treatment built into them, right?
00:26:52.660 | You'd be solving this problem at scale.
00:26:54.740 | And that's not what they're doing.
00:26:56.860 | By the way, do you guys want to hear this week in Grift?
00:27:00.180 | Sure.
00:27:01.020 | We're all in.
00:27:01.820 | That's a great example of Grift.
00:27:03.260 | I read something today in Bloomberg that was unbelievable.
00:27:07.260 | There's about $2 trillion of debt
00:27:09.780 | owned by the developing world that
00:27:13.180 | has been classified by a nonprofit, the Nature
00:27:16.300 | Conservancy in this case, as eligible for what
00:27:18.980 | they called nature swaps.
00:27:20.940 | So this is $2 trillion of the umpteen trillions of debt
00:27:23.780 | that's about to get defaulted on by countries like Belize,
00:27:27.580 | Ecuador, Sri Lanka, Seychelles, you name it.
00:27:31.900 | And what happens now are the big bulge bracket,
00:27:35.220 | Wall Street banks and the Nature Conservancy,
00:27:38.700 | goes to these countries and says, listen,
00:27:41.660 | you have a billion dollar tranche of debt that's
00:27:44.140 | about to go upside down.
00:27:46.660 | And you're going to be in default with the IMF.
00:27:48.980 | We'll let you off the hook.
00:27:50.940 | And we will negotiate with those bondholders
00:27:54.340 | to give them $0.50 on the dollar.
00:27:56.900 | But in return, you have to promise
00:27:59.420 | to take some of that savings and protect the rainforest
00:28:03.260 | or protect a coral reef or protect some mangrove trees.
00:28:07.180 | All sounds good.
00:28:08.740 | Except then what these folks do is
00:28:10.540 | they take that repackaged debt.
00:28:13.380 | They call it ESG.
00:28:15.340 | They mark it back up.
00:28:16.580 | And then they sell it to folks like BlackRock
00:28:18.900 | who have decided that they must own this in the portfolio.
00:28:21.980 | So it literally just goes from one sleeve of BlackRock,
00:28:24.940 | which is now marked toxic emerging market debt.
00:28:29.100 | And then it gets into someone's 401(k) as ESG debt.
00:28:33.460 | Isn't that unbelievable?
00:28:34.660 | So you could virtue signal and buy some ESG
00:28:37.340 | to make yourself feel good, yeah.
00:28:38.820 | $2 trillion of this paper.
00:28:39.940 | Here's all you have to know about ESG
00:28:41.580 | is that Exxon is like the number seven top-ranked company
00:28:46.020 | according to ESG.
00:28:47.340 | And Tesla is even on the list.
00:28:48.740 | Disastrous.
00:28:50.060 | How crazy is that?
00:28:52.780 | It's a complete scam.
00:28:54.620 | Yeah.
00:28:55.420 | All of those-- we've said this many times,
00:28:57.180 | but each of those letters individually
00:28:58.820 | means so much and should be worth a lot to a lot of people.
00:29:02.220 | But when you stick them together,
00:29:03.620 | it creates this toxic soup where you can just hide the cheese.
00:29:07.540 | Yeah, I mean, governance is important in companies.
00:29:10.260 | Of course, the environment is important.
00:29:12.660 | Social change is important.
00:29:14.180 | I mean, but why are these things grouped together in this?
00:29:17.620 | It just perverts the whole thing.
00:29:18.980 | It's an industry, J. Cal.
00:29:19.460 | It's an industry of consultants.
00:29:20.820 | It's a grift.
00:29:21.300 | Absolutely.
00:29:22.020 | All right, speaking of grifts, Microsoft
00:29:24.500 | is going to put $10 billion or something into chat GPT.
00:29:28.340 | Degenerate AI, as I'm calling it now,
00:29:30.820 | is the hottest thing in Silicon Valley.
00:29:34.420 | The technology is incredible.
00:29:35.740 | I mean, you can question the business model, maybe,
00:29:38.100 | but the technology is pretty--
00:29:39.660 | Well, I mean, yeah.
00:29:40.460 | So what I'd say is $29 billion for a company that's
00:29:43.180 | losing $1 billion in Azure credits a year.
00:29:45.700 | That's one way to look at it.
00:29:47.220 | That's also a naive way to look at a lot of other businesses
00:29:50.380 | that ended up being worth a lot down the road.
00:29:52.300 | I mean--
00:29:52.900 | Sure.
00:29:53.820 | You can model out the future of a business like this
00:29:56.260 | and create a lot of really compelling big outcomes.
00:30:00.340 | Potentially, yeah.
00:30:01.780 | So Microsoft is close to investing $10 billion
00:30:04.060 | in open AI in a very convoluted transaction
00:30:06.620 | that people are trying to understand.
00:30:08.780 | It turns out that they might wind up owning 59% of open AI,
00:30:13.900 | but get 75% of the cash and profits back over time.
00:30:19.300 | 49%, yeah, of open AI.
00:30:21.220 | But they would get paid back the $10 billion
00:30:23.780 | over some amount of time.
00:30:26.140 | And this obviously includes Azure credits and chat GPT.
00:30:32.340 | As everybody knows, this just incredible demonstration
00:30:36.980 | of what AI can do in terms of text-based creation of content
00:30:41.500 | and answering queries has taken the net by storm.
00:30:45.060 | People are really inspired by it.
00:30:47.100 | Sax, do you think that this is a defensible, real technology?
00:30:51.820 | Or do you think this is like a crazy hype cycle?
00:30:55.540 | Well, it's definitely the next VC hype cycle.
00:30:57.660 | Everyone's kind of glomming onto this,
00:30:59.260 | because VC really right now needs a savior.
00:31:01.620 | Just look at the public markets, everything we're investing in,
00:31:04.220 | it's in the toilets.
00:31:05.060 | So we all really want to believe that this
00:31:07.260 | is going to be the next wave.
00:31:10.140 | And just because something is a VC hype cycle
00:31:13.060 | doesn't mean that it's not true.
00:31:14.980 | So as I think one of our friends pointed out,
00:31:19.180 | mobile turned out to be very real.
00:31:21.020 | I think cloud turned out to be, I'd say, very real.
00:31:25.500 | Social was sort of real in the sense
00:31:28.020 | that it did lead to a few big winners.
00:31:30.460 | On the other hand, web 3 and crypto was a hype cycle
00:31:34.860 | and it's turned into a big bust.
00:31:36.180 | VR falls into the hype cycle.
00:31:37.500 | I think VR is probably a hype cycle so far.
00:31:39.860 | No one can even explain what web 3 is.
00:31:42.780 | In terms of AI, I think that if I had to guess,
00:31:47.300 | I would say the hype is real in terms
00:31:50.180 | of its technological potential.
00:31:53.100 | However, I'm not sure about how much potential there is yet
00:31:57.340 | for VCs to participate.
00:31:58.780 | Because right now, it seems like this
00:32:01.580 | is something that's going to be done by really big companies.
00:32:04.620 | So open AI is basically a--
00:32:07.140 | it looks like kind of a Microsoft proxy.
00:32:09.460 | You've got Google, I'm sure, will develop it
00:32:11.340 | through their DeepMind asset.
00:32:14.460 | I'm sure Facebook is going to do something huge in AI.
00:32:18.100 | So what I don't know is, is this really
00:32:20.180 | a platform that startups are going to be able to benefit
00:32:22.940 | from?
00:32:23.700 | I will say that some of the companies we've invested in
00:32:26.740 | are starting to use these tools.
00:32:28.180 | So I guess where I am is I think the technology is actually
00:32:34.460 | exciting.
00:32:35.780 | I wouldn't go overboard on the valuations.
00:32:39.220 | I wouldn't buy into that level of the hype.
00:32:41.020 | But you think there could be hundreds
00:32:42.700 | of companies built around an API for something
00:32:46.780 | like ChatGBT, Dolly--
00:32:48.140 | Maybe, yeah.
00:32:48.740 | I don't think startups are going to be able to create
00:32:51.020 | the AI themselves.
00:32:53.500 | But they might be able to benefit from the APIs.
00:32:56.940 | Maybe that's the thing that has to be proven out.
00:32:59.580 | Freebird.
00:33:00.100 | There's a lot of really fantastic machine learning
00:33:04.020 | services available through cloud vendors today.
00:33:07.100 | So Azure has been one of these kind of vendors.
00:33:09.940 | And obviously, OpenAI is building tools a little bit
00:33:13.500 | further down on the stack.
00:33:16.180 | But there's a lot of tooling that
00:33:17.660 | can be used for specific vertical applications.
00:33:19.780 | Obviously, the acquisition of InstaDeep by BioNTech
00:33:22.900 | is a really solid example.
00:33:24.180 | And most of the big dollars that are flowing in biotech
00:33:26.780 | right now are flowing into machine learning applications
00:33:29.700 | where there's some vertical application of machine
00:33:31.740 | learning tooling and techniques around some specific problem
00:33:36.180 | And the problem set of mimicking human communication
00:33:40.540 | and doing generative media is a consumer application
00:33:45.300 | set that has a whole bunch of really interesting product
00:33:47.540 | opportunities.
00:33:48.620 | But let's not kind of be blind to the fact
00:33:51.140 | that nearly every other industry and nearly every other vertical
00:33:54.660 | is being transformed today.
00:33:55.980 | And there's active progress being made in funding
00:33:59.740 | and getting liquidity on companies and progress
00:34:02.980 | with actual products being driven by machine learning
00:34:05.540 | systems.
00:34:06.020 | There's a lot of great examples of this.
00:34:07.700 | So the fundamental capabilities of large data sets
00:34:11.900 | and then using these kind of learning techniques
00:34:15.380 | in software and statistical models
00:34:16.860 | to make kind of predictions and drive businesses forward
00:34:20.900 | in a way that they're not able to with just human knowledge
00:34:23.940 | and human capability alone is really real.
00:34:27.660 | And it's here today.
00:34:29.100 | And so I think let's not get caught up in the fact
00:34:31.180 | that there's this really interesting consumer market
00:34:33.660 | hype cycle going on, where these tools are not
00:34:36.060 | being kind of validated and generating
00:34:38.580 | real value across many other verticals and segments.
00:34:41.220 | Chamath, when you look at this Microsoft OpenAI deal,
00:34:44.700 | and you see something that's this convoluted,
00:34:47.220 | hard to understand, what does that signal to you
00:34:50.780 | as a capital allocator and company builder?
00:34:52.820 | I would put deals into two categories.
00:34:55.340 | One is easy and straightforward.
00:34:57.780 | And then two is, you know, cute by half or the two hard bucket.
00:35:02.300 | This is clearly in that second category,
00:35:05.100 | but it doesn't mean that--
00:35:05.940 | - Why is it in that category?
00:35:07.140 | - Well, it doesn't mean that it won't work.
00:35:08.740 | In our group chat with the rest of the guys,
00:35:11.660 | one person said, there's a lot of complex law
00:35:14.820 | when you go from a nonprofit to a for-profit.
00:35:19.360 | There's lots of complexity in deal construction.
00:35:22.260 | The original investors have certain things
00:35:24.660 | that they want to see.
00:35:26.580 | There may or may not be, you know, legal issues at play here
00:35:29.820 | that you encapsulated well in the last episode.
00:35:32.820 | I think there's a lot of stuff we don't know.
00:35:34.980 | So I think it's important to just like,
00:35:37.520 | give those folks the benefit of the doubt.
00:35:39.140 | But yeah, if you're asking me,
00:35:41.300 | it's in the two hard bucket for me to really take seriously.
00:35:43.800 | Now, that being said, it's not like I got shown the deal.
00:35:47.300 | So I can't comment.
00:35:49.020 | Here's what I will say.
00:35:50.980 | The first part of what Zach said,
00:35:52.260 | I think is really important for entrepreneurs
00:35:54.780 | to internalize, which is where can we make money?
00:35:59.420 | The reality is that, well, let me just take a prediction.
00:36:04.140 | I think that Google will open source their models
00:36:07.220 | because the most important thing that Google can do
00:36:11.420 | is reinforce the value of search.
00:36:14.460 | And the best way to do that is to scorch the earth
00:36:17.420 | with these models, which is to make them widely available
00:36:20.380 | and as free as possible.
00:36:22.160 | That will cause Microsoft to have to catch up.
00:36:25.740 | And that will cause Facebook to have to really look
00:36:28.180 | in the mirror and decide whether they're going to cap
00:36:32.460 | the betting that they've made on AR/VR
00:36:35.820 | and reallocate very aggressively to AI.
00:36:38.220 | I mentioned this in the, I did this Lex Friedman podcast,
00:36:41.380 | but that should be what Facebook does.
00:36:43.240 | And the reason is, if Facebook and Google and Microsoft
00:36:47.340 | have roughly the same capability and the same model,
00:36:50.640 | there's an element of machine learning
00:36:53.660 | that I think is very important,
00:36:54.900 | which is called reinforcement learning.
00:36:56.780 | And specifically it's reinforcement learning
00:36:58.700 | from human feedback, right?
00:37:00.220 | So these RLHF pipelines,
00:37:01.860 | these are the things that will make your stuff unique.
00:37:05.780 | So if you're a startup,
00:37:07.420 | you can build a reinforcement learning pipeline.
00:37:11.160 | You build a product that captures a bunch of usage.
00:37:13.460 | We talked about this before.
00:37:14.900 | That data set is unique to you as a company.
00:37:17.740 | You can feed that into these models,
00:37:19.740 | get back better answers, you can make money from it.
00:37:22.380 | Facebook has an enormous amount of reinforcement learning
00:37:26.180 | inside of Facebook.
00:37:27.140 | Every click, every comment, every like, every share.
00:37:32.100 | Twitter has that data set.
00:37:33.940 | Google inside of Gmail and search.
00:37:36.300 | Microsoft inside of Minecraft and Hotmail.
00:37:40.520 | So my point is, David's right.
00:37:42.700 | The huge companies, I think, will create the substrates.
00:37:47.700 | And I think there'll be forced to scorch the earth
00:37:52.060 | and give it away for free.
00:37:54.340 | And then on top of that is where you can make money.
00:37:56.820 | And I would just encourage entrepreneurs to think,
00:37:59.340 | where is my edge in creating a data set
00:38:02.540 | that I can use for reinforcement learning?
00:38:04.200 | That I think is interesting.
00:38:05.620 | That's kind of saying,
00:38:06.900 | I buy the ingredients from the supermarket,
00:38:09.780 | but then I can still construct a dish that's unique.
00:38:12.700 | And the salt is there, the pepper is there,
00:38:15.680 | but how I use that will determine
00:38:17.540 | whether you like the thing or not.
00:38:19.240 | And I think that that is the way
00:38:21.340 | that I think we need to start thinking about it.
00:38:23.380 | - Interestingly, as we've all pointed out here,
00:38:26.100 | OpenAI was started as a nonprofit.
00:38:28.780 | The stated philosophy was this technology is too powerful
00:38:31.620 | for any company to own.
00:38:33.660 | Therefore, we're going to make it open source.
00:38:35.900 | And then somewhere in the last couple of years,
00:38:37.300 | they said, well, you know what, actually,
00:38:38.820 | it's too powerful for it to be out there in the public.
00:38:42.020 | We need to make this a private company
00:38:44.380 | and we need to get $10 billion from Microsoft.
00:38:48.000 | That is the disconnect I am trying to understand.
00:38:51.700 | That's the most interesting part of the story, Jason,
00:38:54.060 | I think, if you go back to 2014,
00:38:56.460 | is when Google bought DeepMind.
00:38:58.540 | And immediately everyone started reacting
00:39:01.560 | to a company as powerful as Google,
00:39:03.540 | having a toolkit and a team as powerful
00:39:06.100 | as DeepMind within them.
00:39:07.940 | And that sort of power should not sit in anyone's hands.
00:39:10.660 | I heard people that I'm close with
00:39:11.980 | that are close to the organization
00:39:13.700 | and the company comment that they thought
00:39:15.680 | this is the most kind of scary, threatening,
00:39:18.420 | biggest threat to humanity
00:39:20.300 | is Google's control of DeepMind.
00:39:21.820 | And that was a naive kind of point of view,
00:39:24.660 | but it was one that was close,
00:39:25.900 | that was deeply held by a lot of people.
00:39:27.480 | So Reid Hoffman, Peter Thiel, Elon Musk,
00:39:29.820 | a lot of these guys funded the original
00:39:32.020 | kind of open AI business in 2015.
00:39:34.580 | And here's the link.
00:39:35.700 | So I'm putting it out here.
00:39:37.300 | You guys can pull up the original blog post.
00:39:39.140 | - Do all those people who donated get stock in?
00:39:42.300 | - So what happened was they were all in a non,
00:39:43.860 | it was all in a nonprofit.
00:39:45.040 | And then the nonprofit owned stock
00:39:46.700 | in a commercial business now.
00:39:48.280 | But your point is interesting
00:39:49.500 | because at the beginning,
00:39:50.940 | the idea was instead of having Google own all of this,
00:39:53.140 | we'll make it all available.
00:39:54.400 | And here's the statement from the original blog post in 2015,
00:39:58.260 | open AI is a nonprofit AI research company.
00:40:01.220 | Our goal is to advance digital intelligence
00:40:03.240 | in the way that is most likely to benefit humanity
00:40:05.180 | as a whole, unconstrained by a need
00:40:08.100 | to generate financial return.
00:40:10.180 | Since our research is free from financial obligations,
00:40:13.380 | we can better focus on a positive human impact.
00:40:16.060 | And they kind of went on and the whole thing
00:40:17.820 | about Sam, Greg, Elon, Reed, Jessica, Peter Thiel, AWS,
00:40:22.700 | YC are all donating to support open AI,
00:40:25.260 | including donations and commitments of over a billion
00:40:27.620 | dollars, although we expect that to only be a tiny fraction
00:40:32.060 | of what we will spend in the next few years,
00:40:35.140 | which is a really interesting kind of,
00:40:37.280 | if you look back historical perspective
00:40:39.100 | on how this thing all started seven years ago
00:40:41.380 | and how quickly it's evolved as you point out
00:40:44.060 | into the necessity to have a real commercial alignment
00:40:47.180 | to drive this thing forward without seeing any
00:40:49.220 | of these models open sourced.
00:40:50.700 | And during that same period of time,
00:40:53.420 | we've seen Google share AlphaFold and share a number
00:40:56.820 | of other predictive models and toolkits
00:40:59.740 | and make them publicly available
00:41:01.500 | and put them in Google's cloud.
00:41:04.220 | And so there's both kind of tooling and models
00:41:07.000 | and outputs of those models that Google has open sourced
00:41:09.860 | and made freely available.
00:41:11.220 | And meanwhile, open AI has kind of diverged
00:41:14.060 | into this deeply profitable, profit seeking
00:41:16.940 | kind of enterprise model.
00:41:18.340 | And when you invest in open AI,
00:41:20.940 | in the round that they did before,
00:41:22.980 | you could generate a financial return capped at 100X,
00:41:26.060 | which is still a pretty amazing financial return.
00:41:27.980 | You put a billion dollars in,
00:41:29.340 | you can make a hundred billion dollars.
00:41:30.660 | That's funding a real commercial endeavor at that point.
00:41:34.500 | - Well, and then to-
00:41:35.980 | - It is the most striking question about this whole thing,
00:41:38.260 | about what's going on in our AI.
00:41:40.220 | And it's one that Elon's talked about publicly
00:41:42.180 | and others have kind of sat on one side or the other,
00:41:45.060 | which is that AI offers a glimpse
00:41:49.300 | into one of the biggest
00:41:50.660 | and most kind of existential threats to humanity.
00:41:53.380 | And the question we're all gonna be tackling
00:41:56.180 | and the battle that's gonna be happening politically
00:41:58.580 | and regulatory wise,
00:42:00.060 | and perhaps even between nations in the years to come,
00:42:03.100 | is who owns the AI, who owns the models,
00:42:05.360 | what can they do with it?
00:42:06.340 | And what are we legally gonna be allowed to do with it?
00:42:08.460 | And this is a really important part of that story, yeah.
00:42:10.700 | - To build on what you're saying,
00:42:11.980 | I just put in PyTorch,
00:42:13.620 | people don't know that's another framework, P-Y-T-O-R-C-H.
00:42:18.260 | This was largely built inside of Facebook.
00:42:20.900 | And then Facebook said,
00:42:21.740 | "Hey, we want to democratize machine learning."
00:42:24.860 | And they made,
00:42:25.700 | and I think they put a bunch of executives,
00:42:27.500 | they may have even funded those executives
00:42:29.060 | to go work on this open source project.
00:42:31.940 | So they have a huge stake in this
00:42:33.700 | and they went very open source with it.
00:42:37.260 | And then TensorFlow,
00:42:38.460 | which you have an investment in Chamath,
00:42:42.020 | TensorFlow was inside of--
00:42:43.980 | - No, I don't have an investment in TensorFlow.
00:42:46.860 | We built--
00:42:47.700 | - No, TensorFlow, the public source came out of Google
00:42:49.300 | and then you invested in another company.
00:42:51.060 | - But we were building Silicon for machine learning.
00:42:53.100 | That's different.
00:42:54.260 | - Right.
00:42:55.100 | But it's based on TensorFlow.
00:42:56.900 | - No, no, no, no.
00:42:57.740 | The founder of this company was the founder of TensorFlow.
00:43:01.820 | - Oh, got it, okay.
00:43:02.860 | - Oh, sorry, not of TensorFlow, pardon me, of TPU,
00:43:06.420 | which was Google's internal Silicon
00:43:08.300 | that they built to accelerate TensorFlow.
00:43:11.260 | - Right.
00:43:12.100 | - If that makes sense.
00:43:13.460 | - And so that's the,
00:43:15.500 | I don't mean to be cynical about the whole project or not,
00:43:18.780 | it's just the confounding part of this
00:43:20.860 | and what is happening here.
00:43:21.900 | It reminds me, I don't know if you remember this,
00:43:24.300 | but Chamath and René allowed--
00:43:26.300 | - The biggest opportunity here is for Facebook.
00:43:28.020 | I mean, they need to get in this conversation, ASAP.
00:43:31.700 | I mean, to think that, like, look,
00:43:33.940 | PyTorch was like a pretty seminal piece of technology
00:43:37.180 | that a lot of folks in AI and machine learning
00:43:39.700 | were using for a long time.
00:43:41.100 | TensorFlow before that.
00:43:44.260 | And what's so funny about Google and Facebook
00:43:46.740 | is they're a little bit kind of like,
00:43:49.080 | they're not really making that much progress.
00:43:51.220 | I mean, Facebook released this kind of like
00:43:54.180 | rando version of AlphaFold recently.
00:43:57.180 | It's not that good.
00:43:58.480 | I think these companies really need to get these products
00:44:03.740 | in the wild as soon as possible.
00:44:05.560 | It cannot be the case that--
00:44:06.400 | - Is that, yeah.
00:44:07.460 | - You have to email people and get on some list,
00:44:09.620 | I mean, this is Google and Facebook, guys, come on.
00:44:12.100 | Get going. - This is the,
00:44:14.100 | I think the big innovation of OpenAI,
00:44:16.940 | SACS, to bring you in the conversation.
00:44:19.520 | They actually made an interface
00:44:21.300 | and let the public play with it
00:44:23.180 | to the tune of $3 million a day in cloud credits or costs.
00:44:27.380 | Which, you know-- - By the way, just on that,
00:44:30.340 | my son was telling me, he's like,
00:44:32.220 | "Hey, dad, do you want me to tell you
00:44:33.660 | when the best time to use ChatGPT is?"
00:44:35.460 | And I'm like, "Huh?"
00:44:36.340 | He's like, "Yeah, my friends and I,
00:44:38.500 | we've been using it so much,
00:44:39.460 | we know now when we can actually get resources."
00:44:43.420 | - Oh, wow. - And it's such
00:44:44.540 | an interesting thing where a 13-year-old kid knows
00:44:48.060 | when it's mostly compute-intensive that it's unusable
00:44:51.100 | and when to come back and use it.
00:44:52.380 | It's incredible. - When's the last time,
00:44:53.620 | SACS, that technology became this mainstream
00:44:56.380 | and captured people's imagination this broadly?
00:44:59.480 | Let's check GPT is. - It's been a while.
00:45:01.860 | I don't know, maybe the iPhone or something.
00:45:04.380 | Yeah, look, it's powerful.
00:45:06.060 | There's no question it's powerful.
00:45:07.100 | I mean, I'm of two minds about it
00:45:08.860 | because whenever something is the hype cycle,
00:45:11.420 | I just reflexively want to be skeptical of it.
00:45:14.000 | But on the other hand, we have made a few investments
00:45:16.660 | in this area.
00:45:18.100 | And I mean, I think it is powerful
00:45:20.420 | and it's going to be an enabler
00:45:21.980 | of some really cool things to come.
00:45:23.300 | There's no question about it.
00:45:24.340 | - I have two pieces of more insider information.
00:45:28.020 | One, I have a ChatGPT iOS app on my phone.
00:45:31.420 | One of the nice folks at OpenAI
00:45:33.580 | included me in the test flight.
00:45:35.540 | And it's the simplest interface you've ever seen,
00:45:37.740 | but basically you type in your question,
00:45:39.700 | but it keeps your history.
00:45:41.180 | And then you can search your history.
00:45:42.300 | So it looks, SACS, like you're in iMessage, basically.
00:45:46.180 | And it has your threads.
00:45:47.580 | And so I asked him,
00:45:48.420 | "Hey, what are the best restaurants in Yawnville?"
00:45:51.100 | A town near Napa.
00:45:53.380 | And then I said, "Which one has the best duck?"
00:45:55.540 | And it literally like gave me a great answer.
00:45:57.780 | And then I thought, "Wait a second,
00:45:58.620 | "why is this not using a Siri or Alexa-like interface?
00:46:02.840 | "And then why isn't it, oh, here's a video of it."
00:46:04.940 | And I gave the video to Nick.
00:46:06.260 | - By the way, Jason, this, what you're doing right now
00:46:08.580 | is you're creating a human feedback
00:46:10.860 | reinforcement learning pipeline for ChatGPT.
00:46:13.780 | So just the fact that you asked that question,
00:46:16.620 | and over time, if ChatGPT has access to your GPS information
00:46:21.260 | and then knows that you went to restaurant A versus B,
00:46:24.460 | it can intuit.
00:46:25.420 | And it may actually prompt you to ask,
00:46:27.880 | "Hey, Jason, we noticed you were in the area.
00:46:30.360 | "Did you go to Bottega?
00:46:31.420 | "If you did, how would you rate it one through five?"
00:46:34.020 | That reinforcement learning now allows the next person
00:46:36.780 | that asks, "What are the top five restaurants?"
00:46:38.540 | to say, "Well, over a thousand people
00:46:40.880 | "that have asked this question,
00:46:41.800 | "here's actually the best answer,"
00:46:43.860 | versus a generic rank of the open web,
00:46:46.460 | which is what the first data set is.
00:46:48.380 | That's what's so interesting about this.
00:46:50.300 | So this is why, if you're a company
00:46:52.460 | that already owns the eyeballs,
00:46:54.460 | you have to be running to get this stuff out there.
00:46:57.740 | - Well, and then this answer, you cited Yelp.
00:47:01.100 | Well, this is the first time I've actually seen
00:47:03.140 | a Chatshippeetee site,
00:47:04.740 | and this is, I think, a major legal breakthrough.
00:47:06.780 | It didn't put a link in,
00:47:08.500 | but if it's gonna use Yelp's data,
00:47:10.780 | I don't know if they have permission from Yelp,
00:47:12.180 | but it's quoting Yelp here,
00:47:13.620 | it should link to French Lounge, Bottega, and Bouchon.
00:47:16.300 | Bouchon actually has the best duck confit, for the record.
00:47:19.620 | And I did have that duck,
00:47:20.840 | so I asked this afterwards to see,
00:47:23.060 | you know, in a scenario like this.
00:47:24.720 | But it could also, if I was talking to it,
00:47:27.060 | I could say, "Hey, which one has availability
00:47:28.900 | "this afternoon or tomorrow for dinner?"
00:47:31.980 | And make the phone call for me,
00:47:33.180 | like Google Assistant does,
00:47:34.380 | or any number of next tasks. - I was thinking about--
00:47:37.980 | - This was an incredibly powerful display
00:47:40.740 | in a 1.0 product.
00:47:42.060 | - I was thinking about what you said last week,
00:47:43.860 | and I thought back to the music industry
00:47:47.060 | in the world of Napster.
00:47:50.660 | And what happened was, there was a lot of musicians,
00:47:54.280 | I think Metallica being the most famous one,
00:47:57.480 | famously suing Napster, because it was like,
00:48:00.580 | "Hey, listen, you're allowing people to take my content,
00:48:04.080 | "which they would otherwise pay for.
00:48:05.480 | "There's economic damage that I can measure."
00:48:08.180 | That legal argument was meaningful enough
00:48:10.200 | that ultimately Napster was shut down.
00:48:11.880 | Now, there were other versions of that,
00:48:14.140 | that folks created, including us at Winamp.
00:48:16.260 | We created a headless version of that.
00:48:18.700 | But if you translate that problem set here,
00:48:21.820 | is there a claim that Yelp can make in this example,
00:48:25.700 | that they're losing money?
00:48:27.900 | That if you were going through Google,
00:48:30.520 | or if you were going through their app,
00:48:33.740 | there's the sponsored link revenue
00:48:35.560 | and the advertising revenue that they would have got
00:48:37.660 | that they wouldn't get from here.
00:48:39.060 | Now, that doesn't mean that ChatGPT can't figure that out,
00:48:42.420 | but it's those kinds of problems
00:48:43.740 | that are gonna be a little thorny in these next few years
00:48:47.100 | that have to really get figured out.
00:48:49.560 | This is a very--
00:48:51.420 | - If you were a human reading every review on Yelp
00:48:54.480 | about duck, then you could write a blog post
00:48:58.400 | in which you say, "Many reviewers on Yelp
00:49:00.520 | "say that Bouchon is the best duck."
00:49:02.440 | So the question is, is GPT held to that standard?
00:49:06.760 | - Yeah, exactly. - Or is it,
00:49:07.700 | or something different?
00:49:09.640 | And is linking to it enough?
00:49:12.200 | - This is the question that I'm asking.
00:49:13.480 | I don't know.
00:49:14.460 | It should be, I'll argue it should be,
00:49:16.680 | because if you look at the four-part test for fair use,
00:49:19.480 | which I had to go through
00:49:20.320 | because blogging had the same issue.
00:49:21.600 | We would write a blog post
00:49:23.720 | and we would mention Walt Mossberg's review of a product
00:49:26.080 | and somebody else's.
00:49:26.920 | And then people would say,
00:49:27.740 | "Oh, I don't need to read Walt Mossberg's
00:49:29.120 | "Indiana Wall Street Journal subscription."
00:49:30.720 | And we'd say, "Well, we're doing an original work.
00:49:32.160 | "We're comparing two or three different."
00:49:34.200 | You know, human is comparing two or three different reviews
00:49:39.160 | and we're adding something to it.
00:49:40.600 | It's not interfering with Walt Mossberg's ability
00:49:45.200 | to get subscribers in the Wall Street Journal.
00:49:47.440 | But the effect on the potential market
00:49:49.680 | is one of the four tests.
00:49:50.960 | And just reading from Stanford's quote on fair use,
00:49:54.320 | "Another important fair use factor
00:49:55.560 | "is whether your use deprives the copyright owner of income
00:49:59.100 | "or undermines a new or potential market
00:50:01.340 | "for the copyrighted work.
00:50:02.980 | "Depriving a copyright owner of income
00:50:05.200 | "is very likely to trigger a lawsuit."
00:50:07.320 | This is true even if you are not competing directly
00:50:09.560 | with the original work.
00:50:11.400 | And we'll put the link to Stanford here.
00:50:13.240 | This is the key issue.
00:50:14.380 | And I would not use Yelp.
00:50:16.160 | In this example, I would not open the Yelp app.
00:50:18.880 | Yelp would get no commerce and Yelp would lose this.
00:50:20.960 | So, ChatGPT and all these services must use citations
00:50:25.960 | of where they got the original work.
00:50:27.500 | They must link to them and they must get permission.
00:50:29.380 | That's where this is all gonna shake out.
00:50:31.080 | And I believe that's--
00:50:31.920 | - Using permission?
00:50:32.740 | Well, forget about permission.
00:50:33.580 | I mean, you can't get a big enough data set
00:50:35.560 | if you have to get permission in advance, right?
00:50:38.160 | You have to go out and negotiate.
00:50:39.880 | - It's gonna be the large data sets.
00:50:41.400 | Quora, Yelp, the App Store reviews, Amazon's reviews.
00:50:46.520 | So, there are large corpuses of data that you would need.
00:50:49.760 | Like Craigslist has famously never allowed anybody
00:50:52.080 | to scrape Craigslist.
00:50:53.280 | The amount of data inside Craigslist,
00:50:55.480 | but one example of a data set,
00:50:57.280 | would be extraordinary to build ChatGPT on.
00:50:59.920 | ChatGPT is not allowed to because,
00:51:02.000 | as you brought up robots.txt last week,
00:51:04.340 | there's gonna need to be an AI.txt.
00:51:08.240 | Are you allowed to use my data set in AI?
00:51:10.240 | And how will I be compensated for it?
00:51:13.700 | I'll allow you to use Craigslist,
00:51:15.800 | but you have to link to the original post,
00:51:18.560 | and you have to note that.
00:51:20.480 | - The other gray area that isn't there today,
00:51:22.640 | but may emerge, is when section 230 gets rewritten.
00:51:27.560 | Because if they take the protections away
00:51:29.560 | for the Facebook and the Googles of the world,
00:51:31.260 | for the basically for being an algorithmic publisher,
00:51:34.560 | and saying an algorithm is equivalent to a publisher.
00:51:37.580 | What it essentially saying is that an algorithm
00:51:40.560 | is kind of like doing the work of a human
00:51:42.700 | in a certain context.
00:51:43.600 | And I wonder whether that's also an angle here,
00:51:45.640 | which now this algorithm, which today,
00:51:48.280 | David, you said the example,
00:51:49.920 | I read all these blog posts, I write something.
00:51:52.520 | But if an algorithm does it,
00:51:53.880 | maybe can you then say, no, actually there was intent there
00:51:56.880 | that's different than if a human were to do it?
00:51:58.840 | I don't know.
00:51:59.660 | My point is, very complicated issues
00:52:02.800 | that are gonna get sorted out.
00:52:04.680 | And I think the problem with the hype cycle
00:52:08.200 | is that you're gonna have to marry it
00:52:09.500 | with an economic model for VCs to really make money.
00:52:12.980 | And right now there's just too much betting on the come.
00:52:15.020 | So to the extent you're going to invest,
00:52:17.260 | it makes sense that you put money into open AI
00:52:19.400 | because that's safe.
00:52:20.440 | Because the economic model of how you make money
00:52:23.720 | for everybody else is so unclear.
00:52:25.840 | - No, it's clear actually, I have it for business.
00:52:28.880 | I just signed up for chat GPT premium.
00:52:31.280 | They had a survey that they shared on their discord server.
00:52:34.800 | And I filled out the survey
00:52:35.880 | and they did a price discovery survey, Freeberg.
00:52:38.960 | What's the least you would pay, the most you would pay,
00:52:41.320 | what would be too cheap of a price for chat GPT pro,
00:52:44.520 | and what would be too high of a price?
00:52:45.960 | I put in like 50 bucks a month would be what I would pay.
00:52:49.080 | But I was just thinking, imagine chat GPT
00:52:51.080 | allowed you, Freeberg, to have a Slack channel
00:52:53.800 | called research.
00:52:55.040 | And you could go in there or anytime you're in Slack,
00:52:57.620 | you do slash chat or slash chat GPT.
00:53:00.600 | And you say slash chat GPT, tell me,
00:53:03.200 | what are the venues available in,
00:53:05.520 | which we did this actually for,
00:53:07.520 | I did this for venues for all in some way.
00:53:10.280 | I can say, what are the venues that seat
00:53:11.400 | over 3000 people in Vegas?
00:53:14.440 | And it just gave us the answer.
00:53:15.760 | Okay, well, that was the job of the local event planner.
00:53:20.760 | They had that list.
00:53:21.840 | Now you can pull that list
00:53:23.280 | from a bunch of different sources.
00:53:24.720 | I mean, what would you pay for that?
00:53:26.880 | A lot.
00:53:27.720 | - Well, I think one of the big things that's happening
00:53:31.120 | is all the old business models don't make sense anymore.
00:53:36.120 | In a world where the software is no longer just doing
00:53:40.040 | what it's done for the last 60 years,
00:53:42.440 | which is what is historically defined
00:53:44.480 | as information retrieval.
00:53:46.320 | So you have this kind of hierarchical storage of data
00:53:50.320 | that you have some index against,
00:53:52.840 | and then you go and you search and you pull data out,
00:53:55.400 | and then you present that data back to the customer
00:53:57.600 | or the user of the software.
00:53:59.200 | And that's effectively been how all kind of data
00:54:04.200 | has been utilized in all systems
00:54:06.700 | for the past 60 years in computing.
00:54:09.600 | Largely, what we've really done
00:54:11.600 | is kind of built an evolution of application layers
00:54:14.080 | or software tools to interface with the fetching of that data,
00:54:17.800 | the retrieval of that data, and the display of that data.
00:54:20.400 | But what these systems are now doing,
00:54:22.600 | what AI type systems or machine learning systems now do,
00:54:25.880 | is the synthesis of that data
00:54:27.900 | and the representation of some synthesis of that data
00:54:31.720 | to you, the user, in a way that doesn't necessarily
00:54:34.760 | look anything like the original data
00:54:36.860 | that was used to make that synthesis.
00:54:39.420 | And that's where business models like a Yelp, for example,
00:54:42.380 | or like a web crawler that crawls the web
00:54:44.740 | and then presents webpage directories to you.
00:54:47.600 | Those sorts of models no longer make sense
00:54:49.740 | in a world where the software,
00:54:50.820 | the signal to noise is now greater.
00:54:52.840 | The signal is greater than the noise
00:54:54.620 | in being able to present to you a synthesis of that data
00:54:57.780 | and basically resolve what your objective is
00:55:01.200 | with your own consumption and interpretation of that data,
00:55:04.240 | which is how you've historically used these systems.
00:55:06.740 | And I think that's where there's,
00:55:08.260 | going back to the question of the hype cycle,
00:55:10.660 | I don't think it's about being a hype cycle.
00:55:12.500 | I think it's about the investment opportunity
00:55:15.220 | against fundamentally rewriting all compute tools.
00:55:18.420 | Because if all compute tools ultimately can use
00:55:21.280 | this capability in their interface and in their modeling,
00:55:24.380 | then it very much changes everything.
00:55:26.320 | And one of the advantages that I think
00:55:28.300 | businesses are going to latch onto,
00:55:29.980 | which we talked about historically,
00:55:31.660 | is novelty in their data,
00:55:33.340 | in being able to build new systems and new models
00:55:36.700 | that aren't generally available.
00:55:38.980 | - Example?
00:55:40.140 | - In biotech and pharma, for example,
00:55:42.860 | having screening results from very expensive experiments
00:55:47.860 | and running lots of experiments
00:55:49.620 | and having a lot of data against those experiments
00:55:51.820 | gives a company an advantage
00:55:55.140 | in being able to do things like drug discovering.
00:55:57.140 | We're going to talk about that in a minute,
00:55:59.140 | versus everyone using publicly known screening libraries
00:56:02.180 | or publicly available protein modeling libraries,
00:56:05.580 | and then screening against those.
00:56:06.900 | And then everyone's got the same candidates
00:56:08.260 | and the same targets and the same clinical objectives
00:56:11.260 | that they're going to try and resolve from that output.
00:56:13.660 | So I think novelty in data
00:56:16.300 | is one way that advantage arises.
00:56:19.980 | But really, that's just kind of, where's there an edge?
00:56:23.580 | But fundamentally, every business model can
00:56:26.100 | and will need to be rewritten
00:56:27.900 | that's dependent on the historical,
00:56:31.460 | on the legacy of information retrieval
00:56:34.220 | as the core of what computing is used to do.
00:56:37.260 | - Sax, on my other podcast,
00:56:38.460 | I was having a discussion with Molly
00:56:39.780 | about the legal profession.
00:56:41.640 | What impact would it be if ChantGPT took every court case,
00:56:46.500 | every argument, every document,
00:56:48.820 | and somebody took all of those legal cases
00:56:53.180 | on the legal profession,
00:56:55.220 | and then the filing of a lawsuit,
00:56:56.800 | the defending of a lawsuit,
00:56:59.060 | public defenders, prosecutors,
00:57:01.820 | what data could you figure out?
00:57:03.780 | Like, just to think of the recent history,
00:57:06.420 | you look at Chesa Boudin,
00:57:07.900 | you could literally take every case,
00:57:10.140 | every argument he did, put it through it,
00:57:12.460 | and say, you know, versus an outcome in another state,
00:57:14.980 | and you could figure out what's actually going on
00:57:18.940 | with this technology.
00:57:19.780 | What impact did this have on the legal field?
00:57:22.020 | You are a non-practicing attorney,
00:57:24.020 | you have a legal degree.
00:57:25.380 | - I never practiced,
00:57:27.180 | other than one summer at a law firm.
00:57:29.060 | But no, I think--
00:57:29.900 | - Did you pass the bar?
00:57:30.740 | - What's that?
00:57:31.560 | I did pass the bar, yes.
00:57:32.500 | - First try.
00:57:33.900 | - Yes, of course.
00:57:34.740 | - First try, of course, yeah, come on.
00:57:35.980 | - You gotta be kind of dumb to fail the bar exam.
00:57:36.820 | - Look, it's like a rolling in science,
00:57:38.340 | yes, I went to Stanford, dude.
00:57:40.300 | I'm rated 1800.
00:57:41.140 | - It took two days of studying, come on.
00:57:42.340 | - I may not have passed the bar,
00:57:43.580 | but I know a little shit,
00:57:44.900 | enough to know that you can't legally--
00:57:46.620 | - I would be curious in terms of
00:57:49.620 | a very common question
00:57:52.140 | that an associate at a law firm would get asked
00:57:55.100 | would be something like,
00:57:56.580 | you know, summarize the legal precedence
00:57:58.660 | in favor of X, right?
00:58:01.860 | And you could imagine GPT doing that like instantly.
00:58:06.380 | Now, I think that the question about that,
00:58:08.380 | I think there's two questions.
00:58:09.360 | One is, can you prompt GPT in the right way
00:58:12.340 | to get the answer you want?
00:58:13.780 | And I think, you know, Chamath,
00:58:14.860 | you shared a really interesting video
00:58:16.780 | showing that people are developing some skills
00:58:19.140 | around knowing how to ask GPT questions in the right way.
00:58:23.740 | Is that prompt engineering?
00:58:26.300 | 'Cause GPT's a command line interface.
00:58:27.700 | So if you ask GPT a simple question about
00:58:30.300 | what's the best restaurant in, you know, Napa,
00:58:33.020 | it knows how to answer that.
00:58:34.200 | But there are much more complicated questions
00:58:36.560 | that you kind of need to know
00:58:37.900 | how to prompt it in the right way.
00:58:39.060 | So it's not clear to me that a command line interface
00:58:41.740 | is the best way of doing that.
00:58:42.740 | I could imagine apps developing
00:58:44.700 | that create more of like a GUI.
00:58:46.320 | So we're an investor, for example, in Copy AI,
00:58:48.320 | which is doing this for copywriters and marketers,
00:58:51.060 | helping them write blog posts and emails.
00:58:53.780 | And so, you know, imagine putting that like, you know,
00:58:57.820 | GUI on top of Chad GPT.
00:59:00.020 | They've already been kind of doing this.
00:59:01.940 | So I think that's part of it.
00:59:03.980 | I think the other part of it is on the answer side,
00:59:07.500 | you know, how accurate is it?
00:59:09.780 | Because in some professions,
00:59:12.140 | having 90 or 95 or 99% accuracy is okay.
00:59:16.640 | But in other professions, you need six nines accuracy,
00:59:19.620 | meaning 99.9999% accuracy.
00:59:23.780 | Okay, so I think for a lawyer going into court,
00:59:28.340 | you know, you probably need, I don't know,
00:59:31.180 | it depends on the question.
00:59:32.020 | - Yeah, it's a parking ticket versus a murder trial
00:59:34.900 | is two very different things.
00:59:36.060 | - Yeah, exactly.
00:59:36.900 | So is 99% accuracy good enough?
00:59:39.420 | Is 95% accuracy good enough?
00:59:41.780 | I would say probably for a court case,
00:59:44.740 | 95% is probably not good enough.
00:59:46.120 | I'm not sure GPT is at even 95% yet.
00:59:49.060 | But could it be helpful?
00:59:51.300 | Like could the associates start with Chad GPT,
00:59:55.300 | get an answer and then validate it?
00:59:57.860 | Probably, yeah.
00:59:58.860 | - If you had a bunch of associates
01:00:00.900 | bang on some law model for a year,
01:00:04.260 | again, that's that reinforcement learning
01:00:05.940 | we just talked about.
01:00:07.300 | I think you'd get precision recall off the charts
01:00:09.740 | and it would be perfect.
01:00:11.420 | By the way, just a cute thing.
01:00:13.060 | I don't know if you guys got this email.
01:00:14.940 | It came about an hour ago from Reid Hoffman.
01:00:17.180 | And Reid said to me, "Hey Chamath,
01:00:18.420 | I created Fireside Chatbots,
01:00:22.540 | a special podcast mini series
01:00:25.140 | where I will be having a set of conversations
01:00:27.460 | with Chad GPT."
01:00:28.820 | So you can go to YouTube by the way
01:00:30.420 | and see Reid having, and he's a very smart guy,
01:00:33.420 | so this should be kind of cool.
01:00:35.740 | And by the way, Chad GPT will have an AI generated voice
01:00:39.260 | powered by the text to speech platform, play.ht.
01:00:44.140 | Go to YouTube if you want to see Reid
01:00:46.420 | have a conversation with Chad GPT.
01:00:50.660 | - Well, I mean, Chamath, we have a conversation
01:00:52.540 | with the two Davids every week.
01:00:54.100 | What's the difference?
01:00:54.940 | We know how this is gonna turn out.
01:00:57.580 | - Hey, but actually, so synthesizing Chamath's point
01:01:00.660 | about reinforcement learning
01:01:01.980 | with something you said, Jay Cal, in our chat,
01:01:04.420 | which I actually thought was pretty smart.
01:01:06.740 | - Well, that's a first.
01:01:07.860 | - Yeah, so I'm gonna give you credit here
01:01:09.700 | 'cause I don't think you've said it on this episode,
01:01:12.660 | which is you said that these open AI capabilities
01:01:16.340 | are eventually gonna become commoditized
01:01:18.380 | or certainly much more widely available.
01:01:21.020 | I don't know if that means
01:01:21.980 | that they'll be totally commoditized
01:01:23.220 | or there'll be four players,
01:01:24.420 | but there'll be multiple players that offer them.
01:01:26.660 | And you said the real advantage will come from
01:01:29.940 | applications that are able to get ahold
01:01:31.980 | of proprietary data sets
01:01:34.140 | and then use those proprietary data sets
01:01:36.860 | to generate insights.
01:01:37.940 | And then layering on what Chamath said
01:01:39.180 | about reinforcement learning,
01:01:40.700 | if you can be the first out there
01:01:42.700 | in a given vertical with a proprietary data set,
01:01:45.380 | and then you get the advantage,
01:01:47.020 | the moat of reinforcement learning,
01:01:49.340 | that would be the way to create,
01:01:50.540 | I think, a sustainable business.
01:01:51.980 | - Just to build on what you said,
01:01:53.540 | this week is the J.P. Morgan Conference.
01:01:56.820 | Freeberg mentioned it last week.
01:01:58.620 | I had dinner on Wednesday
01:01:59.660 | with this really interesting company based in Zurich.
01:02:03.180 | And what they have is basically a library of ligands, right?
01:02:07.700 | And so these ligands are used as a substrate
01:02:10.540 | to deliver all kinds of molecules inside the body.
01:02:13.220 | And what's interesting is that they have a portfolio
01:02:15.900 | of like a thousand of these,
01:02:17.620 | but really what they have
01:02:19.180 | is they have all the nuclear medicine
01:02:21.540 | about whether it works.
01:02:22.980 | So, you know, they target glioblastoma, glioblastoma.
01:02:26.740 | And so all of a sudden they can say,
01:02:28.500 | "Well, this ligand can actually cross the blood-brain barrier
01:02:31.020 | "and get to the brain."
01:02:32.460 | They have an entire data set of that,
01:02:34.100 | and a whole bunch of nuclear imagery around that.
01:02:36.180 | They have something for soft-cell carcinoma.
01:02:38.500 | So then they have that data set.
01:02:40.260 | So to your point, that's really valuable
01:02:42.740 | because that's real work that Google or Microsoft
01:02:47.260 | or OpenAI won't do, right?
01:02:51.100 | And if you have that and you bring it to the problem,
01:02:53.940 | you can probably make money.
01:02:55.540 | You know, there's a business there to be built.
01:02:57.780 | - Just building on this conversation,
01:02:59.740 | I just realized like a great prompt engineer
01:03:02.900 | is gonna become a title and an actual skill,
01:03:05.100 | the ability to interface with these AIs.
01:03:06.980 | - Here you go, coding school 2.0.
01:03:08.820 | - And that prompt engineer, well, no, a prompt engineer,
01:03:11.780 | somebody who is very good at talking to these, you know,
01:03:15.780 | instances and maximizing the result for them
01:03:18.740 | and refining the results for them,
01:03:20.340 | just like a detective who asks great questions,
01:03:23.100 | that person is gonna be 10 or 20 times more valuable.
01:03:27.300 | They could be the proverbial 10X engineer
01:03:30.180 | in the future of as in a company.
01:03:33.340 | And as we talk about austerity and doing more with less
01:03:36.220 | and the 80% less people running Twitter now
01:03:39.140 | or Amazon laying off 18,000 people,
01:03:41.740 | Salesforce laying off 8,000,
01:03:43.180 | Facebook laying off 10 and probably another 10,000,
01:03:46.220 | what catalytic effect like could this have?
01:03:50.180 | We could be sitting here in three or four or five years
01:03:51.980 | and instead of running a company like Twitter
01:03:54.140 | with 80% less people,
01:03:55.180 | maybe you could run it with 98% less people.
01:03:57.580 | - Look, I think directionally, it's the right statement.
01:04:00.660 | I mean, you know, I've made the statement a number of times
01:04:03.780 | that I think we move from this idea of creator economy
01:04:07.540 | to narrator economy,
01:04:08.700 | where historically it was kind of labor economy,
01:04:11.060 | where humans use their physical labor to do things.
01:04:13.780 | Then we were knowledge workers,
01:04:15.100 | we used our brains to make things.
01:04:19.380 | And then ultimately we kind of, I think,
01:04:21.820 | resolve to this narrator economy
01:04:23.580 | where the way that you kind of can state intention
01:04:26.260 | and better manipulate the tools
01:04:28.380 | to drive your intentional outcome,
01:04:30.180 | the more successful you're gonna be.
01:04:31.140 | And you can kind of think about this
01:04:32.220 | as being the artist of the past.
01:04:34.140 | Da Vinci was, what made him so good
01:04:37.940 | was he was technically incredible
01:04:40.260 | at trying to reproduce a photographic like imagery
01:04:44.060 | using paint.
01:04:46.380 | And there's these really great kind of museum exhibits
01:04:48.660 | on how he did it using these really interesting
01:04:50.980 | kind of like split mirror systems.
01:04:55.020 | And then the better, the artist of the 21st century,
01:04:58.020 | the 20th century was the best user of Adobe Photoshop.
01:05:01.020 | And that person is not necessarily the best painter.
01:05:03.780 | And the artist of the 22nd century
01:05:06.540 | isn't gonna look like the Photoshop expert.
01:05:09.660 | And it's not gonna look like the painter,
01:05:12.020 | it's gonna look like something entirely different.
01:05:13.980 | It could be who's got the most creative imagination
01:05:16.900 | in driving the software to drive new outcomes.
01:05:19.140 | And I think that the same analogy can be used
01:05:21.020 | across every market and every industry.
01:05:23.180 | However, one thing to note, Jekyll,
01:05:25.220 | it's not about austerity
01:05:27.260 | 'cause the Luddite argument is when you have new tools
01:05:31.220 | and you get more leverage from those tools,
01:05:33.380 | you have less work for people to do
01:05:35.060 | and therefore everyone suffers.
01:05:36.780 | The reality is new work emerges
01:05:38.820 | and new opportunities emerge.
01:05:40.460 | And we level up as a species.
01:05:42.380 | And when we level up, we all kind of fill the gaps
01:05:44.460 | and expand our productivity and our capability set.
01:05:47.380 | - I thought what Jekyll was saying was more
01:05:49.020 | that Google will be smaller
01:05:50.500 | didn't mean that the pie wouldn't grow.
01:05:52.100 | It's just that that individual company is run differently,
01:05:55.540 | but there will be hundreds of more companies
01:05:57.780 | or thousands more, millions more.
01:05:59.220 | - Yeah, that's sort of,
01:06:00.340 | I have an actual punch up for you.
01:06:02.900 | Instead of narrative, it's the conductor economy.
01:06:05.340 | It's you're conducting a symphony.
01:06:07.980 | - Ooh, a punch up.
01:06:09.620 | - Punch up there.
01:06:10.460 | But I do think like we're gonna,
01:06:11.740 | there's gonna be somebody who's sitting there like,
01:06:14.020 | remember Tom Cruise in Minority Report
01:06:15.660 | as a detective was moving stuff around
01:06:17.460 | with a interface in, you know,
01:06:20.140 | with the gloves and everything.
01:06:21.900 | This is kind of that manifested.
01:06:23.860 | You could, even if you're not an attorney,
01:06:25.380 | you can say, hey, I wanna sue this company
01:06:27.380 | for copyright infringement, give me my best arguments.
01:06:29.940 | And then on the other side say,
01:06:30.980 | hey, I wanna know what the next three features
01:06:32.860 | I should put into my product is.
01:06:34.660 | Can you examine who are my top 20 competitors?
01:06:37.180 | And then who have they hired in the last six months?
01:06:39.100 | And what are those people talking about on Twitter?
01:06:40.940 | You can have this conductor, you know,
01:06:43.660 | who becomes really good at that.
01:06:45.220 | - Well, the leveling up--
01:06:46.060 | - Massive amounts of tasks.
01:06:48.260 | - Yeah, the leveling up that happens
01:06:49.660 | in the book Ender's Game, I think is a good example
01:06:52.620 | of this where the guy goes through the entire
01:06:54.860 | kind of ground up and then ultimately
01:06:56.340 | he's commanding armies of spaceships in space
01:06:59.580 | and his orchestration of all of these armies
01:07:02.380 | is actually the skillset that wins the war.
01:07:04.860 | Yeah.
01:07:05.700 | - You predicted that there would be like all these people
01:07:07.700 | that create these next gen forms of content.
01:07:10.700 | But I think this Reid Hoffman thing could be pretty cool.
01:07:13.140 | Like what if he wins a Grammy for his, you know,
01:07:16.260 | computer created podcast miniseries?
01:07:18.860 | That's really cool.
01:07:19.700 | - The thing I'm really excited about,
01:07:21.260 | when's the first AI novel gonna get published
01:07:23.380 | by a major publisher?
01:07:24.380 | I think it happens this year.
01:07:25.740 | When's the first AI symphony gonna get performed
01:07:27.820 | by a major symphony orchestra?
01:07:29.420 | And when's the first AI generated screenplay
01:07:31.540 | get turned into an AI generated 3D movie that we all watch?
01:07:34.900 | And then the more exciting one I think is
01:07:36.340 | when do we all get to make our own AI video game
01:07:38.980 | where we instruct the video game platform
01:07:40.900 | what world we wanna live in?
01:07:42.140 | I don't think that's happening
01:07:43.060 | for the next three or four years,
01:07:44.460 | but when it does, I think everyone's got
01:07:46.020 | these new immersive environments that they can live in.
01:07:48.180 | - I have a question.
01:07:49.180 | When, you know, when you think--
01:07:50.020 | - When I say live in, I mean video game wise.
01:07:51.620 | Yeah, sorry, go ahead.
01:07:52.460 | - When you have these computer systems,
01:07:54.860 | just like to use a question of game theory for a second,
01:07:57.380 | they're, these models are iterating rapidly.
01:07:59.980 | These are all mathematical models.
01:08:01.180 | So inherent in, let's just say this,
01:08:04.700 | the perfect answer, right?
01:08:06.260 | Like if you had perfect precision recall,
01:08:10.540 | if multiple models get there at a system-wide level,
01:08:13.340 | everybody is sort of like,
01:08:15.020 | they get to the game theory optimal.
01:08:17.300 | They're all at Nash equilibrium, right?
01:08:19.060 | All these systems working at the same time.
01:08:21.580 | Then the real question would then be
01:08:24.260 | what the hell do you do then?
01:08:26.180 | 'Cause if you keep getting the same answer,
01:08:27.820 | if everybody then knows how to ask the exact right question
01:08:30.980 | and you start to go through these iterations
01:08:32.660 | where you're like, maybe there is a dystopian hellscape
01:08:35.340 | where there are no jobs.
01:08:37.020 | Maybe that's the Elon world, which is you can,
01:08:40.060 | you can recursively find a logical argument
01:08:44.340 | where there is no job that's possible, right?
01:08:47.940 | And now I'm not saying that that path is the likely path,
01:08:50.940 | but I'm saying it is important to keep in mind
01:08:52.620 | that that path of outcomes is still very important
01:08:55.860 | to keep in the back of our mind
01:08:57.300 | as we figure these things out.
01:08:58.660 | - Well, Freeberg, you were asking before about this,
01:09:01.180 | like, will more work be created?
01:09:03.700 | Of course, artistic pursuits,
01:09:05.420 | podcasting is a job now, being an influencer is a job,
01:09:08.540 | yada, yada, new things emerge in the world.
01:09:10.420 | But here in the United States in 1970,
01:09:13.460 | I'm looking at Fred, I'm looking at the St. Louis Fed,
01:09:16.820 | 1970, 26.4% of the country was working in a factory,
01:09:21.820 | was working in manufacturing.
01:09:23.820 | You want to guess what that is in 2012?
01:09:27.800 | - Sorry, what percentage?
01:09:29.240 | - It was 26% in 1970.
01:09:32.140 | And in 2015, when they stopped the percentage
01:09:34.900 | in manufacturing, I say it's, they discontinued this,
01:09:37.020 | it was a 10.
01:09:38.500 | So it's possible we could just see, you know,
01:09:42.020 | the concept of office work,
01:09:44.060 | the concept of knowledge work is going to follow,
01:09:47.260 | pretty inevitable, the path of manufacturing.
01:09:50.580 | That seems like a pretty logical theory or no?
01:09:52.880 | - I think we should move on, but yes.
01:09:56.820 | - Okay, so how would we like to ruin the show now?
01:09:59.500 | Should we talk about Biden and the documents
01:10:02.460 | and ruin the show with political talk?
01:10:04.780 | Or should we talk about?
01:10:07.480 | Since it's been such a great episode so far,
01:10:10.280 | what do we want to talk about next?
01:10:12.160 | I'll give you a couple of choices.
01:10:13.000 | - I know what you don't want to talk about.
01:10:14.880 | (laughing)
01:10:16.560 | - Some of these network guys are talking about productivity.
01:10:19.200 | - Give it to him, give it to him.
01:10:20.880 | - Wait, we all know, we all know J. Cal.
01:10:23.920 | - Apple cash, time-max, hourglasses?
01:10:24.760 | - Hold on a second, we all know J. Cal,
01:10:26.600 | that according to you, when a president
01:10:29.200 | is in possession of classified documents in his home,
01:10:33.760 | that apparently have been taken in an unauthorized manner,
01:10:36.760 | basically stolen, he should have his home raided by the FBI.
01:10:40.040 | - Almost, close, close.
01:10:42.640 | So anyway, Biden, as of the taping of this,
01:10:49.400 | has now said there's a third batch of classified documents.
01:10:54.160 | This group, I guess there was one at an office,
01:10:56.760 | one at a library, now this third group is in his garage
01:10:59.200 | with his Corvette, certainly not looking good.
01:11:01.960 | - Well, they say in his defense,
01:11:04.240 | they say the garage was locked,
01:11:05.640 | meaning that you could use a garage door opener
01:11:08.400 | to open and close it.
01:11:11.000 | It was locked when it went closed.
01:11:12.320 | - So pretty much as secure as the documents at Mar-a-Lago,
01:11:15.840 | same equivalency.
01:11:16.880 | - No, no, no, actually, I mean, just to be perfectly fair,
01:11:19.880 | the documents at Mar-a-Lago were locked in a basement.
01:11:22.200 | The FBI came, checked it out,
01:11:24.160 | said we'd like you to lock those up, they locked 'em up.
01:11:26.200 | So a little safer than being in a garage with a Corvette.
01:11:31.280 | - But functionally the same, functionally the same.
01:11:34.480 | - The only difference here would be what,
01:11:36.680 | Sachs, when you look at these two cases?
01:11:39.320 | - Well, that in one case, Merrick Garland
01:11:41.800 | is appointed an independent counsel to investigate Trump,
01:11:44.640 | and there's no such a special counsel
01:11:47.120 | or investigator appointed to investigate Biden.
01:11:49.640 | I mean, these things are functionally the same.
01:11:50.480 | - Didn't he put somebody on it, though?
01:11:52.280 | Wait, did he put somebody on it?
01:11:54.120 | - No, I don't think they've appointed
01:11:55.280 | a special counsel yet.
01:11:56.480 | - No, they did.
01:11:57.320 | As of an hour ago, a special counsel was appointed.
01:12:00.320 | - Okay.
01:12:01.160 | - Did that just happen?
01:12:02.000 | - Yeah, one hour ago.
01:12:03.000 | Robert Herr is his name.
01:12:03.960 | - Okay, I guess there are real questions to look into here.
01:12:07.080 | The documents apparently were moved twice.
01:12:10.080 | Why were they moved?
01:12:11.120 | Who ordered that?
01:12:12.640 | What was a classified document doing
01:12:14.280 | in Biden's personal library?
01:12:16.680 | What do the documents pertain to?
01:12:18.480 | Do they touch on the Biden family's business dealings
01:12:22.040 | in Ukraine and China?
01:12:23.760 | So there are real things to look into here.
01:12:25.680 | But let me just take a step back.
01:12:28.880 | Now that the last three presidential candidates
01:12:31.480 | have been ensnared in these classified document problems,
01:12:35.920 | remember, it's Biden now, and then Trump,
01:12:39.440 | and Hillary Clinton before Trump.
01:12:42.040 | I think it's time to step back and ask,
01:12:44.920 | are we over-classifying documents?
01:12:47.280 | I mean, are we fetishizing these documents?
01:12:49.880 | Are they all really that sensitive?
01:12:51.840 | It seems to me that we have an over-classification problem,
01:12:54.760 | meaning that ever since FOIA was passed,
01:12:58.560 | the Freedom of Information Act,
01:13:00.960 | the government can avoid accountability and prying eyes
01:13:04.280 | by simply labeling any document as classified.
01:13:07.720 | So over-classification was a logical response
01:13:10.640 | by the permanent government
01:13:12.240 | to the Freedom of Information Act.
01:13:13.720 | And now it's gotten to the point
01:13:16.160 | where just about everything handed
01:13:18.160 | to a president or vice president is classified.
01:13:20.880 | So I think I can understand
01:13:22.600 | why they're all making this mistake.
01:13:25.320 | And I think a compounding problem
01:13:26.880 | is that we never declassify anything.
01:13:28.480 | There's still all these records
01:13:30.120 | from the Kennedy assassination
01:13:31.800 | that have never been declassified.
01:13:34.720 | - And they're supposed to have declassified these.
01:13:37.280 | The CIA keeps filibustering
01:13:40.640 | on the release of the JFK assassination documents,
01:13:44.640 | and they've been told they have to stop,
01:13:47.120 | and they have to release them,
01:13:47.960 | and then they keep redacting stuff,
01:13:50.560 | which is making it... - They're not releasing them.
01:13:53.000 | - I hate to be a conspiracy theorist here,
01:13:54.680 | but what are they trying to cover up?
01:13:57.040 | I mean, this is a long time ago.
01:13:58.760 | - That's the only way to interpret it.
01:14:00.960 | But even for more mundane documents,
01:14:02.920 | there are very few documents
01:14:04.400 | that need to be classified after even, say, five years.
01:14:07.320 | You could argue that we should be
01:14:08.280 | automatically declassifying them after five years,
01:14:11.160 | unless they go through a process to get reclassified.
01:14:14.280 | I mean, I'd say, like, you guys in business,
01:14:17.800 | I know it's not government, in business,
01:14:19.440 | how many of the documents that you deal with
01:14:22.080 | are still sensitive, are trade secrets, five years later?
01:14:26.400 | - Certainly 20 years later, they're not, right?
01:14:29.440 | Like, in almost all cases.
01:14:30.280 | - No, but I wouldn't even say, like, five years.
01:14:31.280 | I mean, the only documents-- - The Coca-Cola formula.
01:14:33.440 | - The only documents in business that I think I deal with
01:14:36.880 | that you could call sensitive
01:14:38.760 | are the ones that pertain to the company's future plans,
01:14:42.280 | right, because you wouldn't want a competitor--
01:14:43.480 | - Cap table. - To get those.
01:14:45.280 | - Yeah, cap table. - There's a handful of things.
01:14:46.640 | - Legal issues, yeah.
01:14:48.160 | - Even cap table is not that sensitive,
01:14:49.920 | because by the time you go public,
01:14:51.160 | it legally has to be public.
01:14:52.760 | - Yeah, so at some point-- - It's on Carta,
01:14:54.320 | like, there's 100 people who have that.
01:14:55.760 | I mean, it's-- - Exactly.
01:14:56.840 | - So, like, in business, I think our experience has been
01:14:59.520 | there's very few documents that stay sensitive,
01:15:02.840 | that need to remain secret.
01:15:04.800 | Now, look, if Biden or Trump or whoever,
01:15:07.960 | they're reviewing the schematics
01:15:09.960 | to the Javelin missile system or to, you know,
01:15:13.280 | how we make our nuclear bombs or something,
01:15:15.400 | obviously that needs to stay secret forever,
01:15:17.440 | but I don't believe our politicians
01:15:19.080 | are reviewing those kinds of documents.
01:15:21.440 | - Well, I mean, we both--
01:15:22.760 | - I don't really understand what it is
01:15:24.400 | that they're reviewing-- - Why are they keeping them--
01:15:27.040 | - That needs to be classified five years later.
01:15:28.880 | - Well, and why are they keeping them
01:15:29.880 | was the issue we discussed previously.
01:15:31.200 | We actually agreed on that.
01:15:32.640 | I think they're just keeping mementos.
01:15:34.160 | - I think there's a simple explanation
01:15:35.480 | for why they're keeping them, Jason,
01:15:36.920 | which is that everything is more classified,
01:15:41.440 | and there's a zillion documents,
01:15:43.880 | and if you look, like, both Biden and Trump,
01:15:47.160 | these documents were mixed in
01:15:48.360 | with a bunch of personal effects and mementos.
01:15:51.240 | My point is, if you work in government
01:15:53.920 | and handle documents, they're all classified.
01:15:57.200 | So, I mean-- - And if the National Archive
01:15:58.920 | asks for them back, or you find them,
01:16:01.120 | you should just give them back.
01:16:02.160 | I mean, that's gonna wind up being the rug here
01:16:04.400 | is Trump didn't give them back, and Biden did.
01:16:07.440 | So that's the only difference here.
01:16:09.040 | - Well, no, no, no, hold on.
01:16:09.880 | The FBI went to Trump's basement,
01:16:11.320 | they looked around, they said, "Put a lock on this."
01:16:13.280 | They seemed to be okay with it initially.
01:16:15.040 | Then maybe they changed their minds, I don't know.
01:16:17.120 | I'm not defending Trump, but--
01:16:17.960 | - Well, it's pretty clear that he wouldn't give them back in.
01:16:19.880 | That was the issue. - The point I'm making
01:16:20.920 | is that now that Biden, Trump, and Hillary Clinton
01:16:24.320 | have all been ensnared in this,
01:16:26.360 | is it time to rethink the fact
01:16:28.880 | that we're over-classifying so many documents?
01:16:31.760 | I mean, just think about the incentives
01:16:33.360 | that we're creating for our politicians.
01:16:36.440 | Okay, just think about the incentives.
01:16:38.140 | Number one, never use email.
01:16:40.120 | Remember Hillary Clinton and the whole email server?
01:16:42.240 | You gotta be nuts to use email.
01:16:44.060 | Number two, never touch a document.
01:16:46.520 | Never touch a document.
01:16:48.160 | Never let anyone hand you a document.
01:16:50.000 | - Flush them down the toilet.
01:16:51.400 | - Never let anyone hand you a document.
01:16:53.840 | I mean, if you're a politician, an elected official,
01:16:58.840 | the only time you should ever be handling anything
01:17:01.080 | is go into a clean room, make an appointment,
01:17:04.320 | go in there, read something, don't take notes,
01:17:06.680 | don't bring a camera, and then leave.
01:17:08.840 | I mean, this is no way to run a government.
01:17:11.560 | - It's crazy. - Who does this benefit?
01:17:13.280 | Hold on, who does this benefit?
01:17:15.240 | It doesn't benefit our elected officials.
01:17:17.200 | It makes it almost impossible
01:17:18.880 | for them to act like normal people.
01:17:21.000 | - That's why. - It benefits the insiders,
01:17:22.560 | the permanent government.
01:17:23.840 | - You're missing the most important part about the sex.
01:17:26.840 | This was, if you wanna go into conspiracy theories,
01:17:29.960 | this was a setup.
01:17:31.400 | Biden planted the documents
01:17:33.320 | so that we could create the false equivalency
01:17:36.040 | and start up Biden versus Trump 2024.
01:17:38.960 | This ensures that now Trump has something
01:17:41.960 | to fight with Biden about, and this is gonna help Trump.
01:17:45.040 | - 'Cause they're both tainted, equally tainted
01:17:48.280 | from the same source. - Yes, yes.
01:17:49.800 | - They are equally tainted now, but I--
01:17:51.240 | - It puts Trump in the new cycle.
01:17:52.800 | - No, I think it's the opposite.
01:17:54.760 | I think Merrick Garland now is gonna have to drop
01:17:57.800 | the prosecution against Trump for the stolen documents,
01:18:01.120 | or at least that part of what they're investigating him for.
01:18:04.120 | They might still investigate him over January 6th
01:18:05.880 | or something, but they can't investigate Trump
01:18:07.640 | over documents now. - And Georgia.
01:18:08.480 | Those two seems more sticky, yeah.
01:18:10.040 | I agree with that, actually.
01:18:10.880 | I think it's gonna be hard to do.
01:18:12.480 | - But my point is, just think about, look,
01:18:14.640 | both sides are engaged in hyper partisanship.
01:18:16.680 | The way right now that the conservatives on the right,
01:18:20.640 | they're attacking Biden now for the same thing
01:18:23.320 | that the left was attacking Trump for.
01:18:25.160 | My point is, just take a step back,
01:18:27.840 | and again, think about the incentives we're creating
01:18:30.520 | about how to run our government.
01:18:31.920 | You can't use email, and you can't touch documents.
01:18:35.000 | - And everything's an investigation
01:18:37.160 | the second you get on the office.
01:18:38.600 | - And by the way, don't ever go into politics
01:18:41.440 | if you're a business person, because they'll investigate
01:18:43.920 | every deal you ever did prior to getting into politics.
01:18:48.320 | I mean, just think about the incentives we're creating.
01:18:49.160 | - So what are you gonna do when you try to get
01:18:51.400 | your treasury position?
01:18:52.840 | What's gonna happen? - You gotta be nuts.
01:18:53.880 | You gotta be nuts to go into government.
01:18:54.720 | - So you're not gonna take a position
01:18:56.560 | in DeSantis' cabinet? - My point is that
01:18:58.640 | the Washington insiders, by which I mean
01:19:00.280 | the permanent Washington establishment, i.e. the deep state,
01:19:03.720 | they're creating a system in which they're running things,
01:19:07.640 | and the elected officials barely can operate
01:19:09.760 | like normal functioning humans there.
01:19:13.040 | - Interesting. - That's what's going on.
01:19:15.160 | - I heard a great rumor.
01:19:16.720 | This is total gossip mongering.
01:19:18.880 | - Oh, here we go.
01:19:19.720 | - That one of Ken Griffin's best out
01:19:25.920 | is to get DeSantis elected so that he can become
01:19:29.240 | treasury secretary.
01:19:30.320 | I mean, Ken Griffin would get that if he wanted it.
01:19:33.600 | And then he would be able to divest
01:19:35.280 | all of Citadel tax-free.
01:19:37.880 | So he would mark the market like $30 billion,
01:19:40.600 | which is a genius way to go out.
01:19:43.020 | Now, then it occurred to me, oh my God,
01:19:45.520 | that is me and Sachs' path too.
01:19:47.800 | (laughing)
01:19:48.640 | With a lot less money, but the same path.
01:19:51.000 | - Why would it be tax-free?
01:19:54.880 | - When you get appointed to those senior posts,
01:19:58.840 | you're allowed to either stick it in a blind trust,
01:20:01.400 | or you can sell with no capital gains.
01:20:03.640 | - What?
01:20:04.480 | What?
01:20:07.120 | - Well, because they want you to divest.
01:20:09.400 | - They want you unconflicted.
01:20:10.800 | - Yes, anything that presents a conflict,
01:20:12.560 | they want you to divest.
01:20:13.620 | And so the argument is, if you're forced to divest it
01:20:17.100 | to enter a government, you shouldn't be forcibly taxed.
01:20:19.120 | - Wait, if I become mayor of San Francisco or Austin,
01:20:21.400 | I can go to federal government.
01:20:22.240 | - No, no, no, wait, wait, wait.
01:20:23.060 | - It's like having a--
01:20:23.900 | - Secretary of Transportation, J. Cal, you can do that.
01:20:26.360 | - Oh, I'm qualified for that.
01:20:28.320 | I'd take the bus, I got an electric bike.
01:20:30.520 | - To answer Freeberg's point, I think Citadel Securities,
01:20:33.480 | there's a lot of folks that would buy that
01:20:34.960 | because that's just a securities trading business.
01:20:36.760 | And then Citadel, the hedge fund,
01:20:38.720 | probably something like a big bulge bracket bank
01:20:41.000 | for Blackstone, probably Blackstone in fact,
01:20:43.360 | because now Blackstone can plug it into
01:20:45.600 | a trillion dollar asset machine.
01:20:47.600 | I think there would be buyers out the door.
01:20:50.440 | - This is an incredible grift.
01:20:52.360 | Now I know why Sachs--
01:20:53.720 | - It's not a grift at all, but it's an incredible--
01:20:56.120 | - Oh, come on, man, a cabinet position for no cap gains?
01:21:00.040 | - Well, that's not a grift, those are the laws.
01:21:02.800 | They force you to sell everything.
01:21:03.640 | - It so feels grifty to me.
01:21:05.360 | - And then you do public service.
01:21:06.960 | - I think you're misusing the word
01:21:08.560 | to continue to genuflect to the left wing.
01:21:10.600 | - No, I'm not genuflecting.
01:21:11.800 | I think you're being a little defensive
01:21:13.680 | 'cause you see this as a bad--
01:21:14.520 | - Is it genuflect?
01:21:15.360 | - That or you're dumb.
01:21:16.280 | - Big bulge.
01:21:17.120 | (laughing)
01:21:17.960 | - I'm not stupid, man, you don't even know.
01:21:18.800 | - I don't grift when I see it.
01:21:20.320 | You take a cabinet position--
01:21:21.800 | - Sachs, would you take a cabinet position?
01:21:22.640 | - You don't pay cap gains?
01:21:23.600 | - Would you be secretary of the treasury?
01:21:25.000 | - Where does that exist?
01:21:26.520 | - Yeah, Sachs, if you were asked to serve--
01:21:28.320 | - Look, any normal person who wants to serve in government,
01:21:31.400 | you can't use email and you can't touch a document,
01:21:34.360 | and every deal you've ever done gets investigated.
01:21:36.720 | Why not? - That's a yes.
01:21:37.560 | - Why would you wanna do it?
01:21:39.560 | I mean, all of a sudden, you get to divest tax free.
01:21:42.400 | - Methink thou doth protesteth too much, David Sachs.
01:21:45.800 | - The fact that you two know this rule,
01:21:48.080 | and Freebird and I don't.
01:21:50.200 | - No, I know it.
01:21:51.040 | It's like a well-known--
01:21:51.880 | - I'm the only person who doesn't know this.
01:21:53.360 | Rich people know it.
01:21:54.200 | - Jake, I looked up grift.
01:21:55.840 | It means to engage-- - Everyone knows this.
01:21:57.640 | - It means that to engage in a petty
01:21:59.440 | or small-scale swindle.
01:22:01.240 | I don't think selling a $31 billion entity--
01:22:03.400 | - Yeah, no, it's not small.
01:22:04.240 | - To a combination of BlackRock and Blackstone
01:22:06.760 | would be considered a petty, small-scale swindle.
01:22:08.840 | - Did any of you guys watch the Madoff series on Netflix?
01:22:12.720 | - No, was it good? - No.
01:22:14.800 | - Oh my God, it is so depressing.
01:22:17.240 | I gotta say, just that Madoff series,
01:22:19.920 | there is no glimmer of light or hope
01:22:24.360 | or positivity or recourse.
01:22:27.200 | Everyone is a victim.
01:22:28.400 | Everyone suffers.
01:22:29.360 | It is just so dark. - So gross.
01:22:31.080 | - Don't watch it.
01:22:31.920 | It's so depressing.
01:22:32.840 | - The Madoff one.
01:22:34.160 | - The Madoff one is so depressing.
01:22:35.720 | It's so awful. - Yeah, they all kill
01:22:36.800 | themselves and die.
01:22:37.640 | - They become like all the-- - No, one guy died.
01:22:39.080 | - Everyone was a victim. - One guy died of cancer.
01:22:41.960 | - And then Irving Picard, I didn't realize all this,
01:22:43.720 | the trustee that went and got the money,
01:22:45.480 | he went and got money back from these people
01:22:47.400 | who were 80 years old and retired
01:22:48.920 | and had spent that money decades ago,
01:22:50.680 | and he sued them and took their homes away from them,
01:22:53.320 | and no one, and they had no idea--
01:22:55.280 | - No one. - That they were part
01:22:56.160 | of the scam.
01:22:57.000 | No one, no one won.
01:22:58.160 | It was a brutal, awful whole thing, yeah.
01:23:01.000 | - You know what they said.
01:23:02.240 | By the way, that's gonna be really interesting
01:23:03.840 | as we enter this SBF trial because--
01:23:05.480 | - 100%. - 100%.
01:23:07.880 | - That is what happens if you got--
01:23:10.280 | - And that's why the Southern District of New York said
01:23:12.400 | that this case is becoming too big for them
01:23:14.760 | because all the places that SBF sent money,
01:23:17.240 | all those PACs and all those political donations--
01:23:20.160 | - Sent before.
01:23:21.000 | - They have to go and investigate where that money went
01:23:24.000 | and see if they can get it back,
01:23:25.320 | and it's gonna open up an investigation
01:23:26.880 | into each one of these campaign finance
01:23:28.920 | and election and kind of interfering actions
01:23:31.760 | that were taken. - Wait, not Politico.
01:23:32.720 | ProPublica, sorry, ProPublica.
01:23:34.560 | - On the other end of the spectrum,
01:23:36.200 | I did watch this weekend, "Triangle of Sadness."
01:23:39.400 | Have you guys seen this? - I watched it too!
01:23:40.680 | It was great! - Oh my God!
01:23:42.240 | - "Triangle of Sadness" is great!
01:23:43.880 | It's so dark.
01:23:44.880 | - To the Davids, listen, this is one of the,
01:23:48.480 | I thought it was, it didn't pay off the way I thought,
01:23:50.580 | but this is one of the best setups you'll see in a movie.
01:23:53.280 | So basically, it's a bunch of people on a luxury yacht.
01:23:57.800 | So you have a bunch of rich people as the guests.
01:24:00.400 | Then you have the staff that interacts with them.
01:24:04.120 | And this is mostly Caucasian.
01:24:07.280 | And then in the bowels of the ship,
01:24:09.320 | what you see are Asian and black workers
01:24:11.680 | that support them, okay?
01:24:13.160 | So in some ways, it's a little bit
01:24:15.360 | of a microcosm of the world.
01:24:17.160 | - Oh, I thought you were gonna say
01:24:18.240 | a microcosm of something else!
01:24:20.520 | - And then what happens is there's a shipwreck, basically.
01:24:23.920 | - Yeah, oh, don't spoil it, come on.
01:24:25.840 | - Okay, but no, but I'll just tell you.
01:24:28.080 | So the plot is you have this Caucasian patriarchy
01:24:33.080 | that then gets flipped upside down
01:24:36.040 | because after the shipwreck,
01:24:37.200 | the only person who knows how to make a fire
01:24:39.040 | and catch the fish is the Filipino woman
01:24:41.360 | who is in charge of cleaning the toilets.
01:24:43.420 | So she becomes in charge.
01:24:44.700 | So now you flip to this immigrant matriarchy.
01:24:47.440 | - It's a pretty great meditation on class and survival.
01:24:50.880 | It's pretty well done.
01:24:52.200 | - It didn't end well, I thought.
01:24:53.680 | I thought it could.
01:24:54.800 | - Well, it's hard to wrap that one up.
01:24:57.320 | - Well, you know what they say, boys.
01:24:58.600 | Still a little and they throw you in jail.
01:25:00.200 | Still a lot and they make you king.
01:25:02.000 | Famous Bob Dylan quote.
01:25:04.160 | - There you go.
01:25:05.000 | - All right, well, this has been a great episode.
01:25:06.520 | Great to see you, besties.
01:25:08.360 | Austerity menu tonight, Chamath?
01:25:10.280 | What's on the austerity menu tonight?
01:25:12.280 | What are we doing, salad, some tuna sandwiches?
01:25:15.420 | - No, I think Kirsten is doing, I think, durod.
01:25:19.340 | - Durod, yeah, that's, yeah.
01:25:22.320 | - That's a good fish.
01:25:24.800 | - Jake and I once had a great durod in Venice.
01:25:28.440 | - In Venice.
01:25:29.280 | - In Venice.
01:25:30.120 | - The durod from Venice.
01:25:30.940 | - That's one of the best meals we ever had.
01:25:32.360 | Am I right?
01:25:33.200 | - So good, I agree.
01:25:34.120 | - When it's done well, the durod kicks ass.
01:25:36.000 | - There's only one way to cook a durod.
01:25:38.120 | Do you know what that is?
01:25:40.200 | You gotta, it's the way they did in Venice.
01:25:42.560 | You gotta cook the whole fish.
01:25:44.280 | - Yeah, yeah, yeah, she's doing that.
01:25:45.480 | - And then after you cook the fish,
01:25:47.320 | then you de-bone it.
01:25:49.720 | And yeah, that's the way to do it.
01:25:52.000 | - That was back when Saxon and I
01:25:52.960 | used to enjoy each other's company.
01:25:54.320 | (laughing)
01:25:56.560 | - Before this podcast made us into mortal enemies.
01:25:59.280 | - Jake, I'm a little disappointed
01:26:00.800 | you couldn't agree with my take on this documents scandal.
01:26:03.720 | Instead of dunking in a partisan way,
01:26:06.000 | I tried to explain why it was a problem
01:26:08.600 | of our whole political system.
01:26:09.920 | - I like your theory.
01:26:10.760 | I think, you know, you keep making me defend Biden.
01:26:15.120 | I think Biden's a grifter.
01:26:16.840 | I told you these guys are grifting.
01:26:19.160 | Just think you're, I just think your party
01:26:20.840 | grifts a little bit more.
01:26:23.480 | But yeah, compare your grift.
01:26:24.880 | - Are we going to play Saturday after the wild card game?
01:26:27.000 | Are you guys interested in playing Saturday as well?
01:26:28.600 | 'Cause I got the hall pass.
01:26:29.560 | I can do a game on Saturday.
01:26:32.000 | - I don't know.
01:26:32.840 | I have to check with my boss.
01:26:34.280 | - Who's going to the, are you guys all going?
01:26:36.520 | Sax, are you going to come to play poker
01:26:38.440 | at that live stream thing for the day, LA?
01:26:42.040 | - I doubt it, no.
01:26:43.160 | - He doesn't want to interact with humans.
01:26:46.040 | - That does not play well in confirmation hearings.
01:26:49.480 | - No.
01:26:50.320 | - The last time I did one of those,
01:26:51.440 | Alan Keating destroyed me on camera.
01:26:53.560 | I like, I had like, I had two left feet
01:26:56.920 | and every time he bluffed, I folded.
01:26:59.400 | Every time he had the nuts, I called.
01:27:01.480 | It was brutal.
01:27:02.320 | - That's true.
01:27:03.160 | - Now you do this.
01:27:04.000 | - That was a bad one.
01:27:04.840 | - A shellacking, a classic Sax shellacking.
01:27:07.680 | - So it's an edge saving thing.
01:27:08.680 | Is that what's going on here?
01:27:09.640 | - No, no, no, no.
01:27:10.480 | - Preserving.
01:27:11.320 | - No, I just said I don't have time to do it.
01:27:12.160 | - No, it has to do with the cabinet positions.
01:27:13.680 | He doesn't need to be seen recklessly gambling.
01:27:16.480 | It's a bad look.
01:27:17.320 | - If you could pick any cabinet position, Sax,
01:27:19.120 | which one would it be?
01:27:20.320 | - State.
01:27:21.160 | - Treasury.
01:27:23.120 | - State's a lot of travel.
01:27:24.120 | - Say the word, Sax.
01:27:24.960 | - State's a lot of travel.
01:27:25.800 | You never stay at home.
01:27:26.620 | You're always on a plane.
01:27:27.460 | - I'll tell you, I don't know.
01:27:28.300 | - That's what he's looking for.
01:27:30.400 | - I don't know that those like cabinet positions
01:27:32.800 | are that important.
01:27:33.680 | I mean, they run these giant bureaucracies
01:27:36.440 | that again are permanent.
01:27:37.920 | You can't fire anyone.
01:27:39.320 | So if you can't fire a person,
01:27:41.120 | do they really report to you?
01:27:42.600 | - Right.
01:27:43.440 | - I mean, only ceremonial.
01:27:44.640 | - Trump's idea was like, put a bunch of hardline,
01:27:47.240 | CEO type people in charge,
01:27:49.000 | have them blow up these things and make it more efficient.
01:27:51.040 | It didn't really work, did it?
01:27:52.560 | - Yeah, well, you know why a CEO is actually in charge?
01:27:55.680 | Like Elon, he walks in,
01:27:57.400 | if he doesn't like what you're doing, he'll just fire you.
01:27:59.640 | You can't fire anyone.
01:28:00.720 | How do you manage them
01:28:01.840 | when they don't have to listen to anything you say?
01:28:04.600 | That's our whole government right now.
01:28:06.500 | Our cabinet heads are figureheads
01:28:11.000 | for these departments, these giant departments.
01:28:13.000 | - Is that a no, or is that a yes, you'd still take state?
01:28:16.240 | Look at that.
01:28:18.760 | - Well, I think I have another theory.
01:28:20.840 | I think he's going for the ambassadorship first.
01:28:23.240 | What is the best ambassadorship?
01:28:24.960 | - Well, you can't divest everything with no cash.
01:28:27.520 | - Historically, you can tell which ambassadorship
01:28:30.440 | is the best one based on how much they charge for it.
01:28:32.720 | - Yeah.
01:28:33.560 | - So I think London is the most expensive.
01:28:36.040 | I think that one's $15 million.
01:28:36.880 | - It's 10 million for London, 10 to 15.
01:28:38.960 | - 10, 15 million, yeah.
01:28:40.760 | - 10, 15 million?
01:28:42.640 | That's what Sax is fourth least expensive home cost.
01:28:47.280 | - No, no, no, you have to spend that every year
01:28:48.680 | to run it, Jason.
01:28:50.040 | You only get, you get a fixed-
01:28:50.880 | - That's nothing for him.
01:28:51.840 | - You could be the ambassador to Guinea
01:28:53.880 | or the ambassador to the UK, you get the same budget.
01:28:56.800 | - Actually, what's kind of funny is I know two people
01:28:58.680 | who served as ambassadors under Trump
01:29:00.560 | and it was really cheap to get those
01:29:02.080 | because no one wanted to be part
01:29:04.520 | of the Trump administration.
01:29:06.480 | - Oh, they were on fire sale, two for one.
01:29:08.400 | - They were on fire sale after, because of Trump.
01:29:12.080 | - Who wants to be tainted?
01:29:13.680 | - But by the way, one of them,
01:29:14.720 | and you can just bleep out the name,
01:29:16.560 | (beep)
01:29:17.560 | was telling me it was the best thing
01:29:19.040 | because he ended up selling,
01:29:20.160 | they already got the all-time highs to take the job.
01:29:22.480 | He was like, "I got to get out of all of this stuff."
01:29:24.240 | - No, but listen, let me tell you, the ambassadorships,
01:29:26.920 | it was a smart trade by those guys
01:29:28.560 | because ambassador's a lifetime title.
01:29:31.040 | So you're ambassador, whatever.
01:29:33.400 | No one remembers what president when you were ambassador,
01:29:35.680 | no one cares.
01:29:36.520 | - So you are going for the ambassador.
01:29:37.760 | - No, no. - So Steve,
01:29:38.600 | I think it's fair to say.
01:29:39.440 | - I'm not, I'm not interested in ceremonial things.
01:29:43.840 | I'm interested in making an impact.
01:29:46.120 | And the problem with all these positions,
01:29:48.400 | I mean, being a cabinet official
01:29:51.120 | is not much different than being an ambassador.
01:29:53.480 | - So you're going to enlist in the Navy?
01:29:56.680 | - No.
01:29:57.520 | What has a bigger impact?
01:29:59.880 | Being on all in pod or being an ambassador?
01:30:02.760 | Who's more influential?
01:30:04.200 | Sax on the all in pod or beep as the ambassador of Sweden?
01:30:08.480 | - Being on the all in pod actually.
01:30:10.000 | - All in pod is more impactful.
01:30:13.280 | - By the way, this is why I take issue
01:30:14.760 | with your statement about the term mainstream media.
01:30:17.040 | 'Cause I think you have become the mainstream media
01:30:19.600 | more than most of the folks that-
01:30:20.920 | - No, we're independent media.
01:30:22.320 | We're independent media.
01:30:23.280 | - Trust me, it's independent.
01:30:24.320 | This thing's hanging on by a thread.
01:30:26.240 | - And stop genuflecting.
01:30:27.920 | (laughing)
01:30:29.200 | - No, it's independent.
01:30:30.240 | Who knows if this thing's going to last
01:30:32.640 | another three episodes.
01:30:34.120 | - I just like saying the word genuflect.
01:30:35.840 | - You like genuflecting, I know.
01:30:37.440 | - That is the top word of 2023 so far for me.
01:30:40.360 | - Oh, is that, is somebody doing an analysis
01:30:42.640 | with Chad Chappity of the words used here?
01:30:44.560 | - No, but Saks brought that word up.
01:30:45.840 | It was just, it's a wonderful word.
01:30:47.520 | It's not used enough.
01:30:48.720 | All right, everybody, we'll see you next time
01:30:51.920 | on the all in podcast.
01:30:53.080 | Comments are turned back on.
01:30:54.920 | Have at it, you animals.
01:30:56.520 | Love you guys.
01:30:57.360 | - Enjoy, bye-bye. - Love you besties, bye.
01:30:59.000 | (upbeat music)
01:31:02.600 | (upbeat music)
01:31:05.200 | (upbeat music)
01:31:08.040 | (upbeat music)
01:31:10.640 | (upbeat music)
01:31:15.720 | (upbeat music)
01:31:23.000 | (upbeat music)
01:31:28.440 | (upbeat music)
01:31:33.520 | (upbeat music)
01:31:36.120 | (upbeat music)
01:31:41.520 | (upbeat music)
01:31:48.120 | (upbeat music)
01:31:56.120 | (eerie music)
01:31:57.120 | (Music plays)