back to indexE111: Microsoft to invest $10B in OpenAI, generative AI hype, America's over-classification problem
Chapters
0:0 Bestie intro!
0:43 Reacting to Slate's article on All-In
11:18 SF business owner caught spraying homeless person on camera
29:22 Microsoft to invest $10B into OpenAI with unique terms, generative AI VC hype cycle
69:57 Biden's documents, America's over-classification problem
87:16 Best cabinet positions/ambassadorships
00:00:00.000 |
Is anybody else seeing a half a second lag with J. Cole? 00:00:35.200 |
And it's said we open sourced it to the fans, 00:00:44.160 |
Welcome to episode 111 of the World's Greatest Podcasts, 00:00:50.720 |
according to Slate, the podcast that shall not be mentioned 00:01:03.040 |
the number one business and the number one tech 00:01:07.840 |
that the press has a hard time giving us any oxygen, 00:01:17.840 |
You're saying they take the ideas, but not the-- 00:01:28.560 |
want to subject ourselves to independent journalists asking 00:01:52.160 |
do not want to have the press interpret what they're 00:01:57.640 |
They feel like the world should be interesting. 00:02:04.160 |
They're constantly writing hit pieces about us. 00:02:06.440 |
The question is, when we want to present our side of it, 00:02:10.680 |
do we need to go through their filter or not? 00:02:12.760 |
Why would you go through their filter when it's always 00:02:18.200 |
They have a class hatred of basically of technology 00:02:46.040 |
to specific large media outlets, right, Saks? 00:02:55.400 |
I'll trade you Fox for MSNBC and CNN and The New York Times, 00:02:59.000 |
The Washington Post, and The Atlantic Magazine, 00:03:02.200 |
You get a lot of mileage out of being able to name Fox. 00:03:10.480 |
You can name one, I mean, literally one outlet that 00:03:19.480 |
There are very small differences in the way they think. 00:03:29.080 |
What you're calling advocacy is bias and activism. 00:03:33.920 |
That's what I'm talking about, activism journalism, yes. 00:03:36.200 |
I think Draymond also highlights a really important point, 00:03:40.920 |
It's become one of the most popular forms of sports media. 00:03:45.240 |
And he can speak directly without the filtering 00:03:47.640 |
and classification that's done by the journalist. 00:03:56.480 |
and they want to hear unfiltered, raw points of view. 00:04:05.680 |
is to then scrutinize and analyze and question and-- 00:04:19.680 |
But actually, I would argue that most of these journalists 00:04:23.040 |
are doing what they're doing for the same reason 00:04:26.480 |
which is they want to have some kind of influence 00:04:41.160 |
You guys see this brouhaha where Matt Iglesias 00:04:44.440 |
wrote this article about the Fed and about the debt ceiling? 00:04:57.400 |
the difference between a percentage point and a basis 00:04:59.440 |
point and then he didn't calculate the interest 00:05:13.520 |
Between a principal and an outside analyst, right? 00:05:15.720 |
Like a principal has a better grasp typically 00:05:21.960 |
But he's considered, within the journalist circle, 00:05:35.120 |
a player on the field, they do have a point of view 00:05:37.120 |
and they do have a direction they want to take things. 00:05:39.320 |
So it is a fair commentary that journalists can theoretically 00:05:42.880 |
play a role, which is they're an off-field analyst 00:05:47.200 |
I would argue they're less educated and more biased 00:05:50.680 |
That may or may not be true, what the two of you 00:05:52.480 |
guys are debating, which is a very subjective take. 00:05:55.040 |
But the thing that is categorical and you can't deny 00:05:57.480 |
is that there is zero checks and balances when something 00:06:02.120 |
as simple as the basis point, percentage point difference 00:06:05.520 |
isn't caught in proofreading, isn't caught by any editor, 00:06:09.360 |
isn't caught by the people that help them review this. 00:06:15.040 |
must get through because there's no way for the average person 00:06:18.920 |
on Twitter to police all of this nonsensical content. 00:06:22.200 |
This one was easy because it was so numerically illiterate 00:06:27.000 |
But can you imagine the number of unforced errors journalists 00:06:31.920 |
make today in their search for clicks that don't get caught 00:06:35.320 |
out, that may actually tip somebody to think A versus B? 00:06:39.040 |
That's, I think, the thing that's kind of undeniable. 00:06:46.760 |
If you read the journalists writing about a topic 00:06:50.720 |
you are an expert on, whatever the topic happens to be, 00:06:55.760 |
on that story I'm reading, that they understand about 10% 00:07:03.280 |
But then when you read stories that you're not involved in, 00:07:05.740 |
you know, you read a story about Hollywood or, I don't know, 00:07:08.320 |
pick an industry or a region you're not super aware of, 00:07:11.160 |
you're like, OK, well, that must be 100% correct. 00:07:14.040 |
And the truth is, journalists have access to 5 to 20-- 00:07:27.180 |
It's because now the mistakes aren't being driven just 00:07:31.120 |
by sloppiness or laziness or just a lack of expertise. 00:07:37.080 |
So just to give you an example on the Slate thing, 00:07:45.520 |
to all in the infuriating, fascinating, safe space 00:07:54.080 |
The headline now is, "Elon Musk's Inner Circle 00:08:01.640 |
Yeah, it's Elon for-- so they're trying way too hard to, 00:08:03.900 |
like, describe us in terms of Elon, which, you know, 00:08:09.280 |
But before Inner Circle, the word they used was cronies. 00:08:14.560 |
saw cronies in, like, one of those tweet, you know, 00:08:21.440 |
You know, where, like, it does a capsule or whatever? 00:08:26.520 |
So, you know, they were trying to bash us even harder. 00:08:28.920 |
And then somebody took another look at it and toned it down. 00:08:32.080 |
I'll tell you what happens in the editorial process. 00:08:34.040 |
Whoever writes the article, the article gets submitted. 00:08:40.640 |
They don't have the time for it, because they're in a race. 00:08:57.400 |
Whichever one performs the best, that's the one they go with. 00:09:06.480 |
Obviously, most people just see the headline. 00:09:09.280 |
That's why I told you, when they did that New Republic 00:09:11.480 |
piece on you with that horrific monstrosity of an illustration, 00:09:28.320 |
took the time to write some prose that was actually decent? 00:09:31.280 |
Yeah, he had listened to a lot of episodes, clearly. 00:09:34.880 |
That was great advice, because you gave it to him. 00:09:36.640 |
And you gave it to me, because both of us had these things. 00:09:42.320 |
And if you're OK with the picture, just move on. 00:09:57.920 |
I mean, the person who did the worst there was Peter Thiel. 00:10:01.400 |
Yeah, but that just shows how ridiculously biased it is, 00:10:14.900 |
Kind of looks like Hugh Grant in "Notting Hill." 00:10:17.500 |
I knew that article was going to be fine when 00:10:20.060 |
the first item they presented as evidence of me doing something 00:10:24.700 |
wrong was basically helping to oust Chasey Boudin, which 00:10:27.980 |
was something that was supported by like 70% of San Francisco, 00:10:34.140 |
So not exactly evidence of some out-of-control right-wing 00:10:40.420 |
"The Quiet Political Rise of David Sachs, Silicon Valley's 00:10:44.780 |
I'm just letting you know, people don't get past the six 00:10:57.580 |
been filler words from their second graph down 00:11:01.940 |
But now apparently if you notice that San Francisco streets 00:11:06.020 |
look like Walking Dead, that apparently you're 00:11:12.580 |
I mean, they can't even acknowledge what people 00:11:17.540 |
And I don't know if you guys saw this really horrible, 00:11:24.140 |
owner who's been dealing with owning a storefront in San 00:11:27.500 |
Francisco, which is challenging, and having to clean up feces 00:11:38.180 |
hosing down a homeless person who refuses to leave 00:11:48.500 |
Like, really, like you're hosing a human being down-- 00:11:53.620 |
--who is obviously not living a great life and is in dire-- 00:12:00.940 |
I agree that it's not good to hose a human being down. 00:12:03.740 |
On the other hand, think about the sense of frustration 00:12:06.060 |
that store owner has, because he's watching his business go 00:12:09.180 |
in the toilet, because he's got homeless people living 00:12:32.100 |
I'm just saying-- I'm not saying that's right, but-- 00:12:43.260 |
look at this homeless person being horribly oppressed. 00:12:53.740 |
This is symbolic of the breaking down of basic society. 00:12:57.700 |
Like, both of these people are obviously like-- 00:13:07.140 |
Jason, do you have equal empathy for the store owner 00:13:14.780 |
hose a person down in the face who is homeless. 00:13:27.460 |
and you've got to clean up whatever, excrement every day, 00:13:39.660 |
for the person on the receiving end of that hose. 00:13:42.540 |
But in general, our society has tons of empathy 00:13:48.100 |
We spend billions of dollars trying to solve that problem. 00:13:50.740 |
You never hear a thing about the store owners 00:13:54.700 |
So on a societal level, not in that moment, but in general, 00:13:59.740 |
the lack of empathy is for these middle class store owners, who 00:14:03.740 |
may not even be middle class, working class, who 00:14:10.220 |
like a quarter or a third of the storefronts in San Francisco 00:14:18.580 |
is running an art gallery storefront in San Francisco. 00:14:23.180 |
Why would you bother to have a storefront in San Francisco? 00:14:28.780 |
If you've opened a store, what are you supposed to do, 00:14:33.020 |
I mean, you would shut it down at some point and find an exit. 00:14:37.340 |
Yeah, but a store has large fixed costs, right? 00:14:42.460 |
At some point, you have to shut down your store 00:14:44.420 |
in San Francisco the second you can get out of the lease. 00:14:47.660 |
isn't go to coding school online and then, you know, 00:14:58.140 |
do believe the solution to everything is learn to code. 00:15:07.540 |
The guy spent years building his retail business. 00:15:13.660 |
The police don't come and move the homeless person. 00:15:17.540 |
Customers are uncomfortable going in the store as a result. 00:15:19.860 |
Yeah, I stopped going to certain stores in my neighborhood 00:15:22.140 |
because of homeless tents being literally fixated 00:15:28.300 |
I mean, it's not a kind of uncommon situation 00:15:43.540 |
I think if everybody learns to code or drives an Uber, 00:15:53.980 |
You have these random detached places where you kind of live. 00:15:58.820 |
becomes a prison while you order food from an app every day. 00:16:02.300 |
I don't think that is the society that people want. 00:16:08.180 |
And I think that the homeless person should be taken care of. 00:16:12.380 |
have the best chance of trying to be successful 00:16:15.980 |
The mortality rate of the small business owner 00:16:22.020 |
It's impossible in San Francisco, let's just be honest. 00:16:31.500 |
I'm just shocked that the guy even has a storefront. 00:16:34.380 |
You're showing a tweet that's a moment in time. 00:16:36.460 |
And you're not showing the 10 steps that led up to it. 00:16:40.660 |
The five times he called the police to do something about it. 00:17:06.700 |
You say you know this, and it's mental illness. 00:17:09.940 |
It's like he said, 99% of the people he talks to, 00:17:18.140 |
But the issue here is not the lack of housing, 00:17:20.900 |
although that's a separate problem in California. 00:17:30.140 |
You cannot have-- you can't have a super drug 00:17:33.540 |
be available for a nominal price and give people 00:17:38.380 |
a bunch of money to come here and take it and not enforce it. 00:17:46.820 |
There's mandated rehab, mandated mental health or jail, 00:17:54.340 |
If you're not breaking the law, you don't have mental illness, 00:17:57.700 |
And then provide-- those are the four paths of outcome here 00:18:00.900 |
And if all four of those paths were both mandated 00:18:03.780 |
and available in abundance, this could be a tractable problem. 00:18:12.820 |
where Kevin Bacon was locked up in a mental institution, 00:18:24.820 |
You guys-- someone's probably going to call me an idiot 00:18:29.740 |
But I think there's a story where mandated mental health 00:18:37.260 |
services, like locking people up to take care of them 00:18:40.460 |
when they have mental health issues like this, 00:18:45.660 |
And a lot of the institutions were shut down, 00:18:49.300 |
And there are many of these cases that happened, 00:18:53.860 |
happened to people that weren't mentally ill. 00:18:55.980 |
And so the idea was, let's just abandon the entire product. 00:19:02.700 |
And it's unfortunate, but I think that there's some-- 00:19:10.100 |
It's not about not having mandated mental health 00:19:12.020 |
services, and it's not about locking everyone up 00:19:16.540 |
But there's some solution here that needs to be crafted, 00:19:21.060 |
and you don't let people suffer both as the victim 00:19:24.860 |
on the street, but also the victim in the community. 00:19:30.660 |
a danger to themselves or others kind of thing? 00:19:33.100 |
Right, but Jacob, let's think about the power of language 00:19:35.400 |
If we refer to these people as untreated persons instead 00:19:39.380 |
of homeless persons, and that was the coverage 24/7 00:19:42.740 |
in the media is, this is an untreated person, 00:19:45.220 |
the whole policy prescription would be completely different. 00:19:47.720 |
We'd realize there's a shortage of treatment. 00:19:50.020 |
We'd realize there's a shortage of remedies related 00:19:52.220 |
to getting people in treatment, as opposed to building housing. 00:19:58.100 |
And laws that mandate it, that don't enable it. 00:20:02.780 |
enable the free reign and the free living on the street 00:20:05.820 |
and the open drug markets and all this sort of stuff. 00:20:14.180 |
or if it was a loved one, one of your immediate family members, 00:20:19.500 |
to be picked up off the street and held with a 51/50 00:20:22.260 |
or whatever code involuntarily against their will 00:20:27.340 |
Would you want them to be allowed to remain on the street? 00:20:29.740 |
Would you want yourself if you were in that dire straits? 00:20:35.980 |
But what's the liberal policy perspective on this, J. Cal? 00:20:38.320 |
So let me ask you as our diehard liberal on the show-- 00:20:42.460 |
No, he's an independent and only votes for Democrats. 00:20:45.660 |
75% of the time I vote a Democrat, 25% Republican. 00:20:51.900 |
Is it not that your individual liberties are infringed upon 00:20:54.540 |
if you were to be, quote, "picked up and put away"? 00:20:57.420 |
My position on it is if you're not thinking straight 00:21:03.940 |
And you could lose the liberty for a small period of time-- 00:21:09.340 |
especially if you're a danger to somebody, yourself 00:21:13.420 |
And in this case, if you're on fentanyl, if you're on meth, 00:21:17.740 |
I think if more people had that point of view 00:21:21.020 |
and had that debate, as Saxe is saying, in a more open way, 00:21:24.180 |
you could get to some path to resolution on-- 00:21:30.940 |
We won't say who it is, but someone in my family 00:21:36.700 |
And the problem is, because they're an adult, 00:21:41.660 |
you can't get them to get any form of treatment whatsoever. 00:21:49.740 |
And the nuclear option is you basically take that person 00:21:52.020 |
to court and try to seize their power of attorney, which 00:21:57.900 |
And by the way, it is so unbelievably restrictive 00:22:00.940 |
what happens if you lose that power of attorney 00:22:06.840 |
It's just a huge burden that the legal system 00:22:14.500 |
If the person's committing something illegal, 00:22:16.460 |
like camping out or doing fentanyl, meth, whatever, 00:22:20.180 |
you can use the law as the backstop against personal 00:22:23.900 |
All that person can do is really get arrested. 00:22:27.660 |
to actually get power of attorney over somebody. 00:22:29.660 |
The other thing that I just wanted you guys to know-- 00:22:31.940 |
I think you know this, but just a little historical context-- 00:22:37.580 |
started because Reagan defunded all the psychiatric hospitals. 00:22:43.580 |
And that compounded, because for whatever reason, 00:22:55.700 |
I think it's called the Mental Health Systems Act, MHSA, 00:23:00.420 |
which completely broke down some pretty landmark legislation 00:23:04.860 |
And I don't think we've ever really recovered. 00:23:06.900 |
We're now 42 years onward from 1980, or 43 years onward. 00:23:11.420 |
But just something for you guys to know that that's-- 00:23:16.860 |
one definitely negative check in my book against his legacy 00:23:35.860 |
about the conditions in these mental health homes. 00:23:47.100 |
is to blame when he hasn't been in office for 50 years, 00:23:59.460 |
declared that he would end homelessness within 10 years. 00:24:03.240 |
He just made another declaration like that as governor. 00:24:10.180 |
Well, I just think it's letting these guys off the hook. 00:24:12.220 |
I think it's letting the politicians off the hook. 00:24:14.380 |
Society needs to start thinking about changing priorities. 00:24:17.260 |
We didn't have this problem of massive numbers of people 00:24:24.700 |
And I think a lot of it has to do with fentanyl. 00:24:26.660 |
The power of these drugs has increased massively. 00:24:31.540 |
So in any event, I mean, you can question what Reagan did 00:24:37.060 |
But I think this problem really started in the last 10, 00:24:44.500 |
Until people realize these are a different class of drugs, 00:24:55.620 |
to these addicts $800 a week to feed their addiction 00:24:59.020 |
so they could live on the streets of San Francisco. 00:25:06.140 |
was just that color that we had a system of funding 00:25:11.700 |
particularly local mental health infrastructure. 00:25:21.780 |
I think that's part of the solution here is, yeah, 00:25:23.860 |
we're going to have to basically build up shelters. 00:25:28.060 |
And to support your point, the problem now, for example, 00:25:37.380 |
to a $25 billion deficit overnight, which we talked 00:25:43.140 |
the law of numbers catching up with the state of California. 00:25:46.940 |
And he's not in a position now to do any of this stuff. 00:25:54.380 |
It's like $10 billion or something out of that huge 00:25:56.620 |
budget they had to solve the problem of homelessness. 00:25:58.460 |
I would just argue they're not tackling it in the right way. 00:26:01.060 |
Because what happened is there's a giant special interest 00:26:06.340 |
is the building industry, who gets these contracts to build 00:26:17.380 |
And so they end up building 10 units at a time 00:26:20.420 |
on Venice Beach, like the most expensive land you could 00:26:23.540 |
possibly build because you get these contracts 00:26:33.100 |
you wouldn't be building housing on Venice Beach. 00:26:36.940 |
You'd be going to cheap land just outside the city. 00:26:42.820 |
I mean, shelters that can house 10,000 people, not 10. 00:26:56.860 |
By the way, do you guys want to hear this week in Grift? 00:27:03.260 |
I read something today in Bloomberg that was unbelievable. 00:27:13.180 |
has been classified by a nonprofit, the Nature 00:27:16.300 |
Conservancy in this case, as eligible for what 00:27:20.940 |
So this is $2 trillion of the umpteen trillions of debt 00:27:23.780 |
that's about to get defaulted on by countries like Belize, 00:27:31.900 |
And what happens now are the big bulge bracket, 00:27:35.220 |
Wall Street banks and the Nature Conservancy, 00:27:41.660 |
you have a billion dollar tranche of debt that's 00:27:46.660 |
And you're going to be in default with the IMF. 00:27:59.420 |
to take some of that savings and protect the rainforest 00:28:03.260 |
or protect a coral reef or protect some mangrove trees. 00:28:16.580 |
And then they sell it to folks like BlackRock 00:28:18.900 |
who have decided that they must own this in the portfolio. 00:28:21.980 |
So it literally just goes from one sleeve of BlackRock, 00:28:24.940 |
which is now marked toxic emerging market debt. 00:28:29.100 |
And then it gets into someone's 401(k) as ESG debt. 00:28:41.580 |
is that Exxon is like the number seven top-ranked company 00:28:58.820 |
means so much and should be worth a lot to a lot of people. 00:29:03.620 |
it creates this toxic soup where you can just hide the cheese. 00:29:07.540 |
Yeah, I mean, governance is important in companies. 00:29:14.180 |
I mean, but why are these things grouped together in this? 00:29:24.500 |
is going to put $10 billion or something into chat GPT. 00:29:35.740 |
I mean, you can question the business model, maybe, 00:29:40.460 |
So what I'd say is $29 billion for a company that's 00:29:47.220 |
That's also a naive way to look at a lot of other businesses 00:29:50.380 |
that ended up being worth a lot down the road. 00:29:53.820 |
You can model out the future of a business like this 00:29:56.260 |
and create a lot of really compelling big outcomes. 00:30:01.780 |
So Microsoft is close to investing $10 billion 00:30:08.780 |
It turns out that they might wind up owning 59% of open AI, 00:30:13.900 |
but get 75% of the cash and profits back over time. 00:30:26.140 |
And this obviously includes Azure credits and chat GPT. 00:30:32.340 |
As everybody knows, this just incredible demonstration 00:30:36.980 |
of what AI can do in terms of text-based creation of content 00:30:41.500 |
and answering queries has taken the net by storm. 00:30:47.100 |
Sax, do you think that this is a defensible, real technology? 00:30:51.820 |
Or do you think this is like a crazy hype cycle? 00:30:55.540 |
Well, it's definitely the next VC hype cycle. 00:31:01.620 |
Just look at the public markets, everything we're investing in, 00:31:10.140 |
And just because something is a VC hype cycle 00:31:14.980 |
So as I think one of our friends pointed out, 00:31:21.020 |
I think cloud turned out to be, I'd say, very real. 00:31:30.460 |
On the other hand, web 3 and crypto was a hype cycle 00:31:42.780 |
In terms of AI, I think that if I had to guess, 00:31:53.100 |
However, I'm not sure about how much potential there is yet 00:32:01.580 |
is something that's going to be done by really big companies. 00:32:14.460 |
I'm sure Facebook is going to do something huge in AI. 00:32:20.180 |
a platform that startups are going to be able to benefit 00:32:23.700 |
I will say that some of the companies we've invested in 00:32:28.180 |
So I guess where I am is I think the technology is actually 00:32:42.700 |
of companies built around an API for something 00:32:48.740 |
I don't think startups are going to be able to create 00:32:53.500 |
But they might be able to benefit from the APIs. 00:32:56.940 |
Maybe that's the thing that has to be proven out. 00:33:00.100 |
There's a lot of really fantastic machine learning 00:33:04.020 |
services available through cloud vendors today. 00:33:07.100 |
So Azure has been one of these kind of vendors. 00:33:09.940 |
And obviously, OpenAI is building tools a little bit 00:33:17.660 |
can be used for specific vertical applications. 00:33:19.780 |
Obviously, the acquisition of InstaDeep by BioNTech 00:33:24.180 |
And most of the big dollars that are flowing in biotech 00:33:26.780 |
right now are flowing into machine learning applications 00:33:29.700 |
where there's some vertical application of machine 00:33:31.740 |
learning tooling and techniques around some specific problem 00:33:36.180 |
And the problem set of mimicking human communication 00:33:40.540 |
and doing generative media is a consumer application 00:33:45.300 |
set that has a whole bunch of really interesting product 00:33:51.140 |
that nearly every other industry and nearly every other vertical 00:33:55.980 |
And there's active progress being made in funding 00:33:59.740 |
and getting liquidity on companies and progress 00:34:02.980 |
with actual products being driven by machine learning 00:34:07.700 |
So the fundamental capabilities of large data sets 00:34:11.900 |
and then using these kind of learning techniques 00:34:16.860 |
to make kind of predictions and drive businesses forward 00:34:20.900 |
in a way that they're not able to with just human knowledge 00:34:29.100 |
And so I think let's not get caught up in the fact 00:34:31.180 |
that there's this really interesting consumer market 00:34:33.660 |
hype cycle going on, where these tools are not 00:34:38.580 |
real value across many other verticals and segments. 00:34:41.220 |
Chamath, when you look at this Microsoft OpenAI deal, 00:34:44.700 |
and you see something that's this convoluted, 00:34:47.220 |
hard to understand, what does that signal to you 00:34:57.780 |
And then two is, you know, cute by half or the two hard bucket. 00:35:11.660 |
one person said, there's a lot of complex law 00:35:14.820 |
when you go from a nonprofit to a for-profit. 00:35:19.360 |
There's lots of complexity in deal construction. 00:35:26.580 |
There may or may not be, you know, legal issues at play here 00:35:29.820 |
that you encapsulated well in the last episode. 00:35:32.820 |
I think there's a lot of stuff we don't know. 00:35:41.300 |
it's in the two hard bucket for me to really take seriously. 00:35:43.800 |
Now, that being said, it's not like I got shown the deal. 00:35:52.260 |
I think is really important for entrepreneurs 00:35:54.780 |
to internalize, which is where can we make money? 00:35:59.420 |
The reality is that, well, let me just take a prediction. 00:36:04.140 |
I think that Google will open source their models 00:36:07.220 |
because the most important thing that Google can do 00:36:14.460 |
And the best way to do that is to scorch the earth 00:36:17.420 |
with these models, which is to make them widely available 00:36:22.160 |
That will cause Microsoft to have to catch up. 00:36:25.740 |
And that will cause Facebook to have to really look 00:36:28.180 |
in the mirror and decide whether they're going to cap 00:36:38.220 |
I mentioned this in the, I did this Lex Friedman podcast, 00:36:43.240 |
And the reason is, if Facebook and Google and Microsoft 00:36:47.340 |
have roughly the same capability and the same model, 00:37:01.860 |
these are the things that will make your stuff unique. 00:37:07.420 |
you can build a reinforcement learning pipeline. 00:37:11.160 |
You build a product that captures a bunch of usage. 00:37:19.740 |
get back better answers, you can make money from it. 00:37:22.380 |
Facebook has an enormous amount of reinforcement learning 00:37:27.140 |
Every click, every comment, every like, every share. 00:37:42.700 |
The huge companies, I think, will create the substrates. 00:37:47.700 |
And I think there'll be forced to scorch the earth 00:37:54.340 |
And then on top of that is where you can make money. 00:37:56.820 |
And I would just encourage entrepreneurs to think, 00:38:09.780 |
but then I can still construct a dish that's unique. 00:38:21.340 |
that I think we need to start thinking about it. 00:38:23.380 |
- Interestingly, as we've all pointed out here, 00:38:28.780 |
The stated philosophy was this technology is too powerful 00:38:33.660 |
Therefore, we're going to make it open source. 00:38:35.900 |
And then somewhere in the last couple of years, 00:38:38.820 |
it's too powerful for it to be out there in the public. 00:38:44.380 |
and we need to get $10 billion from Microsoft. 00:38:48.000 |
That is the disconnect I am trying to understand. 00:38:51.700 |
That's the most interesting part of the story, Jason, 00:39:07.940 |
And that sort of power should not sit in anyone's hands. 00:39:39.140 |
- Do all those people who donated get stock in? 00:39:42.300 |
- So what happened was they were all in a non, 00:39:50.940 |
the idea was instead of having Google own all of this, 00:39:54.400 |
And here's the statement from the original blog post in 2015, 00:40:03.240 |
in the way that is most likely to benefit humanity 00:40:10.180 |
Since our research is free from financial obligations, 00:40:13.380 |
we can better focus on a positive human impact. 00:40:17.820 |
about Sam, Greg, Elon, Reed, Jessica, Peter Thiel, AWS, 00:40:25.260 |
including donations and commitments of over a billion 00:40:27.620 |
dollars, although we expect that to only be a tiny fraction 00:40:39.100 |
on how this thing all started seven years ago 00:40:41.380 |
and how quickly it's evolved as you point out 00:40:44.060 |
into the necessity to have a real commercial alignment 00:40:47.180 |
to drive this thing forward without seeing any 00:40:53.420 |
we've seen Google share AlphaFold and share a number 00:41:04.220 |
And so there's both kind of tooling and models 00:41:07.000 |
and outputs of those models that Google has open sourced 00:41:22.980 |
you could generate a financial return capped at 100X, 00:41:26.060 |
which is still a pretty amazing financial return. 00:41:30.660 |
That's funding a real commercial endeavor at that point. 00:41:35.980 |
- It is the most striking question about this whole thing, 00:41:40.220 |
And it's one that Elon's talked about publicly 00:41:42.180 |
and others have kind of sat on one side or the other, 00:41:50.660 |
and most kind of existential threats to humanity. 00:41:56.180 |
and the battle that's gonna be happening politically 00:42:00.060 |
and perhaps even between nations in the years to come, 00:42:06.340 |
And what are we legally gonna be allowed to do with it? 00:42:08.460 |
And this is a really important part of that story, yeah. 00:42:13.620 |
people don't know that's another framework, P-Y-T-O-R-C-H. 00:42:21.740 |
"Hey, we want to democratize machine learning." 00:42:43.980 |
- No, I don't have an investment in TensorFlow. 00:42:47.700 |
- No, TensorFlow, the public source came out of Google 00:42:51.060 |
- But we were building Silicon for machine learning. 00:42:57.740 |
The founder of this company was the founder of TensorFlow. 00:43:02.860 |
- Oh, sorry, not of TensorFlow, pardon me, of TPU, 00:43:15.500 |
I don't mean to be cynical about the whole project or not, 00:43:21.900 |
It reminds me, I don't know if you remember this, 00:43:26.300 |
- The biggest opportunity here is for Facebook. 00:43:28.020 |
I mean, they need to get in this conversation, ASAP. 00:43:33.940 |
PyTorch was like a pretty seminal piece of technology 00:43:37.180 |
that a lot of folks in AI and machine learning 00:43:44.260 |
And what's so funny about Google and Facebook 00:43:49.080 |
they're not really making that much progress. 00:43:58.480 |
I think these companies really need to get these products 00:44:07.460 |
- You have to email people and get on some list, 00:44:09.620 |
I mean, this is Google and Facebook, guys, come on. 00:44:23.180 |
to the tune of $3 million a day in cloud credits or costs. 00:44:27.380 |
Which, you know-- - By the way, just on that, 00:44:39.460 |
we know now when we can actually get resources." 00:44:44.540 |
an interesting thing where a 13-year-old kid knows 00:44:48.060 |
when it's mostly compute-intensive that it's unusable 00:44:56.380 |
and captured people's imagination this broadly? 00:45:08.860 |
because whenever something is the hype cycle, 00:45:11.420 |
I just reflexively want to be skeptical of it. 00:45:14.000 |
But on the other hand, we have made a few investments 00:45:24.340 |
- I have two pieces of more insider information. 00:45:35.540 |
And it's the simplest interface you've ever seen, 00:45:42.300 |
So it looks, SACS, like you're in iMessage, basically. 00:45:48.420 |
"Hey, what are the best restaurants in Yawnville?" 00:45:53.380 |
And then I said, "Which one has the best duck?" 00:45:55.540 |
And it literally like gave me a great answer. 00:45:58.620 |
"why is this not using a Siri or Alexa-like interface? 00:46:02.840 |
"And then why isn't it, oh, here's a video of it." 00:46:06.260 |
- By the way, Jason, this, what you're doing right now 00:46:13.780 |
So just the fact that you asked that question, 00:46:16.620 |
and over time, if ChatGPT has access to your GPS information 00:46:21.260 |
and then knows that you went to restaurant A versus B, 00:46:27.880 |
"Hey, Jason, we noticed you were in the area. 00:46:31.420 |
"If you did, how would you rate it one through five?" 00:46:34.020 |
That reinforcement learning now allows the next person 00:46:36.780 |
that asks, "What are the top five restaurants?" 00:46:54.460 |
you have to be running to get this stuff out there. 00:46:57.740 |
- Well, and then this answer, you cited Yelp. 00:47:01.100 |
Well, this is the first time I've actually seen 00:47:04.740 |
and this is, I think, a major legal breakthrough. 00:47:10.780 |
I don't know if they have permission from Yelp, 00:47:13.620 |
it should link to French Lounge, Bottega, and Bouchon. 00:47:16.300 |
Bouchon actually has the best duck confit, for the record. 00:47:27.060 |
I could say, "Hey, which one has availability 00:47:34.380 |
or any number of next tasks. - I was thinking about-- 00:47:42.060 |
- I was thinking about what you said last week, 00:47:50.660 |
And what happened was, there was a lot of musicians, 00:48:00.580 |
"Hey, listen, you're allowing people to take my content, 00:48:05.480 |
"There's economic damage that I can measure." 00:48:21.820 |
is there a claim that Yelp can make in this example, 00:48:35.560 |
and the advertising revenue that they would have got 00:48:39.060 |
Now, that doesn't mean that ChatGPT can't figure that out, 00:48:43.740 |
that are gonna be a little thorny in these next few years 00:48:51.420 |
- If you were a human reading every review on Yelp 00:49:02.440 |
So the question is, is GPT held to that standard? 00:49:16.680 |
because if you look at the four-part test for fair use, 00:49:23.720 |
and we would mention Walt Mossberg's review of a product 00:49:30.720 |
And we'd say, "Well, we're doing an original work. 00:49:34.200 |
You know, human is comparing two or three different reviews 00:49:40.600 |
It's not interfering with Walt Mossberg's ability 00:49:45.200 |
to get subscribers in the Wall Street Journal. 00:49:50.960 |
And just reading from Stanford's quote on fair use, 00:49:55.560 |
"is whether your use deprives the copyright owner of income 00:50:07.320 |
This is true even if you are not competing directly 00:50:16.160 |
In this example, I would not open the Yelp app. 00:50:18.880 |
Yelp would get no commerce and Yelp would lose this. 00:50:20.960 |
So, ChatGPT and all these services must use citations 00:50:27.500 |
They must link to them and they must get permission. 00:50:35.560 |
if you have to get permission in advance, right? 00:50:41.400 |
Quora, Yelp, the App Store reviews, Amazon's reviews. 00:50:46.520 |
So, there are large corpuses of data that you would need. 00:50:49.760 |
Like Craigslist has famously never allowed anybody 00:51:20.480 |
- The other gray area that isn't there today, 00:51:22.640 |
but may emerge, is when section 230 gets rewritten. 00:51:29.560 |
for the Facebook and the Googles of the world, 00:51:31.260 |
for the basically for being an algorithmic publisher, 00:51:34.560 |
and saying an algorithm is equivalent to a publisher. 00:51:37.580 |
What it essentially saying is that an algorithm 00:51:43.600 |
And I wonder whether that's also an angle here, 00:51:49.920 |
I read all these blog posts, I write something. 00:51:53.880 |
maybe can you then say, no, actually there was intent there 00:51:56.880 |
that's different than if a human were to do it? 00:52:09.500 |
with an economic model for VCs to really make money. 00:52:12.980 |
And right now there's just too much betting on the come. 00:52:17.260 |
it makes sense that you put money into open AI 00:52:20.440 |
Because the economic model of how you make money 00:52:25.840 |
- No, it's clear actually, I have it for business. 00:52:31.280 |
They had a survey that they shared on their discord server. 00:52:35.880 |
and they did a price discovery survey, Freeberg. 00:52:38.960 |
What's the least you would pay, the most you would pay, 00:52:41.320 |
what would be too cheap of a price for chat GPT pro, 00:52:45.960 |
I put in like 50 bucks a month would be what I would pay. 00:52:51.080 |
allowed you, Freeberg, to have a Slack channel 00:52:55.040 |
And you could go in there or anytime you're in Slack, 00:53:15.760 |
Okay, well, that was the job of the local event planner. 00:53:27.720 |
- Well, I think one of the big things that's happening 00:53:31.120 |
is all the old business models don't make sense anymore. 00:53:36.120 |
In a world where the software is no longer just doing 00:53:46.320 |
So you have this kind of hierarchical storage of data 00:53:52.840 |
and then you go and you search and you pull data out, 00:53:55.400 |
and then you present that data back to the customer 00:53:59.200 |
And that's effectively been how all kind of data 00:54:11.600 |
is kind of built an evolution of application layers 00:54:14.080 |
or software tools to interface with the fetching of that data, 00:54:17.800 |
the retrieval of that data, and the display of that data. 00:54:22.600 |
what AI type systems or machine learning systems now do, 00:54:27.900 |
and the representation of some synthesis of that data 00:54:31.720 |
to you, the user, in a way that doesn't necessarily 00:54:39.420 |
And that's where business models like a Yelp, for example, 00:54:44.740 |
and then presents webpage directories to you. 00:54:54.620 |
in being able to present to you a synthesis of that data 00:55:01.200 |
with your own consumption and interpretation of that data, 00:55:04.240 |
which is how you've historically used these systems. 00:55:08.260 |
going back to the question of the hype cycle, 00:55:12.500 |
I think it's about the investment opportunity 00:55:15.220 |
against fundamentally rewriting all compute tools. 00:55:18.420 |
Because if all compute tools ultimately can use 00:55:21.280 |
this capability in their interface and in their modeling, 00:55:33.340 |
in being able to build new systems and new models 00:55:42.860 |
having screening results from very expensive experiments 00:55:49.620 |
and having a lot of data against those experiments 00:55:55.140 |
in being able to do things like drug discovering. 00:55:59.140 |
versus everyone using publicly known screening libraries 00:56:02.180 |
or publicly available protein modeling libraries, 00:56:08.260 |
and the same targets and the same clinical objectives 00:56:11.260 |
that they're going to try and resolve from that output. 00:56:19.980 |
But really, that's just kind of, where's there an edge? 00:56:41.640 |
What impact would it be if ChantGPT took every court case, 00:57:12.460 |
and say, you know, versus an outcome in another state, 00:57:14.980 |
and you could figure out what's actually going on 00:57:19.780 |
What impact did this have on the legal field? 00:57:35.980 |
- You gotta be kind of dumb to fail the bar exam. 00:57:52.140 |
that an associate at a law firm would get asked 00:58:01.860 |
And you could imagine GPT doing that like instantly. 00:58:16.780 |
showing that people are developing some skills 00:58:19.140 |
around knowing how to ask GPT questions in the right way. 00:58:30.300 |
what's the best restaurant in, you know, Napa, 00:58:34.200 |
But there are much more complicated questions 00:58:39.060 |
So it's not clear to me that a command line interface 00:58:46.320 |
So we're an investor, for example, in Copy AI, 00:58:48.320 |
which is doing this for copywriters and marketers, 00:58:53.780 |
And so, you know, imagine putting that like, you know, 00:59:03.980 |
I think the other part of it is on the answer side, 00:59:16.640 |
But in other professions, you need six nines accuracy, 00:59:23.780 |
Okay, so I think for a lawyer going into court, 00:59:32.020 |
- Yeah, it's a parking ticket versus a murder trial 00:59:51.300 |
Like could the associates start with Chad GPT, 01:00:07.300 |
I think you'd get precision recall off the charts 01:00:25.140 |
where I will be having a set of conversations 01:00:30.420 |
and see Reid having, and he's a very smart guy, 01:00:35.740 |
And by the way, Chad GPT will have an AI generated voice 01:00:39.260 |
powered by the text to speech platform, play.ht. 01:00:50.660 |
- Well, I mean, Chamath, we have a conversation 01:00:57.580 |
- Hey, but actually, so synthesizing Chamath's point 01:01:01.980 |
with something you said, Jay Cal, in our chat, 01:01:09.700 |
'cause I don't think you've said it on this episode, 01:01:12.660 |
which is you said that these open AI capabilities 01:01:24.420 |
but there'll be multiple players that offer them. 01:01:26.660 |
And you said the real advantage will come from 01:01:42.700 |
in a given vertical with a proprietary data set, 01:01:59.660 |
with this really interesting company based in Zurich. 01:02:03.180 |
And what they have is basically a library of ligands, right? 01:02:10.540 |
to deliver all kinds of molecules inside the body. 01:02:13.220 |
And what's interesting is that they have a portfolio 01:02:22.980 |
So, you know, they target glioblastoma, glioblastoma. 01:02:28.500 |
"Well, this ligand can actually cross the blood-brain barrier 01:02:34.100 |
and a whole bunch of nuclear imagery around that. 01:02:42.740 |
because that's real work that Google or Microsoft 01:02:51.100 |
And if you have that and you bring it to the problem, 01:02:55.540 |
You know, there's a business there to be built. 01:03:08.820 |
- And that prompt engineer, well, no, a prompt engineer, 01:03:11.780 |
somebody who is very good at talking to these, you know, 01:03:20.340 |
just like a detective who asks great questions, 01:03:23.100 |
that person is gonna be 10 or 20 times more valuable. 01:03:33.340 |
And as we talk about austerity and doing more with less 01:03:43.180 |
Facebook laying off 10 and probably another 10,000, 01:03:50.180 |
We could be sitting here in three or four or five years 01:03:51.980 |
and instead of running a company like Twitter 01:03:57.580 |
- Look, I think directionally, it's the right statement. 01:04:00.660 |
I mean, you know, I've made the statement a number of times 01:04:03.780 |
that I think we move from this idea of creator economy 01:04:08.700 |
where historically it was kind of labor economy, 01:04:11.060 |
where humans use their physical labor to do things. 01:04:23.580 |
where the way that you kind of can state intention 01:04:40.260 |
at trying to reproduce a photographic like imagery 01:04:46.380 |
And there's these really great kind of museum exhibits 01:04:48.660 |
on how he did it using these really interesting 01:04:55.020 |
And then the better, the artist of the 21st century, 01:04:58.020 |
the 20th century was the best user of Adobe Photoshop. 01:05:01.020 |
And that person is not necessarily the best painter. 01:05:12.020 |
it's gonna look like something entirely different. 01:05:13.980 |
It could be who's got the most creative imagination 01:05:16.900 |
in driving the software to drive new outcomes. 01:05:19.140 |
And I think that the same analogy can be used 01:05:27.260 |
'cause the Luddite argument is when you have new tools 01:05:42.380 |
And when we level up, we all kind of fill the gaps 01:05:44.460 |
and expand our productivity and our capability set. 01:05:52.100 |
It's just that that individual company is run differently, 01:06:02.900 |
Instead of narrative, it's the conductor economy. 01:06:11.740 |
there's gonna be somebody who's sitting there like, 01:06:27.380 |
for copyright infringement, give me my best arguments. 01:06:30.980 |
hey, I wanna know what the next three features 01:06:34.660 |
Can you examine who are my top 20 competitors? 01:06:37.180 |
And then who have they hired in the last six months? 01:06:39.100 |
And what are those people talking about on Twitter? 01:06:49.660 |
in the book Ender's Game, I think is a good example 01:06:52.620 |
of this where the guy goes through the entire 01:06:56.340 |
he's commanding armies of spaceships in space 01:07:05.700 |
- You predicted that there would be like all these people 01:07:10.700 |
But I think this Reid Hoffman thing could be pretty cool. 01:07:13.140 |
Like what if he wins a Grammy for his, you know, 01:07:21.260 |
when's the first AI novel gonna get published 01:07:25.740 |
When's the first AI symphony gonna get performed 01:07:31.540 |
get turned into an AI generated 3D movie that we all watch? 01:07:36.340 |
when do we all get to make our own AI video game 01:07:46.020 |
these new immersive environments that they can live in. 01:07:50.020 |
- When I say live in, I mean video game wise. 01:07:54.860 |
just like to use a question of game theory for a second, 01:08:10.540 |
if multiple models get there at a system-wide level, 01:08:27.820 |
if everybody then knows how to ask the exact right question 01:08:32.660 |
where you're like, maybe there is a dystopian hellscape 01:08:37.020 |
Maybe that's the Elon world, which is you can, 01:08:44.340 |
where there is no job that's possible, right? 01:08:47.940 |
And now I'm not saying that that path is the likely path, 01:08:50.940 |
but I'm saying it is important to keep in mind 01:08:52.620 |
that that path of outcomes is still very important 01:08:58.660 |
- Well, Freeberg, you were asking before about this, 01:09:05.420 |
podcasting is a job now, being an influencer is a job, 01:09:13.460 |
I'm looking at Fred, I'm looking at the St. Louis Fed, 01:09:16.820 |
1970, 26.4% of the country was working in a factory, 01:09:32.140 |
And in 2015, when they stopped the percentage 01:09:34.900 |
in manufacturing, I say it's, they discontinued this, 01:09:38.500 |
So it's possible we could just see, you know, 01:09:44.060 |
the concept of knowledge work is going to follow, 01:09:47.260 |
pretty inevitable, the path of manufacturing. 01:09:50.580 |
That seems like a pretty logical theory or no? 01:09:56.820 |
- Okay, so how would we like to ruin the show now? 01:10:16.560 |
- Some of these network guys are talking about productivity. 01:10:29.200 |
is in possession of classified documents in his home, 01:10:33.760 |
that apparently have been taken in an unauthorized manner, 01:10:36.760 |
basically stolen, he should have his home raided by the FBI. 01:10:49.400 |
has now said there's a third batch of classified documents. 01:10:54.160 |
This group, I guess there was one at an office, 01:10:56.760 |
one at a library, now this third group is in his garage 01:10:59.200 |
with his Corvette, certainly not looking good. 01:11:05.640 |
meaning that you could use a garage door opener 01:11:12.320 |
- So pretty much as secure as the documents at Mar-a-Lago, 01:11:16.880 |
- No, no, no, actually, I mean, just to be perfectly fair, 01:11:19.880 |
the documents at Mar-a-Lago were locked in a basement. 01:11:24.160 |
said we'd like you to lock those up, they locked 'em up. 01:11:26.200 |
So a little safer than being in a garage with a Corvette. 01:11:31.280 |
- But functionally the same, functionally the same. 01:11:41.800 |
is appointed an independent counsel to investigate Trump, 01:11:47.120 |
or investigator appointed to investigate Biden. 01:11:49.640 |
I mean, these things are functionally the same. 01:11:57.320 |
As of an hour ago, a special counsel was appointed. 01:12:03.960 |
- Okay, I guess there are real questions to look into here. 01:12:18.480 |
Do they touch on the Biden family's business dealings 01:12:28.880 |
Now that the last three presidential candidates 01:12:31.480 |
have been ensnared in these classified document problems, 01:12:51.840 |
It seems to me that we have an over-classification problem, 01:13:00.960 |
the government can avoid accountability and prying eyes 01:13:04.280 |
by simply labeling any document as classified. 01:13:07.720 |
So over-classification was a logical response 01:13:18.160 |
to a president or vice president is classified. 01:13:34.720 |
- And they're supposed to have declassified these. 01:13:40.640 |
on the release of the JFK assassination documents, 01:13:50.560 |
which is making it... - They're not releasing them. 01:14:04.400 |
that need to be classified after even, say, five years. 01:14:08.280 |
automatically declassifying them after five years, 01:14:11.160 |
unless they go through a process to get reclassified. 01:14:22.080 |
are still sensitive, are trade secrets, five years later? 01:14:26.400 |
- Certainly 20 years later, they're not, right? 01:14:30.280 |
- No, but I wouldn't even say, like, five years. 01:14:31.280 |
I mean, the only documents-- - The Coca-Cola formula. 01:14:33.440 |
- The only documents in business that I think I deal with 01:14:38.760 |
are the ones that pertain to the company's future plans, 01:14:42.280 |
right, because you wouldn't want a competitor-- 01:14:45.280 |
- Yeah, cap table. - There's a handful of things. 01:14:56.840 |
- So, like, in business, I think our experience has been 01:14:59.520 |
there's very few documents that stay sensitive, 01:15:09.960 |
to the Javelin missile system or to, you know, 01:15:24.400 |
that they're reviewing-- - Why are they keeping them-- 01:15:27.040 |
- That needs to be classified five years later. 01:15:48.360 |
with a bunch of personal effects and mementos. 01:15:53.920 |
and handle documents, they're all classified. 01:16:02.160 |
I mean, that's gonna wind up being the rug here 01:16:04.400 |
is Trump didn't give them back, and Biden did. 01:16:11.320 |
they looked around, they said, "Put a lock on this." 01:16:15.040 |
Then maybe they changed their minds, I don't know. 01:16:17.960 |
- Well, it's pretty clear that he wouldn't give them back in. 01:16:20.920 |
is that now that Biden, Trump, and Hillary Clinton 01:16:28.880 |
that we're over-classifying so many documents? 01:16:40.120 |
Remember Hillary Clinton and the whole email server? 01:16:53.840 |
I mean, if you're a politician, an elected official, 01:16:58.840 |
the only time you should ever be handling anything 01:17:01.080 |
is go into a clean room, make an appointment, 01:17:04.320 |
go in there, read something, don't take notes, 01:17:23.840 |
- You're missing the most important part about the sex. 01:17:26.840 |
This was, if you wanna go into conspiracy theories, 01:17:33.320 |
so that we could create the false equivalency 01:17:41.960 |
to fight with Biden about, and this is gonna help Trump. 01:17:45.040 |
- 'Cause they're both tainted, equally tainted 01:17:54.760 |
I think Merrick Garland now is gonna have to drop 01:17:57.800 |
the prosecution against Trump for the stolen documents, 01:18:01.120 |
or at least that part of what they're investigating him for. 01:18:04.120 |
They might still investigate him over January 6th 01:18:05.880 |
or something, but they can't investigate Trump 01:18:14.640 |
both sides are engaged in hyper partisanship. 01:18:16.680 |
The way right now that the conservatives on the right, 01:18:20.640 |
they're attacking Biden now for the same thing 01:18:27.840 |
and again, think about the incentives we're creating 01:18:31.920 |
You can't use email, and you can't touch documents. 01:18:38.600 |
- And by the way, don't ever go into politics 01:18:41.440 |
if you're a business person, because they'll investigate 01:18:43.920 |
every deal you ever did prior to getting into politics. 01:18:48.320 |
I mean, just think about the incentives we're creating. 01:18:49.160 |
- So what are you gonna do when you try to get 01:19:00.280 |
the permanent Washington establishment, i.e. the deep state, 01:19:03.720 |
they're creating a system in which they're running things, 01:19:25.920 |
is to get DeSantis elected so that he can become 01:19:30.320 |
I mean, Ken Griffin would get that if he wanted it. 01:19:37.880 |
So he would mark the market like $30 billion, 01:19:54.880 |
- When you get appointed to those senior posts, 01:19:58.840 |
you're allowed to either stick it in a blind trust, 01:20:13.620 |
And so the argument is, if you're forced to divest it 01:20:17.100 |
to enter a government, you shouldn't be forcibly taxed. 01:20:19.120 |
- Wait, if I become mayor of San Francisco or Austin, 01:20:23.900 |
- Secretary of Transportation, J. Cal, you can do that. 01:20:30.520 |
- To answer Freeberg's point, I think Citadel Securities, 01:20:34.960 |
because that's just a securities trading business. 01:20:38.720 |
probably something like a big bulge bracket bank 01:20:53.720 |
- It's not a grift at all, but it's an incredible-- 01:20:56.120 |
- Oh, come on, man, a cabinet position for no cap gains? 01:21:00.040 |
- Well, that's not a grift, those are the laws. 01:21:28.320 |
- Look, any normal person who wants to serve in government, 01:21:31.400 |
you can't use email and you can't touch a document, 01:21:34.360 |
and every deal you've ever done gets investigated. 01:21:39.560 |
I mean, all of a sudden, you get to divest tax free. 01:21:42.400 |
- Methink thou doth protesteth too much, David Sachs. 01:22:04.240 |
- To a combination of BlackRock and Blackstone 01:22:06.760 |
would be considered a petty, small-scale swindle. 01:22:08.840 |
- Did any of you guys watch the Madoff series on Netflix? 01:22:37.640 |
- They become like all the-- - No, one guy died. 01:22:39.080 |
- Everyone was a victim. - One guy died of cancer. 01:22:41.960 |
- And then Irving Picard, I didn't realize all this, 01:22:50.680 |
and he sued them and took their homes away from them, 01:23:02.240 |
By the way, that's gonna be really interesting 01:23:10.280 |
- And that's why the Southern District of New York said 01:23:17.240 |
all those PACs and all those political donations-- 01:23:21.000 |
- They have to go and investigate where that money went 01:23:36.200 |
I did watch this weekend, "Triangle of Sadness." 01:23:48.480 |
I thought it was, it didn't pay off the way I thought, 01:23:50.580 |
but this is one of the best setups you'll see in a movie. 01:23:53.280 |
So basically, it's a bunch of people on a luxury yacht. 01:23:57.800 |
So you have a bunch of rich people as the guests. 01:24:00.400 |
Then you have the staff that interacts with them. 01:24:20.520 |
- And then what happens is there's a shipwreck, basically. 01:24:28.080 |
So the plot is you have this Caucasian patriarchy 01:24:44.700 |
So now you flip to this immigrant matriarchy. 01:24:47.440 |
- It's a pretty great meditation on class and survival. 01:25:05.000 |
- All right, well, this has been a great episode. 01:25:12.280 |
What are we doing, salad, some tuna sandwiches? 01:25:15.420 |
- No, I think Kirsten is doing, I think, durod. 01:25:24.800 |
- Jake and I once had a great durod in Venice. 01:25:56.560 |
- Before this podcast made us into mortal enemies. 01:26:00.800 |
you couldn't agree with my take on this documents scandal. 01:26:10.760 |
I think, you know, you keep making me defend Biden. 01:26:24.880 |
- Are we going to play Saturday after the wild card game? 01:26:27.000 |
Are you guys interested in playing Saturday as well? 01:26:34.280 |
- Who's going to the, are you guys all going? 01:26:46.040 |
- That does not play well in confirmation hearings. 01:27:11.320 |
- No, I just said I don't have time to do it. 01:27:12.160 |
- No, it has to do with the cabinet positions. 01:27:13.680 |
He doesn't need to be seen recklessly gambling. 01:27:17.320 |
- If you could pick any cabinet position, Sax, 01:27:30.400 |
- I don't know that those like cabinet positions 01:27:44.640 |
- Trump's idea was like, put a bunch of hardline, 01:27:49.000 |
have them blow up these things and make it more efficient. 01:27:52.560 |
- Yeah, well, you know why a CEO is actually in charge? 01:27:57.400 |
if he doesn't like what you're doing, he'll just fire you. 01:28:01.840 |
when they don't have to listen to anything you say? 01:28:11.000 |
for these departments, these giant departments. 01:28:13.000 |
- Is that a no, or is that a yes, you'd still take state? 01:28:20.840 |
I think he's going for the ambassadorship first. 01:28:24.960 |
- Well, you can't divest everything with no cash. 01:28:27.520 |
- Historically, you can tell which ambassadorship 01:28:30.440 |
is the best one based on how much they charge for it. 01:28:42.640 |
That's what Sax is fourth least expensive home cost. 01:28:47.280 |
- No, no, no, you have to spend that every year 01:28:53.880 |
or the ambassador to the UK, you get the same budget. 01:28:56.800 |
- Actually, what's kind of funny is I know two people 01:29:08.400 |
- They were on fire sale after, because of Trump. 01:29:20.160 |
they already got the all-time highs to take the job. 01:29:22.480 |
He was like, "I got to get out of all of this stuff." 01:29:24.240 |
- No, but listen, let me tell you, the ambassadorships, 01:29:33.400 |
No one remembers what president when you were ambassador, 01:29:39.440 |
- I'm not, I'm not interested in ceremonial things. 01:29:51.120 |
is not much different than being an ambassador. 01:30:04.200 |
Sax on the all in pod or beep as the ambassador of Sweden? 01:30:14.760 |
with your statement about the term mainstream media. 01:30:17.040 |
'Cause I think you have become the mainstream media 01:30:37.440 |
- That is the top word of 2023 so far for me. 01:30:48.720 |
All right, everybody, we'll see you next time