back to indexE169: Elon sues OpenAI, Apple's decline, TikTok ban, Bitcoin $100K?, Science corner: Microplastics
Chapters
0:0 Bestie intros!
0:55 Elon sues OpenAI: complex structure, tax issues, damages, past comparables
37:6 OpenAI's focus on AGI, different interpretations of AGI in tech
44:46 Groq update with Sunny Madra!
49:53 Have we hit peak Apple?: Losing regulatory battles, iPhone stagnation, Buffett starts trimming
66:25 TikTok ban: New proposed House bill would force ByteDance to divest TikTok, or ban the app outright
81:8 Bitcoin hits new all-time high: impact of ETFs and an upcoming halving event
85:6 Science Corner: More data on the negative impacts of microplastics in the bloodstream
00:00:14.500 |
She did a Uranus joke into my chat and I lost it. 00:00:22.960 |
Uranus is right there at the edge of the universe. 00:00:30.080 |
Oh no, we could go to a great restaurant, Uranus. 00:00:56.080 |
Welcome back to your favorite podcast of all time. 00:01:12.280 |
Definitely David Sacks has his Montclair hat back again. 00:01:20.920 |
Issue one, Elon has sued OpenAI, begun the meme wars have. 00:01:43.880 |
As you know, he gave them something like 50 or 75 million 00:01:52.840 |
They became a closed source for profit venture 00:02:00.260 |
from nonprofit tax breaks while building the tech, 00:02:09.240 |
There's a for-profit, it's called OpenAI Global LLC. 00:02:13.900 |
And there's all this funky relationship between the two. 00:02:18.520 |
Elon said that if OpenAI is allowed to do this, 00:02:24.400 |
You can start by donating money to a nonprofit 00:02:45.080 |
And we can get more into a bunch of the OpenAI 00:02:54.880 |
But I want to just get your initial reaction, Sax. 00:03:01.240 |
is just coming from following what's on Twitter. 00:03:03.480 |
And the last time I commented on one of Elon's lawsuits 00:03:07.320 |
on this pod, I ended up in six hours of deposition. 00:03:10.080 |
So remember that was the whole Twitter lawsuit. 00:03:17.060 |
from the tweets that are flying back and forth 00:03:31.260 |
and to promote AI as an open source technology 00:03:40.540 |
In that case, Elon was primarily concerned about Google. 00:03:43.320 |
I think now he's more concerned about Microsoft. 00:03:45.560 |
Nonetheless, the idea was that this would be open source. 00:03:54.060 |
after he donated 40 something million dollars to this. 00:03:58.040 |
They completely changed what it was gonna be. 00:04:00.380 |
And I think he's used the word swindled before. 00:04:05.120 |
And I think usually when there's a lawsuit like this, 00:04:13.700 |
The second is that I think that Elon's making is, 00:04:16.320 |
well, wait a second, if you can start a company 00:04:23.160 |
and then all of a sudden convert to for-profit, 00:04:38.220 |
And they think that they have a smoking gun here 00:04:45.280 |
in order to have a chance of taking on Google/DeepMind. 00:04:49.540 |
I just don't know whether that's the smoking gun 00:05:08.860 |
And then I think Marc Andreessen is responding to Vinod. 00:05:17.260 |
Anyway, it's turned into a whole maelstrom on X, 00:05:25.780 |
So I don't know who's gonna win this case in court, 00:05:31.500 |
And maybe the funniest one is that Elon has said 00:05:45.040 |
which is you ended up doing something different 00:05:55.620 |
So I think that sort of sums up Elon's position on this. 00:06:06.500 |
which is that they used his name pretty aggressively 00:06:24.300 |
So that's probably sacks where also some of this feeling 00:06:33.180 |
But I think that all of those emotions matter less 00:06:37.540 |
than the rule of law, which is his second point, 00:06:41.540 |
irrespective of whatever one email says or another email. 00:06:52.620 |
I think that there's a huge economic incentive 00:06:58.100 |
every other state where these open AI employees lived 00:07:00.960 |
and have gotten equity and now have gotten paid. 00:07:09.020 |
It touches a lot of aspects of very complicated tax law. 00:07:13.480 |
And so I think that's why there will be a resolution 00:07:16.740 |
to this case, because I think that that answer 00:07:20.780 |
And I think it will, as he has correctly identified, 00:07:24.660 |
motivate a lot of people's actions going forward. 00:07:27.800 |
So independent of what the emotional situation is 00:07:49.180 |
we transferred a lot of our IP to Ireland at one moment 00:07:54.100 |
and there was a transfer payment and whatnot. 00:07:56.620 |
And then a few years afterwards when everybody realized, 00:08:02.160 |
but when everybody realized the value of Facebook, 00:08:07.380 |
And it was a huge tax arb that we had facilitated, right? 00:08:12.140 |
Because all of that IP sitting inside of Ireland 00:08:18.140 |
than that IP would have gotten taxed in the United States. 00:08:24.420 |
Similar to you, David, I was in years of depositions 00:08:30.460 |
So the point is that the government really cares 00:08:43.860 |
because there's just too much money at stake. 00:09:02.100 |
where Elon acknowledged and recognized the necessity 00:09:08.120 |
that could pursue the interests of the foundation 00:09:12.460 |
by raising billions of dollars of outside capital. 00:09:16.000 |
I think it's a really interesting set of facts 00:09:20.080 |
that provides a different light on the story. 00:09:23.300 |
And it was really important that OpenAI released it. 00:09:30.560 |
My point is it doesn't allow you to break the law. 00:09:32.720 |
- Yeah, so let me just tell you guys a little bit 00:09:53.760 |
the Cystic Fibrosis Foundation established in 1955. 00:09:56.360 |
They did tons of work on drug development research. 00:09:58.540 |
And it was a nonprofit for years until they realized 00:10:00.860 |
that there needed to be a market-based system 00:10:11.480 |
they invested out of their foundation endowment, 00:10:14.060 |
$40 million in a company called Aurora Biosciences. 00:10:20.280 |
and they continued to invest another $60 million, 00:10:22.560 |
so a total of 100 million bucks in the lifetime 00:10:25.100 |
of the development of a drug that could cure this disease. 00:10:37.560 |
their interest in this for-profit entity for $3.3 billion. 00:10:49.660 |
- Who do they sell it to, back to the company? 00:10:52.700 |
- I actually think they sold the rights to royalty pharma, 00:11:04.620 |
and they make direct investments and other things. 00:11:06.680 |
But it really set the benchmark for this concept 00:11:10.500 |
where a nonprofit parent company can make investments 00:11:13.260 |
in for-profits that can raise additional capital 00:11:17.060 |
that's needed to pursue the broad, difficult interests 00:11:21.860 |
And I think that this argument really kind of ties 00:11:25.500 |
And you can see it in the email exchanges with Elon, 00:11:29.460 |
that I give Elon extraordinary credit for this, 00:11:31.720 |
that he saw that this is going to take billions of dollars 00:11:37.620 |
And there was no way that Elon was going to be able 00:11:39.420 |
to generate that cash himself or that Reid or others 00:11:42.060 |
were going to just be able to pony up that money. 00:11:44.060 |
They needed to have some sort of for-profit vehicle 00:11:50.660 |
into this organization to make this investment interest. 00:12:08.820 |
- So if they do, then they have an investment interest. 00:12:10.900 |
The second question is does the nonprofit parent 00:12:16.280 |
Because if it doesn't, then there's a real question 00:12:20.220 |
I think you have to have a certain amount of your assets 00:12:22.220 |
deployed every year in order to qualify for 501(c)(3). 00:12:25.860 |
So I think the test and the courts will likely end up being, 00:12:28.600 |
look, it's totally reasonable to have a for-profit entity, 00:12:33.820 |
particularly when you need to attract billions of dollars 00:12:42.560 |
Freiburg, do you think that when they booted off 00:12:51.020 |
that they may have crossed a trip wire there? 00:12:58.100 |
and I talked to a couple of people who are in and around 00:13:08.020 |
it wasn't like that cystic fibrosis organization 00:13:23.180 |
- And so what happened here was you had the IP 00:13:29.680 |
they were transferred over to the for-profit organization. 00:13:44.720 |
but in both of their cases, they set up separate boards. 00:13:56.060 |
to all of the employees, which then sold them in secondary. 00:14:00.180 |
So what's happened here is all of the IP in all likelihood 00:14:09.440 |
And this actually triggers a lot of IRS investigations. 00:14:14.140 |
The Mozilla Foundation, which was making the Netscape 00:14:17.060 |
browser, which many of you probably have used 00:14:18.700 |
for the years, they were making hundreds of millions 00:14:20.820 |
of dollars a year, David, from advertising with Google. 00:14:29.060 |
All of the, and Mitchell Baker and the team over there 00:14:35.700 |
into the Mozilla Foundation in their nonprofit efforts. 00:14:44.540 |
But it was very important that these things be separated 00:14:53.320 |
And I think the IRS is gonna have a field day here. 00:14:55.780 |
- Here's the, Nick, can you please throw the image up? 00:15:05.940 |
But it has been reported that Microsoft owns 49%, right? 00:15:14.180 |
In this example, Microsoft, where it says minority 00:15:17.280 |
economic interest, that number there would be 49 00:15:20.000 |
and that's what they own of this OpenAI Global LLC. 00:15:24.400 |
And the majority owner is this holding company. 00:15:40.220 |
we can know very precisely how much the nonprofit owns 00:15:44.880 |
by just Xing out the investors and employees. 00:15:52.900 |
I think that that's probably a pretty reasonable guess. 00:16:01.940 |
Is that fairly reasonable guess at the maximum? 00:16:08.220 |
And so I think that they're gonna have to show 00:16:10.720 |
that ownership structure and decompose all these entities 00:16:15.860 |
If it's a lot less than that, if it's say like a 1% thing, 00:16:26.480 |
they're still gonna have to prove that all of this was done 00:16:29.960 |
in a clean way because the big thing that these guys did 00:16:38.660 |
it's clear that they were trying to avoid something. 00:17:00.380 |
And the discipline is always to tell these lawyers 00:17:04.820 |
And they'll be like, let's set up a master feeder 00:17:06.740 |
and you'll go through Bermuda and this and that. 00:17:09.980 |
And it's so easy to say yes because it's very seductive 00:17:14.380 |
but it's always about trying to avoid something. 00:17:16.320 |
So the real question is why this convoluted structure, 00:17:23.700 |
Like all that IP, Sax, came from a nonprofit. 00:17:26.900 |
All those employees worked for the nonprofit. 00:17:29.260 |
Then they decided to go from an open source model to closed. 00:17:32.380 |
And then by doing that, they capture all the value 00:17:39.860 |
That's where I think something doesn't seem right. 00:17:44.380 |
the majority of the IP that exists at OpenAI today 00:17:48.220 |
was generated after the for-profit was set up. 00:17:50.660 |
The real question, was there a fair transfer at the time 00:17:53.340 |
when it was set up of value into this for-profit entity? 00:17:56.680 |
And my guess is that GP box that you just looked at, 00:17:59.800 |
the compensation that they're earning through that GP box 00:18:08.100 |
relative to what was contributed at the time. 00:18:26.340 |
then I think there's a real question on what's the activity. 00:18:28.820 |
And I don't mean to speak out of turn on all this stuff. 00:18:44.960 |
There's a tried and true way of creating a startup, 00:18:49.860 |
When you try to innovate on legal instead of product, 00:18:59.620 |
damned if you do, damned if you don't situation. 00:19:02.220 |
To go back to Freeberg's point about the nonprofits 00:19:08.980 |
when Sam briefly got fired as CEO and then came back, 00:19:12.560 |
is they had these directors from the nonprofit world 00:19:16.340 |
who didn't really seem to understand how startups work. 00:19:29.340 |
Either way, that whole thing was a spectacle. 00:19:41.220 |
a Silicon Valley startup to maximize the outcome, 00:20:08.900 |
that they've completely changed the original mission 00:20:14.760 |
Right, it was supposed to be-- - Yeah, and the mission. 00:20:28.880 |
and then you realize you've got something super valuable, 00:20:31.720 |
and you close source it, create a for-profit, 00:20:34.740 |
and then take all the employees and all the IP 00:20:37.380 |
and put it into the for-profit, lock it down, 00:20:44.600 |
So if you donate money here and we get the tax exemption, 00:21:01.360 |
which seems to be one of the triggers according to reports, 00:21:06.480 |
maybe that deal-making made the nonprofit people say, 00:21:08.520 |
hey, listen, you're doing even more for-profit stuff 00:21:15.260 |
- He started a venture fund to invest in companies 00:21:21.540 |
which they're saying now it's a clerical mistake 00:21:23.800 |
or something, but he's invested in a bunch of startups 00:21:42.720 |
They're now saying that that was done in error. 00:21:45.640 |
So, okay, fine, maybe it was, maybe it wasn't, who knows? 00:21:54.100 |
- Sam raised the money to create an OpenAI fund 00:21:58.620 |
to invest in companies using OpenAI software. 00:22:08.060 |
where OpenAI's fund-- - Self-dealing, self-dealing. 00:22:16.180 |
of the OpenAI Startup Fund should belong to OpenAI 00:22:18.580 |
if Sam created a separate fund with separate LPs 00:22:22.080 |
that he's the GP of and gets economics in that, 00:22:30.760 |
I'm just saying that's a slightly different issue. 00:22:33.520 |
To go back to the point-- - And it uses the OpenAI name. 00:22:35.900 |
- Yeah, that's where the corporate opportunity 00:22:40.760 |
To go back to the point you were making a minute ago 00:22:48.080 |
I mean, I think we have to complement the employees 00:22:54.780 |
including Sam, including Greg, including Ilya, 00:23:02.960 |
- And they've created, I think, an amazing ecosystem, 00:23:05.860 |
and there's a lot of developers building on top of this, 00:23:16.480 |
The question, and I think you have to include Sam in that 00:23:18.820 |
and say that he's done a great job, okay, as CEO, 00:23:36.260 |
and whether Elon was told something at the beginning 00:23:47.240 |
under which he initially contributed all this money. 00:23:55.220 |
the most cynical interpretation of what happened here. 00:23:57.820 |
There is the most benign and benevolent interpretation 00:24:08.840 |
to give this intellectual property to the world. 00:24:14.540 |
All of humanity is supposed to benefit from it. 00:24:23.940 |
- Right, that's a good point. - The employees. 00:24:28.240 |
and if you really wanna-- - Or the investors. 00:24:36.640 |
to invest in something that's gonna be open sourced, 00:24:38.640 |
and there's no return. - And then the employees 00:24:45.000 |
the most cynical approach to this or interpretation, 00:24:50.840 |
they took an open source project, they closed it, 00:24:54.320 |
they raised money, and then within the next two years 00:24:59.500 |
they sold $2 billion and put that in their pockets. 00:25:02.720 |
Now, those employees, if it was a for-profit, 00:25:18.120 |
and this is where the IRS-- - I think that's a good point. 00:25:20.060 |
- Yeah, and the IRS also interpreted Mozilla, 00:25:35.940 |
So I think the IRS is gonna be on this like crazy 00:25:38.900 |
based upon what happened to Mozilla, which was-- 00:25:41.360 |
- Jason, do you think individual employee sellers 00:25:45.020 |
are gonna get audited by the IRS because of this? 00:25:48.400 |
- I don't, I mean, we're in uncharted territory here. 00:26:06.440 |
or a God King like Sam doing all kinds of deals 00:26:15.220 |
Hey, we'll just pay the employees an extra 50K 00:26:18.560 |
and they'll be a little overpaid in Silicon Valley, 00:26:20.260 |
but they're not getting billions of dollars in equity, 00:26:24.580 |
with people getting billions of dollars in equity 00:26:31.960 |
If this was open source and they took the two billion, 00:26:34.520 |
I would actually not have much of a problem with it 00:26:36.280 |
because we could all be looking at that mission 00:26:50.640 |
So if they're producing, Apple's pretty competent 00:27:00.340 |
it's too dangerous to show us the open source code. 00:27:04.320 |
Their code is not too dangerous for us to sit. 00:27:47.300 |
- I mean, that seems like what happened here to me. 00:27:50.120 |
I mean, it's Occam's Razor kind of situation. 00:27:54.700 |
I think they probably regretted making this a nonprofit 00:27:57.340 |
and then tried to figure out a way to reverse it. 00:27:59.200 |
That's actually what I think is going on here. 00:28:00.920 |
And I do think there is part of it, Chamath, you're right, 00:28:02.880 |
that they needed servers and they needed capacity, 00:28:11.020 |
keeping it open source and then telling Microsoft, 00:28:14.460 |
hey, if you want access to this open source or whatever, 00:28:21.660 |
And the reason is that if it was open source, 00:28:28.420 |
and done its own training and just paid for it. 00:28:36.840 |
- Because OpenAI, yeah, OpenAI dropped a few of the emails 00:28:45.180 |
What we don't know is what are the other emails show? 00:28:55.480 |
Could they have remained open source, for example? 00:28:59.020 |
Also, are there any emails where employees talk 00:29:02.740 |
about the potential benefit to them of going- 00:29:15.020 |
they talked about being able to swap this phantom equity 00:29:24.540 |
- Well, then also like Microsoft coming in and buying 49%, 00:29:29.540 |
This is the other piece that I would wanna see 00:29:32.580 |
And again, I don't have any problem with any of the people. 00:29:37.740 |
I think what they've done for humanity is great. 00:29:39.700 |
I just think they should have kept this thing open source, 00:29:53.420 |
Like, you wanna talk about taking this nonprofit's IP, 00:30:00.040 |
gets given to the employees for billions of dollars, 00:30:01.900 |
and then Microsoft gets 49% of all that nonprofit's effort 00:30:06.900 |
and Microsoft has added, what, $500 billion in market cap 00:30:16.940 |
- And I think this is where you gotta understand 00:30:18.860 |
why Elon feels swindled, is because not only are we going 00:30:25.820 |
to closed source, he was specifically concerned 00:30:36.700 |
There's not really that much of a difference. 00:30:41.740 |
in the hands of one really powerful tech company. 00:30:48.520 |
the biggest company in the world by market cap? 00:30:50.300 |
So this is like the opposite of what he intended. 00:30:52.620 |
- They gifted a trillion dollars to Microsoft, probably. 00:30:56.420 |
If they are people of good faith and they're doing this 00:31:05.620 |
that may solve the lawsuit, and Elon may drop the lawsuit, 00:31:08.300 |
but it's opened a can of worms with respect to tax 00:31:11.800 |
and structuring that's much bigger than just open AI. 00:31:19.300 |
It's every other entrepreneur that studies this model 00:31:21.700 |
and tries to replicate it for their own personal gain, 00:31:34.600 |
- There's the IRS issues, then there's what's morally right, 00:31:38.540 |
And if it is a for-profit company and Elon put in 50 million 00:31:41.180 |
when it was a seed round, what would he own, Sachs? 00:31:44.900 |
What would his ownership in this for-profit be? 00:31:52.260 |
- Yeah, we need to know the total size of that round. 00:32:00.860 |
let's just put a crazy valuation on it, 500 million. 00:32:17.220 |
- That's the way that a very capex-intensive deal 00:32:19.540 |
in a space that wasn't thought to yield big outcomes, 00:32:31.260 |
- We talked about on the show when that whole fracas 00:32:37.060 |
is we said, go back and restructure the whole thing 00:32:48.180 |
'cause it never made sense that Sam had no equity either. 00:32:49.900 |
- Sam has no equity, but he's got a venture firm, 00:33:08.680 |
- Okay, that's a really interesting thing you just said. 00:33:10.700 |
So basically, yeah, it's like he famously has no equity, 00:33:18.440 |
So even though he's not monetizing the thing, 00:33:31.820 |
but that's not really the way it should work either. 00:33:38.280 |
the value of an ecosystem is when the economic value 00:33:41.620 |
generated by the ecosystem exceeds that of the platform. 00:33:44.820 |
Now, in this case, you'd actually rather have 00:33:48.340 |
50 basis points of the ecosystem than 5% of open AI, 00:34:00.460 |
- Sam should just be given like a huge option grant 00:34:07.000 |
- Yeah, and the SEC is looking into all this stuff. 00:34:11.020 |
You know, they look into a lot of things in fairness, 00:34:18.580 |
if you guys know this, but Nick, just as we close here, 00:34:21.960 |
the most insane part of open AI's LP investment agreement, 00:34:24.660 |
which is on their website, you can just search 00:34:29.180 |
The partnership exists to advance open AI Inc.'s mission 00:34:31.740 |
of ensuring that safe artificial and general intentions 00:34:37.780 |
and principles advance in open AI's ink chart, 00:34:39.900 |
yada yada yada, take precedent over the obligation 00:34:45.180 |
and the general partner is under no obligation to do so. 00:34:53.200 |
and/or related expenses without any obligation 00:35:04.220 |
and we can do whatever we want with the profits 00:35:10.340 |
- Do you guys think that when the investors came in, 00:35:20.920 |
Or are they just sort of hand-waved over it and said-- 00:35:32.500 |
or Ernst & Young to do the full financial diligence packet. 00:35:38.980 |
of late-stage organizations document that they've done 00:35:49.160 |
you'll sign a term sheet, you'll turn it over to Deloitte, 00:35:56.740 |
And then what they do is they will furnish a report 00:35:58.740 |
that says, "Yes, this meets all the customary expectations." 00:36:01.960 |
I suspect that if these folks were doing a decent job 00:36:18.380 |
he has a really great commentary on Deloitte and KPMG 00:36:34.360 |
They go to a KPMG or a Deloitte or a Ernst & Young and say... 00:36:49.760 |
In this case, I doubt anything was a muck anyways. 00:36:52.160 |
I doubt though that they looked at the structure 00:37:06.840 |
- The other thing that's completely hypocritical here 00:37:18.400 |
and they're gonna no longer be a nonprofit, et cetera. 00:37:26.960 |
So it should be open source if they haven't hit AGI 00:37:29.560 |
and you don't think they've hit general intelligence, right? 00:37:36.600 |
Maybe you could educate the audience on what that is 00:37:40.840 |
- No, I don't think they have. - Should have shut off 00:37:48.400 |
of the models should be open source versus closed source. 00:37:56.020 |
There may have been some anecdotal conversation 00:38:00.620 |
about we're gonna make the models open source. 00:38:03.020 |
But there was a reason that that change was made 00:38:10.040 |
And those dollars need to have some return of capital 00:38:15.040 |
available to them because they're private investor dollars. 00:38:18.720 |
And so I don't think that that was necessarily, 00:38:23.020 |
that the open AI software models will be open source. 00:38:28.920 |
could probably be interpreted in a lot of different ways 00:38:33.400 |
But no, I don't think that anyone has achieved 00:38:49.320 |
is that their mission is explicitly to create EGI, 00:38:59.480 |
And I think this has raised the fear factor around AI 00:39:09.820 |
Now, I think they defined AGI in a different way. 00:39:12.240 |
They say it's something that can replace 80% of the jobs, 00:39:15.960 |
but I think we all kind of know what it really is. 00:39:33.020 |
of a bunch of really highly qualified knowledge workers, 00:39:36.560 |
and I can sit in front of a computer terminal 00:39:39.200 |
and I can say, "Let's design a mission to Mars." 00:39:41.800 |
A mission to Mars could be a 20-year engineering project 00:39:45.120 |
with hundreds of people involved to design the buildings, 00:39:47.520 |
to design the flight path, to figure out the fuel needs, 00:39:49.840 |
to figure out how you would then be able to terraform Mars. 00:39:52.920 |
And what if one person could interact with a computer 00:39:58.960 |
All of the technical detail docs could be produced, 00:40:01.400 |
all of the engineering specifications could be generated, 00:40:04.260 |
all of the operating plans, all of the dates, 00:40:10.220 |
What would otherwise take NASA or some international 00:40:13.900 |
or well-funded private company many, many decades to do, 00:40:17.620 |
a piece of software could do in a very short order. 00:40:22.140 |
like for me, poignant example of the potential 00:40:31.060 |
We could say, I want to develop a city underneath the ocean 00:40:51.380 |
general intelligence type tooling is extraordinary. 00:40:54.100 |
That one individual starts to have an entire cohort 00:40:59.240 |
of knowledge workers available at their disposal 00:41:01.580 |
to do things that we can't even imagine today. 00:41:07.400 |
a steady state of the world today, that nothing changes. 00:41:09.640 |
Therefore, a piece of software replaces all of us. 00:41:12.280 |
But the potential of humanity starts to stretch 00:41:14.360 |
into a new era that we're not really comfortable with 00:41:17.200 |
because we don't really know it or understand it yet. 00:41:19.440 |
- I'm not saying it's nefarious to want to develop AI 00:41:26.940 |
I'm saying there's something a little bit cultish 00:41:29.040 |
and weird about explicitly devoting yourself to AGI, 00:41:33.200 |
which I think in common parlance means Skynet. 00:41:37.340 |
- Yeah, it means something sentient smarter than humans. 00:41:41.740 |
- Maybe that parlance is what needs to be addressed, 00:41:56.620 |
than the smartest human being who ever lived. 00:41:59.880 |
who's in a position who said the definition of AGI 00:42:03.000 |
is very fuzzy, that there isn't a clear definition. 00:42:06.020 |
And therefore, it allows every side to kind of anchor 00:42:09.240 |
on their interpretation of what that term means 00:42:12.000 |
and therefore kind of justifies their position. 00:42:14.280 |
So I don't really feel great about like just saying, 00:42:18.180 |
We don't have a clear sense of what it means. 00:42:20.080 |
I do think if you look at some of the work that was done 00:42:22.500 |
by Anthropic and published in the Cloud3 model this week, 00:42:25.660 |
did any of you guys see the demos that were done 00:42:28.960 |
There was a guy who wrote a thesis in quantum physics 00:42:34.820 |
And he asked Cloud3 to solve this problem set 00:42:47.840 |
And that's the sort of thing that very few people on earth 00:42:52.020 |
And the Cloud3 model was able to kind of recreate 00:42:54.760 |
the basis, the buildup and then the output of his thesis 00:43:02.840 |
And it's like, oh yeah, this is the third act, 00:43:05.480 |
Like it's pretty impressive, the reasoning ability. 00:43:09.900 |
when these guys say they're gonna create AGI, 00:43:31.440 |
- I think there's a meaningful number of people 00:43:34.000 |
in the tech community who deliberately wanna give rise 00:43:40.000 |
- There's another point of view of super intelligence 00:43:41.760 |
where super intelligence means that the software 00:43:46.860 |
And as a result, the software may have its own motivations 00:43:49.680 |
to figure out how to supersede humans on earth. 00:44:09.200 |
that like, you know, we need to maintain human supremacy, 00:44:21.720 |
but Mustafa came up with the modern Turing test, 00:44:35.040 |
He said the IKEA test, the flat pack furniture test, 00:44:40.840 |
and instructions of an IKEA flat pack product 00:44:43.120 |
that controls a robot to assemble the furniture correctly. 00:44:48.680 |
It's now time to do just a quick little congratulations 00:45:01.760 |
Sandeep is the first person to collect all four besties. 00:45:12.640 |
and we thought we would give him his flowers. 00:45:14.640 |
Sonny, you want to tell us what happened this week 00:45:23.340 |
- Yeah, well, you know, with your guys' support 00:45:25.420 |
and, you know, we've been growing our company 00:45:30.900 |
and we've been working with them for a couple of months 00:45:32.880 |
and all the hype that you've seen has been built on 00:45:37.640 |
building the cloud offering, the API offerings. 00:45:40.280 |
And so, you know, we've decided to merge with them 00:45:44.240 |
And all the besties are now not only shareholders 00:45:48.120 |
in Definitive previously, but now shareholders in Grok. 00:46:20.760 |
from voice to real-time translation of web pages. 00:46:35.960 |
that developers saw when we went from dial-up internet 00:46:40.040 |
and from using like traditional APIs for LLMs 00:46:45.760 |
- You guys support the latest and profit models 00:46:48.400 |
that they just launched that seemed pretty kick-ass. 00:46:51.240 |
- No, we don't have those yet, but we're in just, you know, 00:46:53.820 |
we're having discussions with everyone out there 00:46:57.960 |
Right now, what we've done given all the demand 00:46:59.940 |
is we've kind of limited it to Llama 2, 70B and Mixtrol. 00:47:04.840 |
that we make available in private mode for folks. 00:47:09.320 |
that wants to have us, you know, give us a call 00:47:19.920 |
And it's always fun to kind of be on the journey with you. 00:47:23.020 |
This is my fourth business that I've done with Sonny Madra. 00:47:30.520 |
- No, no, no, there was a company in between. 00:47:33.800 |
because Sonny went to school where I grew up. 00:47:37.080 |
I grew up there and we met through a mutual friend 00:47:53.760 |
Yeah, I mean, whatever I had, I didn't have much. 00:48:06.760 |
- Well, I mean, I've been in two of Sundeep's companies 00:48:12.020 |
And then Extreme Labs, actually, to thank you again. 00:48:14.920 |
You sponsored a lot of my events 10 years ago 00:48:23.400 |
who I work with on this podcast keep winning. 00:48:30.580 |
It means a lot to us, you guys getting the developers 00:48:33.520 |
and keep putting it out there and keep us honest as well. 00:48:37.360 |
So if we're not doing something right, let us know. 00:48:40.840 |
- Sax, you wanna say something motivational to Sonny here? 00:48:44.080 |
I know you always have a great motivational word 00:48:46.380 |
for your friends, something you say that just gives people 00:48:51.460 |
Go ahead, Sax, you always have something kind to say. 00:48:53.560 |
- We're gonna ship 20 million into your safe note. 00:48:56.400 |
I know you closed it, but we're gonna pry it open 00:49:01.960 |
- Wait a second, if you're prying open that safe, 00:49:06.800 |
I know a guy with a podcast who's got 800,000 followers, 00:49:14.120 |
- I don't know, we did a really big announcement 00:49:15.720 |
with the Saudis this week, so that price might be up. 00:49:18.480 |
- So let us ship something into that safe note 00:49:20.340 |
so we actually have some real skin in the game. 00:49:28.400 |
- Jason wants in for 500, by that he means $500, 00:49:32.600 |
- No, I got a gift card for you from Starbucks. 00:49:37.380 |
I think I got like 150,000 in the brief credits. 00:49:39.220 |
- Jacob's gonna redeem that Starbucks gift card. 00:49:48.760 |
All right, everybody, thanks to Sandeep for jumping on. 00:50:01.960 |
You may know about the Apple versus Epic Games saga. 00:50:06.440 |
Epic Games is planning to create a custom app store on iOS 00:50:10.360 |
because Europe's DMA, the Digital Markets Act, 00:50:19.120 |
So Epic created a developer account based in Sweden 00:50:22.080 |
and Apple actually approved the account two weeks ago. 00:50:30.160 |
they terminated the account was because Epic's CEO 00:50:32.100 |
publicly criticized their DMA compliance plan. 00:50:35.520 |
Additionally, on Monday, Apple was fined 2 billion 00:50:40.800 |
and was forced to remove its anti-steering rules 00:50:45.920 |
Basically, Apple has been restricting music apps 00:50:47.640 |
from informing users about pricing and discounts 00:50:50.600 |
and the European Commission considered this anti-competitive 00:51:03.600 |
about how they can't charge 30% more, yada yada. 00:51:06.520 |
Sax, you've spoken about Apple's monopoly before. 00:51:13.040 |
and then we'll get into Apple's wider problems. 00:51:16.640 |
- I mean, did you just say that Apple booted Epic 00:51:20.740 |
from their app store because they didn't like 00:51:24.240 |
I mean, talk about-- - Their feelings were hurt. 00:51:26.420 |
- Well, like, they're violating Epic's free speech 00:51:47.840 |
'cause you don't like their criticism of you. 00:51:53.280 |
exactly what everyone's been saying about Apple, 00:51:55.660 |
which is they're too powerful and heavy-handed. 00:52:13.880 |
- I think it's the beginning of the decay of Apple. 00:52:26.520 |
- The thing is Apple for the last couple of years 00:52:36.440 |
Maybe they can grow by a couple of percentage points 00:52:38.440 |
more than that, but they are effectively levered to GDP. 00:52:42.160 |
Meaning when you look at a Facebook or an Nvidia, 00:52:56.400 |
So that's not super great for its future prospects 00:52:58.760 |
unless it can expand the surface area of where they operate. 00:53:09.880 |
And they just announced that they've killed a project 00:53:11.960 |
in one of those areas, which is autos, right? 00:53:29.880 |
And Nick and I were talking about it this week. 00:53:32.800 |
And what was interesting about Buffett's letter is that 00:53:35.500 |
you can tell when Buffett has gotten disengaged 00:53:48.060 |
this is the number of times Apple was mentioned. 00:53:50.060 |
And just to be clear, what I mean by mentions 00:53:54.780 |
What I mean is when Warren actually explicitly mentions it 00:54:11.500 |
And now what you can start to see is this shrinking. 00:54:14.660 |
And it's gone from basically a bunch of times 00:54:19.140 |
but he mentioned it in the context of talking positively 00:54:26.260 |
but just mentioning that they were not as large 00:54:29.860 |
That's the only mention in this year's annual letter. 00:54:33.440 |
What's interesting about that is the last time 00:54:44.140 |
I think he was able to weather the vicissitudes 00:54:58.200 |
And the number of times it basically was mentioned 00:55:18.180 |
were all the Japanese trading companies that Buffett owns, 00:55:23.300 |
So that is a person that understands the economy, 00:55:27.440 |
I think, better than anybody else in the world. 00:55:29.940 |
And so if you're basically taking a lever bet 00:55:34.620 |
and the person that understands the economy the most 00:55:42.700 |
antitrust rules, killing projects in trillion dollar terms. 00:56:00.100 |
And let's the monitor when he stops talking about them. 00:56:03.900 |
I think this is really just about peak iPhone. 00:56:06.900 |
If you look at the majority of their revenue, 00:56:10.660 |
which has become massively profitable over time. 00:56:17.540 |
And they're starting to make their money from services. 00:56:40.900 |
and then I would buy every like medium upgrade, 00:56:44.380 |
like when they would do like a 12 and then a 12S 00:56:51.560 |
And I just forgot to buy the 14, didn't need it. 00:56:57.020 |
I really couldn't tell the difference between them. 00:57:06.500 |
and I have family members who would take two generations off, 00:57:28.820 |
- You go to general in your settings and about, 00:57:41.420 |
And these things have gotten absurdly expensive. 00:57:49.060 |
- I don't like the hassle of having to reset everything 00:57:59.820 |
- But then on my apps, I have to re-log in and-- 00:58:13.940 |
I've only been on Macintosh and Apple products since then. 00:58:31.660 |
If you worked at Google, were you kind of forced 00:58:35.500 |
Or did you just keep your iPhone in your pocket 00:58:38.260 |
- There weren't a lot of Androids on the market 00:58:40.620 |
when I was at Google, because we acquired the company 00:58:43.900 |
when I was there, and oh, we acquired Andy Rubin's company 00:59:10.620 |
but on a global basis, iOS only has a 27% market share 00:59:18.100 |
Android is 72%, and all other is less than 1%. 00:59:22.180 |
- But look at this, it's all the future GDP is on Android. 00:59:29.660 |
- But I think this is one counter to your point, Chamath, 00:59:32.300 |
that the macro driver is as the economic position, 00:59:36.740 |
the GDP per capita scales in these BRICS nations, 00:59:44.380 |
which are generally some multiple of the Android devices 00:59:52.260 |
if the emerging markets continue to grow GDP per capita 00:59:55.180 |
and iOS continues to be the superior product, 00:59:58.080 |
you'll see Apple able to steal into more share over time. 01:00:01.740 |
But how do you do that when you have messaging groups 01:00:04.620 |
on Android, when you have photos in Google Photos? 01:00:07.860 |
Does the switching cost stop that, do you think? 01:00:13.680 |
They're down to $15 to $25 for these Android phones. 01:00:18.860 |
- That's the difference, right, to these Android devices. 01:00:24.060 |
so they need to figure out some way to grow superior to that. 01:00:31.740 |
- A company that generally has a bunch of cash flow 01:00:36.500 |
being generated by some set of products today, 01:00:42.040 |
the market share for those products could be eaten away, 01:00:50.280 |
The technology value arises from the value of the brand 01:00:52.960 |
that you can launch new products, leverage your brand, 01:00:56.100 |
leverage the sale of new services, and new products. 01:01:01.860 |
the challenge Apple is facing is that the pool of options, 01:01:07.700 |
that you would get a new product coming out of Apple 01:01:09.860 |
is shrinking with the car being taken out of that pool now. 01:01:19.540 |
I feel like there's gonna be a market for that device, 01:01:24.100 |
It has to become more cheap to be more ubiquitous. 01:01:26.980 |
- The thing that a lot of these companies confront, though, 01:01:29.380 |
is that you can also grow inorganically, right? 01:01:31.720 |
You don't necessarily have to incubate these projects. 01:01:34.580 |
We can remember the moment Apple had a chance to buy Tesla, 01:01:44.300 |
pick your company that would take a sweet acquisition offer, 01:01:51.460 |
I'm not saying that these companies are good or not good. 01:02:02.920 |
if they are proving that they can't execute internally, 01:02:06.420 |
the market is going to demand that they prove 01:02:12.460 |
- Yeah, I think the challenge- - And if they don't do that, 01:02:16.520 |
- The challenge is that there's a certain discipline 01:02:21.000 |
and quality to the products and the businesses 01:02:26.060 |
And it's very hard to see that in other markets. 01:02:28.260 |
I mean, why would they go buy a money-losing, 01:02:35.840 |
- It depends on the reasons Project Titan failed. 01:02:42.940 |
that the reason that they abandoned the project 01:02:48.580 |
how to go from where they were to a full production vehicle 01:02:54.040 |
And then when it was proposed that they step down 01:02:56.140 |
and just launch a level three autonomous vehicle, 01:02:58.700 |
everybody said no, that it wasn't disruptive enough. 01:03:02.640 |
the thing that they made a decision about there 01:03:05.440 |
was not going into a market because of regulation, 01:03:10.240 |
And I don't think that that's necessarily a smart decision. 01:03:13.160 |
They would have been better off going into the car market, 01:03:18.560 |
They probably, just like they were able to do in music, 01:03:24.240 |
especially if they stepped in there with their rigor. 01:03:39.400 |
Services seems like the best place for them to make money, 01:03:46.360 |
- My idea was just that they should launch a huge competitor 01:03:50.600 |
So if you look, every developer that Apple has, 01:03:54.360 |
theoretically could have been running on an Apple, 01:03:59.160 |
but they should be running in an Apple cloud. 01:04:02.520 |
and it would have made a ton of sense for app developers 01:04:14.920 |
but here's a bunch of subsidized access to hardware 01:04:19.660 |
And in this AI shift, they can still do that, 01:04:22.240 |
where now people are chipping away at the 30%, right? 01:04:25.880 |
People are saying, "Well, I can just build around you." 01:04:29.120 |
They need to do something and services, Jason, 01:04:36.640 |
- Yeah, I mean, they have all the app developers 01:04:40.200 |
They just email them and say, "Hey, here's your free storage." 01:04:42.400 |
I mean, they could just slowly add features, right? 01:04:48.040 |
I think it's called Maggi or Magpie or something, 01:04:52.400 |
- Well, look, if they could just make Siri work 01:05:04.460 |
I don't think you need to buy a new iPhone for that 01:05:07.240 |
but just getting Siri to work would be a big win. 01:05:12.160 |
- I think that's where they're going with their silicon, 01:05:13.960 |
and they just announced the M3 on the MacBook Air. 01:05:18.000 |
and I think it's going to power LLMs locally on the devices, 01:05:25.640 |
and in Apple Photos, there's now on certain images, 01:05:29.360 |
if you swipe through them, you'll see the little AI icon. 01:05:32.400 |
When you click it, it tells you things in the photo, 01:05:38.120 |
And so they're already subtly adding these features. 01:05:48.140 |
or whatever type of dog you got, you know, Labrador, 01:06:01.120 |
to get actually good AI built into the software, 01:06:05.200 |
then everyone's going to have to upgrade for that. 01:06:20.240 |
and figure out how to customize them for their own products. 01:06:25.120 |
All right, let's go to issue four, TikTok bipartisan ban. 01:06:34.280 |
introduced a bill that would effectively ban TikTok. 01:06:36.760 |
In the House this week, the bill is officially called 01:06:43.120 |
Gives ByteDance 165 days to divest from TikTok. 01:06:53.320 |
It would make it illegal for companies like Apple and Google 01:07:01.960 |
and they claim that they're headquartered in Singapore. 01:07:06.960 |
or do work at TikTok, and they said that's nonsense to me. 01:07:11.160 |
The company said it has not and will not share user data 01:07:23.280 |
And then in 2022, ByteDance admitted that it accessed 01:07:26.040 |
IP addresses and data by journalists covering TikTok 01:07:31.040 |
as ByteDance employees, obviously to find leakers. 01:07:33.940 |
And ByteDance claims they fired the people involved, 01:07:36.720 |
Last year, a former head of engineering of ByteDance US 01:07:39.280 |
said CCP members had god mode access to user data in 2018. 01:08:03.800 |
Do you think the US is crazy for allowing this product? 01:08:06.600 |
Here in the United States, when we're not allowed to put 01:08:09.960 |
Twitter, Facebook, Instagram, pick your social network, 01:08:16.880 |
Look, if it's true that TikTok is sharing data with the CCP, 01:08:23.620 |
within its rights to either ban it or cause it to be divested. 01:08:28.360 |
And I personally like the divestiture option. 01:08:30.720 |
I mean, that's what Trump was suggesting during his term. 01:08:33.520 |
Because we don't, I think, in the United States like to just 01:08:36.560 |
essentially confiscate or destroy people's property. 01:08:45.560 |
that we know is completely separate from and won't 01:08:58.400 |
But it was weirdly prohibiting Americans from using VPNs 01:09:03.240 |
and gave the government the right to go after Americans 01:09:06.920 |
So this one seems cleaner and better and more narrowly 01:09:12.040 |
Now, at the beginning of my response to you, I did say if. 01:09:19.680 |
that it's true that TikTok is sharing data with the CCP. 01:09:25.160 |
But I just want to confirm that that is the case. 01:09:33.200 |
But since we do have a concept of due process in America, 01:09:37.100 |
I do think some evidence should be provided that that actually 01:09:45.000 |
But Chamath, if the CCP is spying on their own citizens, 01:09:48.480 |
what are the chances that they wouldn't take the opportunity 01:10:04.840 |
in America under CCP control or this kind of influence 01:10:09.520 |
I think you're asking the exact right question. 01:10:22.520 |
Basically what happened was that the DOJ filed an indictment. 01:10:29.280 |
Actually, I think I sent you guys the actual indictment. 01:10:34.480 |
But essentially what happened is there was an engineer at Google 01:10:37.800 |
that has been charged with stealing AI secrets for China. 01:10:43.160 |
And I don't know whether he's back in China now or not, 01:10:47.080 |
but the whole point is that if it's happened at Google, where 01:10:51.680 |
there is a motivation for the Chinese intelligence apparatus 01:10:57.480 |
to infiltrate that organization and get access 01:11:04.000 |
presume by default that all of these organizations 01:11:09.120 |
And I think that that's probably a more conservative and 01:11:12.360 |
So Facebook is infiltrated, Google is infiltrated, 01:11:21.640 |
should be considered 100% certainty that this data is 01:11:35.680 |
And I think Palmer Luckey did a very good job of simplifying 01:11:39.840 |
this down to its essence, which is essentially 01:11:42.040 |
what he called the law of equivalent exchange. 01:11:43.960 |
If you want to just play this, it's like just a few seconds. 01:11:46.000 |
I was kind of frustrated that people made TikTok 01:11:49.920 |
By the way, I'm totally on the culture war side of it. 01:11:53.960 |
But I was saying, practically speaking, you should not 01:11:58.080 |
Don't talk about how it's ruining our used ideals. 01:12:04.040 |
We cannot allow them to sell this thing to us if we can't 01:12:15.720 |
Right, so this is what he calls the law of equivalent exchange, 01:12:19.880 |
So on the face, what I would say is, Jake, out of my responses, 01:12:22.760 |
I think that the CCP, but also other intelligence 01:12:26.080 |
organizations, have infiltrated all of these big companies, 01:12:33.520 |
I'm not going to say on a whim, but I think it's accessible. 01:12:37.080 |
I think you have to deal with TikTok as a business issue, 01:12:44.640 |
And I think that that's a fair principle that we can live on. 01:12:49.840 |
Friedberg, let me use your creativity, your love of cinema. 01:12:55.600 |
If you were to use this tool, let's take the most cynical 01:13:07.000 |
Let's say in a conflict like the one going on in Ukraine 01:13:15.320 |
What could they do using the algorithm, using videos? 01:13:19.800 |
What would be the doomsday scenario for America? 01:13:24.840 |
the management of this company and tells them 01:13:36.800 |
that there was a significant surge in pro-Hamas videos 01:13:46.080 |
That's the sort of thing where you could kind of see something 01:13:49.540 |
that sets an opinion that may be disruptive to the social fabric, 01:14:01.160 |
Unlike Facebook and other places where there's a linear feed 01:14:03.840 |
where you can scroll up and select what you want to watch, 01:14:06.160 |
as you know, TikTok has already lined the videos up. 01:14:13.280 |
So the ranking really matters in terms of viewership on TikTok, 01:14:17.240 |
unlike a lot of other kind of select-to-play social media 01:14:38.080 |
have told me I'm an idiot for saying it, but-- 01:14:45.860 |
But think about it, like people don't individually 01:14:48.340 |
go and gather data and then make an informed opinion about who 01:15:00.180 |
That's why they prevented it from happening until-- 01:15:05.120 |
But I mean, I think that's what's so nuts is that there's 01:15:08.000 |
no longer a forced discourse that kind of makes people go 01:15:13.000 |
out and choose what content they want to consume, 01:15:15.880 |
what they want to hear, debate stages, et cetera. 01:15:18.520 |
That now it's about who spends the most money 01:15:22.200 |
and that that actually influences someone's decision 01:15:24.600 |
on who to vote for is what's so compelling to me about why 01:15:29.120 |
all of these systems have such extraordinary power. 01:15:31.360 |
It's just so amazing to me that the more frequently someone 01:15:33.880 |
sees an ad, the more likely they are to buy something 01:15:40.080 |
So the more frequently you show someone a memetic on TikTok, 01:15:42.040 |
the more likely they are to vote something differently. 01:15:48.680 |
for the Chinese government to have any kind of access to it. 01:15:55.520 |
not being partisan in here at all, but we've said 01:16:02.480 |
That was suppressed, obviously, on social networks. 01:16:05.160 |
That could have been amplified in social networks, 01:16:09.160 |
Hillary Clinton's emails, Trump this, Hillary this, 01:16:16.800 |
by putting in specific, subtle information, let alone 01:16:25.400 |
So those things might not have swayed an election. 01:16:35.280 |
Is Putin is suddenly pulling the strings of our election? 01:16:40.040 |
This is like the biggest threat inflation ever. 01:16:50.080 |
They'll say, yeah, of course, people are brainwashed. 01:16:59.800 |
They'll say, yeah, of course, I'm not brainwashed. 01:17:02.840 |
And I believe that people are closer to telling the truth 01:17:10.160 |
better than they understand everyone else's situation. 01:17:19.640 |
through all of the channels, both online and offline, 01:17:23.680 |
Some of those data points come from advertisements, 01:17:27.080 |
but I don't think we take ads very seriously. 01:17:29.560 |
We're trained to kind of even just block out the ads. 01:17:33.000 |
When I see banner ads or even ads in my stream, 01:17:44.960 |
but there are accounts that I've chosen to follow 01:17:51.280 |
the more I disregard them and take them less seriously 01:18:01.960 |
and secretly influenced by maligned foreign actors, 01:18:07.840 |
and that entire narrative might just be completely bogus. 01:18:11.120 |
Nonetheless, I do agree that for data collection reasons 01:18:17.440 |
and reciprocity reasons, I think it's, like I said, 01:18:25.320 |
- Yeah, I don't, you know, it's actually, you said on this-- 01:18:29.120 |
Again, I think this whole disinformation narrative, 01:18:31.520 |
by the way, you wanna know why they push it so hard? 01:18:37.520 |
wants to be involved, and there are political actors 01:18:41.640 |
in the US who wanna regulate, quote, disinformation 01:18:46.760 |
I'm not talking about TikTok, I'm talking about X, 01:18:48.960 |
I'm talking about Facebook, Insta, and so on. 01:19:04.480 |
and what they were censoring. - They blocked the 01:19:07.400 |
And you've said on this program that you believe 01:19:10.840 |
- Well, I think it was, I think that actually 01:19:21.240 |
- I think it was a story that deserved to be published 01:19:26.800 |
so that the public could take that into account 01:19:48.720 |
were deprived of information that they had every right to-- 01:19:52.240 |
- Most Republicans believe if that had come out, 01:20:08.320 |
with the intelligence community published a bogus letter 01:20:10.920 |
saying that it was Russian disinformation, which it wasn't, 01:20:14.080 |
and then that caused our social media sites to suppress it. 01:20:18.080 |
So that, to me, is as concerning, if not more concerning, 01:20:22.960 |
than whatever it is that TikTok's accused of. 01:20:25.740 |
So I don't want social media companies being used 01:20:29.800 |
by the intelligence communities of either China 01:20:33.140 |
or the United States to swing or to influence our elections. 01:20:37.180 |
And we need to be equally concerned about that 01:20:45.420 |
I mean, that kind of sunk Dukakis, if you remember that. 01:20:47.620 |
I mean, media and these ads can really have a big impact. 01:20:54.460 |
whether it's Nixon sweating on TV or the Willie Horton ad, 01:20:57.840 |
there have been many moments where video can do this. 01:20:59.620 |
And I think they could be done even more subtly 01:21:03.120 |
by the Chinese by just promoting certain videos. 01:21:19.500 |
As we're taping, there'd be 75 by the time you hear this, 01:21:36.260 |
They were approved by the SEC on January 10th. 01:21:45.680 |
BlackRock's Bitcoin ETF became the fastest ETF 01:21:51.340 |
Also, as the technical crypto heads in the audience know, 01:21:58.660 |
This happens about every four years at the current pace. 01:22:08.900 |
entering circulation and can cause some swings. 01:22:15.460 |
just about your prediction, and you nailed it again. 01:22:20.340 |
You said this would be a big year for Bitcoin. 01:22:33.100 |
that seem to have a very good pulse and touch 01:22:45.980 |
I'm not sure whether that price is realistic or not 01:22:59.740 |
But when you see the inflows into these ETFs, 01:23:04.400 |
because it just allows every mom and pop individual 01:23:08.780 |
to buy some to the extent that they want to own it 01:23:11.220 |
or they want to speculate on it, whatever it is. 01:23:21.740 |
and it's a setup for something really constructive. 01:23:26.280 |
The other thing I'll say is that it's not just Bitcoin, 01:23:30.540 |
but as goes Bitcoin, there are a handful of other things. 01:23:34.860 |
People are now speculating that there's gonna be 01:23:41.920 |
there's probably legitimate cause to approve a few others. 01:23:44.500 |
So these things are becoming part of the financial fabric. 01:23:48.100 |
And I think that that should not be underestimated. 01:23:55.940 |
but Bitcoin, Friedberg, is incredibly resilient, 01:24:01.140 |
The fact that it hasn't broken down under stress, 01:24:04.860 |
it hasn't had a denial of service type of attack, 01:24:13.220 |
51% of the mining, or some great amount of it, 01:24:19.380 |
you have to be impressed by the fundamental technology here, 01:24:22.180 |
Friedberg, maybe you could speak to that level of success, 01:24:37.540 |
If Bitcoin price drops, transaction fees will decline, 01:24:48.440 |
I think the real question is, in the last couple of years, 01:25:04.100 |
and it's become this kind of stored value asset. 01:25:10.860 |
and since that time, I refuse to open plastic bottles. 01:25:18.540 |
I already did glass bottles in my house 'cause I'm cheap, 01:25:28.080 |
that microplastics are in our blood streams in some cases, 01:25:36.320 |
- Team of scientists in Italy collected samples 01:25:48.100 |
where you get plaque that blocks up in your carotid, 01:25:53.700 |
So a total of 304 patients agreed to have the plaque 01:25:57.360 |
that was removed from their artery submitted for analysis. 01:26:01.300 |
And then what this team did is they took that plaque 01:26:10.880 |
And they used a bunch of measurement techniques to do this, 01:26:17.960 |
'cause it's really hard to find these molecules. 01:26:25.280 |
with a mean level of 21 micrograms per milligram of plaque. 01:26:30.280 |
Roughly one per 50 is the ratio of plastic to plaque 01:26:39.020 |
these little nano and microplastics are accumulating- 01:26:45.660 |
these nanoplastics are accumulating in the human body. 01:27:11.580 |
This was published in the New England Journal of Medicine, 01:27:18.060 |
and that the cumulative problem is likely leading 01:27:24.860 |
from a team in Germany and Norway back in May of 2022. 01:27:29.140 |
And this team tried to figure out how plastics 01:27:31.540 |
are causing adverse health effects in the body. 01:27:34.260 |
And they had a theory like let's put little microplastics 01:27:36.720 |
or nanoplastics together with all the human cells 01:27:38.940 |
that we know, shake it up and see what happens. 01:27:41.860 |
And what they found were that these little plastic fragments 01:27:44.440 |
were binding to dendritic cells and monocytes, 01:27:54.540 |
and the pro-inflammatory signals go through the roof. 01:27:59.500 |
increases inflammation and the cascading effects 01:28:06.180 |
were measured in this set of patients in Italy. 01:28:08.620 |
So again, we're just starting to uncover these effects, 01:28:12.380 |
this concept that microplastics and nanoplastics 01:28:16.240 |
And let me just say these plastics are mostly PET, 01:28:28.700 |
And so as little tiny bits of these plastic materials 01:28:33.140 |
and end up in our water and food supply and we consume them, 01:28:39.060 |
and they may be driving inflammatory response, 01:28:45.160 |
and really studying this, understanding and analyzing it. 01:28:47.620 |
But here's another really interesting empirical data set 01:28:50.680 |
that highlights that this really is a pretty significant, 01:28:57.280 |
they had a four and a half times higher chance of dying 01:29:03.980 |
- Yeah, the thing that that study said, which was nuts, 01:29:12.280 |
were effectively acting as scaffolding for plaque. 01:29:15.080 |
So in almost like it was a shim that allowed it to grow. 01:29:34.900 |
- I was drinking water from plastic water bottles this week 01:29:37.520 |
and every time I drink water out of a plastic bottle now, 01:29:45.400 |
of all cause mortality by drinking a Fiji water. 01:29:53.420 |
- Think about the cumulative effect over time. 01:29:56.820 |
- Imagine drinking water out of a plastic bottle, 01:30:02.860 |
And you do that for 20 years, you may be killing yourself. 01:30:10.440 |
- So wait, what's the, if you, if the water's-- 01:30:13.420 |
- Glass bottles, you must use, you cannot use plastic. 01:30:24.840 |
Plastic, stainless steel, stainless steel is fine. 01:30:28.580 |
- Just like I mentioned when we talked about this 01:30:33.020 |
the environmental cost, the cash cost is much higher 01:30:39.600 |
to having some big, massive response to plastics 01:30:46.540 |
that we all get to live in, we get to have that choice 01:30:52.240 |
because plastics are so ubiquitous in so many things 01:30:54.940 |
and they've enabled affordability of consumer goods. 01:30:59.880 |
Honestly, like all you have to do is have glass bottles 01:31:04.380 |
Like I have a Contigo one I like, I carry it with me, 01:31:12.660 |
and we fill water bottles and put them in the fridge. 01:31:21.260 |
- We try to in our house. - That's what's so scary. 01:31:22.920 |
- We do have a French yogurt that comes in glass bottles, 01:31:25.940 |
- Ah, yes, we do the French yogurt in glass bottles. 01:31:31.760 |
- There's a French yogurt that comes in glass bottles. 01:31:35.760 |
- It's called like Le Benoit or something, Le Benoit. 01:31:38.760 |
- The water, you just install a filter system, 01:31:47.860 |
and put them in the fridge and don't throw them away. 01:31:49.860 |
And give your kids like some of these thermoses or whatever, 01:31:55.460 |
It's been a disaster in the poker game in some ways. 01:31:57.940 |
We got rid of it and there's been way more broken glass. 01:32:01.740 |
People knock over the, you know, the side tables. 01:32:06.100 |
It's been a huge pain, but I will not go back. 01:32:25.980 |
And then if it hits Tamath's threshold, which is higher, 01:32:31.940 |
- As your bestie, I would like you to stop using plastic 01:32:40.260 |
- Can all you (beep)ers come back to the Bay Area 01:32:47.840 |
I miss you guys, too, for the Sultan of Science, 01:32:50.840 |
the Chairman Dictator, and the Rain Man, David Sax.