back to index

E169: Elon sues OpenAI, Apple's decline, TikTok ban, Bitcoin $100K?, Science corner: Microplastics


Chapters

0:0 Bestie intros!
0:55 Elon sues OpenAI: complex structure, tax issues, damages, past comparables
37:6 OpenAI's focus on AGI, different interpretations of AGI in tech
44:46 Groq update with Sunny Madra!
49:53 Have we hit peak Apple?: Losing regulatory battles, iPhone stagnation, Buffett starts trimming
66:25 TikTok ban: New proposed House bill would force ByteDance to divest TikTok, or ban the app outright
81:8 Bitcoin hits new all-time high: impact of ETFs and an upcoming halving event
85:6 Science Corner: More data on the negative impacts of microplastics in the bloodstream

Whisper Transcript | Transcript Only Page

00:00:00.000 | - Chamath, who are you giggling with?
00:00:01.700 | Are you with your kids?
00:00:02.540 | What's going on here?
00:00:03.540 | You're making googly face.
00:00:04.540 | - I got a message from that.
00:00:06.420 | - Oh no, is it Hellmuth?
00:00:08.120 | - No, no, I got a message from that.
00:00:09.860 | She's so funny.
00:00:10.820 | So I was like, blah, blah, blah.
00:00:14.500 | She did a Uranus joke into my chat and I lost it.
00:00:19.500 | - It is like Friedberg always says,
00:00:22.960 | Uranus is right there at the edge of the universe.
00:00:26.760 | No, she's like, are you hungry?
00:00:28.200 | I could eat a Uranus.
00:00:30.080 | Oh no, we could go to a great restaurant, Uranus.
00:00:33.880 | They serve great meals there.
00:00:36.520 | They have a great chocolate lava cake.
00:00:38.540 | (upbeat music)
00:00:41.120 | - All right, everybody.
00:00:56.080 | Welcome back to your favorite podcast of all time.
00:00:59.800 | It's episode 169 of All In with me again,
00:01:04.040 | the chairman dictator, Chamath Palihapitiya,
00:01:06.980 | David Friedberg, your Sultan of Science
00:01:09.480 | and the Rain Man, yeah.
00:01:12.280 | Definitely David Sacks has his Montclair hat back again.
00:01:15.680 | I guess they're back in stock.
00:01:16.980 | How we doing boys?
00:01:17.820 | Welcome back to the show.
00:01:19.080 | Let's get to the docket here.
00:01:20.920 | Issue one, Elon has sued OpenAI, begun the meme wars have.
00:01:26.020 | After we finished the recording last week,
00:01:28.800 | the memes are incredible.
00:01:30.520 | After we finished recording last week,
00:01:31.760 | Elon sued Sam Waltman, Greg Brockman
00:01:33.960 | and the OpenAI organization.
00:01:36.020 | He's suing for breach of contract,
00:01:38.240 | fiduciary duty and unfair competition.
00:01:40.420 | And his argument is basically OpenAI started
00:01:42.760 | as an open source nonprofit.
00:01:43.880 | As you know, he gave them something like 50 or 75 million
00:01:47.120 | of both those numbers quoted.
00:01:49.320 | And then of course, everybody knows OpenAI
00:01:51.560 | turned into closed AI.
00:01:52.840 | They became a closed source for profit venture
00:01:55.280 | after the tech was deployed.
00:01:57.420 | This enabled them obviously to benefit
00:02:00.260 | from nonprofit tax breaks while building the tech,
00:02:03.820 | but they now have two corporate entities.
00:02:07.260 | There's this nonprofit called OpenAI.
00:02:09.240 | There's a for-profit, it's called OpenAI Global LLC.
00:02:11.640 | That was created in 2019.
00:02:13.900 | And there's all this funky relationship between the two.
00:02:16.320 | We'll get into that
00:02:17.160 | 'cause it's kind of interesting actually.
00:02:18.520 | Elon said that if OpenAI is allowed to do this,
00:02:21.020 | then it should be the standard
00:02:22.020 | for every company going forward.
00:02:23.160 | That's an interesting point.
00:02:24.400 | You can start by donating money to a nonprofit
00:02:26.240 | and then make a for-profit.
00:02:27.160 | It seems pretty capital efficient.
00:02:28.640 | So what does Elon want to get out of this?
00:02:30.620 | According to the lawsuit,
00:02:31.680 | he wants OpenAI to open source their models.
00:02:34.380 | By the way, that's what Facebook and Apple
00:02:35.760 | are doing right now.
00:02:36.600 | So that seems more than reasonable
00:02:37.920 | since that's how the company was formed.
00:02:41.280 | And he wants to make sure that shareholders
00:02:42.720 | receive no financial benefit from OpenAI.
00:02:45.080 | And we can get more into a bunch of the OpenAI
00:02:50.260 | nonprofit status and their structure.
00:02:52.400 | It's super convoluted.
00:02:54.880 | But I want to just get your initial reaction, Sax.
00:02:58.080 | I haven't read the legal filings, Jason.
00:03:00.280 | My knowledge of the case
00:03:01.240 | is just coming from following what's on Twitter.
00:03:03.480 | And the last time I commented on one of Elon's lawsuits
00:03:07.320 | on this pod, I ended up in six hours of deposition.
00:03:10.080 | So remember that was the whole Twitter lawsuit.
00:03:15.080 | At any event, my understanding of the case
00:03:17.060 | from the tweets that are flying back and forth
00:03:21.920 | is I think Elon is making two points.
00:03:24.260 | One is that when OpenAI was set up,
00:03:28.540 | it was set up to be a nonprofit
00:03:31.260 | and to promote AI as an open source technology
00:03:35.160 | that everyone could benefit from
00:03:36.840 | and no one large tech company would control.
00:03:40.540 | In that case, Elon was primarily concerned about Google.
00:03:43.320 | I think now he's more concerned about Microsoft.
00:03:45.560 | Nonetheless, the idea was that this would be open source.
00:03:49.380 | So I think point number one is Elon feels
00:03:52.020 | like the rug was pulled out from under him
00:03:54.060 | after he donated 40 something million dollars to this.
00:03:58.040 | They completely changed what it was gonna be.
00:04:00.380 | And I think he's used the word swindled before.
00:04:02.620 | He feels like he was cheated.
00:04:05.120 | And I think usually when there's a lawsuit like this,
00:04:07.620 | it's usually because somebody does feel
00:04:09.420 | fundamentally cheated by what happened.
00:04:12.140 | So I think that's the first point.
00:04:13.700 | The second is that I think that Elon's making is,
00:04:16.320 | well, wait a second, if you can start a company
00:04:20.020 | as a nonprofit using pre-tax dollars
00:04:23.160 | and then all of a sudden convert to for-profit,
00:04:25.640 | then why wouldn't everybody do this
00:04:27.080 | in order to circumvent paying taxes?
00:04:29.160 | And I think that's another, I'd say,
00:04:30.880 | valid and interesting point that he's made.
00:04:33.020 | Now, OpenAI has responded to this
00:04:35.520 | by publishing some of Elon's emails to them.
00:04:38.220 | And they think that they have a smoking gun here
00:04:40.300 | because apparently Elon told them
00:04:43.440 | they need to raise far more money
00:04:45.280 | in order to have a chance of taking on Google/DeepMind.
00:04:49.540 | I just don't know whether that's the smoking gun
00:04:51.160 | they claim it is, but that's basically
00:04:52.940 | what they're putting out there.
00:04:54.580 | And then to add fuel to the fire,
00:04:57.460 | you've got VCs coming over the top
00:04:59.080 | debating different aspects of this.
00:05:01.060 | You've got sort of Vinod defending OpenAI.
00:05:04.020 | He was the first VC investor in it.
00:05:06.240 | And so he's been out there attacking Elon.
00:05:08.860 | And then I think Marc Andreessen is responding to Vinod.
00:05:15.060 | I don't know if he's exactly defending Elon.
00:05:17.260 | Anyway, it's turned into a whole maelstrom on X,
00:05:20.560 | and then everyone else is starting
00:05:22.340 | to put out a whole bunch of crazy memes.
00:05:25.780 | So I don't know who's gonna win this case in court,
00:05:28.380 | but the memes are definitely lit.
00:05:31.500 | And maybe the funniest one is that Elon has said
00:05:36.020 | that if OpenAI will simply change their name
00:05:38.420 | to ClosedAI, he'll drop the lawsuit.
00:05:40.800 | - I think he's serious about that.
00:05:42.280 | - I think he is serious about it,
00:05:43.660 | but I think he's making his point,
00:05:45.040 | which is you ended up doing something different
00:05:48.900 | than what you told me when you got me
00:05:51.620 | to write this big donation,
00:05:52.780 | and when we co-founded this thing together.
00:05:55.620 | So I think that sort of sums up Elon's position on this.
00:05:59.720 | - Chamath, a year from now,
00:06:02.460 | where do you think we'll be with this?
00:06:04.340 | - Well, I think there's one more thing,
00:06:06.500 | which is that they used his name pretty aggressively
00:06:11.500 | to get more money.
00:06:13.120 | And he was very instrumental
00:06:15.380 | in getting some of the early critical hires,
00:06:18.340 | particularly Ilya, which has been, I think,
00:06:20.420 | well-documented to leave Google, right?
00:06:24.300 | So that's probably sacks where also some of this feeling
00:06:28.020 | that he was bamboozled comes from.
00:06:33.180 | But I think that all of those emotions matter less
00:06:37.540 | than the rule of law, which is his second point,
00:06:39.800 | which is the important point,
00:06:41.540 | irrespective of whatever one email says or another email.
00:06:44.880 | It's not great for the US tax system
00:06:47.720 | if all of a sudden a big gaping loophole
00:06:50.480 | is identified and taken advantage of.
00:06:52.620 | I think that there's a huge economic incentive
00:06:55.680 | for the state of California,
00:06:58.100 | every other state where these open AI employees lived
00:07:00.960 | and have gotten equity and now have gotten paid.
00:07:03.740 | There's an incentive for treasury.
00:07:06.000 | There's an incentive for the IRS.
00:07:09.020 | It touches a lot of aspects of very complicated tax law.
00:07:13.480 | And so I think that's why there will be a resolution
00:07:16.740 | to this case, because I think that that answer
00:07:19.820 | is very important.
00:07:20.780 | And I think it will, as he has correctly identified,
00:07:24.660 | motivate a lot of people's actions going forward.
00:07:27.800 | So independent of what the emotional situation is
00:07:31.460 | amongst all the actors in this play,
00:07:34.340 | the reality is he's identified a loophole
00:07:36.540 | and that loophole needs to get fixed.
00:07:37.860 | I'll give you a different example of this
00:07:40.100 | that's much more benign,
00:07:41.140 | but it dragged on for eight or nine years.
00:07:44.360 | At Facebook, there was a period where,
00:07:46.740 | and this was public, so I can tell you,
00:07:49.180 | we transferred a lot of our IP to Ireland at one moment
00:07:54.100 | and there was a transfer payment and whatnot.
00:07:56.620 | And then a few years afterwards when everybody realized,
00:08:00.260 | including us, and we had miscalculated,
00:08:02.160 | but when everybody realized the value of Facebook,
00:08:05.140 | that transfer payment did not seem correct.
00:08:07.380 | And it was a huge tax arb that we had facilitated, right?
00:08:12.140 | Because all of that IP sitting inside of Ireland
00:08:15.140 | gets taxed at a very different rate
00:08:18.140 | than that IP would have gotten taxed in the United States.
00:08:20.440 | And other companies had copied this.
00:08:22.580 | Long story short, the IRS sued.
00:08:24.420 | Similar to you, David, I was in years of depositions
00:08:28.380 | and interviews and all of this stuff.
00:08:30.460 | So the point is that the government really cares
00:08:33.060 | about these kinds of things
00:08:34.240 | because so much money is on the line.
00:08:36.960 | And if OpenAI turns out to be
00:08:38.980 | this multi-hundred billion dollar behemoth,
00:08:41.780 | this will get figured out in court
00:08:43.860 | because there's just too much money at stake.
00:08:46.240 | - Freeberg, any thoughts and maybe Steelman,
00:08:50.060 | if you feel like it, OpenAI's position?
00:08:53.140 | - I think we're all ignoring the documents
00:08:57.000 | that OpenAI put out yesterday,
00:08:58.700 | showing emails and interactions with Elon,
00:09:02.100 | where Elon acknowledged and recognized the necessity
00:09:05.100 | of having a for-profit subsidiary
00:09:08.120 | that could pursue the interests of the foundation
00:09:12.460 | by raising billions of dollars of outside capital.
00:09:16.000 | I think it's a really interesting set of facts
00:09:20.080 | that provides a different light on the story.
00:09:23.300 | And it was really important that OpenAI released it.
00:09:25.860 | I'm gonna share a-
00:09:27.080 | - Hold on, I didn't mention that.
00:09:29.220 | - Freeberg, I mentioned that too.
00:09:30.560 | My point is it doesn't allow you to break the law.
00:09:32.720 | - Yeah, so let me just tell you guys a little bit
00:09:35.580 | about another example.
00:09:37.720 | My sister works at a nonprofit
00:09:39.820 | called the Cystic Fibrosis Foundation.
00:09:41.980 | Cystic Fibrosis is an inherited disorder,
00:09:44.860 | affects the lungs and digestive tract.
00:09:46.760 | It's debilitating, really affects children.
00:09:50.660 | There was a nonprofit called
00:09:53.760 | the Cystic Fibrosis Foundation established in 1955.
00:09:56.360 | They did tons of work on drug development research.
00:09:58.540 | And it was a nonprofit for years until they realized
00:10:00.860 | that there needed to be a market-based system
00:10:02.680 | to create the necessary incentives
00:10:04.240 | to drive the capital needed to find a drug
00:10:07.000 | that could cure cystic fibrosis.
00:10:09.300 | And in the year 2000,
00:10:11.480 | they invested out of their foundation endowment,
00:10:14.060 | $40 million in a company called Aurora Biosciences.
00:10:18.140 | Subsequently, it was acquired by Vertex
00:10:20.280 | and they continued to invest another $60 million,
00:10:22.560 | so a total of 100 million bucks in the lifetime
00:10:25.100 | of the development of a drug that could cure this disease.
00:10:28.740 | In 2005, a drug was discovered,
00:10:32.300 | and in 2012, the FDA approved it.
00:10:35.180 | The nonprofit then sold their rights,
00:10:37.560 | their interest in this for-profit entity for $3.3 billion.
00:10:42.560 | It was an incredible return.
00:10:44.860 | It was the largest investment return
00:10:47.500 | by a nonprofit in history.
00:10:49.660 | - Who do they sell it to, back to the company?
00:10:52.700 | - I actually think they sold the rights to royalty pharma,
00:10:55.260 | who you and I know well.
00:10:57.340 | And that $3.3 billion opened up this ability
00:11:01.180 | for them to continue to make investments.
00:11:02.620 | They now fund biotech VCs
00:11:04.620 | and they make direct investments and other things.
00:11:06.680 | But it really set the benchmark for this concept
00:11:08.680 | of what's called venture philanthropy,
00:11:10.500 | where a nonprofit parent company can make investments
00:11:13.260 | in for-profits that can raise additional capital
00:11:17.060 | that's needed to pursue the broad, difficult interests
00:11:20.400 | of the nonprofit.
00:11:21.860 | And I think that this argument really kind of ties
00:11:24.180 | into what happened with OpenAI.
00:11:25.500 | And you can see it in the email exchanges with Elon,
00:11:28.060 | where he was so prescient
00:11:29.460 | that I give Elon extraordinary credit for this,
00:11:31.720 | that he saw that this is going to take billions of dollars
00:11:33.740 | a year of investment to realize the pursuit
00:11:36.260 | that OpenAI was going after.
00:11:37.620 | And there was no way that Elon was going to be able
00:11:39.420 | to generate that cash himself or that Reid or others
00:11:42.060 | were going to just be able to pony up that money.
00:11:44.060 | They needed to have some sort of for-profit vehicle
00:11:46.600 | that would allow the market to work
00:11:48.100 | and allow capitalists to find their capital
00:11:50.660 | into this organization to make this investment interest.
00:11:53.540 | So I think the real question with respect
00:11:56.340 | to is OpenAI in trouble as a foundation
00:11:59.580 | is does the nonprofit own a meaningful piece
00:12:03.100 | of the for-profit entity?
00:12:04.420 | And I don't know the answer to that.
00:12:05.420 | I don't know if you guys do.
00:12:06.640 | - Sounds like it.
00:12:07.480 | I mean, yeah, it must, right?
00:12:08.820 | - So if they do, then they have an investment interest.
00:12:10.900 | The second question is does the nonprofit parent
00:12:14.180 | still do charitable stuff?
00:12:16.280 | Because if it doesn't, then there's a real question
00:12:18.900 | on the nonprofit status.
00:12:20.220 | I think you have to have a certain amount of your assets
00:12:22.220 | deployed every year in order to qualify for 501(c)(3).
00:12:25.860 | So I think the test and the courts will likely end up being,
00:12:28.600 | look, it's totally reasonable to have a for-profit entity,
00:12:31.260 | to fund a for-profit entity.
00:12:32.420 | Other nonprofits have done it,
00:12:33.820 | particularly when you need to attract billions of dollars
00:12:36.100 | of outside capital to make it work.
00:12:37.780 | The real question is does the nonprofit
00:12:39.300 | still do nonprofit stuff?
00:12:40.540 | - That's an interesting question because,
00:12:42.560 | Freiburg, do you think that when they booted off
00:12:46.780 | the nonprofit experts from the board,
00:12:51.020 | that they may have crossed a trip wire there?
00:12:53.860 | - Yes.
00:12:54.700 | - Yeah, I'm the lawyer.
00:12:55.540 | I'm the lawyer.
00:12:56.360 | That's a great question.
00:12:57.200 | - I did a little journalism,
00:12:58.100 | and I talked to a couple of people who are in and around
00:13:02.140 | two other of these scenarios.
00:13:03.420 | One of the key issues here is that the IP,
00:13:06.180 | in your example, Freiburg,
00:13:08.020 | it wasn't like that cystic fibrosis organization
00:13:11.380 | did the drug discovery, right?
00:13:12.860 | They didn't do it, correct?
00:13:15.740 | Well, they funded it.
00:13:16.580 | - I think they handed over some research
00:13:18.720 | that was open to the public at the time,
00:13:20.300 | but they funded an independent for-profit.
00:13:23.180 | - And so what happened here was you had the IP
00:13:26.220 | and the employees of the nonprofit gifted,
00:13:29.680 | they were transferred over to the for-profit organization.
00:13:33.460 | And when the Mozilla Foundation did this
00:13:36.260 | and another company called Samasource,
00:13:38.900 | and these both have two structures,
00:13:43.060 | a nonprofit and a for-profit,
00:13:44.720 | but in both of their cases, they set up separate boards.
00:13:47.940 | And the two different boards,
00:13:49.860 | the answer to the for-profit concerns
00:13:51.740 | and the nonprofit concerns,
00:13:53.120 | they also didn't give huge chunks of equity
00:13:56.060 | to all of the employees, which then sold them in secondary.
00:14:00.180 | So what's happened here is all of the IP in all likelihood
00:14:04.220 | has been, and the wealth transfer
00:14:05.940 | has gone to all these employees.
00:14:07.660 | They've now done the secondary.
00:14:09.440 | And this actually triggers a lot of IRS investigations.
00:14:14.140 | The Mozilla Foundation, which was making the Netscape
00:14:17.060 | browser, which many of you probably have used
00:14:18.700 | for the years, they were making hundreds of millions
00:14:20.820 | of dollars a year, David, from advertising with Google.
00:14:23.780 | The IRS said, "Hey, wait a second,
00:14:24.900 | "you look like a for-profit to us."
00:14:26.700 | So they made a for-profit entity.
00:14:29.060 | All of the, and Mitchell Baker and the team over there
00:14:33.740 | set this up so all the money went back
00:14:35.700 | into the Mozilla Foundation in their nonprofit efforts.
00:14:40.460 | And they just paid above market salaries.
00:14:44.540 | But it was very important that these things be separated
00:14:47.160 | and that the employees not have equity
00:14:48.780 | and that all that money flowed back up.
00:14:50.580 | And I think this is where OpenAI
00:14:51.700 | is gonna get super tripped up.
00:14:53.320 | And I think the IRS is gonna have a field day here.
00:14:55.780 | - Here's the, Nick, can you please throw the image up?
00:14:57.820 | This is from the OpenAI website
00:15:00.380 | that explains their structure.
00:15:02.140 | So let's just speculate for a second
00:15:04.520 | 'cause we're just guessing.
00:15:05.940 | But it has been reported that Microsoft owns 49%, right?
00:15:10.940 | That's the number?
00:15:12.480 | - Of the for-profit.
00:15:14.180 | In this example, Microsoft, where it says minority
00:15:17.280 | economic interest, that number there would be 49
00:15:20.000 | and that's what they own of this OpenAI Global LLC.
00:15:24.400 | And the majority owner is this holding company.
00:15:27.040 | Okay, so that means 51% is there.
00:15:29.660 | But it says here that that 51% is owned
00:15:32.400 | by a combination of the nonprofit,
00:15:35.180 | employees and investors.
00:15:37.540 | So the question is, back to Sax's question,
00:15:40.220 | we can know very precisely how much the nonprofit owns
00:15:44.880 | by just Xing out the investors and employees.
00:15:48.900 | And I'm gonna guess it's at least 30 or 40%.
00:15:52.900 | I think that that's probably a pretty reasonable guess.
00:15:55.700 | So it means that the OpenAI Foundation
00:15:57.720 | probably owns somewhere between five to 20%.
00:16:01.940 | Is that fairly reasonable guess at the maximum?
00:16:06.420 | - That seems right, yeah.
00:16:08.220 | And so I think that they're gonna have to show
00:16:10.720 | that ownership structure and decompose all these entities
00:16:14.040 | and show the actual percentages.
00:16:15.860 | If it's a lot less than that, if it's say like a 1% thing,
00:16:21.180 | then it's a little bit more sketch, I think.
00:16:23.220 | But if it's really like 15 or 20%,
00:16:26.480 | they're still gonna have to prove that all of this was done
00:16:29.960 | in a clean way because the big thing that these guys did
00:16:33.940 | was because by creating this LPGP structure
00:16:36.460 | and these cap profits,
00:16:38.660 | it's clear that they were trying to avoid something.
00:16:41.300 | And the question is in the discovery,
00:16:43.980 | will everybody learn what it is
00:16:46.540 | that they were trying to avoid?
00:16:47.660 | Because I've been through this,
00:16:49.500 | I think Sax, you've been through it,
00:16:50.780 | Freeburg, you were through it.
00:16:52.220 | When you set up these large vehicles
00:16:55.420 | and pooled amounts of money,
00:16:57.100 | folks come out of the woodwork
00:16:58.340 | with all kinds of convoluted structures.
00:17:00.380 | And the discipline is always to tell these lawyers
00:17:02.580 | and accountants, no.
00:17:04.820 | And they'll be like, let's set up a master feeder
00:17:06.740 | and you'll go through Bermuda and this and that.
00:17:09.980 | And it's so easy to say yes because it's very seductive
00:17:12.860 | what they're selling you,
00:17:14.380 | but it's always about trying to avoid something.
00:17:16.320 | So the real question is why this convoluted structure,
00:17:19.480 | what was it trying to avoid?
00:17:21.260 | - And where did this IP come from?
00:17:23.700 | Like all that IP, Sax, came from a nonprofit.
00:17:26.900 | All those employees worked for the nonprofit.
00:17:29.260 | Then they decided to go from an open source model to closed.
00:17:32.380 | And then by doing that, they capture all the value
00:17:35.940 | and then it's gifted to these employees
00:17:38.760 | and this investor base.
00:17:39.860 | That's where I think something doesn't seem right.
00:17:43.380 | - Well, it's important to remember
00:17:44.380 | the majority of the IP that exists at OpenAI today
00:17:48.220 | was generated after the for-profit was set up.
00:17:50.660 | The real question, was there a fair transfer at the time
00:17:53.340 | when it was set up of value into this for-profit entity?
00:17:56.680 | And my guess is that GP box that you just looked at,
00:17:59.800 | the compensation that they're earning through that GP box
00:18:02.460 | that the nonprofit earns,
00:18:04.080 | is likely meant to be kind of a fair amount
00:18:08.100 | relative to what was contributed at the time.
00:18:10.280 | The real question for me still remains,
00:18:12.420 | what other nonprofit and charitable work
00:18:15.480 | does the OpenAI 501(c)(3) actually do?
00:18:18.740 | And if the answer is nothing
00:18:20.600 | and it's just a shell nonprofit
00:18:23.000 | and under it the only thing that it owns
00:18:24.560 | is a for-profit interest,
00:18:26.340 | then I think there's a real question on what's the activity.
00:18:28.820 | And I don't mean to speak out of turn on all this stuff.
00:18:30.900 | I don't know enough,
00:18:32.760 | but I think that's the only sort of lens
00:18:34.340 | I would kind of look at this stuff through.
00:18:35.940 | - Sax, you wanna say something?
00:18:37.100 | - Well, a couple of points.
00:18:38.200 | One is, one of my contentions has been
00:18:41.220 | never innovate on structure.
00:18:43.060 | All you do is create legal problems.
00:18:44.960 | There's a tried and true way of creating a startup,
00:18:48.160 | which is a C corp.
00:18:49.860 | When you try to innovate on legal instead of product,
00:18:53.060 | it almost always backfires.
00:18:55.560 | The second point here is,
00:18:57.180 | I think OpenAI is in a little bit of a
00:18:59.620 | damned if you do, damned if you don't situation.
00:19:02.220 | To go back to Freeberg's point about the nonprofits
00:19:05.500 | sitting at the top there,
00:19:07.000 | remember what created all the problems
00:19:08.980 | when Sam briefly got fired as CEO and then came back,
00:19:12.560 | is they had these directors from the nonprofit world
00:19:16.340 | who didn't really seem to understand how startups work.
00:19:19.140 | And so either they fired Sam for no reason,
00:19:23.260 | which was kind of incompetent,
00:19:24.620 | or they had a really good reason
00:19:26.220 | but didn't communicate it properly,
00:19:27.920 | which was also incompetent.
00:19:29.340 | Either way, that whole thing was a spectacle.
00:19:32.740 | And it just kind of showed
00:19:34.180 | that there was this culture clash going on
00:19:36.580 | between the for-profit company
00:19:39.140 | and the standard way of running
00:19:41.220 | a Silicon Valley startup to maximize the outcome,
00:19:43.900 | and then this nonprofit board
00:19:45.460 | that was sitting on top of the whole thing.
00:19:47.560 | So I think that now they're in the situation
00:19:50.900 | where they've changed that nonprofit board,
00:19:54.680 | they've booted off the nonprofit people,
00:19:59.160 | and that may have been the right thing
00:20:01.140 | for the for-profit entity,
00:20:03.260 | but now it might get them in trouble
00:20:05.100 | because it lends credence to Elon's lawsuit
00:20:08.900 | that they've completely changed the original mission
00:20:12.240 | of this organization.
00:20:14.760 | Right, it was supposed to be-- - Yeah, and the mission.
00:20:16.100 | - It was supposed to be open source,
00:20:18.220 | and it was supposed to be nonprofit,
00:20:20.700 | and now it's for-profit and closed source.
00:20:22.580 | - Think about this, and again,
00:20:24.140 | we are speculating here,
00:20:25.500 | but if the mission was to be open source,
00:20:28.880 | and then you realize you've got something super valuable,
00:20:31.720 | and you close source it, create a for-profit,
00:20:34.740 | and then take all the employees and all the IP
00:20:37.380 | and put it into the for-profit, lock it down,
00:20:39.940 | break the original mission that,
00:20:41.220 | hey, this was gonna help humanity
00:20:42.920 | because we're gonna make it open source.
00:20:44.600 | So if you donate money here and we get the tax exemption,
00:20:47.560 | humanity benefits because everybody gets
00:20:50.020 | to look at that code at the same time
00:20:51.600 | and work on it and iterate.
00:20:53.400 | Now they've just basically stolen all that
00:20:56.400 | in order to enrich themselves.
00:20:57.820 | And I think then you add to it,
00:20:59.280 | Sam maybe doing some deal-making,
00:21:01.360 | which seems to be one of the triggers according to reports,
00:21:03.660 | and this OpenAI Venture Fund, et cetera,
00:21:06.480 | maybe that deal-making made the nonprofit people say,
00:21:08.520 | hey, listen, you're doing even more for-profit stuff
00:21:10.940 | with the OpenAI name.
00:21:12.360 | We're supposed to be a nonprofit here.
00:21:13.900 | - What's the venture fund thing?
00:21:15.260 | - He started a venture fund to invest in companies
00:21:17.740 | called the OpenAI Ventures Fund,
00:21:19.580 | and he was the sole owner of it,
00:21:21.540 | which they're saying now it's a clerical mistake
00:21:23.800 | or something, but he's invested in a bunch of startups
00:21:26.720 | that have unique access to the OpenAI,
00:21:30.200 | you know, I think infrastructure.
00:21:32.880 | - He's the GP on behalf of OpenAI.
00:21:34.500 | - Yes, the OpenAI.fund, yes.
00:21:36.780 | - No, but he's the GP on behalf of himself
00:21:38.820 | or on behalf of OpenAI.
00:21:40.160 | - The way the paperwork was done,
00:21:41.480 | it was on behalf of himself.
00:21:42.720 | They're now saying that that was done in error.
00:21:45.640 | So, okay, fine, maybe it was, maybe it wasn't, who knows?
00:21:47.880 | - Whoa, whoa, whoa, wait, the OpenAI fund,
00:21:52.820 | - Whose money is that?
00:21:54.100 | - Sam raised the money to create an OpenAI fund
00:21:58.620 | to invest in companies using OpenAI software.
00:22:02.160 | - Okay, so that's a different kind of issue.
00:22:04.820 | That's called a corporate opportunity issue
00:22:08.060 | where OpenAI's fund-- - Self-dealing, self-dealing.
00:22:11.140 | - It's basically a conflict of interest
00:22:13.700 | where the economic opportunity
00:22:16.180 | of the OpenAI Startup Fund should belong to OpenAI
00:22:18.580 | if Sam created a separate fund with separate LPs
00:22:22.080 | that he's the GP of and gets economics in that,
00:22:24.760 | then that's a potential usurpation
00:22:26.380 | of a corporate opportunity.
00:22:27.640 | I don't know the truth of that matter.
00:22:30.760 | I'm just saying that's a slightly different issue.
00:22:33.520 | To go back to the point-- - And it uses the OpenAI name.
00:22:35.900 | - Yeah, that's where the corporate opportunity
00:22:38.400 | becomes really explicit.
00:22:40.760 | To go back to the point you were making a minute ago
00:22:43.200 | about the employees, I just think,
00:22:45.240 | I mean, you used the word stealing.
00:22:46.580 | I think that's a really strong word.
00:22:48.080 | I mean, I think we have to complement the employees
00:22:51.940 | of this company, if it is a company,
00:22:54.780 | including Sam, including Greg, including Ilya,
00:22:57.480 | for creating an amazing product, right?
00:23:00.460 | - Something incredible.
00:23:01.300 | - They've created something incredible.
00:23:02.140 | - Sure, absolutely.
00:23:02.960 | - And they've created, I think, an amazing ecosystem,
00:23:05.860 | and there's a lot of developers building on top of this,
00:23:08.160 | and I think no one is attacking their work.
00:23:11.660 | And I think it's a little too harsh to say
00:23:13.280 | that anyone's stealing anything
00:23:14.300 | 'cause they have created all the value.
00:23:16.480 | The question, and I think you have to include Sam in that
00:23:18.820 | and say that he's done a great job, okay, as CEO,
00:23:21.740 | because they've built something amazing.
00:23:23.260 | - And, Sax, to your point, it's not stealing
00:23:25.400 | because very smart investors
00:23:27.100 | are putting in money eyes wide open.
00:23:29.740 | - Yeah, I think that the criticism here
00:23:32.160 | is related to structure, from my standpoint.
00:23:34.820 | It's related to structure,
00:23:36.260 | and whether Elon was told something at the beginning
00:23:39.620 | that's different, it got changed,
00:23:41.700 | and clearly it got changed in a way
00:23:44.440 | that was not consistent with the terms
00:23:47.240 | under which he initially contributed all this money.
00:23:50.560 | - Yeah, mine is, I'm stating what is
00:23:55.220 | the most cynical interpretation of what happened here.
00:23:57.820 | There is the most benign and benevolent interpretation
00:24:01.820 | you could have as well,
00:24:02.660 | and maybe the truth is in between the two,
00:24:04.540 | but the series of events that occurred
00:24:06.780 | does not look good when you have a mission
00:24:08.840 | to give this intellectual property to the world.
00:24:12.300 | So, no one person benefits from it.
00:24:14.540 | All of humanity is supposed to benefit from it.
00:24:16.640 | That was the point of being open source.
00:24:20.120 | - Right. - Then you close it.
00:24:21.720 | Now, who gets the benefit if it's closed?
00:24:23.940 | - Right, that's a good point. - The employees.
00:24:25.980 | - Right, that's a good point.
00:24:26.800 | - And that's gonna be the crux of this,
00:24:28.240 | and if you really wanna-- - Or the investors.
00:24:30.800 | The investors. - The investors.
00:24:32.200 | - But that's how they were able
00:24:33.220 | to attract billions of dollars,
00:24:34.500 | 'cause you can't get billions of dollars
00:24:36.640 | to invest in something that's gonna be open sourced,
00:24:38.640 | and there's no return. - And then the employees
00:24:40.720 | sell two billion in secondary.
00:24:42.460 | So now you, if you really wanna take
00:24:45.000 | the most cynical approach to this or interpretation,
00:24:48.300 | and again, this is just one interpretation,
00:24:50.840 | they took an open source project, they closed it,
00:24:54.320 | they raised money, and then within the next two years
00:24:57.520 | on this incredible innovation,
00:24:59.500 | they sold $2 billion and put that in their pockets.
00:25:02.720 | Now, those employees, if it was a for-profit,
00:25:05.220 | obviously no problem with that,
00:25:07.280 | but they took the stated mission
00:25:09.540 | of giving this to all of humanity,
00:25:11.500 | and then took 3% or 4% of that gain
00:25:13.760 | and put it in their pockets.
00:25:14.640 | That's where I think this whole thing
00:25:16.240 | is gonna get really crazy,
00:25:18.120 | and this is where the IRS-- - I think that's a good point.
00:25:20.060 | - Yeah, and the IRS also interpreted Mozilla,
00:25:24.080 | which was just $100 million,
00:25:25.680 | and none of the employees took any of it.
00:25:27.640 | The 100 million that came out of Mozilla,
00:25:30.100 | they were just saying,
00:25:30.940 | "I think you should pay tax on that.
00:25:32.720 | "Should you pay tax on that?"
00:25:33.780 | And then they were investigated for years.
00:25:35.940 | So I think the IRS is gonna be on this like crazy
00:25:38.900 | based upon what happened to Mozilla, which was--
00:25:41.360 | - Jason, do you think individual employee sellers
00:25:45.020 | are gonna get audited by the IRS because of this?
00:25:48.400 | - I don't, I mean, we're in uncharted territory here.
00:25:51.380 | When I did some tweets and said,
00:25:52.720 | "Hey, can everybody contact me
00:25:53.800 | "who has information on these structures?"
00:25:55.780 | I was able to find the Samasource example,
00:25:59.520 | and they did it very clean,
00:26:01.080 | and there was no IP transfer
00:26:03.760 | or employees enriching themselves,
00:26:06.440 | or a God King like Sam doing all kinds of deals
00:26:09.480 | and enriching himself.
00:26:11.480 | It was a very clean structure.
00:26:12.500 | And then Mitchell Baker who set up Mozilla
00:26:14.200 | also did the same thing.
00:26:15.220 | Hey, we'll just pay the employees an extra 50K
00:26:17.240 | on their salaries,
00:26:18.560 | and they'll be a little overpaid in Silicon Valley,
00:26:20.260 | but they're not getting billions of dollars in equity,
00:26:22.920 | which again, I have no problem
00:26:24.580 | with people getting billions of dollars in equity
00:26:26.760 | in a for-profit,
00:26:27.860 | but this was supposed to benefit humanity.
00:26:30.940 | It was supposed to be open.
00:26:31.960 | If this was open source and they took the two billion,
00:26:34.520 | I would actually not have much of a problem with it
00:26:36.280 | because we could all be looking at that mission
00:26:38.960 | and then Saks and myself and Friedberg
00:26:41.000 | and anybody who wanted to use that code
00:26:42.840 | could go in there and adapt it.
00:26:44.680 | And you know what?
00:26:45.520 | There are for-profits, Apple and Meta,
00:26:48.640 | which are producing open source software.
00:26:50.640 | So if they're producing, Apple's pretty competent
00:26:53.080 | and Meta's super competent in this respect.
00:26:55.860 | They're able to do open source,
00:26:57.520 | but Sam is claiming or OpenAI is claiming
00:27:00.340 | it's too dangerous to show us the open source code.
00:27:02.480 | I call bull (beep) on that.
00:27:04.320 | Their code is not too dangerous for us to sit.
00:27:06.720 | That's complete utter bull (beep)
00:27:08.120 | - So you think that there is a motivation
00:27:10.200 | for profit, basically.
00:27:11.220 | What you're saying is essentially
00:27:12.640 | that this was highly motivated by profit.
00:27:15.140 | - I think. - Versus like even,
00:27:18.120 | like some of the emails,
00:27:20.240 | the emails try to paint a picture
00:27:23.480 | of very industrious people on both sides
00:27:27.400 | who are trying to solve a very hard problem
00:27:29.840 | and dealing with a conundrum
00:27:31.300 | that's really around capital, right?
00:27:33.700 | - Yeah.
00:27:34.540 | - We need to do all this training,
00:27:35.740 | it's gonna be so expensive.
00:27:37.520 | But I think what you're saying,
00:27:38.500 | which is a deviation from that,
00:27:40.420 | is actually, come on guys,
00:27:41.380 | these are all very smart people
00:27:42.920 | and those folks saw an opportunity
00:27:44.560 | to make a ton of money and they took it.
00:27:47.300 | - I mean, that seems like what happened here to me.
00:27:50.120 | I mean, it's Occam's Razor kind of situation.
00:27:52.420 | I think they, if you look at it,
00:27:54.700 | I think they probably regretted making this a nonprofit
00:27:57.340 | and then tried to figure out a way to reverse it.
00:27:59.200 | That's actually what I think is going on here.
00:28:00.920 | And I do think there is part of it, Chamath, you're right,
00:28:02.880 | that they needed servers and they needed capacity,
00:28:05.820 | but they could have done that
00:28:07.560 | without giving the employees tons of equity,
00:28:09.440 | without selling billions of dollars,
00:28:11.020 | keeping it open source and then telling Microsoft,
00:28:14.460 | hey, if you want access to this open source or whatever,
00:28:17.140 | and they could give them-
00:28:19.580 | - You're making a very, very good point.
00:28:21.660 | And the reason is that if it was open source,
00:28:26.420 | Microsoft could have just taken it
00:28:28.420 | and done its own training and just paid for it.
00:28:31.100 | - Yeah.
00:28:32.060 | - Or make a donation to the foundation.
00:28:34.060 | - This is interesting 'cause-
00:28:36.000 | - It's a very good point, Jacob.
00:28:36.840 | - Because OpenAI, yeah, OpenAI dropped a few of the emails
00:28:41.080 | that they had with Elon.
00:28:42.200 | Obviously, those are cherry picked
00:28:43.460 | to help their case the most.
00:28:45.180 | What we don't know is what are the other emails show?
00:28:47.960 | Like, what were their alternatives
00:28:49.320 | to this particular structure?
00:28:50.740 | Could they have raised the necessary funds
00:28:52.040 | in a way that was more consistent
00:28:53.620 | with the original mission of the company?
00:28:55.480 | Could they have remained open source, for example?
00:28:59.020 | Also, are there any emails where employees talk
00:29:02.740 | about the potential benefit to them of going-
00:29:06.900 | - Private.
00:29:07.740 | - Going for profit, right?
00:29:09.260 | - Those emails have already been leaked.
00:29:10.940 | There was this thing where,
00:29:12.860 | in the OpenAI emails that were leaked,
00:29:15.020 | they talked about being able to swap this phantom equity
00:29:18.700 | in OpenAI into either YC equity,
00:29:21.100 | and then Elon said, "Oh, maybe SpaceX too,
00:29:23.140 | "but I'll have to figure that out."
00:29:24.540 | - Well, then also like Microsoft coming in and buying 49%,
00:29:28.700 | and then it's closed.
00:29:29.540 | This is the other piece that I would wanna see
00:29:31.700 | in discovery as well.
00:29:32.580 | And again, I don't have any problem with any of the people.
00:29:34.900 | I love Sam Allman, I think he's great.
00:29:36.740 | I think all the people there are great.
00:29:37.740 | I think what they've done for humanity is great.
00:29:39.700 | I just think they should have kept this thing open source,
00:29:41.640 | which was the mission.
00:29:43.060 | But then they closed source at Chamath,
00:29:45.900 | and then give 49% of it, all the weights,
00:29:48.460 | all the source code to Microsoft.
00:29:50.900 | So that, to me, was like a really...
00:29:53.420 | Like, you wanna talk about taking this nonprofit's IP,
00:29:57.560 | and then some amount of that bag
00:30:00.040 | gets given to the employees for billions of dollars,
00:30:01.900 | and then Microsoft gets 49% of all that nonprofit's effort
00:30:05.340 | to then go commercialize,
00:30:06.900 | and Microsoft has added, what, $500 billion in market cap
00:30:10.240 | since this whole thing has been announced?
00:30:12.880 | So now all that profit has been aggregated
00:30:15.680 | into Microsoft stock.
00:30:16.940 | - And I think this is where you gotta understand
00:30:18.860 | why Elon feels swindled, is because not only are we going
00:30:22.840 | from nonprofit to for-profit and open source
00:30:25.820 | to closed source, he was specifically concerned
00:30:28.060 | about all the benefits of AI crewing
00:30:31.160 | to one powerful, big tech company.
00:30:33.400 | Now, at that time, he thought it was Google.
00:30:35.400 | Now it's Microsoft.
00:30:36.700 | There's not really that much of a difference.
00:30:38.580 | He never wanted all the benefits of AI
00:30:41.740 | in the hands of one really powerful tech company.
00:30:45.780 | And Microsoft is the most, it's now, what,
00:30:48.520 | the biggest company in the world by market cap?
00:30:50.300 | So this is like the opposite of what he intended.
00:30:52.620 | - They gifted a trillion dollars to Microsoft, probably.
00:30:54.800 | There's an easy solution.
00:30:56.420 | If they are people of good faith and they're doing this
00:30:59.460 | for the right reasons, open source it.
00:31:02.160 | Just go back and open source it.
00:31:04.020 | - Jason, but to your earlier point,
00:31:05.620 | that may solve the lawsuit, and Elon may drop the lawsuit,
00:31:08.300 | but it's opened a can of worms with respect to tax
00:31:11.800 | and structuring that's much bigger than just open AI.
00:31:14.820 | - Yeah, I don't think it solves that.
00:31:15.660 | - It's the entire nonprofit industry.
00:31:16.820 | It's a tax problem for every new company.
00:31:19.300 | It's every other entrepreneur that studies this model
00:31:21.700 | and tries to replicate it for their own personal gain,
00:31:24.380 | even if that wasn't intentional here.
00:31:26.760 | So there's a whole set of issues
00:31:29.240 | that we've really cracked the egg here.
00:31:32.560 | We gotta figure out how to unscramble it.
00:31:34.600 | - There's the IRS issues, then there's what's morally right,
00:31:37.340 | and then there's Elon's beef.
00:31:38.540 | And if it is a for-profit company and Elon put in 50 million
00:31:41.180 | when it was a seed round, what would he own, Sachs?
00:31:44.900 | What would his ownership in this for-profit be?
00:31:47.540 | - Oh my God, I mean, probably--
00:31:49.300 | - 98%.
00:31:50.820 | - Okay, and so--
00:31:52.260 | - Yeah, we need to know the total size of that round.
00:31:54.360 | He put in the first 40 million.
00:31:55.580 | Do we know what other people put in?
00:31:57.240 | It was most of the money, right?
00:31:58.640 | - Yeah, it was most of the money.
00:31:59.480 | So, I mean, if you put,
00:32:00.860 | let's just put a crazy valuation on it, 500 million.
00:32:04.220 | Okay, he owns whatever, 10%--
00:32:05.620 | - No, no, at that point, it would,
00:32:09.460 | AI was not what it is today.
00:32:11.060 | - Of course not.
00:32:11.900 | - So you would've raised that like,
00:32:13.060 | you would've raised 150 on 50 pre.
00:32:15.980 | - Okay.
00:32:17.220 | - That's the way that a very capex-intensive deal
00:32:19.540 | in a space that wasn't thought to yield big outcomes,
00:32:22.860 | that's unfortunately the cost of capital.
00:32:23.700 | - Elon would own half the company, at least.
00:32:25.900 | - At least.
00:32:26.740 | - All right, so give him 20% of the company,
00:32:28.060 | it's $20 billion since even half
00:32:30.120 | of what you're saying, Zach, right?
00:32:31.260 | - We talked about on the show when that whole fracas
00:32:34.180 | with these non-profit directors went down,
00:32:37.060 | is we said, go back and restructure the whole thing
00:32:40.660 | to make it what it always should've been,
00:32:42.620 | which is just a clean for-profit entity,
00:32:45.060 | give Elon his equity, give Sam his equity,
00:32:48.180 | 'cause it never made sense that Sam had no equity either.
00:32:49.900 | - Sam has no equity, but he's got a venture firm,
00:32:52.120 | it's all so convoluted.
00:32:53.160 | - It's a weird form of compensation in which
00:32:55.980 | they're giving him corporate opportunities
00:32:59.440 | in effect as like a type of compensation,
00:33:01.800 | when really he should just have compensation
00:33:03.720 | in the corporation, and then the corporation
00:33:06.420 | should own all of its opportunities.
00:33:08.680 | - Okay, that's a really interesting thing you just said.
00:33:10.700 | So basically, yeah, it's like he famously has no equity,
00:33:14.560 | but then he has this retained optionality
00:33:16.800 | to monetize the ecosystem.
00:33:18.440 | So even though he's not monetizing the thing,
00:33:20.720 | he gets to monetize everything around him.
00:33:23.400 | - And it seems like the board,
00:33:24.980 | or at least the new board's okay with that,
00:33:26.580 | because he is being undercompensated
00:33:28.860 | with respect to the main thing.
00:33:30.440 | So then he gets the side things,
00:33:31.820 | but that's not really the way it should work either.
00:33:34.280 | - You know, Bill Gates said to us famously
00:33:35.940 | when we were at, when I was at Facebook,
00:33:38.280 | the value of an ecosystem is when the economic value
00:33:41.620 | generated by the ecosystem exceeds that of the platform.
00:33:44.820 | Now, in this case, you'd actually rather have
00:33:48.340 | 50 basis points of the ecosystem than 5% of open AI,
00:33:51.580 | if that's true, 'cause if this thing
00:33:52.900 | could be so revolutionary, you're talking
00:33:55.340 | 10, 20, 30 trillion dollars.
00:33:57.060 | - What a mess, oy oy oy.
00:34:00.460 | - Sam should just be given like a huge option grant
00:34:03.380 | in open AI, but then open AI should own
00:34:05.900 | its own venture fund.
00:34:07.000 | - Yeah, and the SEC is looking into all this stuff.
00:34:11.020 | You know, they look into a lot of things in fairness,
00:34:12.860 | but they're looking into it, what a mess,
00:34:14.700 | and we'll keep track of it.
00:34:16.420 | The other thing that's crazy, I don't know
00:34:18.580 | if you guys know this, but Nick, just as we close here,
00:34:21.960 | the most insane part of open AI's LP investment agreement,
00:34:24.660 | which is on their website, you can just search
00:34:25.980 | for open AI LP agreement, is this part.
00:34:29.180 | The partnership exists to advance open AI Inc.'s mission
00:34:31.740 | of ensuring that safe artificial and general intentions
00:34:33.900 | is deployed and benefits all of humanity.
00:34:35.860 | The general partners' duty to this mission
00:34:37.780 | and principles advance in open AI's ink chart,
00:34:39.900 | yada yada yada, take precedent over the obligation
00:34:42.220 | to generate a profit.
00:34:43.980 | The partnership may never make a profit
00:34:45.180 | and the general partner is under no obligation to do so.
00:34:46.940 | The general partner is free to reinvest any
00:34:49.300 | and all of the operating entities' cash flow
00:34:51.660 | into research and development activities
00:34:53.200 | and/or related expenses without any obligation
00:34:55.760 | to the limited partners.
00:34:57.140 | And so they basically told everybody,
00:35:00.140 | Vinod and employees or whatever,
00:35:01.640 | we can just basically wipe your equity out
00:35:04.220 | and we can do whatever we want with the profits
00:35:06.020 | and you're probably gonna lose your money.
00:35:07.380 | So this structure is weird.
00:35:10.340 | - Do you guys think that when the investors came in,
00:35:13.580 | especially in this latest round,
00:35:15.100 | the $86 billion round,
00:35:16.560 | do you think they underwrote the legal risk
00:35:19.300 | inherent in the structure?
00:35:20.920 | Or are they just sort of hand-waved over it and said--
00:35:24.500 | - No, typically what happens in these deals
00:35:27.500 | is you hire somebody like KPMG or Deloitte
00:35:32.500 | or Ernst & Young to do the full financial diligence packet.
00:35:35.820 | And that typically tends to be how a lot
00:35:38.980 | of late-stage organizations document that they've done
00:35:43.080 | and manage their fiduciary responsibilities
00:35:45.240 | on behalf of their limited partners.
00:35:47.060 | So what happens is you will do a deal,
00:35:49.160 | you'll sign a term sheet, you'll turn it over to Deloitte,
00:35:52.120 | you'll turn it over to KPMG and say,
00:35:54.500 | "Please go and run this down."
00:35:56.740 | And then what they do is they will furnish a report
00:35:58.740 | that says, "Yes, this meets all the customary expectations."
00:36:01.960 | I suspect that if these folks were doing a decent job
00:36:06.960 | of running late-stage money,
00:36:09.920 | they probably sent it to those folks.
00:36:12.380 | And those folks probably produced something
00:36:14.420 | that said, "This looks fine."
00:36:15.820 | Now, if you read Buffett's letter this year,
00:36:18.380 | he has a really great commentary on Deloitte and KPMG
00:36:22.760 | and these sorts of letters,
00:36:24.740 | which is not exactly the most supportive
00:36:28.360 | is the best way to say it.
00:36:29.400 | Nick, you can find that mentioned.
00:36:31.380 | In his case, it's Deloitte and Touche.
00:36:33.000 | But typically, SACS, that's what they do.
00:36:34.360 | They go to a KPMG or a Deloitte or a Ernst & Young and say...
00:36:37.660 | And by the way, 99% of the time,
00:36:41.840 | I mean, they do really good work,
00:36:43.360 | but it's a standard structure.
00:36:45.220 | And they just wanna make sure
00:36:48.160 | that nothing nefarious is a muck.
00:36:49.760 | In this case, I doubt anything was a muck anyways.
00:36:52.160 | I doubt though that they looked at the structure
00:36:54.660 | and then also elevated the litigation risk
00:36:57.660 | of this getting unwound.
00:36:58.800 | I just don't see that that diligence report,
00:37:01.240 | and I've seen enough of them,
00:37:02.640 | typically has that section in it.
00:37:06.840 | - The other thing that's completely hypocritical here
00:37:08.660 | is they said when they hit AGI
00:37:11.080 | and they're gonna be like a sentient,
00:37:13.520 | artificial intelligent going on here,
00:37:16.060 | Friedberg, that they would wrap up shop
00:37:18.400 | and they're gonna no longer be a nonprofit, et cetera.
00:37:21.760 | But they're claiming they haven't hit that,
00:37:24.200 | but they closed the software.
00:37:26.960 | So it should be open source if they haven't hit AGI
00:37:29.560 | and you don't think they've hit general intelligence, right?
00:37:34.320 | Friedberg, for anything close to it.
00:37:36.600 | Maybe you could educate the audience on what that is
00:37:39.080 | and that claim that they have to-
00:37:40.840 | - No, I don't think they have. - Should have shut off
00:37:41.680 | the for-profit.
00:37:42.840 | Yeah, they haven't.
00:37:43.800 | - But I think we keep repeating this concept
00:37:48.400 | of the models should be open source versus closed source.
00:37:52.000 | Making AI for the benefit of humanity
00:37:54.000 | can be interpreted in a lot of ways.
00:37:56.020 | There may have been some anecdotal conversation
00:37:58.240 | at some point with Elon or others
00:38:00.620 | about we're gonna make the models open source.
00:38:03.020 | But there was a reason that that change was made
00:38:07.140 | along the way, which was to attract dollars.
00:38:10.040 | And those dollars need to have some return of capital
00:38:15.040 | available to them because they're private investor dollars.
00:38:18.720 | And so I don't think that that was necessarily,
00:38:20.880 | and correct me if I'm wrong,
00:38:21.720 | I don't think that's in the mission,
00:38:23.020 | that the open AI software models will be open source.
00:38:26.280 | Making AI for the benefit of humanity
00:38:28.920 | could probably be interpreted in a lot of different ways
00:38:31.040 | and we'll see.
00:38:33.400 | But no, I don't think that anyone has achieved
00:38:36.800 | this holy grail of general intelligence.
00:38:41.800 | - Yeah.
00:38:43.840 | - I think one of the more interesting
00:38:46.360 | and kind of wacky things about open AI
00:38:49.320 | is that their mission is explicitly to create EGI,
00:38:52.800 | which most people would associate
00:38:56.280 | with some sort of sci-fi dystopian outcome.
00:38:59.480 | And I think this has raised the fear factor around AI
00:39:04.480 | because they're explicitly trying to create
00:39:06.500 | the sentence that's gonna replace humanity.
00:39:09.820 | Now, I think they defined AGI in a different way.
00:39:12.240 | They say it's something that can replace 80% of the jobs,
00:39:15.960 | but I think we all kind of know what it really is.
00:39:18.980 | So I just wonder if--
00:39:20.240 | - I don't know, I think that that's like,
00:39:22.780 | that assumes a steady state in the world.
00:39:24.920 | So if you end up with a system
00:39:28.880 | and the system has all the capabilities
00:39:33.020 | of a bunch of really highly qualified knowledge workers,
00:39:36.560 | and I can sit in front of a computer terminal
00:39:39.200 | and I can say, "Let's design a mission to Mars."
00:39:41.800 | A mission to Mars could be a 20-year engineering project
00:39:45.120 | with hundreds of people involved to design the buildings,
00:39:47.520 | to design the flight path, to figure out the fuel needs,
00:39:49.840 | to figure out how you would then be able to terraform Mars.
00:39:52.920 | And what if one person could interact with a computer
00:39:55.680 | and design a plan to go and inhabit Mars?
00:39:58.960 | All of the technical detail docs could be produced,
00:40:01.400 | all of the engineering specifications could be generated,
00:40:04.260 | all of the operating plans, all of the dates,
00:40:06.620 | the amount of labor needed,
00:40:07.820 | the amount of production needed,
00:40:08.860 | the amount of capital needed.
00:40:10.220 | What would otherwise take NASA or some international
00:40:13.900 | or well-funded private company many, many decades to do,
00:40:17.620 | a piece of software could do in a very short order.
00:40:20.260 | I think that's like a really,
00:40:22.140 | like for me, poignant example of the potential
00:40:25.140 | of having these tools broadly available,
00:40:27.860 | that the potential of humanity
00:40:29.660 | starts to become much broader.
00:40:31.060 | We could say, I want to develop a city underneath the ocean
00:40:34.820 | because I want to explore more of the earth.
00:40:36.680 | I think humans need to go solve cancer,
00:40:39.840 | figure out the biologic drugs
00:40:41.200 | and the combination of biologic drugs
00:40:42.600 | that would be needed to solve cancer
00:40:44.120 | based on this patient's genotype.
00:40:46.940 | The extensibility of highly knowledgeable
00:40:50.320 | or what other people might call
00:40:51.380 | general intelligence type tooling is extraordinary.
00:40:54.100 | That one individual starts to have an entire cohort
00:40:59.240 | of knowledge workers available at their disposal
00:41:01.580 | to do things that we can't even imagine today.
00:41:04.580 | So I don't think that it is nefarious.
00:41:06.140 | It's nefarious because we assume
00:41:07.400 | a steady state of the world today, that nothing changes.
00:41:09.640 | Therefore, a piece of software replaces all of us.
00:41:12.280 | But the potential of humanity starts to stretch
00:41:14.360 | into a new era that we're not really comfortable with
00:41:17.200 | because we don't really know it or understand it yet.
00:41:19.440 | - I'm not saying it's nefarious to want to develop AI
00:41:22.960 | because I agree with you
00:41:24.720 | about all the extraordinary potential of it.
00:41:26.940 | I'm saying there's something a little bit cultish
00:41:29.040 | and weird about explicitly devoting yourself to AGI,
00:41:33.200 | which I think in common parlance means Skynet.
00:41:37.340 | - Yeah, it means something sentient smarter than humans.
00:41:41.740 | - Maybe that parlance is what needs to be addressed,
00:41:44.160 | which is AGI effectively enables equivalence
00:41:47.200 | to a human knowledge worker.
00:41:49.300 | And that can kind of unleash a new kind of
00:41:52.640 | set of opportunities.
00:41:53.480 | - So you think that's what it is?
00:41:54.640 | - My definition for AGI is smarter
00:41:56.620 | than the smartest human being who ever lived.
00:41:58.540 | - Yeah, I was talking to somebody this week
00:41:59.880 | who's in a position who said the definition of AGI
00:42:03.000 | is very fuzzy, that there isn't a clear definition.
00:42:06.020 | And therefore, it allows every side to kind of anchor
00:42:09.240 | on their interpretation of what that term means
00:42:12.000 | and therefore kind of justifies their position.
00:42:14.280 | So I don't really feel great about like just saying,
00:42:16.800 | are we at AGI?
00:42:18.180 | We don't have a clear sense of what it means.
00:42:20.080 | I do think if you look at some of the work that was done
00:42:22.500 | by Anthropic and published in the Cloud3 model this week,
00:42:25.660 | did any of you guys see the demos that were done
00:42:27.600 | of the output of that model?
00:42:28.960 | There was a guy who wrote a thesis in quantum physics
00:42:32.240 | on a very esoteric, complicated problem set.
00:42:34.820 | And he asked Cloud3 to solve this problem set
00:42:37.500 | and it came up with his thesis.
00:42:39.480 | It was really like extraordinary.
00:42:40.880 | And this is something he's like,
00:42:41.820 | no one in the world knows this stuff.
00:42:43.180 | And he's like, I can't believe this model
00:42:45.380 | like came up with my thesis.
00:42:47.840 | And that's the sort of thing that very few people on earth
00:42:50.280 | even read or understand.
00:42:52.020 | And the Cloud3 model was able to kind of recreate
00:42:54.760 | the basis, the buildup and then the output of his thesis
00:42:58.360 | was really extraordinary.
00:42:59.200 | - It'd be like somebody writing a screenplay
00:43:00.280 | and then giving it the first two acts
00:43:01.800 | and say, guess the third act.
00:43:02.840 | And it's like, oh yeah, this is the third act,
00:43:04.320 | here's what happens.
00:43:05.480 | Like it's pretty impressive, the reasoning ability.
00:43:07.680 | - Yeah, complicated.
00:43:08.520 | - Let me ask you, like deep down,
00:43:09.900 | when these guys say they're gonna create AGI,
00:43:12.260 | what do you think they really mean
00:43:14.460 | in their heart of hearts?
00:43:16.180 | - Oh, they mean the Terminator.
00:43:17.320 | Yeah, they mean the sentient God.
00:43:19.480 | - You guys are propagating some bad (beep)
00:43:22.080 | you guys shouldn't be saying that.
00:43:23.400 | - I think that's what they think.
00:43:24.560 | - Remember what Larry Page said to Elon,
00:43:28.760 | don't be speciesist.
00:43:30.600 | - Yeah.
00:43:31.440 | - I think there's a meaningful number of people
00:43:34.000 | in the tech community who deliberately wanna give rise
00:43:37.920 | to the super intelligence.
00:43:40.000 | - There's another point of view of super intelligence
00:43:41.760 | where super intelligence means that the software
00:43:44.640 | is now more intelligent than all humans.
00:43:46.860 | And as a result, the software may have its own motivations
00:43:49.680 | to figure out how to supersede humans on earth.
00:43:53.320 | Now, the Larry Page statement,
00:43:55.840 | which I don't know firsthand,
00:43:57.600 | I read the same article you did,
00:43:59.540 | is one that a group of people might say,
00:44:01.880 | evolution is evolution.
00:44:03.260 | You know, there are people-
00:44:05.280 | - Yeah, they would say that.
00:44:06.400 | - That's right.
00:44:07.380 | And I know Elon's taking this point of view
00:44:09.200 | that like, you know, we need to maintain human supremacy,
00:44:13.720 | but my favorite test is, you know,
00:44:17.560 | there's the Turing test,
00:44:18.520 | which like you can't tell if it's a human
00:44:20.400 | or if it's a robot,
00:44:21.720 | but Mustafa came up with the modern Turing test,
00:44:25.120 | which is an AI model is given $100,000
00:44:27.600 | and has to obtain 1 million, go.
00:44:30.520 | - Kind of interesting.
00:44:31.360 | - That's a good idea.
00:44:32.760 | - The other interesting one was Gary Marcus.
00:44:35.040 | He said the IKEA test, the flat pack furniture test,
00:44:39.680 | and AI views the parts
00:44:40.840 | and instructions of an IKEA flat pack product
00:44:43.120 | that controls a robot to assemble the furniture correctly.
00:44:46.480 | - Oh, look, what's up, Sonny?
00:44:48.680 | It's now time to do just a quick little congratulations
00:44:52.440 | to our dear friend, Sandeep Madra.
00:44:54.760 | We call him Sonny, that's his nickname.
00:44:56.660 | And he's one of our poker buddies.
00:44:57.960 | He keeps building great companies.
00:44:59.980 | And in this case,
00:45:01.760 | Sandeep is the first person to collect all four besties.
00:45:04.360 | We all invested in his company, Definitive.
00:45:06.900 | And Definitive was working in AI,
00:45:10.000 | but we got some great news this week
00:45:12.640 | and we thought we would give him his flowers.
00:45:14.640 | Sonny, you want to tell us what happened this week
00:45:16.660 | with our investment in your company,
00:45:19.180 | Definitive Intelligence at definitive.io,
00:45:21.840 | I believe is your domain.
00:45:23.340 | - Yeah, well, you know, with your guys' support
00:45:25.420 | and, you know, we've been growing our company
00:45:27.500 | and we saw a really great opportunity
00:45:29.660 | to work together with Grok
00:45:30.900 | and we've been working with them for a couple of months
00:45:32.880 | and all the hype that you've seen has been built on
00:45:35.920 | the collaborations that we've done,
00:45:37.640 | building the cloud offering, the API offerings.
00:45:40.280 | And so, you know, we've decided to merge with them
00:45:43.180 | and we're super excited.
00:45:44.240 | And all the besties are now not only shareholders
00:45:48.120 | in Definitive previously, but now shareholders in Grok.
00:45:50.720 | - This is the first investment, I think,
00:45:52.660 | where we're all on the cap table, right?
00:45:54.440 | So we're all rooting in the same direction.
00:45:56.480 | What's happening in the developer side?
00:45:58.160 | How's the momentum going?
00:45:59.000 | - Well, the momentum's incredible.
00:46:00.860 | You know, we have now 16,000 plus developers
00:46:04.520 | in our self-serve playground.
00:46:07.400 | There's well over like a thousand apps
00:46:09.800 | that people have developed using the API
00:46:12.320 | and all kinds of new functionality.
00:46:14.240 | The API allows people to get a higher rate
00:46:17.120 | of throughput on tokens and low latency.
00:46:19.320 | So there's all kinds of new applications
00:46:20.760 | from voice to real-time translation of web pages.
00:46:25.240 | We're collecting them on our Discord.
00:46:26.720 | We have 3,000 people in the Discord as well.
00:46:30.360 | It is a real community that's come together
00:46:32.320 | building around Grok.
00:46:33.240 | And what I will say is sort of the same jump
00:46:35.960 | that developers saw when we went from dial-up internet
00:46:38.240 | to broadband, they're seeing that now
00:46:40.040 | and from using like traditional APIs for LLMs
00:46:42.960 | to using, you know, the ones that we offer.
00:46:45.760 | - You guys support the latest and profit models
00:46:48.400 | that they just launched that seemed pretty kick-ass.
00:46:51.240 | - No, we don't have those yet, but we're in just, you know,
00:46:53.820 | we're having discussions with everyone out there
00:46:56.000 | and we want to support their models.
00:46:57.960 | Right now, what we've done given all the demand
00:46:59.940 | is we've kind of limited it to Llama 2, 70B and Mixtrol.
00:47:03.320 | And we actually have a bunch of other models
00:47:04.840 | that we make available in private mode for folks.
00:47:07.080 | We're pretty excited.
00:47:08.240 | But if there's anyone out there
00:47:09.320 | that wants to have us, you know, give us a call
00:47:11.400 | and we'll get you going on our systems.
00:47:13.840 | - Well, Sonny, congrats to you.
00:47:16.420 | You're an incredible entrepreneur.
00:47:19.920 | And it's always fun to kind of be on the journey with you.
00:47:23.020 | This is my fourth business that I've done with Sonny Madra.
00:47:27.080 | - That's amazing.
00:47:27.920 | Extreme Lens. - Yeah.
00:47:29.080 | - Then there was a company in between.
00:47:30.520 | - No, no, no, there was a company in between.
00:47:32.500 | You have to understand, Sonny and I met
00:47:33.800 | because Sonny went to school where I grew up.
00:47:35.800 | He went to the University of Ottawa.
00:47:37.080 | I grew up there and we met through a mutual friend
00:47:39.960 | and his first company was called Spongefish,
00:47:42.200 | which I backed, I think in 2006 maybe.
00:47:46.820 | I mean, I had no money.
00:47:47.880 | I may have written a $10,000 check.
00:47:49.360 | - Would you give me $8,000?
00:47:50.920 | - 10K maybe. - Maybe less.
00:47:52.480 | - 10K check? - Maybe less, 5K.
00:47:53.760 | Yeah, I mean, whatever I had, I didn't have much.
00:47:55.880 | - I like it, scraped together.
00:47:57.440 | - And it's our fourth business.
00:47:58.680 | - Sonny's a variable rabbit's foot.
00:48:00.680 | You're a rabbit's foot, Sonny.
00:48:02.480 | - It is true, he is.
00:48:03.320 | He's a luck bot.
00:48:04.320 | - Luck prefers the prepared.
00:48:06.760 | - Well, I mean, I've been in two of Sundeep's companies
00:48:10.560 | and I thank him for that.
00:48:12.020 | And then Extreme Labs, actually, to thank you again.
00:48:14.920 | You sponsored a lot of my events 10 years ago
00:48:17.360 | when I was coming up, so I appreciate that.
00:48:18.740 | So it's just great to see a nice guy win
00:48:21.100 | as opposed to the three miserable people
00:48:23.400 | who I work with on this podcast keep winning.
00:48:25.720 | So that's good.
00:48:26.560 | Now, finally, a good guy wins
00:48:27.560 | as opposed to the four of us.
00:48:28.720 | - Well, I appreciate all your guys' support.
00:48:30.580 | It means a lot to us, you guys getting the developers
00:48:33.520 | and keep putting it out there and keep us honest as well.
00:48:37.360 | So if we're not doing something right, let us know.
00:48:39.080 | Really appreciate you guys all.
00:48:40.840 | - Sax, you wanna say something motivational to Sonny here?
00:48:44.080 | I know you always have a great motivational word
00:48:46.380 | for your friends, something you say that just gives people
00:48:49.540 | that thrill of being in the game.
00:48:51.460 | Go ahead, Sax, you always have something kind to say.
00:48:53.560 | - We're gonna ship 20 million into your safe note.
00:48:56.400 | I know you closed it, but we're gonna pry it open
00:48:58.360 | and get some allocation.
00:49:01.960 | - Wait a second, if you're prying open that safe,
00:49:04.680 | the safe's, if there's a wedge,
00:49:06.800 | I know a guy with a podcast who's got 800,000 followers,
00:49:09.640 | maybe I could slip in a quick 500,000.
00:49:11.360 | Is there possible?
00:49:12.360 | What do you think, Sandeep?
00:49:13.280 | Let us know right now.
00:49:14.120 | - I don't know, we did a really big announcement
00:49:15.720 | with the Saudis this week, so that price might be up.
00:49:18.480 | - So let us ship something into that safe note
00:49:20.340 | so we actually have some real skin in the game.
00:49:21.840 | - David, text me.
00:49:22.960 | - All right, we'll do.
00:49:23.800 | - Well, no, but seriously, hey, Sandeep,
00:49:25.240 | wait a second, if he gets in--
00:49:26.060 | - I mean, if you're gonna let him text you--
00:49:26.900 | - I gotta get a second chip in the 500,000.
00:49:28.400 | - Jason wants in for 500, by that he means $500,
00:49:31.760 | so we'll let you into it.
00:49:32.600 | - No, I got a gift card for you from Starbucks.
00:49:34.840 | I got a $500 gift card for you and the team.
00:49:37.380 | I think I got like 150,000 in the brief credits.
00:49:39.220 | - Jacob's gonna redeem that Starbucks gift card.
00:49:42.040 | He'll sell it for 20 cents on the dollar
00:49:43.580 | and then ship it into the--
00:49:44.420 | - Yeah, let me zip it in.
00:49:45.240 | Let me zip it in, maybe I can 100X it.
00:49:46.360 | No, seriously, 500K from your boy, Jay Cal.
00:49:48.760 | All right, everybody, thanks to Sandeep for jumping on.
00:49:51.920 | Congratulations on the murder.
00:49:53.040 | - Thanks, guys.
00:49:53.880 | - All right, issue two, issue two.
00:49:55.800 | Apple is battling two major iOS developers
00:49:59.080 | and regulators are siding with the devs.
00:50:01.960 | You may know about the Apple versus Epic Games saga.
00:50:04.520 | We've talked about it here.
00:50:06.440 | Epic Games is planning to create a custom app store on iOS
00:50:10.360 | because Europe's DMA, the Digital Markets Act,
00:50:14.340 | has said that Apple now has to allow
00:50:16.560 | third-party app stores in the EU.
00:50:19.120 | So Epic created a developer account based in Sweden
00:50:22.080 | and Apple actually approved the account two weeks ago.
00:50:24.400 | Then on Wednesday, Apple flipped,
00:50:26.080 | terminated Epic's EU developer account,
00:50:28.800 | and Apple said one of the reasons
00:50:30.160 | they terminated the account was because Epic's CEO
00:50:32.100 | publicly criticized their DMA compliance plan.
00:50:35.520 | Additionally, on Monday, Apple was fined 2 billion
00:50:38.600 | by the EU's antitrust regulators
00:50:40.800 | and was forced to remove its anti-steering rules
00:50:43.960 | from music apps like Spotify.
00:50:45.920 | Basically, Apple has been restricting music apps
00:50:47.640 | from informing users about pricing and discounts
00:50:50.600 | and the European Commission considered this anti-competitive
00:50:53.580 | since Apple runs Apple Music.
00:50:55.900 | And they want Spotify to pay 30%.
00:50:58.660 | Daniel Ek, a friend of the pod,
00:51:00.520 | basically did a whole video on this
00:51:03.600 | about how they can't charge 30% more, yada yada.
00:51:06.520 | Sax, you've spoken about Apple's monopoly before.
00:51:10.540 | Your thoughts on what's happening in the EU
00:51:13.040 | and then we'll get into Apple's wider problems.
00:51:16.640 | - I mean, did you just say that Apple booted Epic
00:51:20.740 | from their app store because they didn't like
00:51:22.320 | what Epic said about them?
00:51:24.240 | I mean, talk about-- - Their feelings were hurt.
00:51:26.420 | - Well, like, they're violating Epic's free speech
00:51:29.560 | because they don't like what Epic is saying.
00:51:31.220 | I mean, this is-- - According to Epic, yes.
00:51:32.960 | - This is crazily heavy-handed by Apple.
00:51:36.060 | - Apple heavy-handed with developers, yeah.
00:51:38.040 | - Have they lost their minds?
00:51:39.180 | I mean, this is right out of power corrupts
00:51:41.600 | and absolute power corrupts, absolutely.
00:51:43.940 | I mean, whatever dispute you have with Epic,
00:51:46.320 | you don't boot them out of your app store
00:51:47.840 | 'cause you don't like their criticism of you.
00:51:50.740 | I mean, this is basically proving
00:51:53.280 | exactly what everyone's been saying about Apple,
00:51:55.660 | which is they're too powerful and heavy-handed.
00:51:58.440 | And Apple's coming along and saying,
00:51:59.800 | "Let me confirm it for you guys
00:52:01.380 | "by acting tyrannically against Epic."
00:52:04.900 | I mean, talk about a backfire.
00:52:06.480 | This seems insane to me.
00:52:08.280 | - Yeah, it's super nuts.
00:52:10.280 | Chamat, any thoughts on this
00:52:11.560 | before we get into Apple's other problems?
00:52:13.880 | - I think it's the beginning of the decay of Apple.
00:52:18.880 | Peak Apple, well, let's get into that.
00:52:21.080 | There's tons of headwinds facing Apple.
00:52:22.700 | - You can add a bunch of other things
00:52:23.920 | to the list as well, Jason.
00:52:25.460 | - Yeah.
00:52:26.520 | - The thing is Apple for the last couple of years
00:52:28.220 | has been what is effectively
00:52:30.160 | what we call a GDP plus growth company,
00:52:32.960 | which means that take GDP, two, three, 4%.
00:52:36.440 | Maybe they can grow by a couple of percentage points
00:52:38.440 | more than that, but they are effectively levered to GDP.
00:52:42.160 | Meaning when you look at a Facebook or an Nvidia,
00:52:45.260 | they're growing at 50%, 200%, 2,000%,
00:52:48.280 | whatever it is, that's not tied to GDP.
00:52:50.920 | They're just taking share.
00:52:52.700 | But Apple is a company now that grows
00:52:54.600 | as the economy of the world grows.
00:52:56.400 | So that's not super great for its future prospects
00:52:58.760 | unless it can expand the surface area of where they operate.
00:53:03.800 | And then on that dimension,
00:53:05.680 | there are a few trillion dollar markets
00:53:07.840 | that they can really penetrate.
00:53:09.880 | And they just announced that they've killed a project
00:53:11.960 | in one of those areas, which is autos, right?
00:53:13.820 | Project Titan, which was $10 billion
00:53:16.280 | turned out to be a failure.
00:53:18.580 | So all of these things I think mean to me
00:53:20.260 | that it is effectively becoming a cyclical,
00:53:24.140 | rate sensitive stock.
00:53:25.540 | And then the coup de gras is Warren Buffett.
00:53:29.880 | And Nick and I were talking about it this week.
00:53:32.800 | And what was interesting about Buffett's letter is that
00:53:35.500 | you can tell when Buffett has gotten disengaged
00:53:41.140 | with a company based on the number of times
00:53:43.860 | he mentions it in his annual letter.
00:53:46.500 | So in this example,
00:53:48.060 | this is the number of times Apple was mentioned.
00:53:50.060 | And just to be clear, what I mean by mentions
00:53:52.260 | is not when it's included in a chart
00:53:53.780 | or part of a disclosure.
00:53:54.780 | What I mean is when Warren actually explicitly mentions it
00:53:58.340 | in a positive or even negative way,
00:54:01.300 | or he doesn't mention it at all,
00:54:02.560 | which I think rings very loudly.
00:54:04.820 | He went from basically saying,
00:54:07.180 | Apple was the absolute end all and be all.
00:54:11.500 | And now what you can start to see is this shrinking.
00:54:14.660 | And it's gone from basically a bunch of times
00:54:16.940 | to almost none.
00:54:17.860 | He did mention it once,
00:54:19.140 | but he mentioned it in the context of talking positively
00:54:21.700 | about Coca-Cola and Amex.
00:54:23.560 | And he was lauding these two positions,
00:54:26.260 | but just mentioning that they were not as large
00:54:28.860 | in comparison to Apple.
00:54:29.860 | That's the only mention in this year's annual letter.
00:54:33.440 | What's interesting about that is the last time
00:54:35.060 | that that happened was with, drum roll,
00:54:37.000 | Wells Fargo.
00:54:38.840 | Over 15 and 20 years,
00:54:40.860 | Buffett built up a huge position.
00:54:44.140 | I think he was able to weather the vicissitudes
00:54:47.280 | of the market.
00:54:48.120 | So even when the markets would contract,
00:54:49.780 | he knew when to hold onto that company
00:54:52.120 | until he realized that that company
00:54:53.820 | was not really one of his forever stocks.
00:54:56.060 | And he got out of it.
00:54:58.200 | And the number of times it basically was mentioned
00:55:01.960 | in his letter went to zero.
00:55:03.040 | So interestingly, I think this Buffett index
00:55:07.300 | is a really important one for Apple,
00:55:08.780 | which is it went from a forever holding
00:55:11.620 | that he said he would own forever
00:55:14.380 | to barely getting mentioned.
00:55:15.800 | And above Apple on that list
00:55:18.180 | were all the Japanese trading companies that Buffett owns,
00:55:21.740 | American Express, Coca-Cola.
00:55:23.300 | So that is a person that understands the economy,
00:55:27.440 | I think, better than anybody else in the world.
00:55:29.940 | And so if you're basically taking a lever bet
00:55:32.640 | to the economy as a reason to own Apple
00:55:34.620 | and the person that understands the economy the most
00:55:37.500 | has now started to pivot away,
00:55:39.260 | he started to sell in quarter four.
00:55:41.420 | And then you see all of these things,
00:55:42.700 | antitrust rules, killing projects in trillion dollar terms.
00:55:47.340 | Unfortunately, it speaks for a very bad
00:55:50.360 | next five to 10 years for this company
00:55:51.980 | unless they figure something out.
00:55:53.340 | - I think it's just such a great insight
00:55:55.180 | with the Warren Buffett mentions.
00:55:57.100 | And we should do that forward looking,
00:55:58.380 | like who is he talking about now?
00:56:00.100 | And let's the monitor when he stops talking about them.
00:56:02.540 | I have a theory about this.
00:56:03.900 | I think this is really just about peak iPhone.
00:56:06.900 | If you look at the majority of their revenue,
00:56:08.700 | it's obviously from the iPhone,
00:56:10.660 | which has become massively profitable over time.
00:56:13.380 | But the iPhone revenue has been flat
00:56:15.820 | for a couple of years now.
00:56:17.540 | And they're starting to make their money from services.
00:56:19.820 | Everybody knows Apple One, iCloud storage,
00:56:22.780 | Apple Arcade, Apple Music, Apple Plus TV.
00:56:26.080 | They've got a great collection of services
00:56:28.060 | and services revenues growing.
00:56:29.860 | But I don't think that they are filling in
00:56:33.060 | this growth problem with the iPhone.
00:56:34.740 | And this happened to me for the first time.
00:56:36.380 | And I don't know if you gentlemen
00:56:37.740 | have had the same experience,
00:56:38.660 | but I was buying every Apple phone
00:56:40.900 | and then I would buy every like medium upgrade,
00:56:44.380 | like when they would do like a 12 and then a 12S
00:56:46.740 | or an 11, 11S.
00:56:48.440 | And when I got to iPhone 13,
00:56:49.700 | I was like, this thing's kind of peaked.
00:56:51.560 | And I just forgot to buy the 14, didn't need it.
00:56:54.320 | And then I bought the 15
00:56:55.400 | and I'm sitting here on my 15 and my 13,
00:56:57.020 | I really couldn't tell the difference between them.
00:56:58.500 | And so now I'm a technologist
00:57:00.380 | who would buy it every year, the latest one,
00:57:02.820 | and I don't feel the need to upgrade it.
00:57:04.980 | And I think that a lot of people,
00:57:06.500 | and I have family members who would take two generations off,
00:57:09.780 | now they take three or four generations off.
00:57:11.620 | - I don't even know what number I'm on.
00:57:14.100 | - I don't even know either.
00:57:15.140 | It all looks the same to me.
00:57:16.660 | Okay, so I mean, that's the problem here.
00:57:18.940 | And then if you look at their roadmap,
00:57:20.860 | what is the device post the Steve Jobs era
00:57:24.900 | that they have launched?
00:57:26.220 | How do you know what device you have?
00:57:28.820 | - You go to general in your settings and about,
00:57:31.060 | I just checked it, iPhone 13 Pro.
00:57:33.780 | What version are they on?
00:57:35.380 | - 15 general.
00:57:36.220 | - I'm on 15 and I'm on 13 Pro.
00:57:37.660 | I don't even know.
00:57:38.500 | - Yeah, and you're not price sensitive.
00:57:41.420 | And these things have gotten absurdly expensive.
00:57:43.300 | So they're trying to really extract--
00:57:44.860 | - What is the latest?
00:57:46.660 | - 15.
00:57:47.860 | - 15, I'm on a 14.
00:57:49.060 | - I don't like the hassle of having to reset everything
00:57:52.180 | when I get a new phone.
00:57:53.020 | - Yes, and it's easy to do now.
00:57:55.020 | It's all in the cloud, but even that,
00:57:57.060 | going to the store, buying it, who cares?
00:57:58.980 | It's not differential.
00:57:59.820 | - But then on my apps, I have to re-log in and--
00:58:01.980 | - Frebert, what Android phone are you on?
00:58:05.340 | - Oh, you're a Googler.
00:58:06.260 | Didn't the Googlers all have Android phones?
00:58:08.100 | Or are you your team iPhone?
00:58:11.220 | - I got the Macintosh original in 1984.
00:58:13.940 | I've only been on Macintosh and Apple products since then.
00:58:18.820 | - What iPhone are you on, be honest?
00:58:20.660 | - 15 Pro.
00:58:22.020 | - Okay, so--
00:58:22.860 | - Have you ever tried a Pixel, Frebert?
00:58:26.300 | - I don't like it.
00:58:28.460 | But I just, I'm not used to the OS, I just--
00:58:30.540 | - What was the rule at Google?
00:58:31.660 | If you worked at Google, were you kind of forced
00:58:34.300 | to have an Android phone?
00:58:35.500 | Or did you just keep your iPhone in your pocket
00:58:37.180 | and not take out?
00:58:38.260 | - There weren't a lot of Androids on the market
00:58:40.620 | when I was at Google, because we acquired the company
00:58:43.900 | when I was there, and oh, we acquired Andy Rubin's company
00:58:47.820 | in '04, and I think the first major devices
00:58:52.820 | started to roll out in '07.
00:58:56.340 | And so this was all post my era.
00:58:58.180 | I left at the end of '06.
00:59:00.340 | - Wow, Frebert, that's an amazing chart.
00:59:02.460 | Holy (beep)
00:59:03.300 | - Look at this chart.
00:59:04.300 | Now, here's the global statistics.
00:59:07.140 | The U.S. market share is 57% for iOS,
00:59:10.620 | but on a global basis, iOS only has a 27% market share
00:59:15.620 | in mobile operating systems.
00:59:18.100 | Android is 72%, and all other is less than 1%.
00:59:22.180 | - But look at this, it's all the future GDP is on Android.
00:59:25.960 | Brazil, Nigeria, India, holy mackerel.
00:59:29.660 | - But I think this is one counter to your point, Chamath,
00:59:32.300 | that the macro driver is as the economic position,
00:59:36.740 | the GDP per capita scales in these BRICS nations,
00:59:41.000 | they can start to afford to buy iPhones,
00:59:44.380 | which are generally some multiple of the Android devices
00:59:47.300 | in these markets that you see here.
00:59:49.020 | So while today Android is 72% of the market,
00:59:52.260 | if the emerging markets continue to grow GDP per capita
00:59:55.180 | and iOS continues to be the superior product,
00:59:58.080 | you'll see Apple able to steal into more share over time.
01:00:01.740 | But how do you do that when you have messaging groups
01:00:04.620 | on Android, when you have photos in Google Photos?
01:00:07.860 | Does the switching cost stop that, do you think?
01:00:11.380 | - I don't know the answer to that.
01:00:12.340 | - It's a good question.
01:00:13.680 | They're down to $15 to $25 for these Android phones.
01:00:17.420 | In India, there's a famous-
01:00:18.860 | - That's the difference, right, to these Android devices.
01:00:21.880 | - I think Apple is levered to global GDP,
01:00:24.060 | so they need to figure out some way to grow superior to that.
01:00:27.700 | So what is the market?
01:00:29.020 | What could they do?
01:00:30.300 | Services?
01:00:31.740 | - A company that generally has a bunch of cash flow
01:00:36.500 | being generated by some set of products today,
01:00:39.300 | and you assume stasis over time,
01:00:42.040 | the market share for those products could be eaten away,
01:00:44.260 | you'll see the profits per year go down,
01:00:46.580 | and a company like that will trade anywhere
01:00:48.140 | from seven to 20 times.
01:00:50.280 | The technology value arises from the value of the brand
01:00:52.960 | that you can launch new products, leverage your brand,
01:00:55.080 | leverage your distribution,
01:00:56.100 | leverage the sale of new services, and new products.
01:01:00.340 | I think, to your point, Shamav,
01:01:01.860 | the challenge Apple is facing is that the pool of options,
01:01:05.580 | the portfolio of call options
01:01:07.700 | that you would get a new product coming out of Apple
01:01:09.860 | is shrinking with the car being taken out of that pool now.
01:01:14.860 | Apple Vision Pro, a lot of question marks
01:01:17.340 | on how much it can scale.
01:01:18.700 | As you guys know,
01:01:19.540 | I feel like there's gonna be a market for that device,
01:01:22.020 | but it's a high-end device today.
01:01:24.100 | It has to become more cheap to be more ubiquitous.
01:01:26.980 | - The thing that a lot of these companies confront, though,
01:01:29.380 | is that you can also grow inorganically, right?
01:01:31.720 | You don't necessarily have to incubate these projects.
01:01:34.580 | We can remember the moment Apple had a chance to buy Tesla,
01:01:38.200 | right? - Didn't take it.
01:01:40.700 | - Then they didn't take it.
01:01:42.240 | Apple still has a chance to buy,
01:01:44.300 | pick your company that would take a sweet acquisition offer,
01:01:48.900 | Rivian, Lucid, Polestar, whatever it is.
01:01:51.460 | I'm not saying that these companies are good or not good.
01:01:53.240 | I'm just saying that Apple has the one tool
01:01:56.480 | that they've never used in their toolbox
01:01:58.340 | is the large inorganic acquisition.
01:02:00.880 | And at this point,
01:02:02.920 | if they are proving that they can't execute internally,
01:02:06.420 | the market is going to demand that they prove
01:02:08.340 | they can use that balance sheet as a cudgel
01:02:10.380 | to go and execute it externally.
01:02:12.460 | - Yeah, I think the challenge- - And if they don't do that,
01:02:14.420 | they're gonna discount that cash to zero.
01:02:16.520 | - The challenge is that there's a certain discipline
01:02:21.000 | and quality to the products and the businesses
01:02:23.420 | that Apple produces and runs.
01:02:26.060 | And it's very hard to see that in other markets.
01:02:28.260 | I mean, why would they go buy a money-losing,
01:02:30.340 | low-margin car hardware company
01:02:33.780 | when they are proven to make high quality-
01:02:35.840 | - It depends on the reasons Project Titan failed.
01:02:38.400 | - Yeah, I think-
01:02:40.980 | - The article in Bloomberg basically said
01:02:42.940 | that the reason that they abandoned the project
01:02:45.000 | was in part that they did not know
01:02:48.580 | how to go from where they were to a full production vehicle
01:02:52.920 | that could be level five.
01:02:54.040 | And then when it was proposed that they step down
01:02:56.140 | and just launch a level three autonomous vehicle,
01:02:58.700 | everybody said no, that it wasn't disruptive enough.
01:03:01.400 | Well, if that's the case,
01:03:02.640 | the thing that they made a decision about there
01:03:05.440 | was not going into a market because of regulation,
01:03:07.680 | not because of technical capability.
01:03:10.240 | And I don't think that that's necessarily a smart decision.
01:03:13.160 | They would have been better off going into the car market,
01:03:16.040 | launching a level three vehicle
01:03:17.360 | and just letting the market play out.
01:03:18.560 | They probably, just like they were able to do in music,
01:03:21.500 | have the influence to change the laws,
01:03:24.240 | especially if they stepped in there with their rigor.
01:03:27.100 | And they're not doing that.
01:03:28.140 | So it's a little bit of a head scratcher
01:03:29.920 | what's going on over there.
01:03:30.760 | - Yeah, they need some new products.
01:03:32.120 | I mean, maybe Apple Vision Pro someday
01:03:34.420 | becomes the platform of the future,
01:03:36.420 | but they've squeezed as much revenue
01:03:38.020 | as they can out of the iPhone.
01:03:39.400 | Services seems like the best place for them to make money,
01:03:41.700 | but as we see-
01:03:43.100 | - Well, my idea for them was-
01:03:43.940 | - The app source under assault, yeah.
01:03:46.360 | - My idea was just that they should launch a huge competitor
01:03:48.660 | to AWS, Azure and GCP.
01:03:50.600 | So if you look, every developer that Apple has,
01:03:54.360 | theoretically could have been running on an Apple,
01:03:56.960 | not just on Apple's SDK and APIs,
01:03:59.160 | but they should be running in an Apple cloud.
01:04:00.780 | You could have made that claim
01:04:02.520 | and it would have made a ton of sense for app developers
01:04:04.680 | to have turnkey access to that, right?
01:04:06.760 | And they could have subsidized it.
01:04:08.060 | And by the way, to Sax's point,
01:04:09.960 | that would have been an incredible way
01:04:11.460 | to defend a 30% rev share.
01:04:13.040 | Okay, listen, I'm taking 30%,
01:04:14.920 | but here's a bunch of subsidized access to hardware
01:04:17.640 | and the market would have loved it.
01:04:19.660 | And in this AI shift, they can still do that,
01:04:22.240 | where now people are chipping away at the 30%, right?
01:04:25.880 | People are saying, "Well, I can just build around you."
01:04:29.120 | They need to do something and services, Jason,
01:04:31.280 | the most valuable thing they could do
01:04:34.360 | would be to launch a big cloud, I think.
01:04:36.640 | - Yeah, I mean, they have all the app developers
01:04:38.640 | on TestFlight ready to go.
01:04:40.200 | They just email them and say, "Hey, here's your free storage."
01:04:42.400 | I mean, they could just slowly add features, right?
01:04:45.080 | And they're doing this open source,
01:04:48.040 | I think it's called Maggi or Magpie or something,
01:04:50.800 | but they're doing an image one.
01:04:52.400 | - Well, look, if they could just make Siri work
01:04:56.560 | like the way it was supposed to using an LLM
01:04:59.880 | and have it be like snappy,
01:05:01.560 | that'd be a major upgrade to the iPhone.
01:05:04.460 | I don't think you need to buy a new iPhone for that
01:05:05.920 | 'cause it'd just be a software upgrade,
01:05:07.240 | but just getting Siri to work would be a big win.
01:05:12.160 | - I think that's where they're going with their silicon,
01:05:13.960 | and they just announced the M3 on the MacBook Air.
01:05:16.720 | It's like they're making their own silicon,
01:05:18.000 | and I think it's going to power LLMs locally on the devices,
01:05:21.480 | and that's so powerful.
01:05:22.880 | I was using iPhoto this week,
01:05:25.640 | and in Apple Photos, there's now on certain images,
01:05:29.360 | if you swipe through them, you'll see the little AI icon.
01:05:32.400 | When you click it, it tells you things in the photo,
01:05:35.640 | like that's a bulldog, that's pasta.
01:05:38.120 | And so they're already subtly adding these features.
01:05:40.320 | Now, that's a feature Google's had
01:05:42.120 | in Google Photos for five years,
01:05:44.640 | but go ahead and open your phone right now
01:05:46.280 | and search for a dog or a bulldog
01:05:48.140 | or whatever type of dog you got, you know, Labrador,
01:05:51.400 | and then watch Golden Retriever,
01:05:53.160 | that it actually knows how to do it.
01:05:54.360 | They never announced it.
01:05:55.360 | It's just subtly being put in there,
01:05:57.700 | so there's so many opportunities.
01:05:59.200 | - If they required a hardware upgrade
01:06:01.120 | to get actually good AI built into the software,
01:06:05.200 | then everyone's going to have to upgrade for that.
01:06:07.020 | - You'd be on 16.
01:06:07.860 | - That's a pretty big motivation to upgrade.
01:06:10.280 | - There you have it, Tim Cook.
01:06:11.160 | Somebody send this clip to Tim Cook.
01:06:12.840 | Put AI chips on your phone.
01:06:15.280 | - But this isn't hard, right?
01:06:16.380 | I mean, all they gotta do is just
01:06:18.760 | take the latest open source models
01:06:20.240 | and figure out how to customize them for their own products.
01:06:23.240 | - Yeah, and they're doing it.
01:06:24.080 | They're doing it right now.
01:06:25.120 | All right, let's go to issue four, TikTok bipartisan ban.
01:06:28.640 | Will the CCP agree?
01:06:30.920 | A bipartisan group of a dozen plus lawmakers
01:06:34.280 | introduced a bill that would effectively ban TikTok.
01:06:36.760 | In the House this week, the bill is officially called
01:06:38.800 | "Protecting Americans from Foreign Adversary
01:06:41.360 | "Controlled Applications Act."
01:06:43.120 | Gives ByteDance 165 days to divest from TikTok.
01:06:46.480 | That's the parent company,
01:06:47.600 | the Chinese company that owns TikTok,
01:06:49.480 | the app that's very popular in the West,
01:06:51.600 | especially here in America.
01:06:53.320 | It would make it illegal for companies like Apple and Google
01:06:55.720 | to show TikTok in their app stores.
01:06:58.140 | As you know, TikTok, 170 million US users,
01:07:01.960 | and they claim that they're headquartered in Singapore.
01:07:05.120 | I know people who have worked at TikTok
01:07:06.960 | or do work at TikTok, and they said that's nonsense to me.
01:07:11.160 | The company said it has not and will not share user data
01:07:14.800 | with the CCP.
01:07:16.280 | In my mind, it's an obvious lie.
01:07:17.720 | In 2021, the CCP took a board seat
01:07:19.800 | on ByteDance Beijing-based subsidiary.
01:07:22.260 | That's according to Reuters.
01:07:23.280 | And then in 2022, ByteDance admitted that it accessed
01:07:26.040 | IP addresses and data by journalists covering TikTok
01:07:29.600 | to see if they'd been in the same places
01:07:31.040 | as ByteDance employees, obviously to find leakers.
01:07:33.940 | And ByteDance claims they fired the people involved,
01:07:35.880 | yada yada.
01:07:36.720 | Last year, a former head of engineering of ByteDance US
01:07:39.280 | said CCP members had god mode access to user data in 2018.
01:07:43.420 | Sax, I can go explain more of this,
01:07:46.000 | but I just got to go to you here.
01:07:48.640 | If ByteDance is not spying on Americans
01:07:52.980 | and the CCP is on the board, that
01:07:54.800 | would make no logical sense to me.
01:07:57.120 | And why are they fighting the divestiture
01:07:59.120 | if it's just the financial reason?
01:08:00.920 | Why would they take the CCP off their board?
01:08:02.760 | Do you think it's spyware?
01:08:03.800 | Do you think the US is crazy for allowing this product?
01:08:06.600 | Here in the United States, when we're not allowed to put
01:08:09.960 | Twitter, Facebook, Instagram, pick your social network,
01:08:13.000 | in China, we have zero reciprocity here.
01:08:16.880 | Look, if it's true that TikTok is sharing data with the CCP,
01:08:21.960 | then I think the United States is well
01:08:23.620 | within its rights to either ban it or cause it to be divested.
01:08:28.360 | And I personally like the divestiture option.
01:08:30.720 | I mean, that's what Trump was suggesting during his term.
01:08:33.520 | Because we don't, I think, in the United States like to just
01:08:36.560 | essentially confiscate or destroy people's property.
01:08:40.960 | But I think we are within our rights
01:08:42.520 | to require it to be divested to an entity
01:08:45.560 | that we know is completely separate from and won't
01:08:48.240 | cooperate with the CCP.
01:08:51.120 | This bill is better than the last bill
01:08:54.320 | we talked about that was targeted at TikTok.
01:08:56.680 | I don't know if you remember that one.
01:08:58.400 | But it was weirdly prohibiting Americans from using VPNs
01:09:03.240 | and gave the government the right to go after Americans
01:09:06.040 | who were using VPNs.
01:09:06.920 | So this one seems cleaner and better and more narrowly
01:09:10.200 | targeted at divestiture.
01:09:12.040 | Now, at the beginning of my response to you, I did say if.
01:09:17.460 | I know that everyone's just assuming
01:09:19.680 | that it's true that TikTok is sharing data with the CCP.
01:09:25.160 | But I just want to confirm that that is the case.
01:09:27.920 | Because they are denying it.
01:09:29.160 | And I can understand why people think it
01:09:31.280 | and why it might even be likely.
01:09:33.200 | But since we do have a concept of due process in America,
01:09:37.100 | I do think some evidence should be provided that that actually
01:09:40.640 | is taking place.
01:09:42.120 | Well, we've had whistleblowers inside there.
01:09:45.000 | But Chamath, if the CCP is spying on their own citizens,
01:09:48.480 | what are the chances that they wouldn't take the opportunity
01:09:51.020 | to spy on government officials who
01:09:53.280 | have TikTok on their phone or their kids
01:09:55.760 | and get compromise on them possibly
01:09:57.960 | or no locations of people?
01:10:00.520 | What does your gut tell you?
01:10:02.480 | Is this too dangerous for us to have here
01:10:04.840 | in America under CCP control or this kind of influence
01:10:07.900 | being on their board?
01:10:09.520 | I think you're asking the exact right question.
01:10:13.280 | I have two comments to make.
01:10:15.280 | One is, Nick, if you can bring up
01:10:18.680 | this article about the Google AI IP case.
01:10:22.520 | Basically what happened was that the DOJ filed an indictment.
01:10:29.280 | Actually, I think I sent you guys the actual indictment.
01:10:34.480 | But essentially what happened is there was an engineer at Google
01:10:37.800 | that has been charged with stealing AI secrets for China.
01:10:43.160 | And I don't know whether he's back in China now or not,
01:10:47.080 | but the whole point is that if it's happened at Google, where
01:10:51.680 | there is a motivation for the Chinese intelligence apparatus
01:10:54.960 | and, frankly, every intelligence apparatus
01:10:57.480 | to infiltrate that organization and get access
01:10:59.960 | to all kinds of data, I think we should
01:11:04.000 | presume by default that all of these organizations
01:11:07.800 | are infiltrated.
01:11:09.120 | And I think that that's probably a more conservative and
01:11:11.480 | reasonable posture.
01:11:12.360 | So Facebook is infiltrated, Google is infiltrated,
01:11:15.640 | Apple is infiltrated, TikTok is infiltrated.
01:11:18.840 | So on that dimension, I think that it
01:11:21.640 | should be considered 100% certainty that this data is
01:11:26.040 | getting back to not just the Chinese,
01:11:28.120 | but multiple state-sponsored actors.
01:11:30.960 | So the question about TikTok then, I think,
01:11:34.520 | should be one of business.
01:11:35.680 | And I think Palmer Luckey did a very good job of simplifying
01:11:39.840 | this down to its essence, which is essentially
01:11:42.040 | what he called the law of equivalent exchange.
01:11:43.960 | If you want to just play this, it's like just a few seconds.
01:11:46.000 | I was kind of frustrated that people made TikTok
01:11:48.120 | into a cultural issue.
01:11:49.920 | By the way, I'm totally on the culture war side of it.
01:11:53.960 | But I was saying, practically speaking, you should not
01:11:56.400 | make this a culture war issue.
01:11:58.080 | Don't talk about how it's ruining our used ideals.
01:12:01.040 | Just say strictly on a trade basis.
01:12:04.040 | We cannot allow them to sell this thing to us if we can't
01:12:09.000 | sell the same thing to them.
01:12:10.520 | That should be totally fair.
01:12:11.680 | That's reciprocity.
01:12:13.240 | That's reciprocity.
01:12:14.240 | Me and Palmer Luckey and Singh.
01:12:15.720 | Right, so this is what he calls the law of equivalent exchange,
01:12:18.200 | and I think it just makes a lot of sense.
01:12:19.880 | So on the face, what I would say is, Jake, out of my responses,
01:12:22.760 | I think that the CCP, but also other intelligence
01:12:26.080 | organizations, have infiltrated all of these big companies,
01:12:29.440 | and all of our data is accessible by them.
01:12:33.520 | I'm not going to say on a whim, but I think it's accessible.
01:12:37.080 | I think you have to deal with TikTok as a business issue,
01:12:39.480 | and I agree with Palmer Luckey, which
01:12:41.120 | is they should not be able to sell to us,
01:12:43.520 | but we cannot sell to them.
01:12:44.640 | And I think that that's a fair principle that we can live on.
01:12:47.120 | Reciprocity is a very simple position.
01:12:49.840 | Friedberg, let me use your creativity, your love of cinema.
01:12:55.600 | If you were to use this tool, let's take the most cynical
01:12:58.800 | approach here, or interpretation.
01:13:01.000 | CCP has complete access to the algorithms,
01:13:03.560 | and they want to do maximum damage,
01:13:05.600 | let's say, during the election.
01:13:07.000 | Let's say in a conflict like the one going on in Ukraine
01:13:11.380 | with Russia or in Gaza.
01:13:15.320 | What could they do using the algorithm, using videos?
01:13:19.800 | What would be the doomsday scenario for America?
01:13:22.560 | As in the CCP comes in and influences
01:13:24.840 | the management of this company and tells them
01:13:27.280 | what to tweak and how and why?
01:13:28.920 | Yes, what would they do?
01:13:31.280 | I think we saw this after October 7
01:13:36.800 | that there was a significant surge in pro-Hamas videos
01:13:43.400 | relative to Israel support video.
01:13:46.080 | That's the sort of thing where you could kind of see something
01:13:49.540 | that sets an opinion that may be disruptive to the social fabric,
01:13:55.420 | to the election cycle, that starts
01:13:57.360 | to get shared more frequently and shows up
01:13:59.640 | in feeds more frequently.
01:14:01.160 | Unlike Facebook and other places where there's a linear feed
01:14:03.840 | where you can scroll up and select what you want to watch,
01:14:06.160 | as you know, TikTok has already lined the videos up.
01:14:09.000 | So when you scroll up, they automatically
01:14:11.680 | play the next thing for you.
01:14:13.280 | So the ranking really matters in terms of viewership on TikTok,
01:14:17.240 | unlike a lot of other kind of select-to-play social media
01:14:20.960 | type networks.
01:14:21.500 | Could it shift an election?
01:14:23.960 | Hearts and minds in a war?
01:14:25.740 | Well, I've always said this.
01:14:26.900 | I think it's the craziest thing in the world
01:14:29.700 | that someone can spend advertising dollars
01:14:32.000 | and change someone's vote.
01:14:34.760 | Just think about that fact for a second.
01:14:36.420 | I mean, I've said this before, and people
01:14:38.080 | have told me I'm an idiot for saying it, but--
01:14:40.020 | Meaning that you can, or that it does?
01:14:42.240 | That both.
01:14:43.320 | [INTERPOSING VOICES]
01:14:45.860 | But think about it, like people don't individually
01:14:48.340 | go and gather data and then make an informed opinion about who
01:14:51.800 | they're going to vote for.
01:14:52.880 | Their opinion changes based on seeing an ad.
01:14:55.720 | It is so crazy to me that that's the truth.
01:14:58.580 | I think the country knew that it does.
01:15:00.180 | That's why they prevented it from happening until--
01:15:02.680 | For so long.
01:15:03.620 | Yeah, exactly.
01:15:05.120 | But I mean, I think that's what's so nuts is that there's
01:15:08.000 | no longer a forced discourse that kind of makes people go
01:15:13.000 | out and choose what content they want to consume,
01:15:15.880 | what they want to hear, debate stages, et cetera.
01:15:18.520 | That now it's about who spends the most money
01:15:20.360 | to get the most views in front of someone,
01:15:22.200 | and that that actually influences someone's decision
01:15:24.600 | on who to vote for is what's so compelling to me about why
01:15:29.120 | all of these systems have such extraordinary power.
01:15:31.360 | It's just so amazing to me that the more frequently someone
01:15:33.880 | sees an ad, the more likely they are to buy something
01:15:36.080 | or do something.
01:15:38.000 | Here's the bottom line.
01:15:39.000 | This thing is incredibly--
01:15:40.080 | So the more frequently you show someone a memetic on TikTok,
01:15:42.040 | the more likely they are to vote something differently.
01:15:44.480 | This thing is far too powerful for the CCP
01:15:47.080 | to have any kind of access to it,
01:15:48.680 | for the Chinese government to have any kind of access to it.
01:15:51.100 | It has to be divested.
01:15:52.320 | If you look at what we went through
01:15:53.840 | in the last couple of election cycles,
01:15:55.520 | not being partisan in here at all, but we've said
01:15:58.880 | and we've talked about Hunter Biden's laptop
01:16:00.720 | is but one example.
01:16:02.480 | That was suppressed, obviously, on social networks.
01:16:05.160 | That could have been amplified in social networks,
01:16:07.240 | and it could have had the opposite effect.
01:16:09.160 | Hillary Clinton's emails, Trump this, Hillary this,
01:16:12.760 | Biden that.
01:16:13.880 | You could really sway an election
01:16:16.800 | by putting in specific, subtle information, let alone
01:16:21.320 | taking a hack and releasing it like they did
01:16:24.240 | with Hillary Clinton's email.
01:16:25.400 | So those things might not have swayed an election.
01:16:27.720 | But, Sax, if Putin had access to this,
01:16:30.240 | like somebody who's super capable,
01:16:32.000 | and he had access to 170--
01:16:33.000 | Of course, that's always where it goes.
01:16:34.680 | No, I just--
01:16:35.280 | Is Putin is suddenly pulling the strings of our election?
01:16:38.720 | Iran, okay, Iran, North Korea.
01:16:40.040 | This is like the biggest threat inflation ever.
01:16:42.840 | Okay, look, if you ask people out there,
01:16:47.480 | do you think other people's votes
01:16:49.040 | are influenced by social media?
01:16:50.080 | They'll say, yeah, of course, people are brainwashed.
01:16:51.800 | If you say to them, is your own vote
01:16:54.920 | influenced by social media?
01:16:56.160 | Do you make up your own mind
01:16:57.920 | based on all the information you have?
01:16:59.800 | They'll say, yeah, of course, I'm not brainwashed.
01:17:01.760 | Everybody else is.
01:17:02.840 | And I believe that people are closer to telling the truth
01:17:06.520 | when they're talking about themselves
01:17:07.920 | because they understand their own situation
01:17:10.160 | better than they understand everyone else's situation.
01:17:12.680 | The fact of the matter is that all of us
01:17:14.640 | are constantly bombarded with information,
01:17:17.960 | 24 hours a day, seven days a week,
01:17:19.640 | through all of the channels, both online and offline,
01:17:22.400 | where we get information.
01:17:23.680 | Some of those data points come from advertisements,
01:17:27.080 | but I don't think we take ads very seriously.
01:17:29.560 | We're trained to kind of even just block out the ads.
01:17:33.000 | When I see banner ads or even ads in my stream,
01:17:35.800 | I just scroll past them.
01:17:38.680 | They don't even factor in my consciousness.
01:17:41.280 | I am influenced by accounts that I follow,
01:17:44.960 | but there are accounts that I've chosen to follow
01:17:46.840 | because I think they have signal over noise.
01:17:49.200 | And the more noisy those accounts are,
01:17:51.280 | the more I disregard them and take them less seriously
01:17:54.080 | with advertisements being the most noisy
01:17:56.960 | and least useful channels.
01:17:58.120 | So look, I think at the end of the day,
01:17:59.880 | this idea that we're all being brainwashed
01:18:01.960 | and secretly influenced by maligned foreign actors,
01:18:04.720 | I think, is at a minimum threat inflation,
01:18:07.840 | and that entire narrative might just be completely bogus.
01:18:11.120 | Nonetheless, I do agree that for data collection reasons
01:18:17.440 | and reciprocity reasons, I think it's, like I said,
01:18:20.760 | we're within our rights to require
01:18:22.800 | the divestment of TikTok.
01:18:25.320 | - Yeah, I don't, you know, it's actually, you said on this--
01:18:27.080 | - But don't make this into more than it is.
01:18:29.120 | Again, I think this whole disinformation narrative,
01:18:31.520 | by the way, you wanna know why they push it so hard?
01:18:34.080 | It's because our own intelligence community
01:18:37.520 | wants to be involved, and there are political actors
01:18:41.640 | in the US who wanna regulate, quote, disinformation
01:18:45.440 | on our own social networks.
01:18:46.760 | I'm not talking about TikTok, I'm talking about X,
01:18:48.960 | I'm talking about Facebook, Insta, and so on.
01:18:51.520 | And we saw this in the Twitter files.
01:18:53.360 | We saw what a cozy relationship
01:18:55.440 | the intelligence community had with Twitter.
01:18:57.760 | They were all up in there trying to control
01:19:00.920 | what legacy Twitter was allowing
01:19:04.480 | and what they were censoring. - They blocked the
01:19:05.320 | New York Post URL.
01:19:06.160 | They blocked the New York Post URL.
01:19:07.400 | And you've said on this program that you believe
01:19:09.240 | that could have swung the election.
01:19:10.840 | - Well, I think it was, I think that actually
01:19:13.680 | was genuine election interference.
01:19:15.240 | I actually said that I didn't know
01:19:16.400 | whether it could swing the election.
01:19:17.640 | I think that, but I think that--
01:19:18.720 | - Well, do you think it could have?
01:19:21.240 | - I think it was a story that deserved to be published
01:19:24.600 | so that, and distributed online,
01:19:26.800 | so that the public could take that into account
01:19:28.960 | when they voted, I mean--
01:19:30.480 | - And so if it had been widely received,
01:19:32.760 | do you think, I'm asking a question,
01:19:34.000 | you're not answering it. - How would I know?
01:19:36.520 | Okay, you're asking me a question
01:19:38.440 | that I personally think is irrelevant.
01:19:39.480 | How would I know whether the suppression
01:19:42.520 | of that story swung the election?
01:19:45.300 | How could I know that?
01:19:46.320 | The point is that the American people
01:19:48.720 | were deprived of information that they had every right to--
01:19:52.240 | - Most Republicans believe if that had come out,
01:19:55.000 | it would have swung the election.
01:19:56.520 | - Well, I can't prove.
01:19:57.840 | I can't prove that it swung the election.
01:19:59.720 | What I'm saying is that it was
01:20:01.200 | a type of election interference.
01:20:02.640 | Why was that story suppressed?
01:20:03.920 | Because 51 former intelligence officials
01:20:06.720 | who still maintain close relations
01:20:08.320 | with the intelligence community published a bogus letter
01:20:10.920 | saying that it was Russian disinformation, which it wasn't,
01:20:14.080 | and then that caused our social media sites to suppress it.
01:20:18.080 | So that, to me, is as concerning, if not more concerning,
01:20:22.960 | than whatever it is that TikTok's accused of.
01:20:25.740 | So I don't want social media companies being used
01:20:29.800 | by the intelligence communities of either China
01:20:33.140 | or the United States to swing or to influence our elections.
01:20:37.180 | And we need to be equally concerned about that
01:20:40.140 | as we are about supposed Chinese influence.
01:20:42.900 | - Just to go back in time,
01:20:43.740 | you probably remember the Willie Horton ad.
01:20:45.420 | I mean, that kind of sunk Dukakis, if you remember that.
01:20:47.620 | I mean, media and these ads can really have a big impact.
01:20:50.620 | So people should just look historically.
01:20:52.600 | There have been many moments where,
01:20:54.460 | whether it's Nixon sweating on TV or the Willie Horton ad,
01:20:57.840 | there have been many moments where video can do this.
01:20:59.620 | And I think they could be done even more subtly
01:21:03.120 | by the Chinese by just promoting certain videos.
01:21:06.860 | All right, let's move on to Bitcoin.
01:21:08.540 | Issue five, Bitcoin's back, baby.
01:21:10.700 | It hits an all-time high.
01:21:12.140 | 69,000, interesting number on Tuesday.
01:21:15.740 | 69,000 before dropping, sitting around 68K.
01:21:19.500 | As we're taping, there'd be 75 by the time you hear this,
01:21:23.060 | or 50, it's been ripping in 2024,
01:21:26.700 | up 70% since January 25th.
01:21:30.940 | There's two things going on here
01:21:32.540 | that you've probably heard about.
01:21:33.620 | The Bitcoin ETFs finally arrived in the US.
01:21:36.260 | They were approved by the SEC on January 10th.
01:21:38.900 | We'll get into them in a minute, Chamath.
01:21:40.820 | Obviously, if it's an ETF,
01:21:42.940 | it's super easy to buy them and sell them.
01:21:44.780 | They've been a total hit.
01:21:45.680 | BlackRock's Bitcoin ETF became the fastest ETF
01:21:49.060 | to ever reach 10 billion in assets.
01:21:51.340 | Also, as the technical crypto heads in the audience know,
01:21:55.180 | there is a halving happening in April.
01:21:58.660 | This happens about every four years at the current pace.
01:22:00.900 | Last one was May of 2020.
01:22:02.300 | When these halfings happen,
01:22:04.900 | the mining rewards are cut in half.
01:22:07.180 | This reduces the supply of new Bitcoins
01:22:08.900 | entering circulation and can cause some swings.
01:22:12.940 | Chamath, we released a clip, I think,
01:22:15.460 | just about your prediction, and you nailed it again.
01:22:20.340 | You said this would be a big year for Bitcoin.
01:22:21.860 | So your thoughts on being right.
01:22:24.920 | - I don't have much thoughts on that,
01:22:27.860 | but my two comments are that
01:22:29.360 | I talk to a lot of Bitcoin traders and folks
01:22:33.100 | that seem to have a very good pulse and touch
01:22:36.860 | on this market.
01:22:37.700 | I don't say that I do
01:22:39.020 | because I don't really look at it every day,
01:22:42.620 | but they seem to think that this thing
01:22:44.100 | is on a death march to 100K.
01:22:45.980 | I'm not sure whether that price is realistic or not
01:22:48.880 | in a year, but I will say that
01:22:50.420 | we're gonna get to a tipping point
01:22:53.980 | where everybody really talks about this.
01:22:56.660 | I still don't think we're there yet.
01:22:57.940 | I think we're just at the beginning.
01:22:59.740 | But when you see the inflows into these ETFs,
01:23:02.560 | Jay Cal, it's like a very big deal
01:23:04.400 | because it just allows every mom and pop individual
01:23:08.780 | to buy some to the extent that they want to own it
01:23:11.220 | or they want to speculate on it, whatever it is.
01:23:13.620 | So I think it's been a very big year,
01:23:17.220 | and I think that psychologically
01:23:19.160 | it's proven a lot of folks wrong
01:23:21.740 | and it's a setup for something really constructive.
01:23:26.280 | The other thing I'll say is that it's not just Bitcoin,
01:23:30.540 | but as goes Bitcoin, there are a handful of other things.
01:23:34.860 | People are now speculating that there's gonna be
01:23:37.320 | an Ethereum ETF that gets approved as well,
01:23:40.120 | because if you approve one,
01:23:41.920 | there's probably legitimate cause to approve a few others.
01:23:44.500 | So these things are becoming part of the financial fabric.
01:23:48.100 | And I think that that should not be underestimated.
01:23:50.460 | - And the SEC is still taking action
01:23:52.220 | against certain bad actors in crypto.
01:23:54.660 | There were a couple of those this week,
01:23:55.940 | but Bitcoin, Friedberg, is incredibly resilient,
01:23:59.120 | just on a technological basis.
01:24:01.140 | The fact that it hasn't broken down under stress,
01:24:04.860 | it hasn't had a denial of service type of attack,
01:24:08.680 | or a government hasn't been able to capture
01:24:13.220 | 51% of the mining, or some great amount of it,
01:24:16.680 | or just even be hacked in any way,
01:24:19.380 | you have to be impressed by the fundamental technology here,
01:24:22.180 | Friedberg, maybe you could speak to that level of success,
01:24:25.380 | that this thing is so stable
01:24:26.860 | and trustworthy and reliable to date.
01:24:29.900 | - It's got a great incentive model.
01:24:32.180 | As long as Bitcoin price remains high,
01:24:34.120 | the miners will still be there
01:24:35.220 | and the system will keep running.
01:24:37.540 | If Bitcoin price drops, transaction fees will decline,
01:24:41.960 | the value of mining will decline,
01:24:44.520 | and it kind of goes the other way as well.
01:24:48.440 | I think the real question is, in the last couple of years,
01:24:51.100 | have we really seen a change in Bitcoin
01:24:53.340 | being used for transactions or for commerce
01:24:57.600 | in any meaningful way?
01:24:59.060 | I think the answer is still likely not.
01:25:01.340 | - No, definitely not.
01:25:02.180 | - And it's really a stored value system,
01:25:04.100 | and it's become this kind of stored value asset.
01:25:06.300 | - Friedberg, this microplastics thing,
01:25:08.860 | we talked about it on the show,
01:25:10.860 | and since that time, I refuse to open plastic bottles.
01:25:15.020 | I'm doing all glass.
01:25:16.300 | I'm getting rid of all this goddamn plastic.
01:25:18.540 | I already did glass bottles in my house 'cause I'm cheap,
01:25:21.400 | and I like to fill it from my water filter,
01:25:22.860 | but we're uncovering more information.
01:25:26.360 | And then I saw this headline this week
01:25:28.080 | that microplastics are in our blood streams in some cases,
01:25:32.460 | and what the heck does that mean?
01:25:35.400 | - It's worse than that.
01:25:36.320 | - Team of scientists in Italy collected samples
01:25:40.380 | from patients that had plaque removed
01:25:43.340 | from their carotid artery.
01:25:44.580 | It's a kind of common cardiac procedure
01:25:48.100 | where you get plaque that blocks up in your carotid,
01:25:51.820 | they go in, they remove the plaque.
01:25:53.700 | So a total of 304 patients agreed to have the plaque
01:25:57.360 | that was removed from their artery submitted for analysis.
01:26:01.300 | And then what this team did is they took that plaque
01:26:03.500 | and they studied it to see
01:26:06.160 | how much plastic was found in that plaque.
01:26:10.880 | And they used a bunch of measurement techniques to do this,
01:26:14.960 | including electron microscopy and mass spec,
01:26:17.960 | 'cause it's really hard to find these molecules.
01:26:20.500 | And microplastics or nanoplastics, remember,
01:26:22.820 | are less than five millimeters in size
01:26:25.280 | with a mean level of 21 micrograms per milligram of plaque.
01:26:30.280 | Roughly one per 50 is the ratio of plastic to plaque
01:26:35.120 | that they found, which is really incredible
01:26:37.280 | because it shows that plastics,
01:26:39.020 | these little nano and microplastics are accumulating-
01:26:40.860 | - Incredible good or incredible bad?
01:26:43.140 | - Incredible bad, that these microplastics,
01:26:45.660 | these nanoplastics are accumulating in the human body.
01:26:48.640 | Now here's the scary part.
01:26:50.520 | They then did a follow-up 34 months later.
01:26:53.200 | The patients that had plastic in their blood
01:26:56.100 | had a four and a half times higher ratio
01:26:58.940 | or likelihood of having a heart attack,
01:27:01.540 | stroke or death from any cause.
01:27:04.100 | So all of these major health effects
01:27:07.660 | were four and a half times elevated
01:27:09.480 | in patients that had plastic in their blood.
01:27:11.580 | This was published in the New England Journal of Medicine,
01:27:13.580 | if I didn't say it.
01:27:14.420 | It really indicates that there is
01:27:15.700 | this kind of cumulative problem
01:27:18.060 | and that the cumulative problem is likely leading
01:27:20.940 | to really adverse health outcomes.
01:27:23.260 | And I'll just highlight one other paper
01:27:24.860 | from a team in Germany and Norway back in May of 2022.
01:27:29.140 | And this team tried to figure out how plastics
01:27:31.540 | are causing adverse health effects in the body.
01:27:34.260 | And they had a theory like let's put little microplastics
01:27:36.720 | or nanoplastics together with all the human cells
01:27:38.940 | that we know, shake it up and see what happens.
01:27:41.860 | And what they found were that these little plastic fragments
01:27:44.440 | were binding to dendritic cells and monocytes,
01:27:47.300 | key cells in the immune system.
01:27:49.500 | And when those cells were bound by plastic,
01:27:53.020 | they release these cytokines
01:27:54.540 | and the pro-inflammatory signals go through the roof.
01:27:57.340 | It causes the immune system to go high wire,
01:27:59.500 | increases inflammation and the cascading effects
01:28:02.220 | of that obviously can ultimately lead
01:28:04.300 | to many of the events that we're mentioning
01:28:06.180 | were measured in this set of patients in Italy.
01:28:08.620 | So again, we're just starting to uncover these effects,
01:28:12.380 | this concept that microplastics and nanoplastics
01:28:15.420 | that are accumulating.
01:28:16.240 | And let me just say these plastics are mostly PET,
01:28:19.120 | which is what we use to make plastic bottles
01:28:20.940 | that we drink water and drinks out of,
01:28:22.540 | and PVC or polyvinyl chloride,
01:28:24.820 | which is what a lot of our plastic plumbing
01:28:27.100 | and piping is made from.
01:28:28.700 | And so as little tiny bits of these plastic materials
01:28:31.220 | either are exposed to sunlight and break off
01:28:33.140 | and end up in our water and food supply and we consume them,
01:28:36.420 | they are slowly accumulating in bodies
01:28:39.060 | and they may be driving inflammatory response,
01:28:41.200 | they may be driving adverse health outcomes.
01:28:43.540 | We're really kind of tip of the iceberg
01:28:45.160 | and really studying this, understanding and analyzing it.
01:28:47.620 | But here's another really interesting empirical data set
01:28:50.680 | that highlights that this really is a pretty significant,
01:28:54.300 | half the patients had it and of that half,
01:28:57.280 | they had a four and a half times higher chance of dying
01:28:59.680 | or having a heart attack or a stroke
01:29:02.000 | in the 34 months that followed.
01:29:03.980 | - Yeah, the thing that that study said, which was nuts,
01:29:07.880 | is it looked like the nanoparticles,
01:29:10.640 | the nanoplastics and microplastics
01:29:12.280 | were effectively acting as scaffolding for plaque.
01:29:15.080 | So in almost like it was a shim that allowed it to grow.
01:29:20.080 | The question is, would it have grown faster
01:29:22.740 | than it would have otherwise?
01:29:24.100 | That's even scarier.
01:29:25.260 | So I did not like reading that paper.
01:29:30.260 | That really freaked me out.
01:29:32.140 | Really, really freaked me out.
01:29:34.900 | - I was drinking water from plastic water bottles this week
01:29:37.520 | and every time I drink water out of a plastic bottle now,
01:29:40.260 | I'm like nervous every time I take a sip.
01:29:43.660 | - You're not supposed to double the risk
01:29:45.400 | of all cause mortality by drinking a Fiji water.
01:29:48.780 | You know what I mean?
01:29:49.620 | - Four and a half X, four and a half X.
01:29:51.340 | - It's crazy.
01:29:52.260 | - In 34 months.
01:29:53.420 | - Think about the cumulative effect over time.
01:29:55.420 | - Over time.
01:29:56.820 | - Imagine drinking water out of a plastic bottle,
01:29:59.040 | thinking you're doing the right thing
01:30:00.260 | and then trotting over to the recycling bin.
01:30:02.860 | And you do that for 20 years, you may be killing yourself.
01:30:07.520 | - Well, there it is.
01:30:10.440 | - So wait, what's the, if you, if the water's--
01:30:13.420 | - Glass bottles, you must use, you cannot use plastic.
01:30:18.060 | You just cannot.
01:30:18.900 | - It's over, no plastic, no plastic.
01:30:20.780 | - You can't, you gotta stop.
01:30:21.620 | It's over, it's done.
01:30:22.440 | - We're drinking, it's done.
01:30:23.280 | Glass, cans, good.
01:30:24.840 | Plastic, stainless steel, stainless steel is fine.
01:30:28.580 | - Just like I mentioned when we talked about this
01:30:30.140 | a few weeks ago, the carbon footprint,
01:30:33.020 | the environmental cost, the cash cost is much higher
01:30:35.600 | with all these alternatives to plastic.
01:30:37.380 | So there are big challenges with respect
01:30:39.600 | to having some big, massive response to plastics
01:30:43.020 | used in our supply.
01:30:43.880 | But, you know, living in the luxury world
01:30:46.540 | that we all get to live in, we get to have that choice
01:30:48.480 | and we'll make that choice.
01:30:49.940 | But it's a real problem for humanity
01:30:52.240 | because plastics are so ubiquitous in so many things
01:30:54.940 | and they've enabled affordability of consumer goods.
01:30:58.680 | - This is such bull (beep)
01:30:59.880 | Honestly, like all you have to do is have glass bottles
01:31:02.500 | or carry a water bottle with you.
01:31:04.380 | Like I have a Contigo one I like, I carry it with me,
01:31:07.980 | I empty it, I fill it. - No, but you can't.
01:31:10.140 | - And I have a water filter in my house
01:31:12.660 | and we fill water bottles and put them in the fridge.
01:31:14.820 | - No, but what if you like yogurt?
01:31:15.960 | Yogurt comes in a plastic container.
01:31:17.400 | There's all kinds of stuff.
01:31:18.700 | You can't avoid plastic.
01:31:20.420 | That's what's so scary.
01:31:21.260 | - We try to in our house. - That's what's so scary.
01:31:22.920 | - We do have a French yogurt that comes in glass bottles,
01:31:25.100 | but yes, it is hard.
01:31:25.940 | - Ah, yes, we do the French yogurt in glass bottles.
01:31:27.700 | - It's so good.
01:31:28.540 | - It's so good.
01:31:29.360 | - That's delicious.
01:31:30.200 | - Wait, where do you get it from?
01:31:31.760 | - There's a French yogurt that comes in glass bottles.
01:31:33.760 | - A French yogurt.
01:31:34.600 | - But how do we get water in plastic?
01:31:35.760 | - It's called like Le Benoit or something, Le Benoit.
01:31:38.760 | - The water, you just install a filter system,
01:31:40.860 | which you have at the mausoleum
01:31:42.820 | and just fill huge glass bottles.
01:31:45.860 | - Just have your staff fill glass bottles
01:31:47.860 | and put them in the fridge and don't throw them away.
01:31:49.860 | And give your kids like some of these thermoses or whatever,
01:31:52.260 | but don't have the water bottles.
01:31:54.260 | Like, you know, it's so funny.
01:31:55.460 | It's been a disaster in the poker game in some ways.
01:31:57.940 | We got rid of it and there's been way more broken glass.
01:32:01.740 | People knock over the, you know, the side tables.
01:32:05.260 | I get it.
01:32:06.100 | It's been a huge pain, but I will not go back.
01:32:08.700 | - No, it's the right thing to do.
01:32:10.580 | - Absolutely. - It's the right thing to do.
01:32:11.620 | - Okay, well, if Tamath is endorsing this,
01:32:14.520 | I guess I'm gonna take it seriously.
01:32:16.880 | - I think you gotta do it.
01:32:18.100 | - My opinion doesn't matter, Sax.
01:32:19.620 | I'm not saying you're wrong.
01:32:20.940 | I'm just saying that your threshold
01:32:22.860 | for becoming concerned is lower.
01:32:25.980 | And then if it hits Tamath's threshold, which is higher,
01:32:29.540 | I'm gonna take it more seriously.
01:32:31.940 | - As your bestie, I would like you to stop using plastic
01:32:34.900 | for the rest of your life.
01:32:35.980 | - Okay. - Please.
01:32:36.800 | - If you're in a position to do it,
01:32:37.740 | I would ask you to not do it.
01:32:39.420 | - All right, everybody.
01:32:40.260 | - Can all you (beep)ers come back to the Bay Area
01:32:43.060 | so we can see each other? - Yes, absolutely.
01:32:44.340 | I miss you guys. - I heard this.
01:32:46.180 | - Love you guys.
01:32:47.020 | I miss you.
01:32:47.840 | I miss you guys, too, for the Sultan of Science,
01:32:50.840 | the Chairman Dictator, and the Rain Man, David Sax.
01:32:54.880 | I am the world's greatest moderator.
01:32:57.280 | Okay, love you guys.
01:32:58.160 | I gotta go. - Love you.
01:32:59.000 | - Love you. - See you next time.
01:32:59.820 | Bye-bye.
01:33:00.660 | (upbeat music)
01:33:03.240 | ♪ I'm going all in ♪
01:33:05.240 | ♪ Rain Man, David Sax ♪
01:33:06.560 | ♪ I'm going all in ♪
01:33:09.220 | ♪ And it said ♪
01:33:10.060 | ♪ We open sourced it to the fans ♪
01:33:11.520 | ♪ And they've just gone crazy with it ♪
01:33:13.380 | ♪ Love you, Westie ♪
01:33:14.220 | ♪ I'm the queen of Kinwan ♪
01:33:15.480 | ♪ I'm going all in ♪
01:33:16.640 | ♪ What your winners like ♪
01:33:18.340 | ♪ What your winners like ♪
01:33:20.580 | ♪ What your winners like ♪
01:33:22.240 | ♪ Besties are gone ♪
01:33:23.320 | ♪ That's cold 13 ♪
01:33:24.980 | ♪ That's my dog taking it ♪
01:33:26.200 | ♪ Oh, it's your driveway ♪
01:33:27.040 | ♪ Sex ♪
01:33:27.860 | ♪ Oh, man ♪
01:33:29.980 | ♪ Oh, man ♪
01:33:30.820 | ♪ My appetizer will meet me ♪
01:33:32.100 | ♪ We should all just get a room ♪
01:33:33.580 | ♪ And just have one big huge door ♪
01:33:34.980 | ♪ Because they're all just useless ♪
01:33:36.400 | ♪ It's like this like sexual tension ♪
01:33:37.800 | ♪ But they just need to release them out ♪
01:33:40.480 | ♪ What you're the bee ♪
01:33:42.300 | ♪ What you're the bee ♪
01:33:44.220 | ♪ Bee ♪
01:33:45.780 | ♪ We need to get merch ♪
01:33:46.740 | ♪ Besties are back ♪
01:33:47.580 | ♪ I'm going all in ♪
01:33:52.580 | ♪ I'm going all in ♪
01:33:58.200 | [BLANK_AUDIO]