back to index

Scarlett Johansson vs OpenAI, Nvidia's trillion-dollar problem, a vibecession, plastic in our balls


Chapters

0:0 Bestie intros: Recapping "General AI Hospital"
2:46 Scarlett Johansson vs. OpenAI
14:37 OpenAI's novel off-boarding agreements, ex-employee equity problem, and safety team resignations
25:35 Nvidia crushes earnings again, but it faces a trillion-dollar problem
40:5 Understanding why economic sentiment is so negative among US citizens, while the broader data is positive
62:36 New study shows plastics in testicles

Whisper Transcript | Transcript Only Page

00:00:00.000 | I just want to be clear here, I'm trying to do a docket. And we
00:00:04.480 | have to put the kibosh on this insanity of the soap opera that
00:00:09.880 | is becoming open AI sacks. Because every week, it's three,
00:00:13.240 | four or five stories. Have you seen what's happened this week?
00:00:16.960 | Yeah, of course.
00:00:18.120 | Got to catch the audience up here. Let's catch the audience
00:00:20.320 | up on what's happened here.
00:00:21.560 | This week, on General AI Hospital,
00:00:27.480 | is Sam Altman's job security in jeopardy? Whose data was stolen
00:00:32.440 | this time? What did Ilyas say? Why isn't he talking about it?
00:00:37.800 | And with our special guest, will our special guest get her
00:00:41.640 | revenge? General AI Hospital, brought to you by the drama
00:00:46.600 | queens at OpenAI.
00:00:49.320 | Who made that? That was great. Did you make that? Yeah, that
00:00:54.200 | was me.
00:00:54.640 | 10 out of 10.
00:00:55.280 | That was my idea. But Nick's and Lon's execution. So shout
00:00:58.960 | out to Nick Calacanis.
00:01:00.560 | You finally landed the plane.
00:01:02.680 | Broken clock's right twice a day.
00:01:04.240 | Took four years.
00:01:05.160 | Broken clock is right twice a day. The Jason Calacanis story.
00:01:09.680 | That's my autobiography.
00:01:10.520 | But seriously, there was a big drama, Sax. I don't know if you
00:01:30.840 | saw this, but you couldn't miss it with Escargot. So they made
00:01:35.960 | an emergency, they had an emergency meeting, got all the
00:01:38.040 | developers together, and they've reset. They took Scarlett
00:01:40.560 | Johansson out, and they got a new person, I think, arguably
00:01:43.520 | better. I'm Friberg. I'm curious, your take on this, to
00:01:45.760 | better.
00:01:46.080 | Hey, JGPD, how's it going?
00:01:47.400 | Good. Good. Yeah, good. What's going on?
00:01:50.840 | I'm doing fine. I'm gonna be a father real soon. And I think I
00:01:55.280 | can have your help with some dad jokes. I'm going to tell you a
00:01:59.280 | joke. And you tell me if it passes as a dad joke.
00:02:02.840 | I've never done that before.
00:02:04.880 | All right. What do you call a giant pile of kittens?
00:02:08.280 | Give it to me.
00:02:09.640 | A meownton.
00:02:11.680 | No, it doesn't quite land.
00:02:14.720 | All right, there it is, folks. If you want, you can switch to
00:02:18.240 | Saxy Poo. So just go into OpenAI.
00:02:20.560 | You're saying they stole my voice now?
00:02:22.120 | Yeah, they stole yours. Go into OpenAI, go to Voices, and then
00:02:26.200 | just pick Saxy Poo. It's right there between Putin and Tucker.
00:02:32.120 | Saxy Poo Tucker. You can find all your favorite MAGA guests on
00:02:36.520 | the number one MAGA program. All in podcast. Here we go. All
00:02:41.160 | right, we're off to a strong start here. Everybody's in a
00:02:43.120 | good mood. Let's, let's keep the good times rolling here. And
00:02:46.640 | let's go over the ScarJo saga. To recap, if you're living under
00:02:51.040 | a rock this week, it came out that OpenAI, specifically Sam,
00:02:55.200 | had contacted Scarlett Johansson multiple times about lending her
00:02:59.280 | voice for one of OpenAI's chatbots. Obviously, you know,
00:03:02.840 | she famously was the voice, Samantha, in the awesome film
00:03:06.440 | Her. And according to ScarJo, Altman told her she could,
00:03:10.000 | quote, bridge the gap between tech companies and creatives and
00:03:14.160 | help consumers to feel comfortable with the seismic
00:03:17.440 | shift concerning humans and AI, and that her voice, quote, would
00:03:21.960 | be comforting to people. Although she declined the offer,
00:03:24.920 | OpenAI released a chatbot named Sky, which had a similar voice
00:03:29.000 | to ScarJo's. According to ScarJo, her friends and family
00:03:31.560 | thought the voice was her. She released a statement, yada yada.
00:03:34.720 | On May 13, the day OpenAI launched chat GPT 4.0 Omni,
00:03:39.080 | which we talked about last week, Altman tweeted Her, a reference
00:03:43.040 | to the film. Obviously, now ScarJo is threatening legal
00:03:46.760 | action against OpenAI. Altman put out a statement
00:03:49.040 | apologizing, and saying the voice was never intended to
00:03:52.240 | resemble hers. His quote, "We're sorry to Ms. Johansson that we
00:03:57.720 | didn't communicate better." OpenAI showed documents to the
00:04:00.680 | Washington Post that confirmed the voice was provided by a
00:04:02.640 | different actress who is anonymous. Post reporters also
00:04:05.520 | spoke to the unnamed actress's agent who confirmed the story. I
00:04:10.240 | guess, Sax, you sent us a comparison clip. Maybe we start
00:04:13.400 | there and see what we think.
00:04:14.920 | Yeah. Do you guys think they sound the same?
00:04:16.880 | I can understand and generate human-like text pretty well. It
00:04:21.200 | really depends on what you're looking for in an assistant.
00:04:24.120 | What specific tasks?
00:04:25.840 | When I was like, oh, sexuality. Like, I was like, open my eyes
00:04:32.080 | to like, some other thing.
00:04:34.400 | Samantha.
00:04:35.120 | Lily, where'd you get that name from?
00:04:37.680 | I gave it to myself, actually.
00:04:39.600 | What do you think?
00:04:40.360 | I think it sounds pretty darn similar.
00:04:42.520 | Dead on.
00:04:43.560 | I mean, I don't know if it's dead on. Honestly, it sounds
00:04:47.560 | like a digitally altered version of her voice. That's what it
00:04:51.160 | sounds like to me.
00:04:51.760 | They didn't get it perfect, right? Like, they got it to,
00:04:54.120 | what, 90%?
00:04:55.160 | It sounds like it was her voice, but then they changed it. So
00:05:01.960 | either that, or they hired a voice actor who sounds like her.
00:05:05.640 | And that's what the company has said, is that they hired a voice
00:05:08.680 | actor. But they won't tell us who the voice actor is, they
00:05:11.600 | said, because of privacy concerns. Which doesn't quite
00:05:13.960 | make sense to me, because when you hire an actor, they want the
00:05:18.320 | credit, you know.
00:05:19.320 | Yeah. Get more work.
00:05:21.720 | The company could just clear this whole thing up by saying
00:05:24.240 | exactly who the voice actor is. And why wouldn't the voice actor
00:05:27.560 | want that? You know,
00:05:29.880 | Because she doesn't exist.
00:05:32.120 | She was a general open AI.
00:05:36.960 | Wait, they made this actress up?
00:05:41.120 | I mean, she doesn't exist. Of course, it's a digitally
00:05:45.120 | altered version of ScarJo. And they got caught. Okay, cookie
00:05:50.200 | This is why Sam's call to Scarlett's agents or her two
00:05:54.960 | days before they launched is such a damning piece of
00:05:58.440 | evidence, because it sounds like he's trying to shut the barn
00:06:00.600 | door on something they've already done, right? They've got
00:06:03.920 | this demo ready. They're gonna launch it in two days. They're
00:06:07.600 | realizing maybe they don't have the rights to use her voice. So
00:06:11.360 | you have to contact her to get those rights. But anyone who
00:06:15.840 | knows anything about Hollywood knows you're not gonna be able
00:06:17.480 | to make a deal with a major star to use her name and likeness in
00:06:21.000 | two days. It's impossible. So this seems like a really crazy
00:06:24.760 | thing to do. I mean, I think the mere fact that he contacted them
00:06:28.720 | and then tweeted out her, which shows that Scarlett was on the
00:06:32.560 | brain. Those are really damning pieces of evidence, I think in
00:06:36.160 | this lawsuit. And I think it's going to feed into our case.
00:06:39.000 | Look, here's the thing, this company is going to go down in
00:06:41.880 | the history books. In one part, because the technical
00:06:46.600 | inventions that they've created are just next level, and
00:06:49.840 | frankly, created an entire industry. And I think they
00:06:52.840 | deserve a ton of credit for that. But they're also going to
00:06:55.560 | be written in the history books for two other things that are
00:06:58.640 | probably less aspirational. I think the first is that there's
00:07:05.400 | just all kinds of dustups and unnecessary drama that just
00:07:10.280 | seemed to kick around every few weeks or months. And then the
00:07:13.600 | second is the sheer quantum of value capture that the employees
00:07:18.440 | have seen through secondaries before a fully functional
00:07:22.840 | business has been really created. And so I think it
00:07:26.240 | explains why folks circle the wagons consistently. It's a very
00:07:29.560 | I think it's a very rational organization. They're
00:07:31.880 | technically ahead of everybody else. A lot of people want to
00:07:35.120 | put in money at, you know, crazy prices. A bunch of that value is
00:07:39.920 | transferred immediately to the employees who circle the wagons
00:07:43.640 | and do what's necessary to keep the taps flowing. And I think
00:07:47.840 | that that explains the whole thing. And I think that that
00:07:49.840 | explains many Silicon Valley companies, quite honestly,
00:07:52.720 | circle the wagons, defend the company, sell your shares and
00:07:56.320 | secondary at 90 billion to thrive or whoever. What would
00:07:59.720 | your take freeberg on all of this craziness and drama?
00:08:03.160 | I think what we will see over time is that rather than have
00:08:08.400 | the ability to sue for their likeness, a lot of AI that is
00:08:15.400 | like a celebrity, the value of it will arise from the
00:08:19.800 | celebrity's endorsement, not actually using the celebrity's
00:08:23.280 | features. So without the endorsement, I know everyone
00:08:27.040 | kind of wants to point to this idea of likeness. But I think
00:08:29.840 | that there's something about the authentic aspect of having the
00:08:32.440 | celebrity actually endorse and provide their, their signature,
00:08:36.800 | their stamp on it. Restaurants are a good example. Some
00:08:39.120 | celebrity chef says I was involved in making this menu.
00:08:42.000 | That's a lot different than mimicking the celebrity chef's
00:08:44.640 | menu from his restaurant, putting his or her name and
00:08:47.520 | brand on it. There's a bunch of videos on YouTube. Now. I don't
00:08:50.240 | know if you guys ever guess probably don't watch these. But
00:08:52.480 | I love watching these videos where like music producers make
00:08:54.760 | tracks and how they do it. And so many of these producers now
00:08:58.400 | are using AI tools, taking samples off of old records or
00:09:02.720 | other tracks, and then telling the AI make something that
00:09:05.480 | sounds like this, or looks like this, but isn't like this. And
00:09:08.520 | so there's enough of a transformation happening, that
00:09:11.560 | it isn't a direct likeness. And then they're able to create
00:09:14.240 | entire vocal tracks without needing a singer, or without
00:09:16.960 | needing the celebrity singer. So you're in the much ado about
00:09:20.520 | nothing on the sky. Yeah, I'm in I'm in the like, like the
00:09:23.280 | content itself, I think is probably less like, compelling.
00:09:27.320 | Oh, you go go after her because the voice sounds the same. But I
00:09:30.360 | do think that there's this element of like, what if you
00:09:33.040 | could then say, hey, you know, Britney Spears actually lent her
00:09:36.280 | voice to this track. So do you think if she if she sues open AI
00:09:39.880 | Do you think it should just get thrown out? It's not a real
00:09:41.920 | case. No, I think there's probably going to be a lot of
00:09:44.240 | discovery to sexist point that's going to show that they probably
00:09:46.880 | did. Yeah, that discovery is gonna be juicy. Yeah, yeah. No,
00:09:49.520 | no, no, I'm not. I'm not saying that. I'm saying I'm asking you
00:09:51.880 | more that. Yeah, even if they find it, your point is, it
00:09:54.240 | shouldn't matter. My point is, it should be thrown out. Well, I
00:09:57.640 | don't know. I mean, hold on. Let's say you're casting a
00:09:59.520 | movie. And you know, you can't get Scarlett Johansson for it.
00:10:02.360 | And so you tell the casting director, get me a Scarlett
00:10:04.120 | Johansson type, I think you can do that. Okay, obviously, you
00:10:07.320 | can't, you can't use her name, you can't use her likeness, but
00:10:09.960 | you could hire a different actor, who might look or sound
00:10:13.160 | like Scarlett Johansson. If the company actually did that, and
00:10:17.040 | they did it nine months ago. This is what the statement they
00:10:19.480 | put out. I think they've got a decent defense. Yeah. But
00:10:22.920 | because they may, I don't know if we believe that. I mean,
00:10:27.280 | again, why? Why don't you just put out the name of the actual
00:10:30.400 | actor, that voice actor that you used?
00:10:32.360 | And there's a very simple test here. If the person if there's
00:10:35.480 | confusion amongst the public, which is what scar Joe put in
00:10:38.880 | her letter, and that was a legally written, deftly written
00:10:43.000 | letter to set up a huge settlement, because she said the
00:10:46.240 | morning this came out, all of my friends said, Oh, my God,
00:10:49.280 | congratulations on your chat GPT Odell. This is great. The
00:10:53.600 | public being confused is the key issue here. And there's something
00:10:56.440 | called the right to publicity. This is basically how
00:10:58.880 | celebrities defend their rights. It's happening all the time to
00:11:01.640 | podcasters, by the way, there was a company that put me in
00:11:04.120 | their heads based on something I said in a show. And then they
00:11:07.040 | put ads against it. And Huberman has been having this happen.
00:11:10.160 | Joe, it's happening. It's happening to me right now. I'm
00:11:12.680 | in the middle of this crazy thing with Facebook, and they've
00:11:15.800 | been doing a very good job. The team and Metta Ashley, thank
00:11:18.040 | you. But hundreds and hundreds and hundreds of automated
00:11:21.360 | accounts, pretending to be me selling all kinds of random
00:11:24.880 | stuff. It's predominantly on WhatsApp and Facebook and Metta.
00:11:28.320 | And I don't know what to do, because I, we work together with
00:11:32.040 | them, we shut it down. I've actually had to reactivate my
00:11:35.240 | Facebook and Instagram accounts, which were dormant, so that we
00:11:39.120 | could actually have them be verified, so that then it's
00:11:41.760 | easier to shut them down. But it is an impossible task when
00:11:44.760 | somebody is impersonating you. Yeah. To fight it. At least in my
00:11:49.040 | experience, it's been a month and we it's just like every
00:11:51.240 | hundred we take down, it's another 1000.
00:11:53.160 | Both of those cases, you guys have your exact image of you
00:11:57.560 | being shown to sell to sell stuff. I think that in this
00:12:00.280 | case, it's also unique to scar Joe, because she was the voice
00:12:04.160 | from her. There is no like if this were like a voice sort of
00:12:07.240 | like Cameron Diaz, or sort of like Julia Roberts, it wouldn't
00:12:10.160 | be as big a deal, because it would certainly got some
00:12:12.600 | differences to it. But it's because they're trying to mimic
00:12:15.120 | the movie her well, didn't didn't matter use Morgan Freeman
00:12:18.040 | when like Zuck created like an AI or something as a project. You
00:12:21.360 | guys remember that wasn't Morgan Freeman the voice they paid for
00:12:23.800 | it. And there's a company speechify. I'm not an investor
00:12:27.480 | or anything like that. But they have Gwyneth Paltrow as a
00:12:29.520 | licensed voice to read yourself. So
00:12:32.160 | they should just license Gwyneth's voice.
00:12:34.840 | Obviously, obvious, like just whoever wants to get paid. Here's
00:12:39.800 | the opportunity if three people say no, the fact that they're
00:12:42.680 | not trying to say this is this person's voice, but they want to
00:12:45.320 | say like, hey, let's say you just can prompt the AI and say
00:12:48.600 | generate a voice that is sort of like movies that are comfortable
00:12:52.520 | and calming to people to listen to. And you know that that we've
00:12:56.200 | that that we think people will be comfortable and the AI
00:12:58.320 | generate something that sounds like I've said this before, but
00:13:00.840 | it's not deliberately trained on scar using the computer to
00:13:03.760 | probabilistically copy something is still copying something. Yes.
00:13:07.320 | Come on, guys. Let's not make this too complicated.
00:13:09.040 | Is the public confused is the only test you need for what if
00:13:11.600 | there's two public actresses that both have similar voices,
00:13:15.240 | and then they both claim, hey, you tried to make this sound
00:13:17.440 | like me. What do you do in that case tomorrow?
00:13:19.080 | Well, which one did you call two days before the demo?
00:13:21.480 | Yeah, exactly. And did the CEO tweet? I think your whole point
00:13:25.160 | about discovery is what is going to get them in trouble in this
00:13:27.280 | case, because they were clearly they were clearly trying to do
00:13:30.320 | an impersonation of her, right?
00:13:31.800 | Yeah, if they never reached out to Scarlett, they claim it's
00:13:36.160 | just a coincidence. But they called her twice. They called
00:13:39.560 | her some months before and then two days before, which indicates
00:13:42.960 | panic.
00:13:43.440 | Judge sacks. Give your verdict. I'm starting a new show. Judge
00:13:46.800 | sacks says
00:13:47.600 | guilty.
00:13:49.360 | Sentence now. What's the sentence?
00:13:54.920 | The sentence is that Scarlett Johansson is going to end up
00:13:57.360 | owning more of this company than Sam Altman.
00:13:59.240 | They call her two days before. Why do you do that guilty?
00:14:08.640 | Because you know, you have a problem. And you're trying to
00:14:11.240 | put the horse back in the barn. They should have done is as soon
00:14:14.720 | as she says no, you just change the voice completely.
00:14:17.920 | Or you get a fixer, get Michael Cohen in there and get a
00:14:20.600 | settlement going. Get set. Let's get some fixer in there to fix
00:14:23.960 | it. Okay, listen enough with open AI. Oh, wait, there's more.
00:14:28.200 | Geez is ruining the docket. I guess we have to talk about the
00:14:33.440 | next drama from a open AI this week. Former employees sign an
00:14:40.040 | agreement that they're forbidden forever from criticizing the
00:14:42.760 | company or even acknowledging that the NDA exists. If a
00:14:45.960 | departing employee declines to sign the document, or if they
00:14:48.880 | violate it, they can lose all vested equity they earn during
00:14:52.580 | their time at the company in practice. This means ex
00:14:55.720 | employees have to choose between giving up millions of dollars
00:14:57.960 | they've already earned or agreeing not to criticize the
00:15:00.160 | company in perpetuity.
00:15:02.280 | Sacks is a lot of details here.
00:15:05.320 | Yeah, well, you can see why they wrote that in there. That makes
00:15:08.840 | sense. If you think hold on a second, if an employee can leave
00:15:11.560 | with, you know, a random copy of like some old weights as a
00:15:16.800 | starting point to rebuild the model that could be very
00:15:19.560 | valuable, or there's, yeah, but then well, there's all kinds of
00:15:23.740 | like, kind of quasi confidential information or knowledge or
00:15:27.940 | know how that you leave a place like that with so I could see
00:15:32.620 | why there was a justification to be very heavy handed about it.
00:15:39.300 | But again, the question isn't whether you're allowed to be
00:15:41.740 | heavy handed, you are. The question is why backtrack and
00:15:45.220 | then obfuscate and lie after you get caught.
00:15:47.940 | Well, right. Well, they claim it's an accident. Once they get
00:15:51.600 | caught. Yeah, with their hand in the cookie jar, which like you
00:15:54.280 | said, your mouth is just a heavy handed agreement. It's in the
00:15:56.340 | company's interest to do this. Yeah, they just say it was an
00:15:59.240 | accident, just like the Scarlet thing is an accident or a
00:16:01.320 | coincidence. It's just getting hard to believe.
00:16:03.520 | Just own it and say, you know what, guys, it's a really
00:16:05.800 | valuable company. There's a ton of very valuable trade secret
00:16:08.760 | know how IP confidential information. And we're going to
00:16:11.520 | be extremely litigious on the offense and protect it. Because
00:16:17.320 | and it's correlated to the importance of the company and
00:16:19.180 | the ecosystem. You could have said that and people could have
00:16:21.580 | been upset, but they would have understood.
00:16:23.100 | Here's what Sam Altman said. There was a provision about
00:16:26.080 | potential equity cancellation in our previous exit docs.
00:16:29.380 | Although we never clawed anything back, it should never
00:16:32.820 | have been something we had in any documents or communication.
00:16:36.660 | This is on me. And one of the few times I've genuinely been
00:16:41.900 | embarrassed running open. I did not know this was happening. And
00:16:46.420 | I should have if any former employee who signed one of these
00:16:48.760 | old agreements is worried about it, they can contact me and
00:16:51.280 | we'll fix that to very sorry about this. So he's very, very
00:16:54.640 | sorry. Sorry, it's starting to be like BP oil, this company,
00:16:57.100 | like, they're just so sorry about everything.
00:16:59.480 | Here's the question is, these are form documents at the end of
00:17:03.520 | the day. And form documents don't write themselves. lawyers
00:17:07.680 | write them. And when you get a novel change in one of these
00:17:10.480 | documents, somebody thought that through and thought it'd be a
00:17:12.640 | good idea and put it in there. Right. And like Jamal said,
00:17:16.660 | there is a way to potentially defend that. It's not like these
00:17:19.420 | provisions don't exist. It's just a novel application to try
00:17:22.660 | and claw back someone's already vested equity from a company.
00:17:25.660 | Well, that is not half of these things. Yeah, that is not
00:17:28.900 | exactly standard. Sometimes that's completely nonstandard.
00:17:31.700 | I'm just saying that these provisions exist in other
00:17:33.460 | contexts. And their application as a as a clawback of vested
00:17:38.220 | employee equity is something that I don't think any of us
00:17:40.960 | have heard before. So haven't Yeah, yeah, exactly. So my point
00:17:44.820 | is just it doesn't happen as an accident. Somebody made a
00:17:47.920 | strategic business decision to do this because they thought it
00:17:50.500 | be in the company's interest.
00:17:51.620 | Yeah. So just in layman's terms, the clawback means you had 10
00:17:55.420 | million in equity. You earn 75% of you got 7.5 million in equity
00:18:00.820 | there. You say something disparaging about the company.
00:18:03.820 | And they can take it back from you.
00:18:07.140 | Once you leave a company, and you vested equity, you don't
00:18:11.640 | lose it. I mean, as long as you you have some period in which to
00:18:14.360 | exercise your option if it's an option rather than stock. But
00:18:18.000 | other than that, I've never heard of a situation where
00:18:20.880 | employees can lose their vested equity.
00:18:22.720 | Even in a situation, sex, we've seen this where somebody commits
00:18:26.240 | fraud, they still get their vested equity, and then it's up
00:18:30.040 | to the company to sue them for fraud separately, right? Like
00:18:32.680 | we've seen, I guess. Yeah, I mean, I guess that's right.
00:18:35.580 | Like committing a crime.
00:18:37.200 | Look, I think the question here is, is it credible that they
00:18:41.600 | keep having these accidents and coincidences?
00:18:44.480 | What does Judge Sachs say?
00:18:46.120 | Judge Sachs says there's one too many coincidences. Look, I think
00:18:53.160 | like Tomas said, you could have just owned this and said, yeah,
00:18:55.840 | that this is a defendant in the way he said and said, but you
00:18:58.440 | know what, it was too aggressive. And we pulled it
00:19:00.060 | back.
00:19:00.280 | I think the thing is, look, if you if you think about the
00:19:03.180 | pendulum of culture in Silicon Valley, we used to have a very
00:19:09.800 | tough culture of founder led businesses, where there was
00:19:13.320 | extremely high expectations. And if you transgressed, it was very
00:19:18.540 | punitive. And then the pendulum swung in the all the way to the
00:19:22.520 | other opposite end where you had this like, coddling daycare type
00:19:26.200 | approach that existed for like the last 15 or 20 years. And
00:19:29.640 | probably what open AI is, is an example of a company that needs
00:19:33.700 | to be run a little bit more like the former, but stuck with a
00:19:37.700 | bunch of people that still pull it towards to be the latter. And
00:19:41.420 | that's the cultural tension that they're going to have to sort
00:19:43.700 | out. Because in order to be this incredible bastion of like AGI
00:19:48.300 | and innovation, I suspect that it's going to look more like a
00:19:52.180 | three letter agency in terms of security and protocols in the
00:19:55.340 | next five or 10 years than it is going to look like the Google
00:19:58.220 | Plex. And I think they just need to own that. And this is
00:20:00.980 | probably a little bit of an insight into that tension. And
00:20:04.460 | they're gonna have to go in more in that direction. It's a good
00:20:06.740 | insight, you're not going to be allowed to build these
00:20:09.060 | incredibly crazy world beating technologies where people are
00:20:12.580 | running around in an eight seater bicycle. She's not going
00:20:15.540 | to and by the way, I mean, because it is such an industry
00:20:18.740 | leading company, I think we could end up with some very bad
00:20:22.020 | fair use precedents or laws. Because Scarlett Johansson is so
00:20:26.760 | sympathetic, as a plaintiff, compared to open AI. And it's,
00:20:32.780 | you know, unless they show us some discovery, that proves that
00:20:37.380 | they really did hire the voice act and all the rest of it. I
00:20:39.360 | mean, this could lead to some very bad precedents for the
00:20:42.240 | industry around fair use.
00:20:43.820 | Well, and here we go, I think the Microsoft will pay the
00:20:46.740 | speeding ticket. And we'll just all move on. But Freeberg, my
00:20:49.620 | God, can you imagine like being two or three PhDs and, you know,
00:20:55.060 | machine learning or whatever you study your whole life, you're
00:20:57.600 | pursuing general AI, and like, people are coming up to your
00:21:00.280 | desk and creating all this drama and nonsense. And you're in the
00:21:03.480 | middle of a soap opera while you're trying to create the
00:21:05.800 | technology that creates super intelligence. It's nuts.
00:21:08.860 | Yeah, I find it annoying to just listening to it.
00:21:11.880 | Okay, well, then in that case, we will move on from open AI up.
00:21:16.280 | There's one more drama going on.
00:21:19.800 | It's not just drama. I mean, it's, there's now a lawsuit, I
00:21:23.140 | think I think it's a very interesting case. The fair use
00:21:25.560 | case is interesting. I think it's a legitimately interesting
00:21:28.000 | case that's gonna if it goes all the way, if it goes the
00:21:31.240 | distance, it's going to create some really interesting
00:21:33.640 | precedents.
00:21:34.360 | Well, I mean, you but yeah, these, what happens in these
00:21:38.840 | content cases is they get settled almost every single
00:21:42.280 | time. So the case law doesn't get codified, they just get
00:21:45.280 | settled out of court, you go look at all the fair use cases,
00:21:48.200 | they almost never go to the mat. And so this one will just be
00:21:51.820 | settled. It'll just be a question.
00:21:53.620 | The interesting part of the other story is that the reason
00:21:57.720 | all this stuff came out about the equity clawbacks was because
00:22:00.360 | the safety team quit and it got leaked during that process.
00:22:03.120 | Alright, so this is the third dramatic story of the week that
00:22:05.600 | one I think that one I think begs a little bit more of a so
00:22:09.320 | let me ask you enough. Yeah.
00:22:11.200 | Two heads of open eyes super alignment team left the company
00:22:15.960 | last week, the day after GPT for a launch, Ilya announced he was
00:22:22.260 | leaving the company. He was our chief scientist a few hours
00:22:25.220 | later, his partner on the alignment team, john like also
00:22:29.820 | announced he was resigning in a later thread, like explained
00:22:33.260 | that he left due to, quote, safety culture and processes
00:22:37.700 | have taken a backseat to shiny products. Okay, there's a little
00:22:42.360 | bit of disparagement, to our former point about non
00:22:45.860 | disparagements, non D and NDAs. So AP open AI lost both its heads
00:22:51.300 | of AI alignment. One day after it launched that new product. Is
00:22:56.180 | that a coincidence? It's interesting. If you don't know
00:22:59.340 | what super alignment is, it's basically making sure that the
00:23:03.600 | software doesn't go terminator. What are your thoughts on this?
00:23:06.980 | Friedberg?
00:23:07.480 | I don't know. I mean, it could be some bad bureaucracy, bad
00:23:13.960 | politicking, not being listened to. But I think the real
00:23:17.120 | interesting question is, who's going to ask these guys? What's
00:23:24.080 | really going on from a technology perspective? And what
00:23:27.160 | is that going to reveal? Because these guys clearly are on the
00:23:29.840 | frontier of model development, and the performance of models.
00:23:33.920 | And so my guess is, there are certain regulatory people who
00:23:38.480 | are going to have interest in the fact that this team just
00:23:40.400 | left, they're going to make a phone call, they're gonna ask
00:23:42.600 | this team to come in and have a conversation. And they're going
00:23:44.720 | to start to ask a lot of questions about what the state
00:23:47.220 | of technology is over there. And I suspect that some things are
00:23:50.400 | going to start to come out.
00:23:51.280 | Saks, it was reported that Ilya was on the side of the
00:23:57.560 | nonprofit, the slow down AI be cautious group when they fired
00:24:03.680 | Sam. So what's your take on what's going on here? With super
00:24:08.160 | alignment inside of OpenAI? Judge Saks?
00:24:11.280 | Let's let's call this what it is a mass resignation. And we don't
00:24:15.040 | really know why. I mean, apparently, they were promised
00:24:17.600 | something like 20% of the computing resources of OpenAI.
00:24:21.440 | And they didn't get that I definitely read that somewhere.
00:24:23.960 | And so that is part of it, I think, but we don't really know
00:24:29.200 | the whole story. And you know, when you look at this issue of
00:24:34.320 | the mass resignation, and then you look at the issue of the
00:24:37.040 | clawback of vested employee equity, you're like, well, wait
00:24:41.200 | a second, maybe they felt like they needed that clawback. In
00:24:45.320 | order to deter all these people who are leaving from spilling
00:24:48.440 | the beans about right, whatever was upsetting them. So I clearly
00:24:52.320 | upset them, right? Yeah,
00:24:53.600 | that's why people are saying there's this meme, what did
00:24:55.840 | you see? What did he say? And then like you said, the board
00:24:58.720 | did fire him. And the only explanation they provided was
00:25:02.200 | that he wasn't being candid, for which at the time we thought was
00:25:05.640 | an incredibly damning statement. And we thought we'd get some
00:25:09.120 | explanation of it. We never got any explanation whatsoever. You
00:25:12.320 | know, I thought that the board was being incompetent, because I
00:25:15.400 | thought that either they fired him overly hastily, or they had
00:25:19.160 | reason to fire him, but then they communicated poorly. And,
00:25:22.720 | you know, is you add all these things up, and it definitely
00:25:26.080 | seems like we don't smoke,
00:25:27.440 | Sam's a straight shooter, we should just have him on the pod
00:25:29.680 | to explain.
00:25:30.120 | clear everything up. Stop me if you heard this before Nvidia
00:25:37.160 | Nvidia just smashed all expectations while reporting
00:25:40.200 | record profits and revenue. The AI train continues on Wednesday,
00:25:43.800 | video reported earnings for the fiscal q1. Revenue was 26
00:25:48.400 | billion up 18% quarter of a quarter, 260% year over year,
00:25:52.800 | basically, they quadrupled year over year on billions of
00:25:56.800 | revenue. This chart is bonkers. We've never seen anything like
00:26:00.280 | this in the history of Silicon Valley or corporate America. This
00:26:03.920 | is if somebody like literally was mining coal and then found a
00:26:08.440 | diamond and gold mined underneath it. It's bonkers
00:26:11.760 | what's happened here. When you look at the revenue there, you
00:26:14.800 | know, sort of slow growth or moderate growth revenue that
00:26:19.160 | they experienced. That was all because Nvidia was primarily
00:26:22.520 | providing GPUs for people playing video games or mining
00:26:26.480 | crypto. And then what you see with this unbelievable six
00:26:29.880 | quarter run, and five six quarter run is companies like
00:26:34.440 | Microsoft, Google, Tesla, open AI, etc, buying just billions
00:26:40.160 | and billions of dollars worth of hardware. I'll end on this and
00:26:45.320 | Chamath and get your take on it. Here's 2019 top companies by
00:26:50.040 | market cap in the world, obviously, Microsoft, Apple,
00:26:53.200 | Amazon, Google, Berkshire, Facebook, Alibaba, Tencent, and
00:26:57.320 | then you get some of the, you know, incumbents, and legacy
00:27:00.640 | companies, J&J, Exxon, and JP Morgan Visa, way down on the
00:27:05.160 | list in 2019. Number 84 was Nvidia. Today, Nvidia is the
00:27:10.240 | third largest company by market cap behind Microsoft and Apple
00:27:13.560 | and ahead of Google, aka, Alphabet and Saudi Aramco.
00:27:19.240 | Chamath, what's your take on this? Will it continue? And how
00:27:24.120 | do you conceptualize this level of growth on such a big number?
00:27:28.520 | I mean, I think it's a really, really incredibly fun moment if
00:27:33.440 | you're involved in anything AI related, just because it shows
00:27:39.160 | the level of investment that Nvidia's customers are making
00:27:43.120 | into making this new reality available for everybody, right?
00:27:49.800 | So when you're spending effectively $100 billion a year
00:27:53.200 | on the capex of chips, and then a couple hundred billion more on
00:27:57.000 | all the related infrastructure, and then another couple hundred
00:27:59.800 | billion more on power, you're talking about half a trillion to
00:28:04.520 | three quarters of a trillion dollars a year being spent to
00:28:08.120 | bring AI forward to the masses. So I think that's the really
00:28:11.080 | positive take. The other exciting thing is, if you're on
00:28:15.640 | the other side of the Nvidia trade, which is you're working
00:28:18.880 | on something that does what they do, cheaper, faster or better.
00:28:24.040 | It's also really exciting, because at some point, the laws
00:28:27.520 | of capitalism kick in, right, we've talked about this, when
00:28:30.040 | you are over earning so massively, the rational thing to
00:28:34.200 | do for other actors in the arena is to come and attack that
00:28:37.520 | margin, and give it to people for slightly cheaper, slightly
00:28:41.720 | faster, slightly better. So you can take share. Yes. So I think
00:28:45.240 | what you're seeing, and what you'll see even more now is this
00:28:48.200 | incentive for Silicon Valley, who has been really reticent to
00:28:51.560 | put money into chips, really reticent to put money into
00:28:54.880 | hardware, they're going to get pulled into investing in this
00:28:58.240 | space, because there's no choice, you have a company that
00:29:01.480 | went from 100 billion, a market cap to two and a half trillion
00:29:05.160 | in four years, it's just too much value that that is there to
00:29:09.960 | then be leaked back. You know, the interesting thing to
00:29:12.200 | remember, during the PC revolution, which is really
00:29:15.400 | mostly the 90s, right, it ended in the late 90s, I would say
00:29:18.880 | like 98 99, right before the dotcom bubble took over. Intel's
00:29:22.600 | peak market cap was I think it got to about $200 billion. And
00:29:27.480 | then their average growth rate from 1998 to today was negative
00:29:32.680 | 1.4% a year. Right. So it went from about 200 billion to about
00:29:36.600 | 130 odd billion. And why it's not that Intel was a worst
00:29:41.400 | company. But it's that everything else caught up. And
00:29:45.560 | the economic value went to things that sat above them in
00:29:49.280 | the stack, then it went to Cisco for a while, right? Then after
00:29:52.800 | Cisco, it went to the browser companies for a little bit, then
00:29:56.000 | it went to the app companies, then it went to the device
00:29:58.320 | companies, then it went to the mobile company. So you see this
00:30:01.760 | natural tendency for value to push up the stack over time. So
00:30:06.600 | let me see that I we've done step one, which is now you've
00:30:09.400 | given all this value to into Nvidia, and now we're going to
00:30:12.400 | see it being reallocated.
00:30:13.600 | So Jamath, who's in the arena trying stuff, some of these
00:30:17.320 | things? Well, everybody's working. Yeah.
00:30:19.600 | So right now, what you do is you speculatively bet on anything
00:30:24.440 | that kind of like quote unquote rhymes with Nvidia. So AMD is
00:30:28.160 | ripping, the companies that make HBM is gripping. So all of that
00:30:33.640 | stuff, the folks that make optical cables, this Japanese
00:30:36.520 | company that I found, that makes like the high bandwidth optical
00:30:39.680 | cables ripping. So every anything related to that
00:30:43.480 | ecosystem right now, is at all time highs. But at the same
00:30:48.000 | time, what you find now is like every other day, when you wake
00:30:51.680 | up and read the trades, in Techland, you find that there's
00:30:55.640 | a company that's gotten seeded with five to 50 million bucks to
00:30:59.120 | create a new chip, right? You're also starting to see folks that
00:31:02.760 | are working a little bit above the stack and build better
00:31:05.160 | compilers, right, things that will allow you to actually build
00:31:09.840 | once run in many different compute environments. So all of
00:31:14.040 | this stuff is starting to happen. At some point, the spread
00:31:17.720 | trade will be that Nvidia loses share, even though revenues keep
00:31:22.520 | compounding to these upstarts. Yeah, death by 1000 startups.
00:31:26.880 | All right, so I guess one of the questions people are asking
00:31:30.400 | right now is, have we ever seen a company at this scale and the
00:31:35.000 | impact it's having not just in technology, which trim off just
00:31:37.360 | pointed out beautifully, but also it's having a huge impact
00:31:40.320 | on Wall Street, on the stock market on finance.
00:31:43.520 | Well, the the company that everyone compares Nvidia to, or
00:31:50.280 | ask the question about whether a historical comparison should be
00:31:53.400 | made is Cisco. So there's an article in Motley Fool saying is
00:31:57.840 | Nvidia doomed to be the next Cisco there was one in Morning
00:32:01.800 | Star called Nvidia 2023 versus go 1999. Well, history repeat
00:32:06.000 | itself. The reason they're asking these questions is that
00:32:09.360 | if you go back to the.com boom, in 1999, to pull up the stock
00:32:14.880 | performance chart, you can see that Cisco had this incredible
00:32:18.200 | run. And if you overlay the stock price of Nvidia, it seems
00:32:24.440 | to be following that same trajectory. And what happened
00:32:26.840 | with Cisco is that when the.com crash came in 2000, Cisco stock
00:32:30.840 | lost a huge part of its value. Obviously, Cisco still around
00:32:33.000 | today. It's a valuable company, but it just hasn't ever regained
00:32:36.880 | the type of market cap it had. The reason this happened is
00:32:39.600 | because Cisco got commoditized. So to Chamath's point, the
00:32:43.000 | success and market cap of that company attracted a whole bunch
00:32:45.680 | of new entrants. And they copied Cisco's products until they were
00:32:48.560 | total commodities. So the question is, will that happen to
00:32:52.400 | Nvidia? Yeah. And I think the difference here is that at the
00:32:56.800 | end of the day, networking equipment, which Cisco produced
00:32:59.720 | was pretty easy, pretty one dimensional, pretty, pretty easy
00:33:03.320 | to move data around. Yeah, right. Whereas if you look at
00:33:06.040 | Nvidia, it's the these GPU cores are really complicated to make.
00:33:11.040 | And Jensen makes this point that the H 100, for example, has
00:33:15.920 | 1000s of components, and it weighs like 70 pounds or
00:33:19.880 | something like that. I mean, that's like a giant oven. I
00:33:22.080 | mean, it's like a mainframe. It's not it's not just like a
00:33:24.160 | little chip. So it's a much more complicated product to copy. And
00:33:28.480 | then on top of that, they're already in the R&D cycle for the
00:33:31.960 | next chip, right? Whatever is gonna be the H 200, or whatever
00:33:34.640 | it is. And so as people try to catch up with H 100, they're
00:33:39.480 | gonna be on to H 200. So I think you can make the case that
00:33:42.840 | Nvidia has a much better moat than Cisco. And just by the way,
00:33:47.040 | on the Cisco comparison, just to just finish the thought, people
00:33:50.400 | were making this comparison six months ago. And what's happened
00:33:53.920 | since then Nvidia has had to block. Yes. And the competitors
00:33:57.040 | don't seem to be that much closer, maybe a little bit
00:33:58.880 | closer. But so, you know, look, I think it's an open question.
00:34:03.600 | So there's a counter here, freeberg, which is obviously, if
00:34:07.000 | you follow the Cisco analogy, one of the things that also
00:34:10.440 | sunk Cisco was once people had bought all that capacity, there
00:34:14.080 | was no need there was no file size that was so great that it
00:34:18.080 | couldn't be moved easily around the internet. You know, you make
00:34:21.720 | movies, HD, super HD 2k 4k, the bet we've created too much
00:34:27.160 | bandwidth, there was no use for it. So I guess that that's a
00:34:29.840 | counter argument for maybe when, if we build up too much capacity,
00:34:33.880 | Nvidia also could not by competitors, but just by the
00:34:37.000 | buildout being enough. So what's your take on that counter
00:34:40.080 | argument, and then whatever I thought, yeah,
00:34:41.680 | I think that the Cisco analogy, it's a pretty different
00:34:44.880 | situation, because Cisco evolved the business to become much more
00:34:47.880 | enterprise centric. And they were able to run an M&A process
00:34:52.720 | like we see with enterprise software, where they could
00:34:55.280 | acquire and roll up lots of different product companies and
00:34:59.480 | sell into their enterprise channel. So do a lot of cross
00:35:01.880 | selling. Nvidia is not a super acquisitive business. And it
00:35:05.640 | doesn't make as much sense because they're selling much
00:35:07.840 | more kind of infrastructure tools, whereas Cisco moved
00:35:10.440 | really high up in the in the enterprise stack, they were
00:35:13.040 | selling stuff into office buildings, they were selling
00:35:16.120 | software, they did acquisitions to kind of fully integrate,
00:35:19.040 | they had a very diverse set of products that were selling
00:35:21.520 | through an enterprise channel, further up the value stack, and
00:35:25.080 | a pretty distributed customer base, no, no serious
00:35:27.640 | concentration. Even though they did sell a lot into data
00:35:30.120 | centers, they were also selling to telcos, they were selling
00:35:32.760 | enterprises, they were selling to governments and so on. If you
00:35:35.400 | look at Nvidia's revenue, they did $26 billion of total revenue
00:35:38.440 | in the quarter, 22 billion of which was data center. And about
00:35:42.040 | 40% of that was from the top four hyperscalers. So a full
00:35:47.000 | one third of Nvidia's revenue in the quarter came from I believe
00:35:50.880 | it's Google, Amazon, Microsoft and meta. And so between those
00:35:55.200 | four businesses, you know that those companies each have, I
00:35:59.080 | believe, at least over or close to $100 billion of cash sitting
00:36:03.720 | on their balance sheet, they can't find great places to
00:36:06.480 | invest that cash to grow revenue. And so they've
00:36:08.960 | rationalized away the idea that they will make capex investments
00:36:11.920 | to build over the next five to 10 years. And this is where that
00:36:14.720 | money flows.
00:36:15.400 | Yeah, we talked about that on a previous episode, because there's
00:36:17.240 | no M&A, to your point, yeah, that's not gonna let you buy
00:36:19.800 | stuff, or the UK is not gonna let you buy stuff.
00:36:21.920 | So I think that they're gonna have less maneuvering
00:36:23.960 | capability than Cisco had in the future. And obviously, there's
00:36:28.040 | this deep concentration risk, which is going to be deeply
00:36:30.400 | challenging.
00:36:30.920 | I think Nvidia, this is to build on Sox's point, is going to get
00:36:35.440 | pulled into competing directly with the hyperscalers. So if you
00:36:38.920 | were just selling chips, you probably wouldn't. But Sox is
00:36:42.840 | right. Like these are these big, bulky, actual machines. Then all
00:36:48.160 | of a sudden, you're like, well, why don't I just create my own
00:36:50.360 | physical plant and just stack these things and create racks
00:36:54.080 | and racks of these machines and go ahead to have an AWS instead
00:36:57.000 | of selling to them. It's not a far stretch, especially because
00:36:59.680 | Nvidia actually has the software interface that everybody uses,
00:37:04.600 | which is CUDA. So I think it's, it's likely that Nvidia goes on
00:37:08.880 | a full frontal assault against GCP and Amazon and Microsoft,
00:37:12.640 | that's going to really complicate the relationship that
00:37:15.960 | those folks have with each other. But I think it's
00:37:18.520 | inevitable, because you're good. How do you defend? It's kind of
00:37:21.440 | the Apple problem. How do you defend an enormously large
00:37:25.320 | market cap, you're forced to go into businesses that are equally
00:37:29.120 | lucrative. Now, if I look inside of compute, and look at the
00:37:33.560 | adjacent categories, they're not going to all of a sudden start a
00:37:36.040 | competitor to tick tock, right, or a social network. But if you
00:37:40.080 | look at the multi 100 billion revenue businesses that are
00:37:42.680 | adjacent to the markets that Nvidia enables, the most obvious
00:37:46.360 | one is the hyperscalers, which are multi $100 billion revenue
00:37:49.240 | businesses. So they're going to be forced to compete. Otherwise,
00:37:53.720 | your market cap will shrink. And I don't think they want that.
00:37:57.240 | And then it's going to create a very complicated set of
00:37:59.960 | incentives for Microsoft and Google and meta. And Apple and
00:38:04.880 | all the rest. And that's also then going to be an accelerant,
00:38:07.600 | they're going to pump so much money to help all of these
00:38:10.640 | upstarts. To your point, Jason, chip away and nip at the the
00:38:15.000 | Achilles heels of Nvidia until they fall.
00:38:18.080 | Yeah, and there's a great precedent for what you're saying
00:38:21.280 | because or clues. Amazon is making chips, Google is making
00:38:25.400 | chips, and it's making chips. Tesla's making chips. Yeah.
00:38:29.000 | Everybody's their own chips, and they got rid of Intel. And so
00:38:33.000 | this is how it's going to go. Your margin is my opportunity.
00:38:36.880 | And you know, with all this market cap increase, the good
00:38:39.720 | news is, just reported that Jensen has bought a second
00:38:42.920 | leather jacket. So well, this market has enabled him to expand
00:38:47.440 | the world a bit.
00:38:48.280 | I gotta say he looks really good. He looks super fit. Great.
00:38:50.840 | You know, he's late.
00:38:51.600 | Is he is in his late 50s. He's going into his Harrison Ford
00:38:56.200 | looks great.
00:38:58.800 | 6161. He looks amazing. He looks amazing. I'll tell you
00:39:04.920 | something in the zombie apocalypse draft. I'm picking
00:39:07.160 | him. That guy seems crafty. You know,
00:39:09.400 | but what he does for us is no plastic golf class. No, no, no,
00:39:15.880 | his balls up plastic.
00:39:16.880 | No, no, he's got his have brass. He's literally putting his
00:39:22.920 | balls. Oh, all of our balls have plastic. Okay, well, we have two
00:39:27.720 | choices here. We can go with another tech story, or we can go
00:39:31.400 | directly to science corner. And
00:39:32.840 | well, no, I think you stay to the economy and then do science
00:39:34.920 | corner.
00:39:35.320 | Okay, so you want to talk about our pocketbooks. And then we'll
00:39:38.520 | talk about our pockets. And then we'll just make a quick detour
00:39:41.040 | to the right and then talk about our balls. We're in the same
00:39:43.440 | vicinity. So
00:39:44.160 | there was our balls. Okay, great. Yeah, let's say let's end
00:39:47.520 | with our balls. That's science.
00:39:48.440 | To start with the balls. I don't know. Everybody's got different
00:39:52.440 | kind of vibes here.
00:39:54.680 | The ballplay should come a little bit later. And the pro
00:39:57.920 | want to save the ballplay for later sex.
00:39:59.720 | Having a hard time keeping this together.
00:40:04.640 | More than half of Americans think we're in a recession. I
00:40:09.720 | think we're in a vibe session right now because we're not in a
00:40:12.360 | recession. But people are feeling really bad a Harris poll
00:40:16.360 | conducted by the Guardian shows 50 cents 56% of Americans
00:40:21.560 | wrongly believe the US is a recession. Not surprisingly,
00:40:24.840 | they blame Biden, the poll highlighted a bunch of
00:40:27.920 | misconceptions 55% believe the US economy is shrinking. It's
00:40:32.080 | obviously not 56% think the US is experiencing recession. It's
00:40:36.000 | obviously not 49% of people believe the S&P 500 stock
00:40:40.280 | market index is down for the year. It's 12% this year was up
00:40:43.160 | 24% in 2023. And 49% believe that unemployment is at a 50
00:40:48.880 | year high when it's in fact at a 50 year low. And Americans are
00:40:53.920 | really concerned about the cost of living and inflation. Fair
00:40:56.760 | enough. 70% said the biggest economic concern was the cost of
00:41:00.200 | living 68% said that inflation was the biggest economic
00:41:04.800 | concern important quote here, a majority of respondents agreed
00:41:08.480 | it's difficult to be happy about positive economic news when I
00:41:11.160 | feel financially squeezed each month and that the economy was
00:41:14.960 | worse than the media made it out to be. According to the polls,
00:41:17.760 | 70% of Republicans and 40% of Democrats think Biden is making
00:41:21.480 | the economy worse. Chamath, you have some thoughts on this.
00:41:25.280 | I'm going to go out on a limb and speculate that a lot of the
00:41:30.320 | big numbers that we use to gauge how we should feel about things
00:41:35.960 | in today's day and age are pretty brittle. Okay, explain
00:41:41.640 | where agile and may may actually just be totally wrong. So what's
00:41:46.080 | an example? So for example, like, if you look at something
00:41:49.160 | like non farm payrolls, right, so the first Friday of every
00:41:52.800 | month, you get this report that comes out from the Department of
00:41:55.720 | Labor, and it shows what where unemployment is. But how do they
00:41:59.840 | calculate that? Do you think that they have a real time sense
00:42:02.840 | of exactly every person in that month that entered the workforce
00:42:05.960 | or exited the workforce? No, they do a survey and then they
00:42:09.080 | extrapolate. And if you do that survey incorrectly, and Jason,
00:42:12.920 | you've commented on this before, for example, if you don't
00:42:15.200 | capture adequately, the number of people that are on the
00:42:17.920 | sidelines and never joined the workforce, or the number of
00:42:20.560 | people that are part of the gig economy, so they are kind of
00:42:23.080 | working, you get an inaccurate sense of where the real economy
00:42:26.760 | is. I think that GDP is somewhat similar. Because if you just
00:42:31.960 | break down what GDP is, so Nick, there's a very simple pie chart
00:42:35.720 | I sent to you, what is GDP, it's the sum of four things. Most of
00:42:41.440 | it is what people spend. Okay, then the next big chunk is what
00:42:46.480 | companies and governments spend. And then the last is what we
00:42:52.320 | export to other countries. So let's just pause for one second
00:42:55.080 | and think about what do you think happens when rates are zero
00:42:58.280 | versus when rates are at 6% people spend a lot more. Well,
00:43:02.000 | people are tend are tend to save when interest rates are high.
00:43:05.640 | Just the natural thing, like, why would I buy a pair of these
00:43:08.600 | Nike shoes, I'll just put it in the bank, I get 6%. But when the
00:43:12.160 | bank pays you zero, you're like, ah, let me buy these Air Force
00:43:15.040 | ones and move on, right? It turns out, it's the same for
00:43:18.160 | companies, companies find it easier to invest when rates are
00:43:21.400 | at zero, because it's cheaper. It's much more expensive,
00:43:24.040 | because they're borrowing money at 6% versus at 0%. Or more
00:43:27.400 | corporate gets charged a higher fee, right? Yeah. Then when you
00:43:30.640 | have high interest rates, you have a currency that
00:43:34.760 | appreciates it makes exports less attractive to other people,
00:43:39.400 | which means then you become an ad importer. Okay, so what is the
00:43:42.800 | last thing that's left? The last thing that's left is government
00:43:45.040 | spending. And you have to ask the question, what should
00:43:47.360 | governments do when rates are high? There was a chart I
00:43:52.880 | published in my annual letter, if you just go to that for a
00:43:55.680 | second.
00:43:56.080 | And just going back to this chart right before it just so
00:43:58.600 | that people who are listening, put the pie chart in there.
00:44:00.720 | Important for people to know, consumer is about 70% of the
00:44:04.240 | economy. And if you put investment in government
00:44:06.880 | together, that's just over 34% or 35% percent. So it is a
00:44:13.160 | consumer driven economy. But hey, you know, corporate and
00:44:15.800 | government spending is a major piece as well.
00:44:17.520 | And then I just wanted just to highlight that when interest
00:44:22.680 | rates are very high, all of a sudden, governments are faced
00:44:26.520 | with this very difficult problem, which is, oh, man, I
00:44:28.680 | have to spend a ton of money on interest, just like if you had a
00:44:31.120 | bunch of credit cards, and all of a sudden, the interest rates
00:44:33.120 | went up. So the choice is twofold. Do governments spend
00:44:38.280 | less? But unfortunately, it turns out that our governments
00:44:43.320 | in America, they just keep spending more and more. So even
00:44:46.720 | if net interest income is small, even if net interest income is
00:44:49.600 | high, they're just like, forget it, the taps are on. So what
00:44:52.720 | does this all mean? I think what it really means is that we do a
00:44:55.120 | very poor job of measuring all these dynamics together. And so
00:44:59.280 | I actually trust the survey data of these individuals more than I
00:45:04.800 | trust the GDP report, in the sense that I think it more
00:45:08.160 | accurately captures this dynamic rates are at 6%. People are
00:45:12.160 | saving more, if they're not getting paid more, things are
00:45:15.440 | costing more, the government is giving you free money. So you
00:45:18.600 | kind of feel like everything is moving. So that the GDP
00:45:21.880 | measurement, the way that it's classically done shows that,
00:45:24.800 | wow, we grew at three or 4%. But the average individual American
00:45:29.120 | isn't feeling that they're actually feeling that they have
00:45:31.680 | less money. So I would actually go with them. And actually say,
00:45:36.280 | if we don't revisit this thing from first principles, we're
00:45:39.920 | going to get this dynamic where we think one thing is happening,
00:45:44.640 | but the actual exact opposite is happening. In this case, I do
00:45:47.200 | think we're we're in a quasi synthetic recession.
00:45:50.160 | Saks, what's your take on the vibe session?
00:45:53.800 | Well, look, I, I tend to agree with your mouth on this. I think
00:45:56.800 | this is a classic story of who do you believe? Do you believe
00:46:00.240 | the experts? Or do you believe in the intuitions of the
00:46:02.680 | American people? And the experts have some statistics on their
00:46:07.600 | side. But you know, the old saying goes, there's lies, damn
00:46:10.600 | lies and statistics. And then the American people have their
00:46:14.040 | actual lived experience on their side, they know what they're
00:46:18.000 | feeling. And I tend to trust in that. And obviously, we're in an
00:46:21.920 | election year, and the press knows that. So they're trying to
00:46:24.040 | do this big cleanup effort for Biden. But why is it that people
00:46:27.560 | are feeling this way? Number one is inflation. And if you look at
00:46:30.320 | this chart, you can see that if you look at household net worth
00:46:35.880 | since the start of the Biden presidency and compare it to the
00:46:39.960 | change in household net worth at a similar point in Trump's
00:46:43.080 | presidency, in nominal terms, it appears to be the same. But then
00:46:46.920 | if you adjust for inflation, in other words, you look at the
00:46:49.160 | real household net worth, you can see that household net worth
00:46:53.400 | during the Biden term has been flat actually is down because of
00:46:57.520 | inflation, right? Because of inflation, where did the
00:46:59.880 | inflation come from? Where the inflation come from? Yeah, well,
00:47:03.800 | Larry Summers warned in the first quarter of the Biden
00:47:06.680 | administration, that if you pass an unnecessary $2 trillion of
00:47:11.980 | COVID stimulus, you would produce inflation. The inflation
00:47:15.720 | rate when Biden came into office was 1.7%. We had a rip roaring
00:47:19.040 | economy, but he started stimulating and we talked about
00:47:21.920 | by dynamics is this new policy of pumping trillions of dollars
00:47:26.480 | of stimulus into a healthy economy, which we've never done
00:47:29.080 | before, what happened, inflation went all the way to 9%. So
00:47:33.200 | people's wages have not kept up with the rate of inflation. This
00:47:36.080 | is why they feel worse off. When you actually look at purchasing
00:47:39.220 | power, people are worse off in terms of their actual ability to
00:47:42.520 | buy things. Their purchasing power has gone down, wages may
00:47:46.000 | have gone up a little bit, but they have not gone up as much as
00:47:47.960 | inflation. So people feel worse off. Now, Larry also had that, I
00:47:53.720 | think, really informative study, showing that inflation would
00:47:58.040 | have peaked at 18%. If you include cost of borrowing. So
00:48:00.620 | again, to Tomas point, if you're trying to get a mortgage, and
00:48:04.040 | you're paying seven and a half 8%, you're feel way worse off.
00:48:07.080 | If you need to buy a car and make a car payment, you feel
00:48:09.880 | much worse off. If you've got credit card debt, which is now
00:48:12.840 | hit an all time record of something like 1.1 trillion,
00:48:16.280 | your credit card rates have never been higher. So the
00:48:19.160 | average American feels worse off because cost of borrowing has a
00:48:21.880 | huge impact on their household finances. And that's why if you
00:48:25.400 | read like one of the last paragraphs in that story that
00:48:28.080 | you referred to, they use the key words, the consumer feels
00:48:31.320 | squeezed, the average household feels squeezed, they may not
00:48:34.480 | have lost their job yet. But they've lost purchasing power,
00:48:38.280 | and they've lost their under earning, their under earning.
00:48:41.120 | And so the project is obvious. And you know, the press, the
00:48:44.480 | press can gaslight us all day long about how wonderful things
00:48:47.240 | are under Biden. But the average American, I think understands
00:48:51.040 | differently, based on their own experience.
00:48:53.160 | I think the whole thing comes down to the projection of an
00:48:56.240 | individual or a household of their lived experience onto the
00:49:00.160 | economy. You assume that because you're having a tough time, the
00:49:03.440 | economy is bad. That and the economy as a definition for them
00:49:08.000 | is, how do I earn and how do I spend? And if I'm under earning,
00:49:12.600 | that means there must be serious job loss. And things are more
00:49:16.680 | expensive, and my ability to purchase isn't improving. And so
00:49:20.760 | I think we're all kind of going to end up on the same take on
00:49:23.200 | this one. I mean, Nick, if you want to pull this image up, this
00:49:25.680 | is, I think, a helpful one, which is disposable personal
00:49:28.440 | income relative to outlays, that folks are needing to spend more
00:49:32.480 | than they're making. So clearly indicating that they're feeling
00:49:36.960 | like they're under earning. So the projection of that is the
00:49:40.000 | economy is bad, without recognizing that it is an
00:49:42.520 | inflationary experience, whereas economists use the definition of
00:49:46.840 | quote, economic growth, being gross production, gross product.
00:49:50.680 | And so if gross product or gross revenue is going up, they're
00:49:54.040 | like, Oh, the economy is healthy, we're growing. But the
00:49:56.400 | truth is, we're funding that growth with leverage at the
00:50:00.320 | national level, the federal level, and at the household and
00:50:03.560 | domestic level, we are borrowing money to inflate the revenue
00:50:08.920 | numbers. And so the GDP goes up, but the debt is going up higher.
00:50:14.040 | And so the ability for folks to support themselves and buy
00:50:17.280 | things that they want to buy, and continue to improve their
00:50:20.520 | condition in life has declined, if things are getting worse. And
00:50:24.760 | if you go to the next image, as Saks pointed out already, here's
00:50:27.800 | the image of total outstanding credit card debt, over a
00:50:31.400 | trillion dollars, it's totally spiked. And it's going to
00:50:34.040 | continue to spike, just like federal debt because of the next
00:50:37.120 | chart, which is the sudden jump in interest rates. So we've seen
00:50:41.160 | credit card interest rates jump from 12%, on average, 10 years
00:50:46.120 | ago to 21.19%. Right now, and it was at 14% at the end of 2022.
00:50:53.320 | So we've gone from 14% average credit card interest rates to
00:50:56.680 | 22% now, in just about 2420 to 24 months. And so the the
00:51:03.280 | projection that I think of the quote economy must be bad is
00:51:06.680 | resulted from the fact that income to spending is actually
00:51:11.040 | pretty negative. So here's the real median family income. This
00:51:14.840 | actually only goes through 2019. So it doesn't even capture the
00:51:17.440 | air, the air that we're talking about. But this has been going
00:51:19.840 | on for quite some time, that the average American's ability to
00:51:23.680 | improve their condition has largely been driven by their
00:51:27.240 | ability to borrow, not by their earnings. And this is create a
00:51:31.120 | substantial set of precedents that we're now running into a
00:51:34.280 | wall with interest rates spiking and inflation hitting us because
00:51:37.680 | of the overall federal debt that we've taken on.
00:51:39.480 | I think we're probably going to go around the horn and all agree.
00:51:42.160 | Obviously, the crazy spending started in the Trump and COVID
00:51:46.200 | era. And that caused a lot of the inflation as well, just to
00:51:48.480 | be fair to two administrations that are just out of control
00:51:52.040 | with spending. But the way I look at this is the Mickey D
00:51:54.680 | economy. People may not know this, but 96% of Americans eat
00:51:59.880 | meals at least once a year in McDonald's 8% of Americans eat
00:52:04.160 | at McDonald's on an average day. And when you look at the prices
00:52:07.680 | of McDonald's here, if we look at this image, this is
00:52:10.240 | incredible. This is this is unbelievable medium French fries
00:52:14.080 | at McDonald's bucks 79 in 2019. And now for 19. And then if we
00:52:18.960 | look at just McNuggets, gosh, 449 to 758 68% increase McChicken
00:52:25.400 | 129 to 389. And then here is a very interesting one. This is
00:52:29.840 | CPI versus McDonald's. Big Mac prices. Take a look at that. As
00:52:34.240 | much as the consumer price index has surged. Big Macs have
00:52:37.600 | exceeded that. And so Americans are seeing this over and over
00:52:42.120 | again, when they go to McDonald's and other places. And
00:52:46.840 | that's what's causing the feeling.
00:52:48.520 | Go back to that McNugget chart. I just want to see what what is
00:52:52.840 | it end of 2019. So this is basically you'd want to look at
00:52:55.600 | the four year stock price, right? So like these guys have
00:52:58.320 | jacked up prices massively. If you look at what's happened to
00:53:02.040 | the stock, the stock is way up. It's kind of been they've been
00:53:04.360 | very motivated and rewarded as a company for just rewarding the
00:53:08.940 | shareholder and kind of screwing over the customer.
00:53:11.600 | If you own equities in McDonald's, and you're in the
00:53:13.880 | top third or half of maybe half of Americans who have equity
00:53:17.520 | exposure, you're feeling great. If you're on the bottom third or
00:53:20.720 | half, and you're buying at McDonald's, and you don't own
00:53:23.320 | equity in McDonald's, you feel terrible free break, you had
00:53:25.800 | some additional thoughts.
00:53:26.800 | Yeah, but remember what McDonald's and other fast food
00:53:30.240 | companies have said is that labor costs have climbed. Here's
00:53:33.120 | a chart on labor costs that can pull up workers at Walmart and
00:53:37.300 | McDonald's have had pay increases. But this has really
00:53:41.280 | been to try and keep up with inflation, the inflation of
00:53:44.360 | other costs. So a lot of people will say, oh, they're price
00:53:47.360 | gouging, they're ripping off consumers to make profits for
00:53:50.100 | shareholders. But the truth is, the biggest component of running
00:53:52.960 | those restaurants is labor. And labor has gotten more expensive
00:53:57.040 | because the employees that work there have to earn enough to pay
00:54:01.040 | their bills and to afford their food. And this is the circular
00:54:04.640 | effect of inflation. It finds its way all through the economy,
00:54:08.000 | it filters down, and it eventually hits everyone.
00:54:10.720 | Look, fast food is a pretty competitive business. I mean, I
00:54:13.160 | think the reason why McDonald's is raising prices because
00:54:15.960 | everyone else is raising prices. I mean, otherwise, they'd be
00:54:17.760 | losing share. And just look, go to the grocery store and look at
00:54:20.840 | the price of steak or chicken or whatever, or eggs. It's gone up
00:54:24.280 | tremendously over the last few years. I mean,
00:54:27.000 | let me ask you a candy question. When's the last time you were in
00:54:30.560 | supermarket? Be honest, when's the last time I'm not literally
00:54:34.760 | went to a I'm just curious.
00:54:36.120 | It's not relevant. I mean, yeah, look, obviously, we know that
00:54:41.440 | the price of eggs doesn't affect me. Okay, great. I'm in a
00:54:43.800 | fortunate position. That's not what the topic is. The topic is,
00:54:46.440 | what is the impact on the American people? And why? Why do
00:54:49.960 | 70% of the people in that poll feel that we're in a recession,
00:54:53.640 | even though the experts tell us we're not? And I've explained
00:54:56.400 | it. Yeah. And the other thing is, there's a way to
00:54:59.600 | I like on this market, I take
00:55:02.520 | I cannot believe how expensive things I mean, some of the
00:55:06.440 | stuff I was just blown away how expensive it's
00:55:08.920 | look at the look at the package I sent you guys the Driscoll's
00:55:12.360 | super sweet strawberries
00:55:13.680 | in court of the market to try to find them here in San Mateo.
00:55:17.880 | There's none left. They said come off quarter of the market.
00:55:20.200 | I was going to not a night to know not and I go to Segonas
00:55:23.200 | every week. A because we like buying our own fruit. Yeah, but
00:55:27.640 | also because our kids like to do it. And they like to see what
00:55:30.280 | things cost and they like to pick stuff. But you know, this
00:55:33.320 | sweetest batch was seven bucks. For how much by the way. Well,
00:55:37.200 | well, so here's what I'll tell you quite honestly, I'm like, I
00:55:40.440 | think that you need to put these guys on notice. Oh, on blast the
00:55:46.680 | perfume. So the the nose, like just like, it's incredible. Okay,
00:55:52.680 | the smell is the aroma. It's just absolutely incredible. Yeah.
00:55:56.760 | But to be totally honest with you, the mouthfeel and the
00:56:00.000 | sweetness is not what this label would imply. Yeah. What were you
00:56:05.240 | expecting in terms of mouthfeel? I was expecting, like, something
00:56:10.640 | juicier, more succulent.
00:56:13.480 | Yes, your questions about do you have a sommelier for your
00:56:15.640 | fruit? Yeah, I mean, you seem like a real connoisseur here.
00:56:18.760 | What did you do?
00:56:19.360 | No, I don't. I've been texting for months. I am a connoisseur
00:56:27.040 | of fruit. Okay. It's very important to me. I like good
00:56:31.120 | fruit. So you know, my wife and I go and find good fruit for our
00:56:34.480 | family. And it's really expensive. And even this over
00:56:38.200 | promises and under deliver. So free bird, you need to get if
00:56:40.920 | you can lend a GMO strawberry, I'm gonna put it in my belly.
00:56:44.760 | Give me a GMO strawberry free bird twice as big three times as
00:56:49.160 | sweet. Get in my belly.
00:56:51.480 | I think we should all go to Tokyo for a weekend and do some
00:56:54.960 | fruit tasting in Tokyo.
00:56:56.000 | You have to get on the Hokkaido strawberries. This is where it's
00:57:00.040 | at. You have no idea like what you can spend on strawberries.
00:57:03.760 | People are spending 10 bucks on a strawberry. It is bonkers.
00:57:06.880 | I bought $100 mango once in Tokyo.
00:57:09.360 | Yeah, incredible. You know, the other thing I think with these
00:57:12.800 | numbers is, if you're an economist, and you're like
00:57:15.520 | inflation has gone down, that means the rate of inflation has
00:57:20.520 | gone down from six or 7% down to 3%, or 2.9, or 3.1. That doesn't
00:57:24.680 | mean prices aren't still going up. And so the question is, how
00:57:27.800 | do you want to cycle if you want to replace if you want to
00:57:30.040 | replace GDP, there was a very good article in the Wall Street
00:57:33.880 | Journal a few weeks ago about how there's been a just a total
00:57:36.400 | breakdown in what these high level numbers say kind of what
00:57:39.640 | we've been saying and how Americans feel. And they
00:57:42.480 | introduced a different score. Well, they gave it publicity.
00:57:45.560 | It's not their score. But it's something that they call the
00:57:48.040 | core score. Okay. And what that does, it's, I'll just read it to
00:57:52.000 | you just so you can understand it. It's a county level index of
00:57:55.720 | well being using measures of economic security, economic
00:57:59.880 | opportunity, health and political voice. And so the
00:58:02.760 | lowest possible score is zero, the highest possible score is 10
00:58:06.000 | as it turns out, when they use this across every county in
00:58:08.880 | America, the distribution is basically as follows the the
00:58:12.600 | most, quote unquote, prosperous county is Falls Church,
00:58:16.240 | Virginia, which is 7.86 out of 10. And the lowest county is
00:58:21.280 | Jim Hawk County of Texas, which has a score of 2.25. So to the
00:58:26.320 | extent that you want to start to look at granular measures, this
00:58:29.080 | is one, I'm not going to advocate for it, but it's an
00:58:31.520 | example. But what do you notice in here, what I noticed is that
00:58:35.360 | there's a lot of patches of like, man to not good in most
00:58:41.040 | parts of America. And so other than a very few small pockets
00:58:45.320 | where people feel great, most of the country is sort of
00:58:50.200 | dissatisfied. And I think that that's a really important thing
00:58:53.160 | to internalize. Yeah, some of this is classic psychology, you
00:58:57.280 | know, people do focus on the negative, the media focuses on
00:59:00.120 | the negative. And then with social media, people are seeing
00:59:03.400 | lifestyles that are unattainable, just like they're
00:59:06.080 | seeing body types that are unattainable, because people are
00:59:08.400 | doing filters, you're also seeing people living a lifestyle
00:59:10.960 | that's unattainable. And then it makes people because their
00:59:14.320 | expectation of their life is so high, when they then subtract
00:59:18.560 | the reality of their life, they've got a deficit. And
00:59:20.640 | really, happiness is like, expectations minus reality
00:59:24.480 | equals happiness.
00:59:25.440 | Yeah, but you had that same dynamic when consumer sentiment
00:59:28.880 | was much higher in a previous administration, people are
00:59:30.880 | feeling much better about the economy. So that's a constant,
00:59:33.400 | you know, when people were having free money,
00:59:35.240 | you're having free Trump drop free money on people's heads. So
00:59:39.280 | your argument that crazy spending? No, no, I'm talking
00:59:42.280 | about it with his name on it. Yeah, when the economy when the
00:59:45.840 | economy was down 33% year over year, but you would spend way
00:59:50.120 | too much money, right? You admit he spent both parties, both
00:59:53.000 | parties thought that we were headed for depression. And so we
00:59:55.440 | had a bipartisan stimulus bill during an actual crisis. Once
01:00:00.240 | the crisis was over, there was no need to keep spending.
01:00:03.000 | When Trump says, Jason, Jason, let's be he wasn't responsible.
01:00:09.920 | And once this once the seal has been broken, yeah, I think it's
01:00:13.240 | fair to say, both sides of the aisle now believe that they can
01:00:18.960 | give away an enormous amount of money. Absolutely. There's
01:00:23.000 | nobody that feels like they have a responsibility to stop. But I
01:00:26.960 | think I think what sax is, what sax is right in the sense that
01:00:29.640 | so if you take that as a constant, right, that there will
01:00:33.080 | always be handouts now of all kinds. Yep, there'll be
01:00:36.760 | different flavors, depending on whether it's a Republican or
01:00:39.000 | whether a Democrat, a Democrat, and all rationalized. It's all
01:00:42.520 | it's all. But so then what I'm saying is my point, this
01:00:45.680 | abstraction, though, doesn't solve what's happening now,
01:00:48.680 | because this free money is in the system. It's constantly in
01:00:51.800 | the system, it comes in different ways. So people should
01:00:55.080 | feel better. The fact that they don't in the face of this
01:00:57.640 | constant money train, yeah, I think is actually quite
01:01:00.640 | alarming. This is I think what the point is, which is we are
01:01:03.200 | economically, in a very complicated moment in the sense
01:01:06.640 | that there is no pandemic to blame. There's no economy
01:01:10.040 | that's totally shut down. In fact, there's an economy that
01:01:12.800 | seems to be moving, but leaving an enormous number of people
01:01:16.240 | behind. So however, it has been structured, just sitting here
01:01:19.560 | today in 2024. It's broken for more people than it's working
01:01:23.400 | for. Yeah,
01:01:24.600 | Trump, just to give facts, Trump will have spent 7.8
01:01:27.640 | trillion, Biden will spend slightly less like six point x
01:01:30.600 | trillion. At the end of these things, they're both going to
01:01:32.440 | have added
01:01:32.920 | you're missing the fact that Trump had 2020 with when we had
01:01:37.600 | the COVID depression, or what could have been when the economy
01:01:41.560 | is down 30% year over year.
01:01:43.360 | Yeah, and a terribly timed tax. By the way, did you guys see
01:01:46.800 | break?
01:01:47.120 | Hold on a second. Let me just make this one point. Yeah, just
01:01:49.920 | because both parties have been irresponsible and spending
01:01:53.160 | doesn't mean that we can't make further judgments about who's
01:01:56.720 | been worse. Sure. What's happened in the Biden
01:01:59.440 | administration is just quantitatively worse.
01:02:01.720 | Well, no, quantitatively, Trump spent more. But you're saying
01:02:05.200 | qualitatively, Biden's is worse because he didn't need to.
01:02:07.760 | There was no crisis. Right? Okay. So one spent more one
01:02:11.680 | spent less, but one didn't need to look at look at Trump
01:02:15.160 | spending for COVID. Look at Trump spending before COVID. It
01:02:18.640 | was like a 5% bump on Obama spending.
01:02:21.840 | Well, the tax break was the one that that accounted for a lot of
01:02:25.720 | Yeah, but anyway, we can sit here and debate Trump versus
01:02:28.840 | Biden. Let's talk about my balls, boys. Yeah, we're
01:02:31.440 | wasting time here. wasting time on Trump and Biden when we could
01:02:35.040 | be talking to my battle. If we're going to talk about our
01:02:38.320 | balls, we need to go to somebody who's an expert on our
01:02:43.280 | testicles. Freeberg. Let's go right to the corner. Let's talk
01:02:48.120 | about our balls. With the Sultan of science. There's been a study
01:02:53.360 | on phthalates, our balls and plastic in our balls. Freeberg
01:02:57.080 | to this up.
01:02:57.600 | Yeah, the Consumer Reports put out a really interesting, or
01:03:02.040 | what has become pretty widely covered now story a couple weeks
01:03:04.960 | ago, where they measure phthalates in common foods. And
01:03:08.640 | Nick, if you want to just pull up the image, that's been
01:03:10.880 | repeated in a lot of media, a lot of press people were going
01:03:13.240 | nuts over this. phthalates are these chemical compounds that
01:03:17.760 | are used in plastics are used with plastics to soften them. So
01:03:22.000 | when you make plastics, you can kind of, you know, make them
01:03:24.080 | softer and form them into all sorts of different shapes and
01:03:26.680 | use them for different applications like plastic bags
01:03:29.400 | or wraps or cubes or all sorts of things. And phthalates are
01:03:33.440 | these kind of smaller molecules that that kind of go along with
01:03:37.280 | the polymers that that are the basis of the plastics. And they
01:03:40.640 | measured phthalates, which are known to be toxic, in terms of
01:03:44.280 | if you get enough of them, they can be carcinogenic and cause
01:03:46.680 | cancer. And they show that every product they tested had phthalates
01:03:51.760 | in it. Wendy's chicken nuggets had, you know, 33,000 nanograms
01:03:56.560 | per serving. If you scroll up to the top thing, wait, look at
01:03:59.800 | the chipotle chicken burrito. Oh my god. And just to be clear,
01:04:03.600 | guys, this is not just about packaging, packaging plays a
01:04:07.320 | role. But the whole food supply chain, the way we wrap food, all
01:04:11.640 | of our water, all of our dust, all of the air we breathe, we
01:04:15.400 | have measured phthalates in everything. So these phthalates
01:04:18.840 | end up in the animals that are used to make milk and the
01:04:21.640 | animals that people eat, they end up in the water that goes
01:04:24.040 | into the vegetables that we grow in the ground, they end up being
01:04:27.280 | used to make the little plastic jars that we feed our kids out
01:04:30.040 | of the little yogurt pouches that our kids drink out of
01:04:32.800 | everything all just to move food around in plastic packaging
01:04:36.440 | freeberg freeberg Why would the chicken nuggets from Wendy's
01:04:39.600 | though be so is it because they are eating? They are eating
01:04:44.200 | things that have plastics in plastics in them, but but we
01:04:46.920 | also don't know. And so we're eating the chicken. So we're
01:04:49.760 | eating the plastic, we're eating the plastic. And it's also the
01:04:52.120 | fact that the way that they process the chicken and the
01:04:54.080 | material that they use and the packaging that they use and how
01:04:56.520 | they move this stuff from one place to another, you got bags
01:04:59.240 | that are holding chicken breasts that then get put in the thing.
01:05:01.720 | Every and then the oil has has, you know, the oil is transported
01:05:05.720 | in plastic, we don't know. So it's plastic all the way down.
01:05:08.160 | It's guys look at this. I mean, these are the things like, look
01:05:11.640 | at Annie's I first of all, I really dislike Annie's labeling
01:05:15.360 | and patch kit packaging. I think it's very ugly. So I've never
01:05:18.120 | bought it for that reason. But I know that there's a lot of
01:05:21.160 | private equity moms that buy Annie's because it's supposed to
01:05:24.360 | be better.
01:05:24.800 | Keep going. This is a really good point. Yeah, I'm making a
01:05:29.800 | really good point. Go ahead tomorrow. It's really good. And
01:05:31.880 | so the problem is you see organic when you go like we
01:05:34.960 | again, we go to Drager's. That's typically where we go sometimes
01:05:38.280 | to go to Whole Foods, but we go to Drager's in Sagona. So that's
01:05:40.440 | the places we go to in our neighborhood. And when you go
01:05:43.920 | and you look at these things like prepared meals, as an
01:05:46.080 | example, the thing that has always attracted me to Annie's
01:05:50.160 | is because it is positioned as it is cleaner and better for
01:05:53.440 | you. And you see everybody buying it. And what you actually
01:05:57.200 | see sitting on the shelf that's left over is actually the Chef
01:06:00.920 | Boyardee and the Campbell's. And I always thought to myself, I
01:06:04.680 | won't buy Annie's because I actually don't like the label.
01:06:06.600 | To be honest, that was like, that was really why I did just
01:06:09.840 | deeply dislike the packaging. But it turns out it's the
01:06:12.840 | actual worst for you. Yeah, it is. And let me just let me tell
01:06:16.840 | you guys some stats about this. So we produce about 3 million
01:06:19.760 | tons of phthalates a year creating them that we use in our
01:06:22.280 | industrial supply chain. The global market for phthalates is
01:06:25.200 | about $10 billion per year, we find it everywhere in our tap
01:06:29.520 | water, as measured in the US in multiple places, there's about
01:06:34.120 | one microgram. So those were nanograms. So you kind of
01:06:37.120 | divide it by 1000. So that Annie's thing has 50 micrograms
01:06:40.240 | of phthalates in it, but there's about one microgram of phthalates
01:06:43.600 | per liter of water that you're drinking. Now, here's a study
01:06:47.200 | was done out of Germany, where they basically tried to
01:06:49.480 | estimate how much people were consuming. And on average,
01:06:53.280 | people consume or ingest about six micrograms of phthalates per
01:06:57.400 | kilogram of your body weight per day. So an adult males make is
01:07:00.680 | consuming about 500 micrograms of phthalates per day, that's
01:07:04.720 | half a gram per day. And the human body metabolizes and
01:07:09.040 | excretes it, it comes out the EPA, all of the administrative
01:07:12.120 | like agencies that oversee this stuff, they're like, it's okay,
01:07:15.080 | we metabolize it, as long as we don't consume more than we can
01:07:18.400 | metabolize, it's going to be safe, because it's not going to
01:07:21.120 | stay in our bodies, it's going to wash out. Here's the problem.
01:07:23.640 | While it's in your body, while it's moving through your body
01:07:26.800 | being metabolized, it is what's called an endocrine disruptor.
01:07:29.320 | And we talked about this in the past with respect to the
01:07:31.800 | sunscreens. These phthalates actually interfere with the
01:07:35.480 | hormones that are made by things like your pituitary gland, your
01:07:38.160 | thyroid, and even some of the hormones that are produced in
01:07:42.280 | testicle cells. There was another study done that really
01:07:46.640 | tried to estimate what the impact was. And here is a study
01:07:50.760 | that showed how do phthalates actually interact with different
01:07:54.880 | parts of the endocrine system. And they went through and they
01:07:58.440 | found all these places that biological hormones, and the
01:08:02.840 | endocrine system are disrupted by the phthalates. And we'll put
01:08:07.320 | credit for everyone that shared these papers here later. And
01:08:12.480 | they basically showed the mechanism by which the phthalates
01:08:16.040 | are actually disrupting endocrine systems. Now, what is
01:08:21.080 | the endocrine system, we talked about endocrine disruptors in
01:08:23.560 | the past, the endocrine system is the interaction of hormones
01:08:27.280 | with cells in your body produced by all these different glands in
01:08:30.200 | your body, like your thyroid, your pituitary gland, and so on,
01:08:32.720 | control things like growth, tissue development, reproductive
01:08:36.560 | tissue activity, like making sperm cells, autonomic function,
01:08:40.240 | like body temperature, blood pressure, sleep, heart rate
01:08:43.120 | regulation, injury and stress response, your mood, all of
01:08:46.320 | those things are regulated by your endocrine system. And so
01:08:48.920 | when the hormones or the proteins or peptides that are
01:08:51.360 | made by those glands are disrupted by these phthalates,
01:08:54.680 | it can actually disrupt those systems and mess them up. So
01:08:57.400 | while we're not consuming, generally speaking, enough
01:09:00.520 | phthalates to cause cancer, and therefore, we all say, hey, it's
01:09:04.400 | okay, these phthalates aren't that bad, we're not going to all
01:09:06.320 | die from cancer. The truth is, there is demonstrations now on
01:09:10.600 | how they can actually disrupt the activity of your endocrine
01:09:13.200 | system. And as a result, have all of these deleterious
01:09:15.840 | effects. Another study done on 125 men out of China, this paper
01:09:20.440 | was done out of China, they saw damage to testicle cells that
01:09:26.160 | would die testicle cells that would produce fewer sperm, and
01:09:30.160 | then testicle cells that produced sperm with extra
01:09:32.600 | nuclei, and they actually demonstrated this in rats. So
01:09:36.240 | the set of compounds can be fairly disruptive. So now we'll
01:09:40.200 | go to the next story. And the next story is the one that
01:09:42.600 | everyone's writing about, which is oh my god, there's plastic
01:09:44.400 | and balls. So a team at University of New Mexico that
01:09:47.800 | was published in the Journal of toxicological sciences, just
01:09:51.240 | last week, they took 47 neutered dogs testicles from a local pet
01:09:55.240 | clinic where they were getting neutered. And they found on
01:09:57.800 | average 128 micrograms per gram of microplastics in those
01:10:03.000 | testicles. And it was mostly you know, polyvinyl chloride or one
01:10:06.800 | of the main plastics and polyethylene. And again, phthalates
01:10:10.920 | leach out of these plastics and leach into the cells. And then
01:10:14.320 | they went to the medical investigators office and they
01:10:16.720 | found these, the testicles that were frozen for seven years,
01:10:20.480 | because when they do a medical investigation, and they keep all
01:10:22.520 | the body parts, they keep them on ice, and then they throw them
01:10:24.760 | away after seven years. So before they throw them away,
01:10:26.520 | they got permission of humans, they got permission to use these
01:10:29.280 | testicles to figure out are there plastics and they measure
01:10:32.960 | in the frozen balls, and the frozen balls. These are ancient
01:10:36.600 | frozen balls. How old are these balls? 23 frozen balls about
01:10:39.880 | seven years old. Yeah, seven year old balls frozen. Are
01:10:43.000 | people donating their balls to science? Is that when there's
01:10:46.200 | like a homicide or someone and or you don't know who dies or
01:10:50.040 | there's an investigation into why someone died the corner the
01:10:53.560 | corner keeps the body parts in case it's needed for a like a
01:10:57.680 | police case later.
01:10:58.640 | On the record, I don't want my balls used that way.
01:11:01.960 | You don't have the frozen balls on your driver's license like
01:11:06.080 | freeze my balls.
01:11:06.720 | They asked me to store my balls. I think you got such huge balls.
01:11:10.680 | Anyway, they got 2323 these these balls from these bodies
01:11:14.840 | and they found plastics on average 328 micrograms per gram
01:11:21.000 | of of testicle in these balls.
01:11:25.200 | What do you usually find? Freeberg? What do you usually
01:11:27.720 | find in these balls?
01:11:28.520 | Well, we don't we don't know because we've never taken human
01:11:31.000 | tissue and try to take it apart in a very detailed way to figure
01:11:33.680 | out like, how much plastic is there? You know, what is it
01:11:36.440 | doing to our body? But now I just want to connect the dots.
01:11:39.120 | So now we have a sense that there's these phthalates and
01:11:41.520 | these other compounds that come with plastics that leak in that
01:11:43.800 | cause all this disruption. Separately, we're seeing this
01:11:46.240 | accumulation of these little plastic particles. And remember,
01:11:49.720 | plastics are polymers, they're long chains of monomers. And so
01:11:53.820 | they can be short chains, they can be long, so they break
01:11:55.840 | apart, break apart, break apart, and little tiny bits of them end
01:11:58.440 | up and they're very hard to metabolize. And they sit in your
01:12:01.000 | tissue, and then they can cause all this disruption. So, you
01:12:05.040 | know, I think these are like, generally just going to say it.
01:12:07.800 | I'm just gonna say, first of all, I really appreciate that
01:12:09.800 | you did this. I think it's so important. We talked about
01:12:13.080 | microplastics a little bit ago, you know, J. Cal moved his whole
01:12:16.480 | family away, because a few years ago from plastics, I've started
01:12:20.000 | to do it four or five months ago, but you can't get away from
01:12:22.920 | it. No, and everywhere, you know, supply chain is your
01:12:26.120 | point. It's in the water, it's in the air, it's everywhere. I
01:12:29.400 | was gonna say, I think our food supply, I think we should just
01:12:32.240 | say it out loud, is totally corrupted. And I think there's
01:12:36.440 | all of these other factors we look at the rise in the use of
01:12:40.200 | SSRIs, the lack of sexual function in young men, the lack
01:12:45.680 | of sex, the low birth rate, I think these are all related. And
01:12:50.640 | part of it is the food supply. And part of the food supply
01:12:53.880 | problem is the fact that it is corrupted by these materials
01:12:57.520 | that should never be in our body.
01:12:58.920 | I want to just push back on this, because I think it's a
01:13:01.440 | guess. But I honestly think it's a truth. I don't want to limit
01:13:04.520 | it. I don't want to limit it to the food supply. Because here's
01:13:06.760 | the other thing. All of us are wearing clothes that use
01:13:09.560 | polymers, which are plastics. All of us are sitting at desks
01:13:13.280 | that have coatings of polymers on them. All of us have iPhones
01:13:16.120 | that use polymers, all of us drive cars and the rubber, one
01:13:21.360 | of the ways that they found that plastic, these microparticles
01:13:24.360 | are getting in the air is through tires. When we drive
01:13:27.360 | little particulates end up in the atmosphere, we breathe them
01:13:30.240 | in, and then they end up in our body. Every part of our
01:13:33.160 | industrial supply chain uses polymers, every part of our
01:13:36.520 | industrial why do you but the concentration, I'm going to guess
01:13:39.640 | that the concentration when you actually put it in your body,
01:13:42.760 | and then your intestines and your organs are bathed in this
01:13:47.520 | stuff. I'm going to guess that the food supply has a huge part
01:13:52.680 | to do with this. Yeah, but when you're let's say you take let's
01:13:54.760 | say you're wearing a clothing, almost all of our clothes. Now
01:13:57.880 | many of our clothes have polymers in them. We put it in
01:14:00.280 | the washing machine, it ends up in the water supply chain, we
01:14:02.480 | consume that water. It's very hard to say that there's a
01:14:05.880 | specific action. Our whole system has been inundated with
01:14:10.640 | these lower
01:14:11.240 | let me say it in a way that maybe you will agree with them.
01:14:14.000 | We need to fix something. My starting point would be the food
01:14:17.880 | supply. Yeah, where do you start? Yeah, my big my big
01:14:22.000 | takeaways is sort of like yours, which is almost impossible to
01:14:24.640 | alter this industry overnight, given how ubiquitous these
01:14:27.000 | compounds are and everything we do and touch tires, phones,
01:14:30.480 | clothing, etc. But I think that this is going to trigger and is
01:14:34.280 | the beginning of a wave, I'm noticing that a lot of folks are
01:14:37.800 | going to start to pay attention in the food industry, and start
01:14:41.640 | to figure out ways to represent low plastic, low phthalate food
01:14:46.680 | products as a way to kind of sell a more premium solution. I
01:14:49.520 | think that's been the trend historically with the food
01:14:51.560 | industry. Chamath is to respond to your ask right now, and to
01:14:55.400 | then show up with with solutions. So I do think that
01:14:58.360 | that's and just just taking a step back. We don't have to use
01:15:03.760 | these, these are all based on fossil fuels. So the way we make
01:15:06.560 | plastics is we basically pull oil out of the ground, and we
01:15:09.200 | turn it into these polymers. That's the basis of this
01:15:11.520 | chemical industry. We don't have to do that. With the same
01:15:15.600 | function, we can get the same function from what are called
01:15:17.760 | bioplastics. So these can these are compounds that can actually
01:15:20.800 | be much more biodegradable that are made with biological
01:15:23.800 | systems, and not made from oil using synthetic chemicals
01:15:27.640 | systems. So I do think that there's a really big
01:15:31.400 | opportunity for a wave of bioplastic alternatives, given
01:15:35.400 | that this is now becoming a little bit more obvious to
01:15:38.680 | folks that there is this kind of systemic problem, that this is
01:15:41.960 | ubiquitous, and that we do need to kind of address it.
01:15:43.960 | Okay, so just zooming out here for a minute, talked about the
01:15:47.320 | phthalates. But what about the bofas? Freeberg? The study on
01:15:52.240 | the bofas?
01:15:53.720 | Let me play in what are the bofas, Jason?
01:15:56.000 | Like what is bofas?
01:16:13.760 | That's awesome.
01:16:20.440 | Why not just start laughing before you deliver the joke next
01:16:23.440 | time?
01:16:23.760 | Let's get serious here for a second.
01:16:29.600 | I have a real question. Part of the thing that I think is broken
01:16:32.480 | is that I think somewhere along the way we got screwed up in how
01:16:36.120 | food is labeled. Right. And we and then it was, it was gamed
01:16:40.160 | effectively, right. So for for years, even when I was growing
01:16:42.760 | up, I thought you should not buy food that had that was high in
01:16:45.800 | fat, as an example. And little did I know I was ingesting all
01:16:49.360 | these sugars as a substitute to fat, it was a total mistake.
01:16:52.520 | Does the labeling need to become simpler and focus on these
01:16:56.480 | things that are just fundamentally carcinogenic for
01:16:59.440 | us? One, two, are there like lawsuits that need to happen
01:17:03.280 | Allah cigarettes where you kind of connect the dots between
01:17:07.600 | these phthalates and, and a bunch of these diseases? Because
01:17:11.560 | it just seems like I think are people have thrown their hands
01:17:14.120 | in the air for years, right? Some people say autism and diet
01:17:17.720 | are correlated, right? Other people. So they're, you know,
01:17:20.920 | frowns, the rise of Crohn's, there's so many of these
01:17:23.480 | conditions, that there is a cohort of people that attribute
01:17:28.040 | most of the reason what the pathology of the disease exists
01:17:32.360 | to food. So what do we do?
01:17:34.200 | We're pesticides, right? And that was in there, too.
01:17:36.400 | Is that your actual office behind you? Or is that a
01:17:38.760 | background?
01:17:39.160 | That's my office? Yeah. So like, every one of those books is
01:17:45.120 | coded in some of these compounds. Now, these are
01:17:48.360 | phthalate free, because these are from the 1400s. They didn't
01:17:51.960 | have that back then.
01:17:52.560 | Okay, good. All right. Well, everything else at the desk is
01:17:56.000 | made of all of those items. Yeah. Yeah. These are
01:17:59.360 | collector's items, bro. These are these are from an era where
01:18:01.640 | that's up. And you're, you're, you're pure, I'm assuming you're
01:18:04.360 | you're wearing a pure baby wool sweater, which doesn't have any
01:18:08.440 | polymers, but baby cash. Yeah. But I think what's what what's
01:18:11.920 | what's overwhelming about right? I think I think baby Catherine
01:18:15.920 | is biodegradable.
01:18:16.840 | What's overwhelming about this problem is the ubiquity of the
01:18:20.640 | problem. It's almost like asking, tell me everywhere that
01:18:24.200 | carbon is used. Like imagine if you had to label every
01:18:27.280 | No, I know. I'm trying to hone you into this one area that I
01:18:30.520 | actually I get the tires and the this and that. I'm trying to get
01:18:34.000 | to something that I fundamentally care about. I have
01:18:36.200 | young children, I feed them food every day. I don't trust my food
01:18:40.320 | supply. I've never really trusted it. And this kind of
01:18:43.880 | stuff adds to this body of evidence where I'm worried that
01:18:47.640 | if my kids go through some kind of an issue, at the core of it
01:18:51.720 | will actually be something dietary. And it's typically
01:18:54.760 | overlooked by modern medicine. Because you'll treat it
01:18:57.720 | symptomologically, you'll try to give it some kind of pill. It's
01:19:01.560 | not how you treat a lot of these things. It could turn out where
01:19:05.480 | I think restructuring someone's diet can actually have an
01:19:08.480 | enormous impact. So I'm just trying to figure out what is
01:19:10.960 | something that we can all start to do to get a handle on this
01:19:15.000 | because you're putting food in your body every day.
01:19:17.280 | Yeah, it seems like you're saying it's, it's helpless here,
01:19:21.600 | free bird, there's nothing we can do. And I think what your
01:19:23.800 | moth and I are saying, asking you is like, where do we start?
01:19:26.440 | How can we start to get off of plastics,
01:19:29.200 | I think we got to go into the source. So biopolymers are made
01:19:32.880 | by living organisms. They're, they're typically longer chains
01:19:37.280 | of what are more like sugar molecules. And they can be used
01:19:41.560 | in a similar way that we're, they're not going to be as good
01:19:44.200 | as synthetic polymers that we use today. So a lot of our
01:19:47.480 | applications, a lot of our industry would have to be
01:19:49.360 | rebuilt, if we really wanted to go back to redesign the whole
01:19:52.520 | system. But we got to redesign the whole system tomorrow. We're
01:19:55.200 | making everything out of these products, because they're cheap.
01:19:57.760 | And because we can pull oil out of the ground and turn it into
01:20:00.240 | cheap stuff. And then it makes things affordable for everyone
01:20:02.960 | on earth. And that's how this industry emerged. You know, it
01:20:05.680 | was not like some, someone randomly came along and said,
01:20:08.480 | let's put plastics and everything, because it's going
01:20:10.400 | to be good for people. It was a way to make products more
01:20:13.080 | accessible and more available and cheaper, and it's
01:20:14.920 | everywhere. And so I think there's this real question of
01:20:17.880 | like, what industrial synthetic chemistries do we use today as a
01:20:22.840 | species that we should rethink using and start at that level
01:20:26.240 | and then rebuild from there. And I think shining a light on this
01:20:29.280 | stuff and just talking about what these products are, and I
01:20:31.320 | think I think that's, I think that's a lot of laudatory, but
01:20:34.480 | too complicated. I want something simpler, which is like,
01:20:36.920 | can we get a law passed so that chickens cannot eat certain
01:20:40.920 | kinds of food that are known to be high in phthalates?
01:20:43.200 | Yeah. And here's an idea. Look at this trim off, like, look at
01:20:45.880 | this banana. Like, I just, as a newsflash, a banana already has
01:20:49.960 | a wrapper called the peel. And then people are wrapping
01:20:53.080 | plastics on this stuff. Like I think consumers need to demand
01:20:56.720 | that, like, why did you take a picture of a banana? Why did you
01:20:59.520 | take a picture of a banana? I found that on the internet. I
01:21:01.680 | didn't actually take it myself. Okay. I thought you were like,
01:21:04.040 | sitting at the store.
01:21:04.920 | You know, when you see this kind of packaging, this is what has
01:21:09.040 | made me nuts in my life is all this crazy packaging going on.
01:21:13.040 | And in Europe, you are required at the supermarket. And people
01:21:16.960 | do this when they get to the end of the counter, they take the
01:21:19.200 | packaging off the supermarket has to take the packaging,
01:21:21.800 | right, so they have to bear the burden of it. So if you get a
01:21:24.400 | tube of toothpaste, you can take the packaging off and hand them
01:21:26.760 | the thing. Other places are now saying, hey, if you're coming
01:21:29.720 | for peanut butter, or grains or flour or sugar, they have a
01:21:32.960 | barrel of sugar, they put it in a brown bag. And you get this
01:21:37.120 | like more clean experience. I think we have to have like both
01:21:40.400 | ends of this the supply chain. There's also consumers. There's
01:21:43.240 | a there's like a marketing
01:21:44.440 | on my bananas to come with packaging on them. You do you
01:21:47.280 | don't want anyone else's fingerprints. I don't want
01:21:49.080 | anyone's fingerprints.
01:21:49.960 | Brother, you don't eat the peel.
01:21:54.440 | Yeah, but I could touch it.
01:21:56.320 | I haven't we haven't bought water bottles.
01:21:58.840 | I want everything to come in hermetically sealed plastic.
01:22:02.960 | Yeah. But then how many people in your house handle your
01:22:04.920 | banana? Pause?
01:22:06.080 | Whoa, whoa. How many people in his house handled his banana?
01:22:10.800 | No, did he know? But the real the real issue is not that it's
01:22:14.760 | like when the girls in your family have puberty younger and
01:22:18.560 | younger. And you're like, why is that happening? Or inconsistent
01:22:21.480 | periods? Or when the boys go through these weird, you know,
01:22:24.480 | moments where they're like, not really growing tick tock.
01:22:28.000 | No, I'm just telling you, like,
01:22:29.520 | wait, there's so many things that we have to panic about. It's
01:22:33.040 | hard to to your point. It's hard to attribute. Yeah. To one.
01:22:36.640 | Yeah. But I think I think that I think the thing that everybody
01:22:38.960 | could get or get focused on is how can you correct at least the
01:22:42.240 | marketing versus the reality in our food supply? You know, in a
01:22:45.280 | different example, I remember not telling me something which
01:22:47.960 | was along the lines of like hormone free is something that's
01:22:51.360 | marketed, but like chickens have been hormone free since like the
01:22:54.680 | 50s. But it's like, there's some latent hormones left inside of
01:22:58.720 | them. And then some of the feed is really poorly constructed. And
01:23:03.200 | you should be focused on like air chilled versus water chilled
01:23:06.880 | or whatever. There's just so much bullshit out there. And I
01:23:11.960 | so I think it's hard if you're like trying to take care of your
01:23:14.320 | family. It makes sense of it all. Yeah, makes sense of it all.
01:23:16.720 | It's just like, everyone, everyone feels helpless. And
01:23:19.280 | everyone wants to cry. I find it super frustrating, because it's
01:23:22.320 | something that I really care about my like what I eat. Right,
01:23:25.920 | totally. And it came from a place where a lot of disease in
01:23:29.560 | my family, and I was overweight when I was young. And so I just
01:23:33.080 | want to kind of like toe of No, I mean, you eat and it's
01:23:35.720 | impossible. You if you're eating vegetables, but my my takeaway
01:23:39.520 | is, there's going to be a lot of phthalates in my balls. Yes,
01:23:44.680 | absolutely. All the stuff that I do. I'm no better off than
01:23:48.760 | somebody eating at Wendy's in the end of the day. And I feel
01:23:51.480 | like, well, what is all that time and expense and difficulty?
01:23:55.040 | Is it's not worth it? Well, there's other health benefits
01:23:58.160 | to it, of course. And there's environmental benefits. But you
01:24:00.880 | know, we went all glass bottles, as I told you. And, you know,
01:24:04.920 | then I find out that some of the cans we have, because it's a
01:24:07.080 | couple of things we like certain natural sodas, they got plastic
01:24:10.880 | on the inside of the aluminum plastic on the inside. Exactly.
01:24:12.920 | I'm like, I thought I was doing the right thing here by going
01:24:14.480 | aluminum. Exactly. So the moral of the story is don't try to do
01:24:20.200 | the right thing. But anyway, I just want to I think this stuff
01:24:25.680 | is unavoidable. I really do. That's why I think I think I
01:24:30.080 | think you're right. And I think that's why you see all of these
01:24:33.960 | kinds of diseases, these chronic and acute conditions just
01:24:37.560 | ticking up tick, tick, tick, tick, tick, tick.
01:24:39.520 | It's well, anyway, I think this was a fascinating science
01:24:42.840 | corner. And I took a screenshot of sacks during it. This is
01:24:48.360 | access interest level, you can always tell how good it is.
01:24:50.560 | I thought this was a science corner I could finally use, you
01:24:57.600 | know,
01:24:57.680 | absolutely. They actually checked saxes balls for the
01:25:02.040 | plastics and all they found were steel. So there it is. Yeah,
01:25:05.960 | it's just brass balls going around the horn here. What's
01:25:08.680 | your favorite balls in pop culture? For me? It's got to be
01:25:12.640 | idiocracy. I love Have you guys seen idiocracy and Mike judges
01:25:16.560 | film?
01:25:17.640 | I haven't seen it.
01:25:18.440 | Okay, so in the film, I'll just keep this up. Society has gone
01:25:22.720 | to the lowest possible IQ, everybody's got an 80 IQ. And
01:25:26.040 | like people, like a reality TV star is running the country into
01:25:29.880 | the ground. That's what he has in idiocracy. And the number one
01:25:34.960 | television show is essentially a tick tock called Ouch, my balls.
01:25:39.880 | Here it is.
01:25:40.680 | It's basically the number one television show in this society,
01:26:03.320 | this dystopian society, where all the crops have died, and
01:26:09.160 | they don't know how to make crops anymore, is ouch, my
01:26:12.720 | balls. It's just a super cut of a guy getting kicked in the
01:26:15.720 | nuts. Sacks, what's your favorite ball moment in pop
01:26:18.320 | culture?
01:26:18.680 | Gregory Glenn Ross.
01:26:21.840 | All right, here it is, folks.
01:26:23.040 | That's Alec Baldwin, yeah.
01:26:26.280 | Freeberg, you got a favorite ball clip from pop culture for
01:26:28.960 | yourself? That tickles you? No. Tremont, you got one?
01:26:33.120 | I'm gonna find one.
01:26:34.760 | Yeah. Okay, everybody, this has been a spectacular episode of
01:26:44.640 | the world's number one podcast. It's episode 180 of the All In
01:26:47.560 | podcast. With you again, for the sultan of science, David
01:26:53.680 | Freeberg. David Sacks.
01:26:56.600 | There's a compilation of YouTube on YouTube of all these Austin
01:27:02.800 | Powers moments of Austin Powers getting kicked in the balls.
01:27:05.320 | Yeah, bad news. And we'll see you all at the All In Summit in
01:27:08.640 | September. Bye bye.
01:27:11.680 | Love you, Bruce.
01:27:13.840 | Bye bye.
01:27:14.320 | All In. We'll let your winners ride. Rain Man, David Sacks.
01:27:20.120 | And it said we open source it to the fans and they've just gone
01:27:26.120 | crazy with it. Love you, Wesley.
01:27:27.840 | The queen of quinoa.
01:27:29.240 | I'm going all in. Let your winners ride. Let your winners
01:27:33.200 | ride. Besties are gone.
01:27:37.240 | That is my dog taking a notice in your driveway, Sacks.
01:27:41.240 | Wait a minute. Oh man.
01:27:44.520 | My avatars will meet me at the All In Summit.
01:27:46.480 | We should all just get a room and just have one big huge orgy
01:27:49.120 | because they're all just useless. It's like this like
01:27:50.960 | sexual tension that they just need to release somehow.
01:27:53.240 | Wet your beep. Wet your beep.
01:27:57.920 | We need to get merch.
01:28:00.840 | Besties are gone.
01:28:01.440 | I'm going all in.
01:28:03.000 | I'm going all in.
01:28:11.280 | And now the plugs the All In Summit is taking place in Los
01:28:15.320 | Angeles on September 8th through the 10th.
01:28:17.520 | You can apply for a ticket at summit.allinpodcast.co.
01:28:22.000 | Scholarships will be coming soon.
01:28:24.440 | You can actually see the video of this podcast on YouTube,
01:28:27.840 | youtube.com/atallin or just search All In Podcast and hit
01:28:32.840 | the alert bell and you'll get updates when we post and we're
01:28:36.480 | going to do a party in Vegas.
01:28:38.680 | My understanding when we hit a million subscribers so look for
01:28:41.560 | that as well.
01:28:42.320 | You can follow us on x x.com/theallinpod TikTok is all
01:28:48.040 | underscore in underscore talk, Instagram, the All In Pod.
01:28:51.400 | And on LinkedIn, just search for the All In Podcast.
01:28:54.720 | You can follow Chamath at x.com/chamath and you can sign up
01:28:58.480 | for a substack at chamath.substack.com I do.
01:29:01.520 | Freeberg can be followed at x.com/freeberg and O'Halo is
01:29:05.000 | hiring click on the careers page at Ohalo genetics.com and you
01:29:09.520 | can follow sacks at x.com slash David sacks sacks recently spoke
01:29:13.240 | at the American moment conference and people are going
01:29:16.000 | crazy for it.
01:29:16.640 | It's pinned to his tweet on his ex profile.
01:29:18.600 | I'm Jason Calacanis.
01:29:20.120 | I am x.com slash Jason and if you want to see pictures of my
01:29:23.640 | bulldogs and the food I'm eating, go to instagram.com
01:29:26.800 | slash Jason in the first name club.
01:29:29.200 | You can listen to my other podcasts this week in startups
01:29:32.160 | just search for it on YouTube or your favorite podcast player.
01:29:34.560 | We are hiring a researcher apply to be a researcher doing
01:29:38.520 | primary research and working with me and producer Nick
01:29:41.040 | working in data and science and being able to do great research
01:29:44.560 | finance, etc.
01:29:45.640 | All in podcast.co slash research.
01:29:48.320 | It's a full time job working with us the besties and really
01:29:52.040 | excited about my investment in Athena go to Athena.
01:29:55.360 | Seeing a wow.com and get yourself a bit of a discount from
01:29:59.920 | your boy j cow.
01:30:01.440 | peanutwild.com. We'll see you all next time on the All In podcast.