back to index

In conversation with Reid Hoffman & Robert F. Kennedy Jr.


Chapters

0:0 Bestie intros: Buttons are back for fall
1:48 Reid Hoffman joins the show, reminiscing on PayPal stories with Sacks
7:52 State of AI: Nvidia, cluster buildouts, competition
19:51 OpenAI's corporate structure and thoughts on Elon's lawsuit
29:9 Inflection AI's deal structure with Microsoft, Lina Khan's impact on the tech industry
41:27 Reid's perspective on Kamala being hot swapped for Biden, funding groups that attempted to keep RFK Jr. off ballots
52:2 Reid's thoughts on growing antisemitism
55:3 Thoughts on Kamala's economic proposals: price caps, wealth tax, etc.
64:19 How Silicon Valley views both candidates, why Reid funded legal action against Trump
79:3 Robert F. Kennedy Jr. joins the show and recaps his campaign and decision to back Trump
91:13 Falling out with the Democratic Party
97:26 Potential role in the Trump Administration, Make America Healthy Again agenda
118:1 Sacks recaps RFK Jr's campaign, RFK Jr. on Trump's legacy

Whisper Transcript | Transcript Only Page

00:00:00.000 | Welcome back to the all in podcast, the number one business
00:00:03.740 | technology and political podcast in the world. I am your host,
00:00:08.360 | Jake out, Jason Calacanis. And with us today, three of my
00:00:12.540 | besties, you got David Freeberg cackling over there. He is your
00:00:17.060 | sultan of science previously known as the Queen of Qinhua,
00:00:19.620 | but he sold the Qinhua business made a killing in Qinhua. Also
00:00:23.520 | with us back from Italy back from Italy, Chamath Palihapitiya
00:00:28.220 | he's at 67% button. And he's not happy about it. But the hair
00:00:33.260 | looks great. You still got a little sea salt from the yacht.
00:00:37.300 | I think I'm going to try to keep my hair long. Let's see what
00:00:40.140 | happens. Did you bring any of the sea salt back with you from
00:00:43.260 | the Mediterranean? Put it in a little bottle to spray or no?
00:00:46.020 | No, but I do. Oh, okay, great. And have you showered in the
00:00:50.360 | last week? Or is it still you got the Mediterranean glove
00:00:52.660 | every day? I showered since I've gotten back. See, that's the
00:00:55.340 | problem. You don't have you don't have the sea to use as a
00:00:57.740 | natural, you know, disinfectant and deodorant. Exfoliant.
00:01:01.660 | Exfoliant also.
00:01:03.140 | Look how many buttons he's got going. I know it's just tragic
00:01:07.180 | beginning. I feel uncomfortable for your neck. I mean, it's like
00:01:09.940 | creeping all the way up. Your neck looks like a prisoner.
00:01:13.340 | Rain Man David Satterthwaite.
00:01:19.140 | I walked here but I had it totally unbuttoned and I thought
00:01:33.340 | this is completely inappropriate for Menlo Park in August. So I
00:01:36.820 | buttoned two buttons back in business mode. He's in business
00:01:39.420 | casual mode. He went from casual to business. Okay, and
00:01:43.180 | with us, of course, the Dark Knight himself. Yeah, the Rain
00:01:46.180 | Man David Sachs. And we have a bestie guest before you folks
00:01:50.700 | friend of my other pod this weekend startups Reid Hoffman is
00:01:53.780 | here. And you know him as a venture capitalist board member
00:01:57.940 | at Microsoft. And you were the co founder or the founder of
00:02:01.780 | LinkedIn. I don't know if you had a co founder, co founder, co
00:02:03.900 | founder of LinkedIn, now owned by Microsoft. He's got his own
00:02:07.780 | podcast masters of scale. And he and David Sachs work together at
00:02:14.260 | PayPal. Reid, give it welcome to the program and give us a little
00:02:18.740 | story. What is your fondest memory? Or the most quirky
00:02:22.860 | memory? David Sachs, and that all those weirdos I'm sorry, I'm
00:02:27.980 | not supposed to use the word weird anymore. I get banned on
00:02:29.900 | X. All of those unique personalities at PayPal. Tell us
00:02:33.860 | about that moment in time. And do you remember the first time
00:02:36.260 | you met David Sachs? Yeah, I met David, because Peter Dunham from
00:02:42.060 | Stanford and hired him in. And, you know, David, very quickly,
00:02:49.260 | because he, you know, has a strong learning curve as he
00:02:53.340 | plays these things kind of got the instinct of what the game we
00:02:57.300 | were playing with PayPal was. And it's part of the reason why
00:03:00.180 | I think, you know, each of the execs have had, you know, kind
00:03:04.060 | of key contributions to making, you know, kind of PayPal
00:03:09.020 | successful. And David's was this kind of like, maniacal focused
00:03:14.740 | on the kind of the cycle of how the product worked on eBay. And
00:03:20.740 | like, like, there was just a whole bunch of stuff I learned
00:03:23.380 | from him. It's part of how I track, you know, kind of, you
00:03:27.460 | know, people I respect, is what do I learn from them. And that
00:03:31.020 | was one of the things that I would say I learned from David
00:03:33.820 | at PayPal. That's nice.
00:03:34.980 | David, tell us your first memory of meeting read Hoffman. Would
00:03:40.220 | you remember where you were? Do you remember the conversation?
00:03:42.940 | Yeah, I think when we met through Peter, you know, and
00:03:45.580 | read, I think read was on the board of this confinity back
00:03:49.300 | then and then joined full time.
00:03:50.580 | Were we like 2827 29?
00:03:54.820 | No, I mean, let's see. This would have been 1000. I guess
00:03:59.940 | it would have been 27. When I first joined PayPal 2728. I
00:04:03.940 | guess something like that. Yeah. 99. So whatever that was, in any
00:04:08.620 | event, I mean, I'll just return the compliment, you know, PayPal
00:04:11.820 | had all these existential issues, where you had these
00:04:15.220 | larger entities trying to kill us visa, MasterCard, eBay, who
00:04:20.140 | else?
00:04:20.620 | Oh, the list goes on Citibank.
00:04:23.100 | Yeah. And read was was kind of our emissary who kept all these
00:04:28.340 | dogs at bay and managed to, I guess, be friends with them, I
00:04:32.420 | guess, to some degree, even though they wanted to kill us.
00:04:34.740 | And re was kind of in charge of making sure that these
00:04:37.820 | existential issues didn't blow up on us. And they didn't. So we
00:04:40.820 | got pretty lucky there.
00:04:42.020 | It's it's the Will Rogers line. It's politics is the art of
00:04:45.300 | saying nice doggy while you hunt for a stick. Tell us like a
00:04:48.820 | moment read. That is incredibly memorable to you from that PayPal
00:04:55.060 | era, you know, some existential moment or one or more difficult
00:04:59.140 | or funny moments, late night moments, that would be
00:05:01.980 | indicative of that era and whatever was in the water that
00:05:05.060 | drew all that talent to one place.
00:05:07.020 | Well, part of it is that, I mean, this is, you know, among
00:05:11.900 | the things I was learning from from Peter, was that Peter and
00:05:16.420 | Max recruited just a tremendous focus on on like intense
00:05:24.940 | learning curves. So, you know, it's one of the things that
00:05:27.500 | Peter later is like, okay, I guess you have to interview for
00:05:29.820 | being on sports teams and so forth, because this teamwork
00:05:32.420 | thing does matter. But like high performers, and it was kind of
00:05:36.220 | like a, like, and that was part of the reason why there was such
00:05:39.140 | intense, you know, kind of innovation and capability. You
00:05:44.980 | know, probably the most stunning memory I had it at PayPal is we,
00:05:49.500 | you know, we're all young, we're all first time we're, we're kind
00:05:53.140 | of doing a startup that matters, you know, kind of making this
00:05:56.540 | stuff happen. And we do this merger with x.com. And, and, you
00:06:01.740 | know, like, pre the merger closing, you know, Elon is
00:06:06.620 | saying, Oh, I got the CEO, Bill Harris, he's best ever, that's
00:06:09.340 | part of the reason why you give so much percentage of the
00:06:11.140 | company to x.com and the merger, you know, dah, dah, dah, dah. And
00:06:14.740 | then after the merger, literally, the first meeting I
00:06:19.180 | had with Elon is Bill Harris, a complete disaster, we need to
00:06:21.340 | fire him right away. Before we get to the first board meeting,
00:06:25.820 | we need him fired. And I'm like, uh, Elon, you need to talk to
00:06:29.860 | Peter about this.
00:06:30.820 | Well, I mean, he is decisive. That's for sure. Yes. All right.
00:06:37.700 | Well, let's get into we want, you know, we're gonna go a
00:06:39.500 | little bit mullet here, Friedberg, we're going to start
00:06:41.780 | with business. And then maybe we had a fun meeting about that
00:06:45.860 | topic at a place in Palo Alto that no longer exists called a
00:06:49.140 | Antonio's nut house. Yes, exactly. Yes, the legendary
00:06:53.140 | Antonio's done. And when Bill eventually did meet his demise,
00:06:57.380 | at PayPal, it was called the nut house coup.
00:07:01.500 | He got whacked at the nut house. We're at school tables in the
00:07:05.860 | back. We're in the books in the front.
00:07:08.100 | Well, it was he wasn't whacked. He was whacked at a board
00:07:10.620 | meeting, not at the house. But certain plans were formulated
00:07:15.220 | at the back of Antonio's nut house.
00:07:17.460 | It's Antonio's that house is, yeah, the most unhygienic bar in
00:07:24.420 | the Bay Area. And then that's, that's a pretty low benchmark.
00:07:27.700 | Let's just leave it at that. We will start with some business
00:07:30.420 | here, talk a little AI. And then since two of our panelists have
00:07:34.220 | a passion, we'll do the party political parties at the end.
00:07:38.340 | Everybody knows that re was a co founder of inflection AI and as
00:07:42.260 | a general partner at Greylock and one of the founding
00:07:46.500 | investors also in open AI. There's a good story there, I'm
00:07:49.780 | sure. And we just got results read from Nvidia results were
00:07:55.380 | good. They beat across the board. Stock was down after
00:07:59.060 | hours. Analysts said probably profit taking putting that
00:08:03.420 | aside. We've never seen a chart like this in the history of
00:08:07.900 | business. I would say data center revenue 26.3 billion 87%
00:08:12.940 | of their revenue. Now you remember Nvidia started
00:08:15.100 | obviously with, you know, video games and, and didn't have a
00:08:18.820 | major data center business that has exploded net income 16.6
00:08:23.740 | billion gross margin 75%. And here's your chart. On a total
00:08:30.460 | basis in videos, revenue scale up is basically unlike anything
00:08:34.540 | we've seen. But if you look their queue quarter of a quarter
00:08:38.260 | revenue over the past couple years, things are starting to
00:08:40.900 | cool off significantly after that giant boom. Re what's your
00:08:45.300 | take on Nvidia is just incredible run here? Is it
00:08:49.660 | sustainable? Will they have competitors? And do you think
00:08:53.540 | this build out this massive bill that we're seeing from startups
00:08:56.660 | to sovereigns? You know, to Microsoft, which are on the
00:08:59.740 | board of Google, Apple, etc? Is this sustainable? And is this
00:09:03.580 | going to keep going?
00:09:04.300 | Well, I got asked that question, unsurprisingly, by many public
00:09:08.500 | market investors over the year. Yeah. And I said, basically told
00:09:11.700 | them and say, Hey, look, it's sustainable for two years, which
00:09:13.860 | for you guys means forever.
00:09:15.180 | Yeah. Eight full quarters.
00:09:18.300 | Yes, exactly. So that that's infinity, right? In terms of
00:09:21.700 | time. You know, Nvidia has a very sharp, you know, kind of
00:09:27.420 | lead on the importance of the the chips for the training
00:09:31.140 | clusters. You know, they're, they're effective on inference.
00:09:35.060 | But I do think that as you kind of scale the demand, there'll be
00:09:38.780 | a lot of inference chips coming in. You know, I think Chamath
00:09:43.820 | you're invested in one of those. Oh, yeah. And I think there's
00:09:47.580 | going to be a bunch of those kind of coming in and and and
00:09:50.740 | the bulk of the demand will be on the inference side. And then
00:09:54.420 | Nvidia will have this challenge of, do I try to keep my prices
00:09:58.300 | and my margin? Or do I do what why we like competition? Do I
00:10:02.820 | have to respond to the competitive market? And then
00:10:04.820 | that I think will will play out, you know, start playing out
00:10:08.020 | probably in a year, two at the latest, and then kind of go. So
00:10:12.660 | I think it's not sustainable. The the pure heat is not
00:10:16.260 | sustainable. But I think it's, you know, Nvidia has got a very
00:10:19.380 | strong position. And, and, you know, I definitely, I would
00:10:23.580 | recommend people not be short on Nvidia.
00:10:25.900 | Yeah, so yeah, there's growth left competition is coming. And
00:10:30.820 | this is probably not the type of stock you would want to short at
00:10:34.060 | this moment in time. Freeberg, what are your thoughts on this
00:10:37.420 | build out, as well as the software build out that's
00:10:40.420 | occurring? And when do you think we're going to see some
00:10:43.860 | competition come into the space?
00:10:45.660 | I don't know if there's competition in the build out. I
00:10:47.460 | think we talked about this in the past. And I don't know if
00:10:49.180 | you guys saw these quotes this week, or recently on, we don't
00:10:53.340 | think about this build out in terms of ROI. Gavin Baker in
00:10:58.100 | conversation on invest like the best. Is that the name of the
00:11:01.860 | podcast? Yeah, I think reference some conversations he's been
00:11:05.260 | having with the leaders of these companies, regarding the build
00:11:08.940 | out is so important, because ultimately, if you create this,
00:11:11.940 | quote, digital God, the, you know, return is how many
00:11:17.140 | trillions. So it doesn't matter how many 10s of billions you're
00:11:20.180 | spending each quarter right now, you have to get there, you have
00:11:22.660 | to make sure you don't miss the boat. I guess read a question
00:11:25.260 | for you. You're on the board of Microsoft still, right? Like,
00:11:27.620 | yes, indeed. As Microsoft or Satya publicly talked about how
00:11:31.460 | they rationalize the investing principles associated with
00:11:36.060 | building out AI infrastructure in the cloud. Is it ROI based
00:11:40.180 | like, hey, in the next two years, we're gonna make this
00:11:41.820 | much additional incremental profit or, or we got to get this
00:11:46.100 | thing working. Right, right. To be more precise, is the
00:11:48.260 | investment driven by ROI? Or does everyone just say this is
00:11:50.740 | so strategic, we just have to win it. And we'll throw all the
00:11:53.540 | resources we have to at this.
00:11:55.100 | Well, so what will one board members speaking for Microsoft
00:12:00.020 | is, you know, is forbidden. So I'm not speaking for me. Right.
00:12:04.700 | To all cloud computing platform companies. How are you? Yeah,
00:12:09.100 | what's your sense on how they're thinking? Just one of the
00:12:10.660 | principles and the Microsoft thing is the company speaks for
00:12:12.780 | itself, board members don't don't don't speak for him. But
00:12:16.620 | you know, I think Satya is like the best public market CEO of
00:12:20.340 | our generation. I think he is stunning in kind of blending
00:12:27.260 | common combination of strategic insight with also kind of being,
00:12:30.140 | you know, kind of return on capital, you know, sensible
00:12:33.100 | risk taking, etc. And so the actual thing between your guys
00:12:36.780 | questions in terms of because I can comment on how Satya things
00:12:39.380 | with this stuff is, he's both thinking about like, it's a
00:12:41.860 | platform change. And you have to be there for the platform
00:12:44.540 | change for productivity for cloud, etc. And, okay, let's
00:12:49.340 | rationalize the capital to when are we expecting revenue? How
00:12:52.420 | do we get revenue sooner to have that as a good, productive
00:12:55.620 | cycle? How do we, you know, be not trying to, you know, just
00:13:01.260 | spend like drunken sailors, which is easy to do, right. But
00:13:05.620 | but to be targeting, you know, kind of business outcomes. And
00:13:08.740 | it's part of the reason why, you know, they're like, like, you
00:13:12.140 | know, he's very focused on what are we doing with office? What
00:13:14.740 | are we doing cloud? What are we doing with, you know, as
00:13:17.380 | opposed to like, you rarely hear him talking about AGI or never
00:13:22.860 | digital gods, because it's kind of the question of, I am, I am
00:13:27.060 | focused on this on a business sense. And I think that's,
00:13:29.140 | that's kind of the way he's doing it. But there is
00:13:30.620 | obviously a, you know, kind of a, it's hard to predict the
00:13:33.860 | future when it's novel and unknown and platforms. And it's
00:13:37.020 | part of the reason why you have all the hyperscalers. Now, you
00:13:40.540 | know, kind of fully engaged and intelligently engaged, because
00:13:43.300 | you say, well, if even it's just the new platform by which, you
00:13:47.460 | know, kind of software everything with with with a
00:13:50.660 | computer unit in it, whether it's a phone or a speaker or a
00:13:54.940 | computer or anything else, anything with a with a with a
00:13:58.740 | with a kind of a CPU or a GPU gets more intelligent. Like, you
00:14:03.500 | can't miss out on that platform. And so that's, that's, I think
00:14:07.420 | the the thing that's motivating everybody, but it's obviously,
00:14:09.940 | you know, how to do that smart is one of the things that, you
00:14:13.820 | know, everybody is, I'd say obsessing about every week.
00:14:17.380 | What do you think about the open source movement versus closed
00:14:21.740 | source, you were one of the original donators to open AI,
00:14:25.460 | you were originally on the board. And there's a couple of
00:14:28.220 | ways to go with this question. But I just want to start with,
00:14:30.420 | forget about the corporate structure over there, we'll get
00:14:33.020 | to that in a second. But I want to talk specifically about open
00:14:36.660 | source medicine, medicine, obviously far behind open AI far
00:14:40.820 | behind Google, Microsoft, so they went open source, when
00:14:44.220 | you're behind you go open source, I guess is the idea
00:14:46.220 | here. But they're making some big progress. Who do you think
00:14:50.300 | is going to win this ultimately, an open source provider of LLM
00:14:54.780 | or proprietary closed source, like open AI is, and it's
00:15:00.220 | confounding to say open AI is closed, but closed AI.
00:15:03.780 | Yeah, um, you look at from the very founding, open AI was
00:15:08.380 | never claiming it was gonna be open source was claiming it was
00:15:10.780 | going to be one safety open access, and not differential or
00:15:15.260 | controlling access for that. And I think that's they've stayed
00:15:18.820 | true to that principle, which is I think what the genesis of the
00:15:21.940 | word open is there. And, look, I think the key thing is going to
00:15:26.580 | be winners all over the place. I think there's gonna be winners
00:15:28.700 | in the open source side. And, you know, I don't, I don't know
00:15:32.580 | if llama is going to win from its open source thing as much as
00:15:35.020 | it's just saying, hey, we're training these models. So we're
00:15:36.860 | gonna, you know, put them out there because our closed system
00:15:40.260 | closed loop, you know, doesn't require selling for tokens and
00:15:43.500 | so forth. But there's also, you know, Mr. All and other folks
00:15:46.300 | who are doing competent models. And then I think that the the
00:15:51.620 | but you know, there'll be wins in different ways. So it's not
00:15:54.700 | like I think, like, for example, you know, I think there's going
00:15:57.460 | to be a bunch of different startups, they're going to win,
00:15:59.340 | whether it's coding agents, or, you know, kind of very specific
00:16:03.100 | applications within medical or other kinds of things. And I
00:16:06.500 | think they will, you know, generate big companies. And I
00:16:09.100 | think large companies, like, you know, the hyperscalers are
00:16:11.940 | gonna, are gonna succeed as well. Now, in the pure model
00:16:15.100 | competition, the question is, when do we start seeing a
00:16:20.460 | asymptote to scale? And, in my guess is, and, you know, kind of
00:16:25.380 | the GPT landmarks is each order of magnitude, my guess is the
00:16:29.740 | soonest will be GPD six. And it might not even it may even be
00:16:34.260 | after that. And that's part of what the, the bet that open AI
00:16:37.620 | and anthropic and the hyperscalers are all making is
00:16:40.580 | that that that return to scale, and then that has a lot of
00:16:43.700 | downstream effects. Because even if you say we can train smaller
00:16:46.940 | models, to do effective things, part of what's going to be
00:16:50.180 | really instrumental for training those smaller models is the
00:16:52.700 | larger models. So, like, even if there's a bunch of smaller
00:16:56.020 | models that are specifically capturing other kinds of market
00:16:58.580 | opportunities, which is part of what I've been doing and
00:17:00.940 | investing in AI since, you know, 2014, 2015, you know, there's a
00:17:06.860 | there's a there's going to be a set of those things that are all
00:17:10.580 | a whole bunch of startup opportunities. So I think that
00:17:12.500 | that the the A versus B is is is a good dramatic framing. But
00:17:17.740 | it's really on which specific opportunities because there's
00:17:20.340 | going to be wins and opportunities across them.
00:17:23.060 | Do you think you're sorry, just real quick, do you think there's
00:17:25.180 | one LLM or one foundational model read that effectively does
00:17:32.060 | everything like a meta model that starts to take most of the
00:17:35.220 | market or does different versions of smaller models or
00:17:40.740 | small agents that kind of network together, end up being
00:17:44.260 | the best solution for specific applications and verticals?
00:17:47.340 | Like, how does this evolve over time? Like, everyone's got this
00:17:50.140 | concept that there's a God model that does everything and wins,
00:17:52.660 | and whoever gets the God model wins everything. But the reality
00:17:56.500 | of software and principles of biology would indicate that
00:18:00.420 | you'll see like smaller network things that are better at doing
00:18:03.340 | things than any one big thing. And so I hear your point of view
00:18:07.100 | on the philosophy of that. Yeah,
00:18:08.300 | I think the mistake that people make is they think precisely is
00:18:11.620 | like the one model to rule them all. It's like Sauron's ring.
00:18:15.380 | And actually, in fact, already today, like, for example, one of
00:18:20.740 | the things that happens with all the model providers at Microsoft
00:18:24.100 | and open AI, which I've seen is sometimes sub in like GPD 3.5,
00:18:28.340 | as opposed to four, to see what the answers are, because there's
00:18:30.900 | a cost of compute, even as you learn that bring the cost of the
00:18:34.100 | computer, the larger models down, the larger models are
00:18:36.740 | always going to be a lot more expensive. And by the way,
00:18:38.340 | they're going to be more expensive, kind of probably
00:18:41.740 | loosely on the order of magnitude, right? So it's like,
00:18:44.300 | well, it's 10x larger, it's 10x more expensive. Totally. And and
00:18:47.180 | so when you're trying to say, hey, I'm trying to make business
00:18:49.460 | models work,
00:18:50.180 | by language translation, right? If I just want to do language
00:18:53.180 | translation, I don't need a massive model, I just need a
00:18:55.260 | model that's really good at language translation.
00:18:56.940 | Exactly. And so what I think you're going to see is is is
00:19:00.300 | networks of models, and like kind of traffic control and
00:19:04.140 | escalation, all the rest and agents are not going to be one
00:19:07.220 | model, they're gonna be blends of models. And that's one of the
00:19:09.820 | reasons why you say, well, there's actually in fact, a lot
00:19:12.100 | of a lot of room for startups, because it's not like we say,
00:19:14.380 | well, we take GPD seven, and we just serve it for everything.
00:19:17.940 | It's like, well, it's gonna be super expensive. And there's a
00:19:19.820 | whole bunch of things about like serving it more cheaply. And
00:19:23.100 | like, for example, one of the the really great technical
00:19:25.460 | papers that I love for Microsoft is, you know, all you need is
00:19:27.820 | textbooks. It's like you can you can train very specific models
00:19:32.180 | on kind of like high quality data, along with, by the way,
00:19:36.540 | the larger model helping train it that all of a sudden, you
00:19:38.580 | have a functional smaller model. And you know, the question will
00:19:42.460 | be a blend of these things. So I think the the the multi model
00:19:46.460 | model approach is, I think, going to be, you know, quickly
00:19:50.340 | universal.
00:19:51.020 | What is your take on IP in this new era, we see open AI, and
00:19:55.580 | you're not on the open AI board anymore, right? You're not so
00:19:58.140 | you're independent of that, even though you made a big donation
00:20:00.980 | at some point.
00:20:01.540 | Donation and investment. I led the first commercial series for
00:20:05.900 | special. So you're an investor in it, and you donated to it.
00:20:08.820 | Actually, let's start there. What's up with that corporate
00:20:13.580 | structure? How do we make sense of that? Something's a
00:20:17.900 | nonprofit, you donated to it, and then you invested in it, and
00:20:21.180 | everybody's making money and selling in secondary at 100
00:20:23.540 | billion. How does that work in the world?
00:20:26.460 | So, so it's, it's a five, one, two, three is the governor thing
00:20:31.500 | is what started and, you know, when, you know, kind of Elon and
00:20:35.580 | Sam were starting this and said, Look, we need, you know,
00:20:38.180 | philanthropic support. And we're trying to make sure that there's
00:20:41.700 | like open access to AI, which is going to be an instrumental
00:20:44.220 | technology. And we've got some great technologists who want to
00:20:46.540 | come do this. We started as a, as a, you know, kind of as a
00:20:50.580 | 501 c three for doing it that that that persists, as far as I
00:20:54.740 | know, till today. Then, you know, one of Sam Altman's, you
00:20:59.340 | know, pieces of genius was that he kind of said, Look, we're
00:21:01.980 | going to need scale capital. And I'm trying to go out to raise
00:21:06.020 | and commercial round was 600 million. I'm trying to raise 600
00:21:09.700 | million philanthropy and is not working. Right. So, so I have
00:21:12.940 | this idea, which is the 501 c three, which is doing this kind
00:21:15.900 | of research mission of, you know, AGI for humanity is also
00:21:19.700 | producing commercial benefits. And we can create initially an
00:21:23.340 | LP, which has a kind of a revenue right on the commercial
00:21:28.740 | things that investors can invest in. And, you know, you know,
00:21:33.580 | read it'd be really helpful if you led this. Because, you know,
00:21:37.580 | and I was like, Oh, but you don't have a go to market plan.
00:21:39.780 | You don't have a product plan, you know, business plan. Yeah,
00:21:44.580 | but you know, like, we, we need to show that we're actually
00:21:47.220 | serious about the business. I said, Alright, fine, I will, I
00:21:49.500 | will lead it from my foundation. You know, because even though
00:21:53.420 | none of these things we like to see as investors were there, but
00:21:56.580 | I was like, Look, okay, I'll lead as investment, I'll manage
00:21:59.340 | it as an investment, but I'll do as an investment for my
00:22:01.540 | foundation, in order to do this. And, and what, you know, you
00:22:07.460 | know, kind of that was kind of as we were beginning to get into,
00:22:09.980 | you know, like, kind of, we, we hadn't seen anything, they were
00:22:13.580 | still doing Dota and, and robot hands and much of this. So it's
00:22:16.940 | like, we're betting on the scale thesis of generating something
00:22:20.380 | magical. And so we hadn't seen GPT three yet. And of course,
00:22:23.780 | once that started coming, then it was like, well, we need a
00:22:25.940 | bunch more capital, let's do a strategic, you know, you know,
00:22:30.300 | connection, and let's talk to all the hyperscalers. And let's
00:22:32.860 | work out a deal by which one of them invest in us. And, and, and
00:22:36.460 | then, you know, the, the Microsoft OpenAI deal came
00:22:38.700 | together with, you know, converting the LP into a
00:22:42.780 | subsidiary of the nonprofit, you know, kind of saying, Look,
00:22:46.220 | there's all kinds of benefits that both OpenAI and Microsoft
00:22:49.540 | can get from a business deal. And so that's, that's, that's
00:22:53.300 | what's led to, you know, the structure, you know, that I was
00:22:57.500 | familiar with before I left the board.
00:22:58.900 | Do you, what did you think of Elon's first lawsuit, and then
00:23:03.660 | he dropped it, and then he refiled, where, where do you
00:23:06.580 | think he's coming from?
00:23:07.540 | Well, what is the, you know, I'm not very charitable about those
00:23:16.660 | lawsuits. They know, I would like to be because, you know,
00:23:19.980 | Elon's one of the entrepreneurial heroes of our
00:23:22.780 | time and generation. But I think it's the, I think it's, you
00:23:26.940 | know, frankly, I think probably the most charitable thing to say
00:23:29.060 | is sour grapes. Because, you know, for example, I know, Sam
00:23:33.580 | offered him as much of the investment round as he wanted,
00:23:36.620 | right? Like, he could have done the whole thing, he could have
00:23:38.820 | led it, you know, it was kind of like, Hey, look, we still love
00:23:41.900 | you. And he was like, No, it's not a company that I control,
00:23:45.420 | it's gonna fail. So I'm not interested in investing. I was
00:23:48.420 | like, Okay, right. And so now you're getting these lawsuits
00:23:51.500 | that are like, you know, like, I was misled. And it was like, you
00:23:56.580 | were offered everything at every opportunity other than converting
00:23:59.500 | open AI into a company that you you completely owned. And so,
00:24:02.860 | you know, I think it's without without basis without merit.
00:24:07.460 | But why do you think he would have dropped it and then
00:24:09.980 | refiled? Where do you think that comes from? Is was it? Is there
00:24:12.660 | a new information? Do you think?
00:24:14.100 | I think it was a jurisdiction? Jamal?
00:24:16.100 | Oh, that makes sense. I got
00:24:18.860 | buried. I mean, Elon put in the first what $44 million and he
00:24:22.380 | doesn't have any shares.
00:24:23.660 | Yeah. I by the way, put in 10 million at the same time, and I
00:24:27.820 | don't have any shares from those 10 million.
00:24:29.220 | But do you think that he kind of got screwed because he doesn't
00:24:32.180 | have any shares? I mean, at the time he put in the 44 million,
00:24:34.900 | it was never going to be a for profit. Now it's a for profit. A
00:24:38.620 | lot of people are profiting, you know, assuming the paper market
00:24:42.060 | ends up being realized. So he doesn't own anything. I mean, if
00:24:47.020 | you were a seed investor, and put up 44 million and something,
00:24:50.260 | and then everyone's making money, and you don't have any
00:24:52.260 | shares, forget about the legal technicalities, wouldn't you
00:24:55.060 | have a feeling of being screwed?
00:24:56.460 | Well, look, I can understand the emotion of that. But like, it's
00:25:00.660 | not like Elon short of money. Right. And so if you go look,
00:25:04.300 | I'd like to have shares like I did invest in the other thing. I
00:25:07.500 | didn't get any shares for the 10 million that I put in. And by
00:25:10.420 | the way, it's not just legal technicalities. It's actually
00:25:12.860 | really important that you're not doing private enrichment off
00:25:15.060 | philanthropic donations. And so, you know, it's, it's, but isn't
00:25:19.420 | that what's happened?
00:25:20.180 | No, from a viewpoint of, they're held separate, right. And so,
00:25:29.260 | you know, and the 501 c three continues to, to control the,
00:25:33.940 | you know, kind of control the kind of the, the, the mission
00:25:38.420 | and destiny and so forth. And so the question about its mission
00:25:41.460 | is still is still guiding things. And you're essentially
00:25:43.740 | investing in that mission. And you recorded, you recruited
00:25:46.900 | people, you know, on that mission. And so, you know, I
00:25:50.100 | think that the, you know, you know, I think, like I said,
00:25:54.540 | sour grapes.
00:25:55.380 | Okay, so let's get into some political stuff. First, I want
00:26:00.780 | to get the IP question, then I want to talk about Lena Khan. So
00:26:04.900 | how do you think about IP in this, you know, briefly in this
00:26:07.740 | new world, opening, I and New York Times can't come to terms,
00:26:11.460 | New York Times caught them red handed cookie jar, according to
00:26:14.780 | their lawsuit, having indexed a ton of their content, it's
00:26:17.980 | pretty crystal clear that their contents behind a paywall. And
00:26:20.700 | that's how they make money. I also subscribe to chat GPT, I
00:26:23.900 | give them 20 bucks a month, maybe 30 bucks a month for every
00:26:26.460 | employee in my firm. And I get New York Times content from
00:26:30.500 | there all the time. I will ask it, what is the wire cut I think
00:26:33.540 | is the best choice in chat GPT, I get that. And then I get the
00:26:36.700 | answer. And I don't need my New York Times subscription, I don't
00:26:39.300 | visit the New York Times anymore. feels pretty clear cut
00:26:43.220 | to me. But how do you think about IP? Should an LLM be able
00:26:48.140 | to ingest whatever they please? Or should they be required to
00:26:51.300 | get permission in advance and pay a royalty to content
00:26:55.100 | creators?
00:26:55.740 | Well, as a content creator to look, I think, I think that it
00:27:03.460 | tends to be a little bit of a, we do want content creators to
00:27:08.780 | benefit economically from the work. It's part of the reason
00:27:10.940 | why we have copyright, it's part of the reason why we have
00:27:12.860 | payrolls, you know, other kinds of things that I think are, are
00:27:16.220 | very important. And I think it's, it's a complicated thing
00:27:18.380 | that needs to be sorted out. Now, that being said, I think we
00:27:21.060 | also want to say that we can train these models, like, you
00:27:24.740 | know, like training is like reading, and like reading things
00:27:28.140 | is, you know, like when something's available to be read,
00:27:31.220 | and you've engaged in the right economic thing for reading it, I
00:27:33.620 | think that's a kind of a reasonable fair use thing. Now,
00:27:36.580 | maybe we update the terms of service, maybe we update, you
00:27:39.220 | know, copyright law or other things to say, well, okay, that
00:27:41.140 | now changes, you know, I think we don't want to forbid changes
00:27:44.740 | in the future. You know, this is one of the problems we get with
00:27:48.780 | it blocks innovation, when we do that, but blocks innovation, and,
00:27:51.620 | you know, kind of Hollywood blocks innovation and music, it
00:27:54.780 | blocks innovation. So you want to allow some new chain, you
00:27:59.020 | know, changing landscape. And I think this is a changing
00:28:01.340 | landscape that arguably is reading. So I think that, that
00:28:04.380 | both of those things are true in terms of what do we, what do we
00:28:07.540 | want to sort through? I think that one of the reasons why this
00:28:10.980 | is kind of like, you know, over, like, when I give advice to, you
00:28:14.820 | know, various news organizations, and so I say,
00:28:16.980 | look, don't try to hold out for money on the training side of
00:28:20.180 | things. Because, you know, we're going to create synthetic data,
00:28:23.500 | we're going to do all kinds of other things that are going to
00:28:25.220 | mean that no one's particular data is really going to matter
00:28:28.980 | what you should be is on freshness, on brand, on other
00:28:32.220 | things, and we should work out ongoing sustaining economic
00:28:35.580 | arrangements like that, that would be my two cents, you know,
00:28:38.780 | suggestion for it. And I do think we want to design an
00:28:41.140 | ecosystem that includes that. And, you know, when I was
00:28:43.860 | involved in those conversations at OpenAI, they, they agreed
00:28:47.180 | with that, Microsoft certainly agrees with that in terms of,
00:28:49.820 | you know, how do we make sure that economics are fairly
00:28:52.380 | apportioned, and so forth, for, you know, what we're doing for,
00:28:55.980 | you know, this phase of, you know, and ongoing, but like, you
00:28:59.620 | know, there's a current tech, new technological wave that's
00:29:02.380 | coming, and how do you do that? So, you know, that's a messy
00:29:05.100 | answer. But unfortunately, it's a messy subject.
00:29:06.860 | It's pretty messy subject. Before we move to politics, I
00:29:09.980 | just wanted to actually ask you about inflection. So is it still
00:29:14.260 | running? Yeah. And so what basically happened, there was
00:29:18.260 | like some transfer payment from Microsoft and a couple of the
00:29:21.620 | people like, and then it seems like whatever that deal was, a
00:29:25.180 | little bit seems to have been copied by Google when they did
00:29:27.820 | this character AI thing. So just trying to get a sense from you,
00:29:30.700 | are these deals to structurally avoid FTC scrutiny in terms of
00:29:34.660 | the building blocks of it? Or how, how did you think about it?
00:29:38.140 | And what is the, what is the pattern and the trend on these
00:29:41.180 | things?
00:29:41.700 | Well, the thing I think that's happened was, you know, very
00:29:45.860 | early days, you had things like, you know, we're doing an agent,
00:29:48.620 | and if Pi had launched before JATCPT, it'd probably be in a
00:29:52.620 | different circumstance. But like JATCPT got the, oh, my God,
00:29:56.380 | Pi is the inflection AI.
00:29:57.580 | Pi is the inflection agent. Yeah. And so, so by the time
00:30:01.020 | that Pi, we got the trend right, and the, and the interest of the
00:30:05.780 | market, right, but we got the timing, you know, too late
00:30:09.100 | happens with startups. So it's like, okay, we need to pivot, we
00:30:11.620 | need to pivot from a B2C model to a B2B. And we have a unique
00:30:15.300 | model, but let's sell that to other people who already have
00:30:17.340 | audiences, because we're not going to be able to easily grow
00:30:19.580 | our audience. And then, you know, once we had that as a
00:30:21.900 | conversation, there were employees like, well, we want to
00:30:23.980 | do the direct agent thing. And that's what we want to do. And
00:30:26.180 | we will go somewhere in order to do that. And we're like, okay,
00:30:28.860 | how do we fund this? And how do we make that work? And how do
00:30:30.940 | we make it work for investors? And we said, hey, there's a deal
00:30:33.380 | structure that could work, which is, you know, with a, you know,
00:30:36.860 | kind of outside party, you can get paid enough in a non
00:30:40.180 | exclusive IP license and an ability to selectively hire
00:30:43.660 | folks. And then you can dividend some of that out to investors.
00:30:47.140 | So investors, you know, get back a, you know, kind of a one
00:30:49.780 | X, and then kind of a ongoing position. So, you know, as
00:30:54.740 | investors, it's great to have a kind of optionality on its B2B
00:30:57.500 | business in order to play that out. And this is a structure
00:31:01.060 | that works for everybody in this pivot to B2B. And that's
00:31:04.380 | essentially the structure that we did.
00:31:06.860 | I see. Great pivot Chamath into Lena Khan. I think one of the
00:31:12.180 | things that is quite paradoxical about your relationship with
00:31:16.780 | David Sachs, as you both agree on something in politics, which
00:31:19.660 | is Lena Khan, and her concepts around future competition, and
00:31:25.100 | maybe how she's running this issue for the United States is
00:31:29.420 | leading to basically a freeze on the market, we're seeing weird
00:31:33.780 | deal structures, like some of the ones we're talking about
00:31:36.500 | here that could have just been acquisitions. And I'm curious
00:31:41.300 | your thoughts on what she you know, this sort of breakup of
00:31:45.420 | Google. Now we're seeing that emerge at the same time that
00:31:50.300 | they're facing the biggest existential crisis of their
00:31:52.740 | career, which is language models competing with them. And then I
00:31:56.660 | would say half of my Google searches have already moved to,
00:32:00.380 | you know, you know, chat GPT, like services. So what's your
00:32:04.340 | take on Lena Khan's approach to M&A? And what impact if it's
00:32:09.020 | continued and sustained? Will that have on capital allocation?
00:32:12.940 | Because I don't know what happened to the single and
00:32:15.140 | double M&A market, but it seems to be completely gone.
00:32:19.340 | Everything from Adobe and Figma, to other mergers that could be
00:32:24.220 | happening are essentially frozen. So what's your take?
00:32:26.900 | So it was funny, because I kind of made an off the cuff, you
00:32:31.700 | know, kind of remark about Lena Khan, which turned an all news
00:32:34.220 | cycle.
00:32:34.740 | I saw you on CNN, where they were like, Are you telling
00:32:37.780 | Kamala and Biden what they have to do? And
00:32:40.780 | I'm like, no, because I don't believe in that kind of
00:32:43.180 | corruption of, of politics. The only way she's gonna learn about
00:32:46.580 | it is she asked me or she watched this television show.
00:32:49.100 | And, and so, so she's done a good job on the price curtail.
00:32:54.820 | She did a good job on the anti-competes, both of which I
00:32:56.580 | think are very good for, you know, competitive markets. The
00:32:59.660 | problem is, I think she has a misunderstanding of these large
00:33:02.980 | tech companies. And, for example, on the M&A thing, you
00:33:07.620 | know, her theory is, you got to prevent the aggregation of
00:33:10.220 | power. So you got to, you got to fight every acquisition of
00:33:13.260 | note. And the problem, of course, is that actually quells
00:33:16.660 | venture capital investment, because it's like, okay, part of
00:33:19.980 | the returns is, if I'm going to invest in something that might
00:33:24.220 | be competing with, you know, one or more of the large tech
00:33:27.940 | companies, I need to have acquisition exits as part of
00:33:32.300 | being able to fund enough capital to really make that
00:33:35.380 | acquisition, you know, that, that, that, that investment
00:33:37.700 | possible, because if it doesn't work, I want to be able to at
00:33:39.620 | least return, recover my capital by an investment. So the right
00:33:43.100 | way to look at it is, is there competition amongst the top tech
00:33:47.060 | companies? Because, you know, if one of them is like squashing
00:33:49.980 | all the other ones, that's a problem. If we're, if we're five
00:33:53.500 | large tech companies heading to three, then I'm much more
00:33:55.660 | sympathetic to her point of view. But we're actually five
00:33:58.300 | heading to 10, right, or five to seven heading to 12. Because
00:34:01.860 | like NVIDIA is now in the mix, and others, I think, are coming
00:34:04.980 | in the mix.
00:34:05.180 | Tesla is, you know, now over 500 billion. Yeah.
00:34:07.660 | Yes. And so you have this ability. So the thing is, is
00:34:10.900 | they're competing on the acquisitions, just like they're
00:34:13.620 | competing in the market, in the marketplace. And if you're
00:34:16.020 | trying to quell the whole thing, because your theory is, like,
00:34:20.060 | like, they should just, you know, the startup should just be
00:34:22.180 | able to grow up to compete. That actually means that those will
00:34:24.980 | never get the capital that they need in order to do that, which
00:34:27.700 | means you're actually having the opposite of your intent, right?
00:34:31.620 | What you're doing is you're actually making there to be less
00:34:34.340 | competition. Because, you know, capitalists can't say if I'm
00:34:37.740 | going to put 100 million, 500 million, a billion dollars into
00:34:41.220 | this company, I at least have a chance of getting my capital
00:34:43.940 | back, or I can possibly create a competitor. And that's, that's
00:34:47.580 | the reason I was speaking out against it as an expert.
00:34:49.980 | It was like, interesting. I saw you, I think it was a Jake
00:34:54.540 | Taper, Tapper, who you kind of grilled you on, I thought you
00:34:58.140 | did an exceptional job of just saying, Listen, I made a
00:35:00.180 | donation. This is how I feel about it. But obviously, she's
00:35:02.700 | going to do what she wants to do. And that's just how politics
00:35:06.620 | works. So I thought that was actually pretty well done. And I
00:35:09.260 | actually appreciate you fighting for more M&A, because it'd be
00:35:12.300 | great for the industry. It's actually you want to throw up a
00:35:16.580 | political topic here, you want me to
00:35:17.940 | just just just to stick on Lena Khan for a second. So I agree
00:35:22.100 | that her approach has been overly broad and has had a
00:35:26.020 | chilling effect on M&A. And so Jake, like you said, we've lost
00:35:30.180 | those base at acquisitions that I think are important to the
00:35:33.940 | venture capital market that help new startups get funded. I mean,
00:35:37.020 | if the returns on risk capital go down, there's going to be
00:35:39.820 | less of it.
00:35:40.540 | Yeah, so I agree with read on that the area where I'm not sure
00:35:46.300 | we agree is, and where I do agree with Lena Khan is I do
00:35:49.740 | think the big tech companies have too much power. I do think
00:35:52.860 | that they are monopolies or have monopolies. And I do think they
00:35:56.740 | need to be controlled. I just think that, you know, I
00:36:00.100 | wouldn't prevent them from doing any M&A whatsoever. I'm curious
00:36:03.060 | if read, like agrees with that, that the big tech companies have
00:36:06.180 | too much power, or agrees with Lena Khan on that. And I guess
00:36:10.260 | specifically, do you think any big tech companies should be
00:36:12.260 | broken up? If so, which ones? I mean, I, I would actually
00:36:15.620 | entertain that idea of, of, of deconglomerating or breaking up
00:36:19.940 | some of these big tech companies. Do you do you think
00:36:21.940 | do you agree that big tech is too much power or not?
00:36:26.620 | I think it's TBD. But the reason I'd say let's take the opposite
00:36:32.100 | point of view would say no. The thesis for no, is that they are
00:36:38.020 | very strong American companies that get, you know, in the most
00:36:42.180 | cases, over half the revenue from overseas. They create
00:36:46.900 | technology platforms that beneficially differentiate the
00:36:50.900 | US versus, you know, many other countries and kind of global
00:36:56.420 | circumstances, like the internet and other kinds of things. I
00:36:59.700 | think that they are competing ferociously with each other. I
00:37:02.100 | mean, you know, Jason just mentioned that, you know, it's
00:37:06.260 | kind of like, look, we've already got like chat GPT
00:37:08.740 | competing with Google search, and other kinds of things. And I
00:37:11.140 | think it's competitive pressure, right? This is, I think what
00:37:13.860 | capitalism is about is competitive pressure, that
00:37:16.500 | essentially creates the thing. And that's the reason why, like,
00:37:18.740 | if it's, if we were shrinking, like it was Google, Uber, Alice,
00:37:23.300 | or actually, frankly, think that the, you know, everyone likes to
00:37:26.100 | talk about, you know, Google, you like, I think that the prime
00:37:29.540 | candidate is likely to be and I'm speaking as an individual,
00:37:32.340 | and as a venture capitalist here is Apple with the App Store.
00:37:35.460 | Right. Okay, so wait, so that brings up an interesting point.
00:37:38.500 | One of the things we've talked about in this pod is that we
00:37:40.660 | shouldn't shut down M&A, but the the FTC should limit
00:37:45.460 | anti-competitive tactics by these big tech companies.
00:37:48.900 | Apple, really good example, because they drive everything
00:37:53.300 | through the App Store, you're not allowed to do side loading,
00:37:55.780 | they want to take what is a 30% piece of any sales, you're not
00:38:01.060 | even allowed to have a link inside an application to drive
00:38:03.940 | accepting your website. Yeah, you're up now you can. So would
00:38:08.900 | you at least want to crack down on those anti-competitive
00:38:11.860 | tactics? Yeah, no, for sure. And look, especially when, you
00:38:15.620 | know, we all know it's nonsense. It's like, look, you could just
00:38:18.340 | give the consumers the option to, to, to allow side loading,
00:38:22.340 | you could just say, it's technically very simple to do.
00:38:25.220 | And you can say, look, we, we don't want you to side load,
00:38:29.460 | because we view it to be safety and security. But we're giving
00:38:33.380 | you the option. Right? Fine. Give people the option. Right?
00:38:38.580 | Reed, were you surprised that then the first target where
00:38:41.940 | there was like some successful antitrust pushback was against
00:38:45.060 | Google versus Apple? And then second, do you think that
00:38:48.260 | there's a chance like a meaningful chance that the
00:38:52.100 | government tries to break Google up? Or do you think it
00:38:54.980 | looks something maybe more similar to what happened to
00:38:57.300 | Microsoft? So I think in mandating breakups, you know,
00:39:03.620 | like I think is a look, I think we should operate through
00:39:07.780 | competitive networks and competitive ecosystems. I think
00:39:09.860 | it's part of what's smart about capitalism. And I think
00:39:12.180 | mandating breakups is only when essentially capitalism is
00:39:15.140 | failing on specific things, you want to do the least, the least
00:39:19.700 | you can, to get back to competitive networks, in terms
00:39:23.540 | of how you're operating. And so, you know, you say, hey, look,
00:39:26.820 | I iOS has this kind of monopoly, and you say, there's no side
00:39:29.380 | loading, you have to use App Store, you have to use the
00:39:30.900 | payment mechanism, you know, etc, etc. It's like, well, that
00:39:33.620 | quells a ton of startup innovation. We all know this as
00:39:36.900 | investors, because we look at anyone who's prospectively doing
00:39:40.020 | a business like this and say, no chance, it's you know, you're
00:39:43.220 | not going to succeed. And so so then you say, well, what's the
00:39:46.500 | least thing that we can do? Right? And you know, a classic
00:39:50.180 | and you're like, well, let's break off the App Store from
00:39:52.740 | Apple as well. Right? unclear that that would really fully
00:39:56.260 | work. You know, that's like socialism mandating how the
00:39:59.460 | thing should work. Let's try to get it so that we allow
00:40:01.700 | competition to determine these things. And like, for example,
00:40:05.620 | saying, hey, like, you have to allow consumers the option of
00:40:09.060 | side loading, you have to allow consumers the option of
00:40:11.140 | installing an alternative App Store, right? Like that kind of
00:40:14.740 | stuff. I think, you know, what's the minimal set? I think
00:40:16.900 | that's the kind of intervention we want to have. Because I think
00:40:19.540 | there's all kinds of benefits that come from
00:40:21.700 | what but why do you think that the case against Apple has made
00:40:25.620 | less progress in the case against Google?
00:40:27.460 | I think it's kind of it's less politically easy, right? Like,
00:40:32.100 | it's kind of like, everybody loves our iOS phone. And, you
00:40:35.700 | know, there's, there's, there's less of a blue and red, you
00:40:39.940 | know, kind of combo tackle, where, you know, the, the blue
00:40:44.420 | feet, people are like, big companies, less offensive,
00:40:48.340 | basically. Yes. Yes. More stylish. They're, they're
00:40:51.380 | prettier. I kind of like your approach, though, with the App
00:40:54.820 | Store, if you were to think of least harm to the ecosystem,
00:40:58.980 | Epic Games has their own App Store for games, they charge
00:41:02.260 | 88%. They give a I'm sorry, they give developers 88%. They
00:41:06.020 | only take 12. And forcing Apple to allow, you know, a startup to
00:41:10.980 | do an App Store would solve the entire problem. And it seems
00:41:14.740 | like that's where it's going to go. And all five of us would
00:41:16.980 | invest instantly in an App Store that would say 0% take rate,
00:41:22.100 | and all advertising based. What a great idea that would be.
00:41:25.380 | Yeah, I have a question. A few weeks ago, you said something to
00:41:30.900 | the effect very publicly that you had had a one hour or
00:41:33.540 | multi hour lunch with Biden. And he just seemed like super on
00:41:37.620 | his game. And then he was kind of dumped. Was that just a
00:41:43.220 | moment in time where he was really great with you? Or how do
00:41:47.940 | you reconcile that with Pelosi and all of these other folks?
00:41:51.300 | And what happened to Biden? Well, like most of us, I was
00:41:56.980 | pretty dismayed by the debate performance. Because when I
00:42:01.940 | talked to him, like detailed, thoughtful analysis with no
00:42:05.300 | notes on Gaza questions about AI, you know, and what kinds of
00:42:09.540 | things, you know, what did I think about what the progress,
00:42:12.180 | you know, the thing they were doing with the voluntary
00:42:14.340 | commitments, the executive order, and, you know, what kinds
00:42:16.740 | of things should happen in the future, and all that kind of
00:42:18.980 | stuff being on the game a little slower, right, then, then, you
00:42:22.420 | know, a 50 year old would be but but, you know, like, cogent and
00:42:26.660 | totally worth it. And then you kind of looked at the debate.
00:42:31.140 | Oh, my gosh, this is, this is, this is a disaster. And so it
00:42:37.940 | was like, look, is the debate a one off thing? Is it? Did you
00:42:40.420 | were you ill, you know, like trying to reconcile the two and,
00:42:44.020 | you know, spend a little bit of time trying to figure that out
00:42:47.380 | to to what was going on, because it was it was the first time I'd
00:42:50.820 | seen something like that. And, you know, you know, I don't, you
00:42:56.100 | know, I'm not enough of a DC insider to know exactly what the
00:43:00.260 | set of conclusions were other than I, you know, I plotted, you
00:43:03.540 | know, Biden for having the kind of integrity to go look, I'm
00:43:06.340 | maybe I'm ill, maybe I'm old, maybe I'm slower, but, you know,
00:43:09.140 | the it's about the country more than it's about me, because I'm,
00:43:12.660 | you know, not, you know, it's, it's, it's important to be about
00:43:16.100 | the country, not about yourself. I'll, I'll step aside. And
00:43:18.900 | ultimately, his decision, there's nothing that anyone can
00:43:21.460 | force Pelosi couldn't force it anyone else. It's ultimately his
00:43:24.020 | decision. He came to that decision. So do you want that?
00:43:27.700 | Do you think that they should have run an open primary after
00:43:30.660 | that? And would Kamala have won an open primary?
00:43:34.580 | Well, it's hard to know. I mean, I think they were definitely
00:43:37.940 | leaning towards an open primary, and then all the people who
00:43:40.180 | would be the most natural contenders all endorsed Kamala.
00:43:42.820 | So and by the way, you say, well, kind of democratic process
00:43:46.660 | was like, well, there was a democratic process that picked
00:43:48.580 | the Biden Harris ticket, which turned into the Harris waltz
00:43:50.660 | ticket. And so that's not anti democratic. But I think if you
00:43:54.580 | look at the sequence of events, it was kind of like, well, you
00:43:57.540 | know, we're, we're going to sort out, you know, what we're
00:44:00.020 | going to do. And then, you know, all of the key folks, you know,
00:44:03.780 | Shapiro, and, and Whitmer, and everyone else all endorsed
00:44:07.140 | Kamala was like, okay, let's just let's get back to, you
00:44:10.180 | know, kind of the, the choice of two candidates. And so I, you
00:44:13.220 | know, do you feel the voters felt? Do you think the voters
00:44:17.220 | felt left out? Um, the democratic voters? Well, I mean,
00:44:24.900 | from post fact seems not right with the level of kind of
00:44:28.020 | energy and all the rest, it seems that that that, um, that
00:44:33.620 | the that, you know, like the with the pure polling and kind
00:44:38.420 | of level energy and kind of what's going on. They're happy
00:44:41.060 | with what they got. Yeah, they're happy with what they
00:44:42.580 | got. I would have liked to have that speed run. Do you think it
00:44:44.900 | sets a bad precedent that there were these back room
00:44:49.700 | conversations, obviously, the staff of Whitmer, more Shapiro,
00:44:56.100 | their office speaks with Democratic Party leadership
00:44:58.500 | speak with big donors. And there was effectively a
00:45:01.140 | coalescing that took place over a period of time that said, we
00:45:04.580 | should all stand behind and endorse one person instead of
00:45:07.220 | infighting and creating a split in the party. And does that not
00:45:11.380 | set a bad precedent that there is a small group of people in
00:45:14.740 | either party that in a primary process effectively get to
00:45:18.660 | nominate their candidate, get their candidate to become the
00:45:21.620 | nominee. And therefore, there's only two people for the country
00:45:24.500 | to choose from. And as we have seen recently with RFK Jr. And
00:45:27.940 | the lawsuits against him in being on the ballot in different
00:45:31.220 | states, it makes it very difficult, maybe for the people
00:45:34.180 | to have their choice. And is that a bad way for democracy to
00:45:37.940 | work? And I just love your philosophical view on this. I'm
00:45:40.580 | like, what's the best way for democracy in the United States
00:45:42.820 | to work? So for the president, for the president, yes, we do
00:45:47.860 | live in a republic, right? And there is various, like, you
00:45:51.780 | know, some people have much more influence than others,
00:45:54.020 | whether it's media platforms, whether it's, you know,
00:45:58.340 | economics and ability to spend whether it's, you know, history
00:46:01.940 | and a brand and, and, and other things. And so, you know, this
00:46:06.100 | melee and, and, and kind of, you know, whole integration set
00:46:10.180 | of things. Now, ultimately, you know, voters are going to
00:46:13.060 | decide in November, right? So, you know, people do have a, you
00:46:16.500 | know, and I think that that that staying to our democratic
00:46:19.780 | process is what's really key, like, you know, people going to
00:46:23.460 | the polls, you know, I think we should want to live in a
00:46:25.860 | country where everyone does, you know, everyone who is who is
00:46:28.820 | legally allowed to vote does vote. And I think that that's,
00:46:32.740 | you know, ultimately a good thing. Now, you know, are there
00:46:35.220 | things that I would like to change? Sure. I'd like to change.
00:46:38.180 | I'd like to have rank choice voting. You know, I'd like to
00:46:40.660 | have open primaries. There's a set of things like, like,
00:46:43.540 | actually, my principal frustration and all this stuff
00:46:46.100 | is, you know, what's, what's one of the fundamental things that
00:46:48.740 | the two parties agree on that, that that that shouldn't be is
00:46:52.900 | that there should be only two parties, right? And I think
00:46:56.020 | that's, I think that's something you you need to fix. And you
00:46:59.380 | can't fix it. Unfortunately, I think with independent
00:47:01.700 | candidates, because because the whole system is really set up
00:47:04.580 | for, you know, kind of two parties and independent
00:47:07.620 | candidates are almost always spoilers one way or the other.
00:47:10.740 | I mean, like on the RFK stuff, I understand it was a bunch of
00:47:13.300 | Democrats who were trying to, you know, prevent him from
00:47:16.740 | getting on the ballot. I actually prefer him on the
00:47:18.260 | ballot because I actually think his his his anti-vax stance,
00:47:22.180 | well, you know, really fit very well with with Trump. And so I
00:47:25.860 | think he was more from trying to address that read because I
00:47:28.500 | think there was a rumor that you were funding some of these
00:47:31.860 | lawsuits to keep him off the ballot or whatever. Like, have
00:47:34.340 | you spent any money to try to impact RFK one way or the other?
00:47:38.020 | I wouldn't be surprised if we look at all the money that goes
00:47:42.500 | to all the different organizations of organization x
00:47:44.820 | kind of had some kind of ballot thing. Mine's my voice
00:47:49.700 | instruction was always like, No, no, no, don't do that. That's
00:47:52.660 | anti democratic. But you know, you can't control everything,
00:47:55.860 | just like you invest in a company and CEO, sometimes the
00:47:58.740 | dumb ass that you can't do anything about.
00:48:00.980 | Because you give money, you give money to folks that then
00:48:04.180 | execute their own strategy. So you can't control on the ground
00:48:07.940 | tactics, right? Yeah.
00:48:09.140 | So there's that happens that you're like, No, don't do that.
00:48:12.180 | Well, okay, that's a good, that's a good segue. Let's talk
00:48:16.660 | about the five cases against Trump. There are five lawsuits.
00:48:20.420 | No, hold on, Jake. Can we just stay on this topic for a second?
00:48:23.540 | I think this is important. Okay, so in in Michigan and
00:48:26.820 | Wisconsin, you had democratic groups, they fought RFK juniors
00:48:32.340 | bid to get on the ballot. Okay, they failed. Now he wants to get
00:48:36.180 | off the ballot, but they won't take him off. Now that they
00:48:39.140 | think that his presence hurts Trump. And at the same time,
00:48:43.300 | Michigan's trying to remove Cornell West, and Wisconsin
00:48:46.900 | trying to remove Jill Stein. So I'm curious, do you think
00:48:49.700 | there's any principle on display here besides naked partisan
00:48:53.940 | hackery? I mean, basically, the democrats fought having third
00:48:59.300 | parties on the ballot when they thought it would hurt Biden.
00:49:02.820 | And now they want to keep them on the ballot when they think
00:49:06.420 | it's going to hurt Trump, except for those third party
00:49:09.300 | candidates who they still think will hurt Harris. So what is
00:49:12.900 | there any principle here? Or is this just partisan hackery?
00:49:15.860 | I think it's, you know, I frankly, you know, think that
00:49:19.780 | everyone who follows the legal process to get on the ballot
00:49:22.260 | should be on the ballot. And, you know, we should follow the
00:49:24.420 | legal process. I'm very much of a legal process kind of person.
00:49:28.580 | What I'm opposed to is like, you know, calling Raffensperger
00:49:31.060 | and asking for 11,000 votes, right, which is not legal.
00:49:35.060 | Right. So, so like, yeah, sure. That is that is that bad? And
00:49:39.140 | do I advocate against that? The answer is absolutely yes. But
00:49:42.980 | it's, you know, follow the legal process.
00:49:44.660 | But if the Secretary of State of Colorado throws Trump off the
00:49:47.860 | ballot, for example, is that legal process if it's then
00:49:50.980 | overruled by the Supreme Court? Or can we just say,
00:49:53.300 | substantively, that states shouldn't be removing candidates
00:49:56.660 | from the ballot? That's anti democratic.
00:49:58.340 | Well, but you want them to remove RFK from the ballot?
00:50:02.500 | No, that's not what I said.
00:50:04.740 | Oh, okay. I'm just trying to figure out.
00:50:06.980 | No, the rule is that, well, first of all, I don't think that
00:50:10.740 | Democratic groups should be suing RFK to keep him off the
00:50:14.340 | ballot. And that's what he said is that Democratic groups were
00:50:17.380 | suing to keep him off the ballot. And they were trying to
00:50:19.380 | exhaust his resources. So he couldn't mount an effective
00:50:22.180 | campaign. And some of those groups you funded, right? So
00:50:25.460 | maybe you don't know what they were doing. But in any event, I
00:50:28.340 | consider that to be anti democratic. RFK is now trying
00:50:30.980 | to remove his name from the ballot. I think as a candidate,
00:50:33.060 | you're allowed to do that. And those same groups that once
00:50:36.660 | fought to keep him off the ballot are trying to keep his
00:50:39.700 | name on the ballot. Because now they perceive, yeah, because
00:50:42.660 | now they perceive the political calculation to be a little
00:50:45.060 | different. So I don't see any of this as being democratic. This
00:50:49.300 | to me is just partisan hackery, isn't it?
00:50:51.460 | Yes, fundamentally, from a viewpoint of like, for example,
00:50:57.860 | my direct actions, and yes, there was, you fund a whole
00:51:00.980 | bunch of different groups, and you have different groups doing
00:51:03.380 | different things, but you funded them to do this thing that you
00:51:05.460 | were thinking of, and things happen, just like companies. You
00:51:08.980 | know, my thing was actually, in fact, making people aware of, of
00:51:14.100 | RFK is anti vax, you know, statement is anti science stuff,
00:51:18.100 | because I thought that would be relevant in the polls in
00:51:20.820 | November, that that was the actual strategy that that I
00:51:24.340 | believe, and I think that would differentially, you know, hit
00:51:28.260 | Trump more. And so therefore would be a spoiler, right, as
00:51:32.180 | these are, I have no problem with drawing attention to
00:51:35.220 | issues. But I do think fundamentally, it's anti
00:51:37.460 | democratic to sue third party candidates to the point where
00:51:40.660 | they can't be on the ballot. Okay, let me ask you directly,
00:51:43.940 | Cornel West. There's an effort right now to remove Cornel West
00:51:48.820 | from the ballot in Michigan. Do you support that? Or would you
00:51:51.940 | oppose that? I mean, is that democracy?
00:51:54.020 | By default, I would oppose it. I don't know any of the details.
00:51:58.260 | Okay, fair enough.
00:51:58.980 | Reid, I have a I have a question for you. It's more of a
00:52:03.140 | statement, actually, maybe I just love to get your reaction.
00:52:05.300 | One of the most divisive issues that we have right now is
00:52:10.500 | people's position on October 7, Israel, Palestine. There is a
00:52:17.700 | sense that there's a growing kind of like virulent strain of
00:52:22.180 | anti semitism in America. A lot of people point to the extreme
00:52:25.620 | left as where that's really gestating. There was thoughts
00:52:30.100 | that Josh Shapiro would have been an exceptional candidate.
00:52:32.980 | But one of the large reasons why he was not really meaningfully
00:52:36.980 | considered was his religion. I just want you to comment on the
00:52:41.140 | broad issue. And whether you see it in the Democratic Party,
00:52:44.180 | whether you see it in the Republican Party, whether you
00:52:46.100 | see it at all, just give us a sense of where we stand
00:52:49.220 | culturally on this issue.
00:52:50.340 | Well, so like I know, Josh Shapiro, I think he's great.
00:52:54.580 | You know, I've had I've broken bread with them. And, you know,
00:52:59.540 | he was meaningfully considered. You know, I think that the, you
00:53:04.900 | know, I think we should be so lucky that he would, you know,
00:53:07.620 | run for presidency someday, some year. You know, I actually
00:53:12.500 | didn't know Walt at all. And, and, and, you know, was
00:53:16.740 | initially kind of surprised because, you know, I was like,
00:53:19.700 | Oh, I thought it was probably gonna be Shapiro. And I was
00:53:21.940 | like, Well, you know, I think it was, you know, probably a
00:53:24.020 | close call down to those two. And, and it looks like, you
00:53:27.060 | know, you know, in making decisions, I think, you know,
00:53:30.660 | Harris, you know, made a good decision of Walt's. So, you
00:53:34.500 | know, I think it's a, you know, now on the, on the
00:53:38.020 | antisemitism topic, I do worry that, you know, broadly, we're
00:53:42.980 | seeing, you know, kind of more rise of antisemitism. And
00:53:48.900 | that's extremely important to fight. You know, because I
00:53:52.980 | think, and I think there are people on in the, you know,
00:53:56.180 | it's, it's a weirdly like, like, there's some lefties are
00:53:59.620 | doing it, and there's some righties are doing it's both a
00:54:01.940 | blue and a red issue in different, different shape. And
00:54:05.300 | I think it's very important that we, you know, we stand
00:54:08.660 | against that as a country. And so, you know, I've been, you
00:54:14.100 | know, kind of mostly just trying to say, Hey, look, we
00:54:16.340 | gotta, we gotta be anti, anti racism, antisemitism, and also
00:54:20.580 | anti genocide. And we got to figure that out.
00:54:22.820 | What do you think of Kamala's handling of that issue in her
00:54:26.100 | speech? She basically seemed to, I don't know, say both sides
00:54:30.420 | it, but she said, Hey, you can believe that the people of
00:54:33.700 | Gaza should be treated more humanely. And that, you know,
00:54:36.900 | Israel has a right to defend herself. What do you think of
00:54:39.300 | her handling that?
00:54:39.940 | I think that's rational, right? Like, you should be anti
00:54:42.340 | genocide, both of Palestinians and of Jews, right? And, and
00:54:47.620 | like, like, it's obviously a very, very thorny topic. Yes.
00:54:51.540 | Right. So. So I think, you know, saying that I'm going to
00:54:54.820 | try to protect civilians on both sides, anti genocide, I
00:54:57.780 | think that's a human, caring place to be looking out for
00:55:02.420 | people.
00:55:02.900 | Reed, do you think that, generally speaking, Marxist
00:55:06.260 | socialist principles are taking a firmer hold on the Democratic
00:55:11.220 | Party, and kind of those principles are starting to
00:55:14.900 | showcase not just in the cultural phenomena that that
00:55:18.180 | Chamath is referencing, but also in some of the policymaking
00:55:20.740 | that's going on, and concepts of equity, rooted in concepts
00:55:27.780 | of social justice, ultimately rooted in Marxist principles
00:55:31.300 | emerging from the Industrial Revolution.
00:55:33.140 | What about price fix?
00:55:34.340 | So as an example, the price gouging, you know, price caps
00:55:37.380 | on food proposal, the concept of a wealth tax, not necessarily
00:55:42.580 | the unrealized capital gains tax, but separately attacks on
00:55:45.140 | wealth, all of these concepts of the degradation of power
00:55:48.820 | structure through policy. And in part, some have argued that
00:55:54.420 | the anti semitism arises from these principles and that the
00:55:57.380 | Jews are considered a privileged and powerful cultural
00:56:00.900 | class. Is that is that not being observed? Do you not do
00:56:04.500 | you not think that there's some tendencies that are emerging
00:56:06.580 | in the Democratic Party and may be influenced by a louder far
00:56:10.020 | left and that far left is becoming more loud and better
00:56:13.140 | represented in the party?
00:56:14.180 | Look, I think we should speak out against both the far left
00:56:18.500 | and the far right. I think it's important to do both. And so,
00:56:21.940 | you know, since, you know, I'm playing the Democrat here on
00:56:27.220 | this conversation, I'll ask you guys to play the or especially
00:56:31.940 | Sax play the Republican and speak out against the far right
00:56:36.020 | too. But the short answer is yes, there are amongst the
00:56:41.060 | extreme left, that's not everybody in the Democratic
00:56:43.060 | Party, but the extreme left, there is some like, you know,
00:56:47.460 | misunderstandings about, you know, why it's important to do
00:56:50.340 | defend, you know, kind of anti genocide, like from the river
00:56:53.460 | to the sea. It's like, yeah, that's a genocidal statement.
00:56:55.940 | Don't use that one. Right. You understand what language
00:56:58.820 | you're using. And and to be like, look, you know, we've had
00:57:03.140 | a great genocidal moment with, you know, World War Two, and
00:57:06.180 | we're still trying to recover from it to, you know, questions
00:57:09.540 | around like, like what I think is a foolish wealth tax, even
00:57:14.260 | though it's, by the way, narrowed to like 80%. And then
00:57:16.660 | on like the, the price gouging stuff, you know, one of the
00:57:20.900 | things is I started scratching at it, you know, it was
00:57:22.900 | interesting, I think this week, Kroger said, yes, we did
00:57:25.380 | actually artificially raise prices to profit from the
00:57:29.140 | pandemic. And, you know, and yeah, you should stop price
00:57:31.780 | gouging. It's not quite the same thing as price capping. And
00:57:34.100 | apparently there's laws that affect even in Florida, right,
00:57:37.700 | or in Texas, where some of, you know, you guys are living. So
00:57:42.420 | like, it's kind of, you know, it's like, okay, I need to
00:57:45.860 | understand this issue in more depth, but I don't think it's
00:57:47.940 | as simplistic as the political headlines are having it.
00:57:51.060 | - Well, but the reason why Kamala Harris proposed the
00:57:54.500 | price fixing proposal, price gouging, whatever you want to
00:57:56.740 | call it, was in response to inflation. In other words,
00:57:59.700 | we've had 20% erosion in purchasing power over the last
00:58:02.900 | four years, Harris needs a response to that. So she came
00:58:05.940 | forward with this new economic proposal. So it's in that
00:58:08.980 | context, this came up, and this wasn't some proposal by the far
00:58:11.940 | left of the party, unless you consider Kamala Harris to be
00:58:15.380 | far left, I actually do. But okay, fair enough. But my point
00:58:18.580 | is just, this is her proposal. And it's in response to
00:58:22.180 | inflation. I mean, you don't, you understand what causes
00:58:25.540 | inflation, right? It's like the government printing too much
00:58:27.540 | money. It's not, it's not greedy corporations raising their
00:58:30.580 | prices too much. I mean, do you agree with that?
00:58:32.500 | - Look, I agree that you have to have good monetary policy. And
00:58:35.860 | so I think we probably agree on that. And I think some printing
00:58:39.220 | of money is part of the normal functioning economy, but too
00:58:41.780 | much is bad. And I don't think, look, I think price it, look,
00:58:46.820 | part of the reason why we just talked about antitrust stuff
00:58:48.740 | earlier, you do have to look at places where there's a
00:58:53.220 | possibility of kind of commanding stuff from your
00:58:55.860 | privileged position. And like, you know, the, like, I want to
00:58:59.620 | - We all agree, we all agree that monopolies have to be
00:59:01.620 | controlled. No, no, no debate there. But that's not what's
00:59:04.500 | caused the inflation, right? Because we've had inflation of
00:59:06.500 | commodities, not just monopoly products, but commodities, like
00:59:09.860 | just food staples, eggs, you know, chicken, stuff like that.
00:59:14.180 | - Driven by fuel and labor and all the other inflationary, you
00:59:18.420 | know, underpinnings of those markets. And I think we tried to
00:59:22.260 | highlight that. I don't know if you saw Elizabeth Warren's
00:59:23.940 | interview on CNBC where she got taken apart because she made
00:59:26.820 | some claims about profiteering by Kraft Heinz and the CNBC
00:59:30.420 | anchors pointed out you were actually incorrect. Kraft Heinz
00:59:33.220 | has seen a reduction in profit over this period of time. And so
00:59:36.980 | like there were factual inaccuracies in these belief
00:59:39.220 | systems. But, you know, for me, it feels a lot like the
00:59:42.180 | government setting prices in free markets is one of those
00:59:45.700 | steps towards socialist principles that worry me the
00:59:48.500 | most.
00:59:48.740 | - Yeah. And look, I, generally speaking, as I was saying
00:59:52.260 | earlier, I'm like, like, make sure the network sets, sorts it
00:59:56.260 | out versus, you know, centralized control.
00:59:58.900 | - Totally, totally.
00:59:59.780 | - So, so it's kind of like, you have to look at, is there a
01:00:02.660 | place where you're like going, okay, that's the reason I like
01:00:04.900 | focused, her words were price gouging. And if you're focused
01:00:08.020 | on the kind of gouging side of it is like, oh, there might be a
01:00:10.580 | market inefficiency that you're essentially correcting, then
01:00:14.180 | that's, I think the same kind of thing we were talking about
01:00:16.500 | with like the FTC and the Apple app store and so forth. If
01:00:19.940 | it's like the, I'm just going to set a fixed price on eggs,
01:00:23.220 | right? That's a bad idea. And by the way, there's, there's bad
01:00:26.420 | ideas, like the wealth tax thing that I, that I disagree
01:00:29.300 | with. Her, her economic thing also had housing, which I think
01:00:32.100 | is a, you know, a good, you know, kind of thing to kind of
01:00:34.740 | lower costs for Americans and, you know, kind of make that kind
01:00:38.820 | of stable work. Like, I think she's, she's been good on
01:00:42.660 | immigration. I think that's the, the Lankford cinema bill,
01:00:46.260 | which was from, you know, the, the, the Republican side was
01:00:49.940 | something they were fully prepared to endorse. And, you
01:00:52.980 | know, Trump killed it because he wanted to campaign on it.
01:00:55.060 | It's like, look, we care about the actual running in the
01:00:57.300 | country. And so you look, I think there's a bunch of good
01:01:00.260 | things, but if you said, do I defend price capping? The
01:01:02.900 | answer is not as an independent principle by itself. And by
01:01:06.500 | the way, are there people lefties, like, you know, a lot
01:01:09.220 | of what Elizabeth Warren says about capitalism, I disagree
01:01:11.860 | with, right. I mean, I could disagree with you on the
01:01:14.020 | border. I think, you know, Kamala Harris used to be
01:01:16.580 | considered the borders are that's gotten scrubbed. I don't
01:01:18.580 | think she's done a great job on that, but whatever that I want
01:01:21.060 | to go back to issues that affect Silicon Valley, 25%
01:01:24.340 | unrealized gains tax. It seems like most of Silicon Valley,
01:01:27.940 | almost all of it is either disagrees with this or is up in
01:01:30.820 | arms about this. I think J Cal, you would you said that this
01:01:34.020 | is disqualifying and disqualifying for me for sure.
01:01:36.820 | Yeah. So I mean, do you agree that a large unrealized gains
01:01:41.060 | tax 25% would be a disaster for Silicon Valley and the whole
01:01:45.540 | startup ecosystem? Or I mean, how do you come down on that?
01:01:48.420 | Well, as I understand it, on that taxes is proposed is you
01:01:53.300 | have to have 80% of your net worth. That's right. Liquid.
01:01:58.180 | Yeah, if 80% or more is illiquid, then
01:02:02.100 | no, you get to defer the tax, but there's a penalty.
01:02:04.980 | Yeah, you get to defer the tax as a penalty. That's right.
01:02:07.940 | Look, I think it's definitely a quelling impact. And it's
01:02:10.500 | definitely stupid and definitely shouldn't happen.
01:02:12.340 | You know, so is it? Yeah, I think we got your position on
01:02:16.900 | it. It's stupid.
01:02:17.860 | Why isn't it? Why isn't it disqualifying? The way that J
01:02:24.580 | cow says, are we just supposed to hope that doesn't do what
01:02:28.660 | she is going to do? I'll tell you why I think that both the
01:02:32.500 | republicans and the democrats have realized that there's
01:02:36.420 | actually very little difference on a lot of the major things
01:02:40.180 | that they actually talk about. So what they're both being
01:02:44.420 | forced to do is realize that because the centrality of a
01:02:47.860 | bunch of the things they say are the same, they each have to go
01:02:50.820 | to their flanks to get the end plus one vote. And so Kamala
01:02:55.220 | goes to the left and spouts all this stuff that seems so
01:02:58.580 | socialist or socialist or communist because she has to get
01:03:02.420 | those people to vote for her. Ultimately, I think what ends
01:03:06.980 | up happening is most of the stuff in the middle has a decent
01:03:10.260 | chance of happening. The stuff at the fringes, I think they
01:03:13.380 | get put up sacks almost as like a sacrificial lamb. A good
01:03:16.980 | example, I think, is like all of the stuff that's happening
01:03:19.540 | with the student loan reform, a half a trillion dollar plan, it
01:03:23.460 | gets shot down by the Supreme Court, this new plan, another
01:03:26.660 | $100 billion, not even being heard yet by the Supreme Court.
01:03:30.580 | So I think they know this. I mean, it's not like the Biden
01:03:33.780 | administration is dumb. The Trump administration is not
01:03:36.580 | them either. So I think what they're doing is 10 million
01:03:38.820 | people would be an example on the right and taking away a
01:03:41.300 | woman's right to choose would be the other one. Yeah. And by
01:03:44.180 | the way, one of the things you keep bringing that up, but
01:03:46.420 | Trump has said that he would veto he would not support a
01:03:49.380 | national ban. I'm talking about already doing he already he
01:03:52.260 | already overturned. I'm talking about that. Yeah. And by the
01:03:54.980 | way, just returning issue to the states. It's not outlawing
01:03:57.460 | abortion. And by the way, as the people in Austin, Texas,
01:04:02.340 | well, but that's a valid initiative. They have not had a
01:04:07.540 | valid initiative. They're just about everywhere. There's been
01:04:09.460 | a valid initiative. The pro choice forces have won. And
01:04:12.660 | besides, that's a state issue. Now, J Cal, not federal. Yeah,
01:04:15.220 | no, it's a state issue. And Trump succeeded in taking away
01:04:17.220 | a woman's right to choose in Texas. But one thing, by the
01:04:19.940 | way, look, in the spirit of the all in podcast, I wanted to be
01:04:22.580 | clear about like, there's, there's this stuff on the on
01:04:25.140 | the Dems, and some of their economic policy for the far
01:04:28.420 | left people that, you know, kind of, you know, they're
01:04:30.580 | advocating for that I'm opposed to, you know, sax, I'd love to
01:04:33.300 | hear from you, what parts of Trump's thing you're opposed to?
01:04:37.060 | There we go. Well, I mean, I have been consistent on this pod
01:04:41.140 | for years that I thought that the, let's call it the like,
01:04:45.860 | extreme pro life side was not good for the Republican Party,
01:04:49.380 | and I've been opposed to it. I don't think it's what J. Cal
01:04:52.580 | says, I think that overturning Roe v. Wade did not abolish
01:04:55.780 | abortion, it basically returned the issue to the states. And if
01:04:58.500 | you look at the referenda that have happened, they've pretty
01:05:00.580 | much all gone the pro choice direction. So I think that the
01:05:04.260 | overturning of Roe v. Wade has actually allowed the country to
01:05:06.660 | sort of sort out that issue, although it's not completely
01:05:09.860 | sorted out. But look, I would not support a national
01:05:13.780 | abortion, I would not support refederalizing the issue. I
01:05:17.940 | think there's a lot of issues about, you know, war and peace
01:05:21.620 | where I do not support the, you could say the establishment
01:05:25.460 | neocon strand within the party. I do not support all these
01:05:28.900 | interventions, I do not support these forever wars. And there
01:05:32.420 | is a big debate in the party about that. Now, one of the
01:05:34.900 | reasons why at the end of the day, I support Trump, is I know
01:05:38.820 | this will strike some people as counterintuitive, but I think
01:05:45.060 | he is the moderate within the Republican Party. He's a
01:05:47.700 | moderate on abortion. I know, J. Cal, you're still bitter about
01:05:50.420 | that Supreme Court case. However, he's been very, very
01:05:53.860 | clear that he will not support national abortion ban. Moreover,
01:05:56.980 | he took the abortion language out of the Republican platform.
01:06:00.260 | I think he's the moderate on issues of war. He was the first
01:06:06.100 | Republican candidate to run opposing Bush's forever wars.
01:06:10.580 | So I give him credit on those things. On style, he may not
01:06:14.260 | come across as a moderate, but those are style points. I think
01:06:17.300 | on issues, he is the moderate. The issue I have with Kamala
01:06:21.060 | Harris is I don't think she's a moderate, you know? So like,
01:06:23.940 | just to take this 25% unrealized gains tax first, when this issue
01:06:29.060 | came up, we were assured, well, she doesn't really believe that,
01:06:31.780 | even though it was in the Democratic platform, and it was
01:06:34.340 | in the Biden-Harris budget. Then people said, well, maybe it's
01:06:38.740 | part of her platform, but it's not a priority for her. And we
01:06:42.420 | just had one of her, like, top economic advisors come out on,
01:06:45.620 | I think it was CNBC, defending it, and her campaign confirmed
01:06:50.260 | that she supports it, okay? So now the argument has become,
01:06:54.260 | well, she supports it. It is really part of the platform. She
01:06:57.700 | would do it if she could, but she's not going to be able to do
01:06:59.860 | it. I just don't think that's a ringing endorsement of a
01:07:03.300 | candidate. I don't think you want to support a candidate,
01:07:05.540 | because they're not going to be able to do what they really
01:07:07.620 | want to do.
01:07:08.180 | Do you think she's a moderate? Or do you think she's a socialist,
01:07:12.420 | you know, going to take the country very far left?
01:07:15.380 | By the way, yeah, but what Sachs didn't address is Trump's
01:07:19.620 | tariff policy, which is also inflationary, almost equivalent
01:07:23.220 | to the price gouging, you know, food price caps. I think that
01:07:25.940 | they're both inflationary, and they're both bad policy. That's
01:07:28.580 | my personal point of view.
01:07:29.780 | Tariffs is where I thought it was going to go.
01:07:31.380 | But yeah, anyway, yeah.
01:07:32.900 | Honestly, I'm not sure what I think of that proposal. You
01:07:37.220 | know, I guess it depends on the details.
01:07:38.580 | What do you think Reid is?
01:07:39.380 | I'm not endorsing it, but I'm not opposing it. But just back
01:07:41.940 | to this point that should we support Kamala Harris, even
01:07:45.620 | though we oppose all the policies that her campaign says
01:07:49.140 | she supports? Because it seems like that's the argument now is
01:07:53.140 | that Silicon Valley is expected to support Harris, even though
01:07:56.420 | she wants, and her campaign has confirmed, she wants a 44%
01:08:00.500 | capital gains tax. She wants a 25% unrealized gains tax.
01:08:04.740 | These are things that I think the vast majority of Silicon
01:08:07.300 | Valley considers to be disastrous for the startup
01:08:10.180 | ecosystem. Should we support her in spite of those things?
01:08:13.540 | Well, look, the information did an actual data poll as opposed
01:08:18.900 | to us being talking heads saying, we say that Silicon
01:08:21.700 | Valley does X or Y. And, you know, the information's poll
01:08:24.980 | showed that there was, you know, much broader support for the
01:08:28.500 | Democratic ticket than the Republican ticket.
01:08:30.420 | Is that the thing that Ron Conway just tweeted?
01:08:32.900 | He might have. I don't know.
01:08:34.740 | No, no, no, no, that's different. That's a subset.
01:08:39.140 | That's a different group. That's a group to counteract you and
01:08:42.660 | Chamath throwing a fundraiser for Trump.
01:08:45.700 | But the information, a news source that ran a poll, you
01:08:53.540 | know, did it objectively ran the whole thing to try to answer
01:08:55.940 | the question, came out with more folks in favor of, you know,
01:09:00.660 | the Biden-Harris ticket than I believe.
01:09:02.580 | Why do you think that is?
01:09:03.780 | I believe that is.
01:09:04.740 | Well, because, because look, taxes is an important issue.
01:09:07.220 | And I think if you ask any, any Silicon Valley business person
01:09:09.540 | to say, look, lower capital gains, promote long-term
01:09:12.260 | investment, ask me, that's what I would say too.
01:09:14.340 | But, you know, you kind of go, well, what actually, in fact,
01:09:17.220 | you most need for business is stability, rule of law, not
01:09:22.180 | grifter capitalism, where it's like, you know, give me a
01:09:24.740 | ability to launch my own NFT, you know, et cetera, et cetera.
01:09:28.580 | You know, that's what they go.
01:09:30.500 | We want that.
01:09:31.700 | And by the way, we can navigate a higher tax rate.
01:09:34.420 | It'll be less fast on growth and everything else, but we can
01:09:36.980 | still invest, create businesses, you know, et cetera, et cetera.
01:09:40.660 | But we can't do it with, you know, kind of a corroding the
01:09:44.500 | rule of law, right?
01:09:46.020 | Like, you know, I think both David, both you and Chamath
01:09:49.940 | spoke out against the January 6th stuff.
01:09:51.620 | I'm curious where you're on that now.
01:09:53.380 | It's still top of mind for me.
01:09:55.300 | That's the reason why the kind of the rule of law thing is my
01:09:57.620 | red line, not a tax policy.
01:09:59.380 | Well, let me ask you about that formally here.
01:10:01.300 | There are five cases against Trump.
01:10:03.700 | You have the insurrection case.
01:10:05.540 | You have the New York taxes case.
01:10:06.900 | You have the hush money case.
01:10:08.340 | You have the E. Jean Carroll case.
01:10:09.780 | And what am I missing there?
01:10:13.540 | Oh, and the documents case.
01:10:15.540 | You funded, like Peter Thiel funded the Gawker case, the E.
01:10:18.740 | Jean Carroll case, which Trump lost.
01:10:21.540 | And just to ask you, why did you choose to fund that?
01:10:26.180 | And do you believe Trump sexually assaulted E.
01:10:29.300 | Jean Carroll?
01:10:29.860 | Well, it's kind of not relevant whether or not I did or not.
01:10:33.700 | What I funded was an ability to have, you know, kind of a woman
01:10:38.340 | who doesn't have power, who's being threatened by a rich man
01:10:40.980 | with a lot of money and power to try to silence her, to have
01:10:43.940 | her day in court where 12 everyday Americans, right, can
01:10:48.340 | come to a judgment.
01:10:49.220 | And their judgment was that there was an assault and there
01:10:52.660 | was slander about the assault.
01:10:54.340 | And they did it twice.
01:10:56.340 | And so that was the reason I funded it.
01:10:59.060 | And, you know, I think that that's important.
01:11:01.540 | We, you know, the laws apply more importantly to rich and
01:11:04.900 | powerful people than it does to poor people.
01:11:07.620 | That's the important about, like, one thing I love about
01:11:11.300 | America is a rule of law system.
01:11:13.220 | And I think that's what's most important.
01:11:15.220 | And that's what's really fundamental.
01:11:16.740 | That's my red line relative to the kind of lines in the sand
01:11:19.620 | that we're talking about.
01:11:20.340 | And, you know, that's, you know, that's the reason why in
01:11:25.300 | the various kind of lawsuits where that seemed to be that
01:11:28.340 | that's what's being emphasized.
01:11:30.180 | Then, you know, I'm happy to support them.
01:11:32.660 | I don't see how it's rule of law when you have a district
01:11:38.500 | attorney, Alvin Bragg, who's elected on a promise to get
01:11:41.380 | Trump, he then takes what are at most a bookkeeping misdemeanor
01:11:48.500 | that's past the statute of limitations that's expired.
01:11:51.300 | And he turns into 34 felony charges on a legal theory that
01:11:55.380 | was never explained to the jury.
01:11:57.300 | And then basically Trump is convicted in a sham trial by a
01:12:03.140 | hyper partisan New York jury system so that Democrats can
01:12:07.220 | then run.
01:12:07.700 | What do you think of that?
01:12:08.420 | On the branding, on the branding that he's a, quote,
01:12:10.500 | convicted felon.
01:12:11.540 | So there are there are four other trials.
01:12:12.900 | I don't think it's rule of law.
01:12:15.860 | Hold on.
01:12:16.420 | I don't want to get feedback.
01:12:18.420 | I want to get reach feedback too.
01:12:19.460 | But let me just finish my point.
01:12:20.580 | I don't think it's rule of law when Trump is prosecuted on a
01:12:24.260 | documents charge that Biden himself is guilty of.
01:12:26.660 | He's got all these documents in his garage for decades, which
01:12:30.260 | the judge has thrown out.
01:12:32.900 | And we've seen a bunch of these lawfare cases where Trump has
01:12:36.420 | ultimately prevailed.
01:12:38.020 | The judge has thrown it out or he's won it on appeal.
01:12:40.100 | So that seems to me like abuse of the legal system for a
01:12:44.180 | partisan political goal, not rule of law.
01:12:46.740 | OK, so read.
01:12:47.300 | There's four other cases.
01:12:48.500 | Two of them Trump's been convicted in.
01:12:50.260 | Two of them are outstanding.
01:12:51.380 | What's your take on the four cases?
01:12:52.900 | You've heard Sachs estate.
01:12:53.940 | So what's the two that he's been convicted?
01:12:56.580 | Jake, how what's the other one besides the three days?
01:13:00.420 | Alvin Bragg was convicted.
01:13:02.100 | And then the Trump organization with the CFO committing tax
01:13:07.620 | fraud.
01:13:08.180 | He was convicted in that one as well, or the Trump organization
01:13:10.900 | was convicted.
01:13:11.540 | And people say that's lawfare by Letitia James.
01:13:14.260 | So guilty, guilty, guilty in those three of five.
01:13:17.220 | So what's your take on the four that we haven't discussed yet
01:13:19.620 | and heard your opinion on?
01:13:20.420 | So look, I think it's, you know, it's definitely possible to
01:13:26.580 | have some versions of lawfare, although I think most people
01:13:29.940 | use the term when it's the legal process and the law
01:13:33.060 | enforcement that they don't like.
01:13:35.300 | You know, I think that in the Bragg case, you had, you know,
01:13:39.220 | you know, indictment and 12 jurors.
01:13:42.100 | There's jurors.
01:13:42.740 | I think, as I recall, one of the jurors said he got that
01:13:46.580 | juror got their principal news from Truth Social.
01:13:49.060 | It was a unanimous conviction.
01:13:50.420 | I think that, you know, you have Vice President Pence, you
01:13:56.900 | know, comes out and says, you know, Trump asked me to
01:14:00.420 | overturn the election illegally.
01:14:02.660 | Right.
01:14:02.900 | That's your own vice president.
01:14:04.580 | So I don't think that that kind of suggests that there's
01:14:07.540 | this just rampant political persecution, that there's a
01:14:10.420 | lot of fire where there's all this smoke doesn't mean that
01:14:12.980 | every single thing, you know, kind of Democrats are trying
01:14:16.260 | to put Trump in jail for 700 years.
01:14:18.660 | These cases are still outstanding.
01:14:20.260 | They want to put him in jail.
01:14:21.300 | Read.
01:14:21.860 | Do you think he should go to jail?
01:14:23.060 | I think if he broke laws that says he should go to jail, I
01:14:27.380 | think the laws apply to powerful people as much as they
01:14:30.180 | apply to everyday people.
01:14:31.780 | Right.
01:14:31.940 | So I think why why why were these cases?
01:14:33.940 | Why did they wait for two years on these cases so they could
01:14:36.820 | bring them in an election year?
01:14:38.500 | Actually, I don't think if you look at the like, look,
01:14:41.780 | speaking factually, Trump's lawyers are always trying to
01:14:44.420 | delay the stuff.
01:14:45.300 | Right.
01:14:45.540 | I think they were trying to follow every legal process and
01:14:47.700 | Trump lawyers keep asking for a campaign this year instead
01:14:50.580 | of being stuck in a courtroom.
01:14:51.540 | Look, after this was this was last year and the year before
01:14:54.660 | asking for deferrals, setting out trial time, like all of
01:14:57.940 | the stuff was from his side trying to delay it.
01:15:01.380 | If it got delayed into this year, that's a bad judgment on
01:15:04.420 | his his part.
01:15:05.220 | Jack Smith just filed new charges, new charges, and all
01:15:09.700 | this stems from January six.
01:15:11.140 | In the wake of January six, Merrick Garland's Justice
01:15:13.140 | Department did an analysis of whether Trump could be
01:15:16.260 | prosecuted for incitement, whether he incited that mob.
01:15:19.620 | And the legal memo came back and they said, no, we don't
01:15:21.940 | have a case here.
01:15:22.500 | It does not meet the legal bar for incitement.
01:15:24.580 | Then it was reported by The New York Times that Biden
01:15:27.620 | thought that Merrick Garland was basically being a wimp and
01:15:30.580 | they need to go after Trump.
01:15:32.180 | So the hyper-partisan DA or prosecutor Jack Smith was
01:15:35.620 | hired and he came up with a novel legal theory that somehow
01:15:39.220 | Trump had perpetrated a fraud in the American people, never
01:15:41.780 | been seen before.
01:15:42.900 | And since then, he's been prosecuting Trump and seeking
01:15:44.900 | to put him in prison.
01:15:45.940 | And when the Supreme Court just kicked the legs out from
01:15:48.580 | under his case with a recent decision, he just refiled
01:15:51.700 | charges.
01:15:52.660 | I don't understand how anyone can look at this and say,
01:15:54.420 | yeah, look, what happened on January six wasn't great, but
01:15:56.580 | the DOJ looked at it, it wasn't criminal, but yet they've
01:15:59.940 | been pursuing this guy, seeking to put him away for the rest
01:16:02.500 | of his life, seeking to interfere with this election,
01:16:06.020 | seeking to deprive the American people of a choice.
01:16:07.860 | On a separate track, you've got Democrats in states like
01:16:12.180 | Colorado literally removing Trump from the ballot.
01:16:14.580 | - Okay, Reid, your thoughts?
01:16:16.420 | - So look, the first thing is January 6th, I think is a
01:16:23.220 | red line.
01:16:24.020 | I think it's, you did incite a riot, whether or not the
01:16:26.740 | legal--
01:16:26.980 | - Then why not prosecute him for it?
01:16:28.580 | - Let's let Reid finish.
01:16:29.620 | - Yeah, you know, I was--
01:16:31.700 | - Fair enough, go ahead, sorry.
01:16:32.740 | - Right.
01:16:33.620 | So I think it was the, you know, there was an incitement
01:16:37.700 | of a riot.
01:16:38.260 | I think that the rioters went in and, you know, killed
01:16:42.900 | police officers, were looking to kill Vice President Pence,
01:16:46.180 | you know, from the court testimony, courts are the best
01:16:48.420 | proxy that we have for finding truth in this stuff.
01:16:51.060 | It's one of the reasons why, you know, by the way, and when,
01:16:53.780 | for example, the Supreme Court says, no, that's great,
01:16:55.780 | that's legal process.
01:16:56.900 | - I just have to fact check that no police officers
01:16:59.460 | were killed.
01:16:59.940 | Where are you getting that from?
01:17:01.700 | - I think there were, there was the one died from his
01:17:05.780 | injuries and, you know, very soon after, and then--
01:17:08.020 | - No, no, no, no, there was one cop who had a seizure later.
01:17:11.220 | It wasn't part of the riot.
01:17:13.220 | No police officers were killed as a part of the riot.
01:17:15.460 | I just have to fact check that.
01:17:16.900 | It's just not true.
01:17:17.700 | - Well, and then there's the one who committed suicide too,
01:17:20.740 | which is a question of, you know--
01:17:22.100 | - I don't know how you can attribute that.
01:17:23.940 | - Yeah, so anyway, so you got, you know, the storming
01:17:29.700 | of the Capitol, you know, he says these people
01:17:32.980 | are American heroes.
01:17:35.140 | He's going to pardon them.
01:17:36.100 | He's going to hire them into his administration, right?
01:17:39.060 | And if that's not encouragement for other people
01:17:40.980 | doing similar things, you know--
01:17:42.500 | - Wait, he's going to hire January 6th rioters?
01:17:44.660 | He's going to hire them?
01:17:45.460 | - Yeah, yeah, well, we'll get you the Trump speech.
01:17:50.660 | There's all kinds of wonderful things in Trump's speeches.
01:17:52.500 | - Let me just move to one thing.
01:17:56.180 | So, Reid, I think this has been an amazingly robust
01:18:00.020 | conversation, and I think you, as always,
01:18:02.020 | as long as I've known you now for 20 years,
01:18:05.300 | have been really intellectually honest.
01:18:06.980 | I want to ask you a favor, which is,
01:18:10.420 | can you stay for 10 extra minutes
01:18:12.340 | and talk with us to Bobby Kennedy?
01:18:14.500 | And the reason I want you to do that is,
01:18:16.340 | I think that there is a bunch of misinformation.
01:18:18.820 | I asked you about these things.
01:18:19.860 | I think it's important to hear maybe from Bobby,
01:18:22.820 | and just for him to know what you said,
01:18:24.820 | because I do think it's important
01:18:25.940 | to hear it from the horse's mouth.
01:18:27.140 | Can you just give us like five, 10 minutes
01:18:28.580 | so that we can do that?
01:18:29.940 | 'Cause I think it would be an important thing to do.
01:18:32.020 | - Sorry, when is that?
01:18:32.900 | - No, just right now.
01:18:33.860 | - We're rolling into an interview with RFK Jr.
01:18:36.900 | - It wasn't designed this way at the last minute, RFK.
01:18:39.380 | - It's just the last minute RFK Jr., who's on vacation,
01:18:42.180 | said that he would talk to us about what it was like
01:18:44.660 | to kind of withdraw and all this sort of stuff.
01:18:46.900 | And so we booked it right after you,
01:18:49.300 | but he's in the waiting room.
01:18:50.340 | - I'm fine to do it.
01:18:51.380 | I mean, it's- - Oh, okay, totally your choice.
01:18:53.540 | - It's one of the things I like about your all-in podcast
01:18:57.940 | is kind of like, let's try to speak truth.
01:19:01.540 | - Okay. - Yeah, right, so.
01:19:02.660 | - Hey, Bobby Kennedy is here.
01:19:04.660 | Mr. Kennedy, it's great to have you
01:19:06.100 | on the all-in podcast for a second time.
01:19:08.100 | May I introduce you to Reid Hoffman,
01:19:11.220 | who you may know of,
01:19:12.420 | but I don't think you two have ever met.
01:19:14.180 | - We have not.
01:19:15.220 | - Pleasure to meet you.
01:19:16.660 | - Likewise, pleasure.
01:19:18.660 | - Mr. Kennedy, you dropped out of the race.
01:19:22.260 | Perhaps you could tell us,
01:19:23.620 | and I was quite disappointed about it.
01:19:25.220 | I really wanted to see a third-party candidate
01:19:28.500 | get into double digits again.
01:19:30.020 | I just want to commend you on the effort
01:19:31.860 | that you put into it.
01:19:32.740 | Maybe you could tell the audience
01:19:34.900 | why you, as a reported never-Trumper,
01:19:37.940 | joined the Trump team and dropped out.
01:19:39.460 | - So, Jason, I, you know, I'm not actually,
01:19:43.220 | I haven't actually terminated my campaign.
01:19:47.140 | I suspended it.
01:19:48.260 | We've taken ourselves,
01:19:50.180 | or we're trying to take ourselves off the ballot
01:19:53.300 | in the, in about 11 states.
01:19:56.100 | So we'll remain on the ballot in 39 states.
01:19:58.740 | And all red, all blue states will be on the ballot
01:20:03.460 | in the states where we felt
01:20:06.420 | we were going to hurt President Trump
01:20:10.340 | with a polling show that we're getting off there,
01:20:13.940 | mainly the battleground states.
01:20:16.180 | Ironically now, the same people
01:20:19.540 | who've been trying to get me off the ballot
01:20:21.460 | a year or since October
01:20:24.500 | are now fighting to keep me on the ballot
01:20:28.260 | in those states.
01:20:29.140 | So that's one of the sort of ironies.
01:20:31.940 | I, you know, it became clear
01:20:35.140 | and about two months ago,
01:20:39.860 | when the, when it became clear
01:20:43.940 | that I was not going to be allowed
01:20:45.140 | on the debating stage.
01:20:46.340 | And I pretty much had a shutout
01:20:50.500 | in the mainstream media.
01:20:52.660 | So the mainstream media,
01:20:54.740 | ABC, NBC, CBS, CNN, MSNBC,
01:21:00.500 | in 60 or 17 months,
01:21:04.180 | I had only two live interviews.
01:21:07.060 | Ross Perot during his 10 month campaign
01:21:12.020 | had 34 interviews.
01:21:14.660 | And then, you know, all of them
01:21:16.580 | were very, very much aligned
01:21:18.660 | with the Democratic National Committee.
01:21:20.740 | And so when they did mention my name,
01:21:23.620 | which was pretty often,
01:21:24.820 | it was accompanied by a lot of defamations
01:21:27.700 | and pejoratives and mischaracterizations, et cetera.
01:21:32.660 | So I never really had a chance
01:21:35.060 | to reach those audiences.
01:21:37.140 | The audiences that I was reaching,
01:21:38.980 | I was dominating in.
01:21:40.340 | I was beating all the candidates
01:21:42.900 | among independents,
01:21:43.860 | which is now the largest demographic.
01:21:45.860 | I was beating them among young people.
01:21:49.140 | So your audience was supporting me.
01:21:52.180 | The audiences that were listening
01:21:55.460 | to long-form interviews,
01:21:57.060 | I was dominating.
01:21:58.340 | But in the older audiences,
01:22:00.900 | which is a critical baby boomers,
01:22:02.900 | people who really should have been for me,
01:22:04.900 | because they are people who,
01:22:08.180 | from my generation,
01:22:09.460 | who remember the Kennedy administration,
01:22:12.340 | they were part of Camelot.
01:22:13.620 | They also, I was very, very popular with them
01:22:17.860 | for many years when I was
01:22:20.100 | the environmental champion alone.
01:22:21.860 | And I should have had good inroads,
01:22:25.300 | but I was never able to communicate with them
01:22:27.540 | because they watch,
01:22:29.380 | they get their news from the mainstream media.
01:22:32.340 | And if you're living in that information ecosystem,
01:22:36.260 | you're going to have a very, very low opinion of me.
01:22:40.340 | I mean, if I was getting my information
01:22:42.420 | from those networks,
01:22:43.540 | I wouldn't vote for myself.
01:22:44.820 | And then President Trump reached out to me
01:22:52.900 | through this guy, Kelly Maynes,
01:22:55.140 | who's a food advocate,
01:22:57.620 | a safe food advocate.
01:22:59.140 | About three hours after the shooting in Butler,
01:23:04.580 | I got a call from him.
01:23:08.740 | And he asked me if I was still interested
01:23:12.180 | in the, specifically in the VP slot.
01:23:18.820 | And I said, "No."
01:23:19.620 | Which I would not have taken
01:23:22.500 | out of the vice president slot.
01:23:26.100 | And he said, "Would I talk to,
01:23:31.140 | be willing to talk to the administration?"
01:23:32.980 | To President Trump, rather.
01:23:36.980 | And at first I said, "No."
01:23:38.900 | And I talked with some of my family members,
01:23:43.860 | including my kids.
01:23:47.380 | And I then sent Kelly Maynes a note saying,
01:23:50.820 | "You know, I'm interested in talking."
01:23:53.220 | And I got a call almost immediately
01:23:56.820 | from President Trump.
01:23:57.780 | I spent about, I don't know,
01:24:00.260 | 30 minutes on the phone with him.
01:24:02.500 | And we met the next day in Minneapolis.
01:24:07.060 | And then we met again more recently.
01:24:10.980 | We had continuing talks with him.
01:24:13.220 | And we met again more recently
01:24:16.180 | for a very, very intensive
01:24:17.780 | and long meeting at Mar-a-Lago
01:24:19.460 | with some of his family members.
01:24:21.060 | And during those meetings,
01:24:23.140 | during the first meeting,
01:24:24.340 | we talked about the idea
01:24:27.140 | of having a unity ticket
01:24:28.500 | where I would remain on the ballot,
01:24:31.380 | where we would ally ourselves
01:24:33.700 | on certain critical issues.
01:24:36.740 | But we would be able to continue
01:24:39.380 | to criticize each other on the issues
01:24:41.700 | that we did not align on.
01:24:44.500 | And President Trump was very happy
01:24:47.700 | with that arrangement.
01:24:48.740 | And the issues, the critical issues
01:24:52.740 | on which we agreed,
01:24:53.780 | and I was really stunned to see
01:24:56.740 | the level of his commitment
01:24:58.100 | to those issues was one.
01:25:00.260 | And there were three issues
01:25:02.020 | that really got me
01:25:03.140 | into the presidential campaign.
01:25:04.660 | One was ending the war in Ukraine.
01:25:08.660 | The second issue was ending the censorship.
01:25:13.220 | And the third issue,
01:25:15.700 | and most important to me,
01:25:16.980 | was addressing the childhood
01:25:20.500 | chronic disease epidemic.
01:25:22.580 | And these connected issues
01:25:25.860 | about soil health
01:25:27.380 | and the corruption
01:25:29.300 | in our regulatory agencies
01:25:31.140 | by USDA and FDA, NIH, CDC,
01:25:35.220 | and HHS,
01:25:37.300 | which have become sock puppets
01:25:39.060 | for the big pharmaceutical,
01:25:42.180 | big ag, big food processing industries
01:25:45.060 | that they're supposed to be regulating.
01:25:46.740 | He was very, very much aligned
01:25:50.340 | on those issues,
01:25:51.380 | and it gave us essentially a beachhead
01:25:54.500 | in which to construct,
01:25:57.860 | you know, this, an alliance.
01:25:59.780 | Bobby, let me ask you a question.
01:26:01.380 | I just want to go back a little bit
01:26:02.500 | because I just want to make sure
01:26:03.620 | I heard it properly.
01:26:04.500 | When Kelly called you,
01:26:05.700 | was it to be the VP
01:26:10.020 | on the Trump ticket?
01:26:11.300 | And did you, was that asked?
01:26:14.340 | And did you consider it?
01:26:15.220 | And why did you say no to that,
01:26:17.220 | but then said yes to this?
01:26:18.900 | Well, I had no interest
01:26:20.660 | in being a vice president.
01:26:23.540 | If you're a vice president,
01:26:25.220 | it's a, you know,
01:26:26.180 | I grew up in politics,
01:26:27.620 | and vice president is the worst job
01:26:29.220 | in Washington.
01:26:29.940 | You have no budget.
01:26:32.340 | You have no staff except what,
01:26:34.340 | your budget actually all comes
01:26:35.780 | through the White House.
01:26:36.660 | So if the, if you do something
01:26:39.700 | that offends the president,
01:26:41.860 | he can literally, you know,
01:26:43.620 | he can take away your plane.
01:26:44.820 | He can take away your staff.
01:26:47.220 | And he can, the only thing
01:26:49.540 | you really have is the Naval Observatory,
01:26:51.780 | which is the official residence
01:26:54.500 | of the vice president.
01:26:55.620 | And he can essentially put you
01:26:56.900 | under house arrest.
01:26:57.860 | And, you know, I have
01:26:59.940 | very strong views on issues.
01:27:01.940 | And I, you know, I felt like
01:27:03.940 | if I took that job,
01:27:04.900 | I'd be on house arrest
01:27:06.260 | probably on day three.
01:27:07.460 | So I, you know, I, I was not,
01:27:13.220 | I was never interested in that.
01:27:14.660 | - Reid had to run,
01:27:15.540 | but let's just thank him
01:27:16.420 | for appearing on the pod.
01:27:17.860 | And I thought it was a great conversation.
01:27:19.780 | - Yeah, thanks Reid.
01:27:21.220 | - But I thought he approached it
01:27:23.460 | in good faith and kudos to him
01:27:24.740 | for stepping into the lion's den.
01:27:26.660 | So he was great.
01:27:27.300 | - Yeah.
01:27:27.940 | - Sorry, Bobby, keep going.
01:27:29.700 | - So Bobby, let me ask you a question.
01:27:31.380 | You are reportedly a never Trumper.
01:27:35.300 | You, there's massive fallout.
01:27:38.500 | You cited personally for you
01:27:40.500 | as a resident of Malibu,
01:27:42.100 | the extremely talented woman
01:27:44.820 | you're married to.
01:27:45.860 | Based on everything I can tell,
01:27:48.100 | maybe not a fan of Trump.
01:27:49.300 | So this is maybe causing
01:27:50.340 | some domestic and some
01:27:52.820 | local town issues for you.
01:27:54.180 | Tell me about your journey
01:27:55.380 | from a never Trumper,
01:27:56.740 | all your friends are,
01:27:57.700 | I think never Trumpers
01:27:58.900 | to now joining with Trump.
01:28:00.660 | That's gotta be a hard decision, no?
01:28:02.740 | - Yeah, it was a very hard decision.
01:28:05.860 | But you know, I, I would,
01:28:07.300 | my whole kind of journey
01:28:08.580 | was over the past 17 months
01:28:12.980 | was, was just,
01:28:15.060 | it was kind of a series of,
01:28:16.820 | of very, very difficult
01:28:23.380 | transitions, you know,
01:28:25.140 | away from the Democratic Party.
01:28:26.660 | The Democratic Party was,
01:28:27.940 | you know, the party,
01:28:30.660 | my family is one of the central
01:28:33.380 | pillars of the Democratic Party.
01:28:35.220 | My family has been
01:28:36.980 | in the Democratic Party since 1848,
01:28:39.380 | since my great-grandparents came over.
01:28:41.460 | And my great-grandfather,
01:28:44.580 | Honey Fitz was the first
01:28:45.860 | Irish Catholic mayor of Boston.
01:28:47.540 | His contemporary, Patrick Joseph Kennedy,
01:28:51.220 | was a state senator
01:28:52.340 | and political boss in Massachusetts.
01:28:54.340 | My grandfather, Joseph Kennedy,
01:28:57.060 | was, you know, FDR's treasurer.
01:28:59.380 | He was the first head of the SEC.
01:29:02.660 | He was the ambassador
01:29:03.860 | to the court of St. James,
01:29:07.460 | deeply, deeply immersed
01:29:08.900 | in Democratic Party politics.
01:29:11.380 | All of my uncles, Joe Kennedy,
01:29:13.860 | who was a delegate to the 1940 convention,
01:29:17.140 | who spoke there,
01:29:18.980 | who was a featured speaker
01:29:20.100 | and then was killed in World War II.
01:29:21.700 | My uncle, John Kennedy,
01:29:23.780 | was the first Irish Catholic
01:29:25.620 | president of the United States.
01:29:29.940 | My other uncle, Ted Kennedy,
01:29:33.380 | who was one of the longest,
01:29:34.740 | I think the second or third
01:29:36.020 | longest serving member
01:29:37.140 | of the United States Senate.
01:29:38.980 | His name on more bills
01:29:42.420 | than any other senator
01:29:43.460 | in the United States of history.
01:29:44.740 | And then, of course,
01:29:46.820 | my father, who was attorney general
01:29:48.740 | and a sort of walk away
01:29:51.300 | from that party was,
01:29:54.420 | you know, I guess it was
01:29:57.060 | very, very difficult for me.
01:29:58.900 | And I was actually
01:30:00.020 | the last person in my campaign
01:30:01.860 | to see that, to understand
01:30:04.900 | the necessity of that,
01:30:06.020 | that the Democratic Party
01:30:07.380 | was not going to allow me
01:30:08.740 | to compete fairly,
01:30:12.420 | that they, you know,
01:30:13.460 | they had rigged the system against us
01:30:15.780 | in ways that were really
01:30:17.300 | quite extraordinary.
01:30:19.140 | They had just walked away
01:30:20.420 | from democracy.
01:30:21.380 | They were cancelling primaries.
01:30:23.140 | They had chosen their candidate
01:30:26.580 | and it was going to be
01:30:27.380 | President Biden and I was
01:30:30.020 | really a nuisance to them.
01:30:31.380 | And so my voice was not
01:30:34.100 | allowed out there.
01:30:35.780 | And so that was difficult.
01:30:40.340 | And then, and then leaving,
01:30:44.980 | you know, I declared
01:30:46.180 | independence in October
01:30:47.540 | and, and joining Trump,
01:30:50.900 | President Trump was,
01:30:52.660 | I burned a lot of bridges.
01:30:55.940 | I burned my boats.
01:30:57.060 | Let me put it that way.
01:30:58.100 | >>Clearly, clearly is definitely
01:31:00.580 | a challenging thing to go from
01:31:03.060 | your family being the bedrock
01:31:05.700 | of the Democratic Party
01:31:06.900 | and Trump being, you know,
01:31:09.140 | obviously seen as an existential risk
01:31:10.820 | by the Democratic Party.
01:31:13.540 | So what, what should Americans
01:31:16.340 | know about the state of politics
01:31:19.620 | and fairness in America
01:31:21.460 | based on what you've learned?
01:31:22.580 | What, what, what do you want
01:31:23.700 | the American people to know
01:31:25.300 | about the process
01:31:26.500 | of selecting a president?
01:31:27.620 | >>Well, and, you know,
01:31:29.700 | I do want to say that I feel
01:31:31.300 | like I didn't really
01:31:34.180 | leave the Democratic Party.
01:31:35.220 | The Democratic Party left me
01:31:37.460 | and left the,
01:31:38.420 | and left the ruins of
01:31:43.380 | the, the infrastructure
01:31:45.620 | that I think my uncle and father had
01:31:48.340 | that had made them Democrats.
01:31:49.940 | If you went on a list
01:31:52.260 | of all of the priorities
01:31:53.540 | that Robert Kennedy,
01:31:54.580 | that John Kennedy had,
01:31:55.940 | I would check every box.
01:31:58.420 | You know, they were anti-war.
01:32:00.500 | They were anti-censorship.
01:32:02.100 | They were, they were against
01:32:04.820 | the corporate control of,
01:32:06.500 | of our country.
01:32:07.860 | The, this, this corrupt merger
01:32:10.900 | of state and corporate power
01:32:13.060 | that now has emerged
01:32:14.340 | as a dominant governing model
01:32:16.180 | in our country.
01:32:16.980 | The Democratic Party
01:32:19.780 | has changed demographically.
01:32:21.380 | When I, I grew up
01:32:24.340 | in a Democratic Party
01:32:25.540 | that was the party
01:32:26.420 | of the working class in our country,
01:32:28.260 | that was party of small businesses,
01:32:32.340 | that was the party of the poor.
01:32:34.660 | In the last election,
01:32:37.460 | President Biden got roughly half
01:32:40.660 | of the country voting for him.
01:32:43.540 | But that half controlled 70% of, of GDP.
01:32:46.900 | And President Trump got about
01:32:50.260 | half the country voting for him.
01:32:52.340 | And that half represents
01:32:54.100 | about 30% of GDP.
01:32:55.860 | So we've had this inversion
01:32:59.700 | where the Democratic Party
01:33:02.340 | has become the party of wealth,
01:33:04.340 | of elites.
01:33:05.380 | And I would say very insular elites.
01:33:10.900 | And the, and the Republican Party
01:33:13.780 | is now the party of the poor,
01:33:15.220 | the working class.
01:33:16.420 | And it's been, you know,
01:33:18.580 | for me to watch that,
01:33:19.700 | I've been on the front lines
01:33:21.460 | of watching it.
01:33:23.780 | And, you know, the values
01:33:27.540 | that held the Democratic Party together
01:33:29.620 | are no longer there.
01:33:30.660 | It's held together by
01:33:32.740 | a sense of tribalism,
01:33:36.260 | a sense of,
01:33:37.060 | and a great, great sense
01:33:38.820 | of what I would say
01:33:39.860 | orchestrated fear of Donald Trump.
01:33:42.980 | It's the only value
01:33:44.180 | that really dominates
01:33:45.460 | any discussion.
01:33:46.500 | If I talk about censorship
01:33:49.380 | to a Democrat,
01:33:50.260 | they'll say, yes,
01:33:52.180 | but Donald Trump
01:33:53.780 | is going to become a dictator.
01:33:55.140 | If you talk about children's health,
01:33:57.940 | they'll say, nevermind that.
01:33:59.700 | Donald Trump is the only thing
01:34:01.060 | we can worry about.
01:34:02.100 | If you talk about,
01:34:02.980 | you know, about the history
01:34:05.380 | of the Democratic Party's
01:34:06.740 | opposition to war,
01:34:09.060 | they'll say, forget all that.
01:34:10.580 | The only thing we can focus on
01:34:12.420 | is Donald Trump.
01:34:13.700 | And that's a very, very
01:34:17.140 | dismaying,
01:34:18.100 | and I would say,
01:34:18.820 | dangerous form of
01:34:21.380 | orchestrated tribalism.
01:34:23.380 | And one of the other features
01:34:25.460 | of the Democratic Party
01:34:26.500 | is this need to control
01:34:29.380 | this mistrust of the plebiscite,
01:34:33.300 | mistrust of the demos.
01:34:34.900 | You know, demos is a
01:34:36.100 | Greek word for people.
01:34:38.740 | And the Democratic Party
01:34:39.780 | doesn't trust the people.
01:34:40.740 | That's why they have to
01:34:41.540 | get rid of elections.
01:34:42.660 | That's why they had to
01:34:43.460 | get me off the ballot.
01:34:44.340 | I did something everybody said,
01:34:48.260 | all the pundits said
01:34:49.460 | could never be done.
01:34:50.340 | So I got on the ballot
01:34:51.940 | in every state.
01:34:52.660 | I got a million people
01:34:53.940 | that signed their signatures
01:34:56.340 | petitioning me on the ballot.
01:34:57.540 | And the Democratic Party's strategy,
01:35:00.500 | rather than to use
01:35:01.540 | the $3 billion it had,
01:35:03.220 | amplify a message
01:35:05.700 | and inspire people
01:35:06.820 | and talk about a vision
01:35:08.340 | and the virtues of its candidates.
01:35:10.340 | Instead, use that money
01:35:12.740 | to try to get me off the ballot,
01:35:14.660 | to get Cornel West,
01:35:17.300 | to get Jill Stein,
01:35:19.060 | to use the courts,
01:35:20.820 | to use the enforcement agencies,
01:35:23.620 | including the Secret Service,
01:35:24.900 | the CIA, the FBI,
01:35:26.340 | to try to rig the election.
01:35:29.140 | And it ultimately comes down
01:35:33.460 | to this mistrust of the people,
01:35:35.140 | which we're seeing now
01:35:37.620 | all over.
01:35:38.260 | We're seeing the kind of
01:35:39.380 | two big forces emerge.
01:35:41.060 | One is a populist force,
01:35:42.500 | and the other is a force of control,
01:35:45.700 | of ironclad control.
01:35:46.820 | We saw Europe has already fallen.
01:35:48.660 | You know, you saw the arrest
01:35:51.860 | of Pavel Durov last week,
01:35:54.260 | which was extraordinary.
01:35:55.780 | You know, the arrest of
01:35:57.940 | the guy who founded Telegram,
01:36:01.780 | because he was hosting
01:36:06.740 | political dissent.
01:36:07.780 | And, you know, the European Commission
01:36:11.540 | is already openly censoring content,
01:36:14.740 | so they did not need to arrest him.
01:36:16.900 | They can take off whatever they want.
01:36:19.300 | But they went through the trouble
01:36:23.300 | of actually,
01:36:24.100 | and probably with U.S. encouragement,
01:36:27.460 | you know, catching him
01:36:28.580 | when he happened to land
01:36:29.780 | for a refueling stop in France.
01:36:31.620 | France has this extraordinary tradition
01:36:34.420 | of free speech
01:36:35.540 | that, you know, began
01:36:36.580 | with the French Revolution.
01:36:37.940 | And then again, in the 1880s,
01:36:39.620 | they passed all these incredible laws.
01:36:41.540 | Their commitment to free speech
01:36:44.180 | is as robust as that
01:36:45.540 | in the United States.
01:36:46.740 | And yet now, you know,
01:36:49.060 | and then two weeks before that,
01:36:51.460 | you had this crazy European Commissioner,
01:36:54.820 | Thierry Breton,
01:36:56.500 | saying that ordering Elon Musk
01:37:01.220 | to not interview Donald Trump,
01:37:03.620 | a former President of the United States,
01:37:05.300 | the nominee of one of the two
01:37:09.460 | major political parties,
01:37:10.900 | and the world is not allowed
01:37:12.580 | to hear his point of view.
01:37:13.860 | >> Yeah.
01:37:14.340 | >> It's extraordinary.
01:37:15.780 | That is what's coming to this country.
01:37:17.700 | And you can already,
01:37:18.500 | and a Democratic Party
01:37:19.780 | is that party of control,
01:37:23.140 | and it's the party
01:37:23.940 | of not trusting the people.
01:37:25.380 | >> Bobby, let's talk about something then.
01:37:28.260 | Let's assume that this election
01:37:31.460 | goes in the direction of now
01:37:33.540 | your preferred direction,
01:37:34.660 | which is Donald Trump wins.
01:37:36.420 | What role would you play?
01:37:39.620 | And what is your agenda?
01:37:42.100 | What is it that you want to accomplish?
01:37:43.540 | And explain,
01:37:44.340 | make America healthy again
01:37:46.580 | in that context, maybe.
01:37:47.660 | >> I mean, it's the three issues,
01:37:51.700 | ending censorship,
01:37:52.980 | and that's pretty easy to do.
01:37:54.420 | You can do it with a series
01:37:56.580 | of executive orders,
01:37:57.700 | ending the Ukraine war,
01:37:59.380 | which is complex,
01:38:01.220 | but I think can be done
01:38:02.580 | very, very quickly.
01:38:03.620 | And then the food issue.
01:38:07.380 | Now, it's the food,
01:38:09.300 | it's medicines,
01:38:10.180 | it's corruption
01:38:10.980 | in the regulatory agencies.
01:38:12.500 | >> But would you be a secretary
01:38:14.020 | in that administration?
01:38:15.300 | Would you be a special advisor?
01:38:17.060 | >> There is no deal
01:38:23.380 | in terms of me
01:38:24.420 | getting a particular post.
01:38:25.700 | So there's just an understanding
01:38:28.660 | that there would be
01:38:30.420 | some kind of co-governance.
01:38:31.940 | And the Trump people
01:38:34.420 | have already demonstrated
01:38:36.180 | their good faith
01:38:37.300 | by inviting me to be
01:38:40.660 | on the transition team
01:38:41.780 | as one of the co-chairs.
01:38:44.340 | And they've done something
01:38:45.380 | really wonderful,
01:38:46.260 | which is to bring Tulsi Gabbard in,
01:38:50.420 | who shares a lot of my views
01:38:53.060 | on this issue
01:38:53.700 | as the other co-chair.
01:38:54.980 | And I think that's a signal
01:38:57.700 | that they're sending
01:38:58.740 | that they are sincere
01:39:00.180 | about making a commitment
01:39:04.180 | to these issues.
01:39:04.900 | >> What should parents know
01:39:07.780 | about your thoughts
01:39:09.620 | on how to raise kids
01:39:11.620 | in a healthy way,
01:39:12.500 | make America healthy again?
01:39:13.940 | Are there any vaccines
01:39:15.060 | that you would say
01:39:15.540 | you would advise parents
01:39:17.060 | to take for their kids?
01:39:18.100 | And how should they look at
01:39:19.700 | the industrial food complex?
01:39:21.860 | If you were sitting with us
01:39:24.020 | as parents just having lunch,
01:39:27.700 | what would you tell us
01:39:28.740 | we should do with our own kids?
01:39:29.940 | >> I mean, the big problem is
01:39:31.780 | you can't really trust
01:39:32.740 | the government to tell you the truth.
01:39:34.180 | The agencies are all compromised.
01:39:37.300 | They all have very, very bad conflicts.
01:39:40.340 | Almost all the people, for example,
01:39:41.940 | on the food recommendation committees
01:39:48.580 | at FDA
01:39:49.620 | are people who are
01:39:52.100 | part of the food industry.
01:39:54.100 | And the same is true
01:39:55.220 | on the pharmaceutical side.
01:39:58.660 | The people who are making decisions
01:40:01.300 | about what's good for you
01:40:02.580 | are actually people
01:40:04.900 | who are making huge amounts of profit
01:40:07.140 | on those recommendations.
01:40:08.580 | And so you can't really trust
01:40:09.940 | that the recommendations
01:40:11.140 | are in your best interest.
01:40:14.740 | And what we know is that
01:40:17.540 | there is no more profitable,
01:40:21.460 | there's no bigger profit center
01:40:24.820 | or industry in this country
01:40:26.420 | than a sick child.
01:40:27.460 | And a sick child is a lifetime customer,
01:40:31.780 | a lifetime consumer
01:40:33.460 | of very, very expensive products.
01:40:35.300 | And you have this alliance
01:40:37.860 | between the food industry
01:40:39.380 | and the pharmaceutical industry
01:40:40.740 | to keep our children sick,
01:40:42.980 | get them addicted.
01:40:43.860 | You know, in the '70s and '80s,
01:40:46.100 | the tobacco industry was under attack.
01:40:52.100 | And the two biggest tobacco companies
01:40:55.060 | went out
01:40:55.860 | and bought all the big food companies.
01:40:59.220 | - RJ Artubisco, you're referring to.
01:41:01.380 | - Yeah, and Kraft.
01:41:03.060 | And, you know,
01:41:03.540 | you had Philip Morris by Kraft.
01:41:06.660 | And they took a lot of the scientists
01:41:09.700 | from the tobacco industry
01:41:11.140 | who were experts
01:41:12.020 | on making products addictive.
01:41:13.460 | And they put them to work
01:41:15.540 | on making food addictive,
01:41:16.820 | on making ultra-processed.
01:41:18.340 | So adding ingredients
01:41:19.700 | that make food
01:41:23.540 | that destroy the satiability of food,
01:41:27.620 | so that food doesn't fill you up,
01:41:29.140 | so you're always craving more.
01:41:30.660 | And those products,
01:41:33.620 | many of them are products,
01:41:35.300 | you know, we have
01:41:35.860 | almost a thousand chemicals in our foods
01:41:38.980 | that are banned in Europe
01:41:40.180 | and banned in other countries.
01:41:41.540 | And those products
01:41:47.620 | are products that have been introduced
01:41:52.820 | by chemists
01:41:53.620 | that did not exist before
01:41:56.340 | and the body does not handle them well.
01:41:58.500 | And, you know,
01:41:59.220 | we're seeing this explosion
01:42:00.900 | in chronic disease.
01:42:01.860 | When my uncle was president,
01:42:04.660 | 6% of Americans had chronic disease.
01:42:06.900 | You know what the budget was
01:42:08.580 | for chronic disease
01:42:10.020 | when my uncle was president?
01:42:11.940 | Zero.
01:42:12.740 | There wasn't any drugs for it.
01:42:14.180 | There was no expenditures
01:42:17.860 | on chronic disease.
01:42:18.740 | Today, it's $4.3 trillion.
01:42:21.860 | It's five times our military budget.
01:42:23.620 | And the people who are making money
01:42:26.100 | are the pharmaceutical companies,
01:42:29.300 | the insurance companies,
01:42:31.380 | which actually you would think
01:42:32.980 | insurance companies
01:42:33.940 | would want people to be,
01:42:35.140 | well, they actually make more money
01:42:36.820 | if they're sick.
01:42:37.460 | The hospital is the medical cartel.
01:42:42.020 | The people we trust
01:42:44.500 | and make advice to us
01:42:45.780 | about our health
01:42:46.660 | are actually compromised.
01:42:48.580 | And that's the difficult part.
01:42:49.860 | You've got to unravel
01:42:51.540 | that corporate capture.
01:42:52.500 | Freeberg, when you hear this,
01:42:54.100 | some people might say
01:42:55.140 | this sounds like a grand conspiracy theory.
01:42:57.780 | But much of it rings true
01:43:00.500 | to I think many of us who are parents
01:43:02.340 | watching kids, you know,
01:43:04.260 | and watching the prevalence of obesity,
01:43:06.020 | watching pharmaceutical drugs
01:43:08.180 | to counter that
01:43:09.380 | and all the money that's made from it.
01:43:10.660 | And then seeing when people eat clean
01:43:12.020 | and they're healthy,
01:43:13.540 | maybe there's less there.
01:43:15.780 | So Freeberg,
01:43:16.740 | when you hear Bobby's position here
01:43:20.180 | as a scientist
01:43:21.300 | who sold a company to Monsanto,
01:43:23.140 | climate.com
01:43:23.940 | and who's working on food today,
01:43:26.260 | what rings true about what Bobby's saying?
01:43:28.260 | And what do you disagree with,
01:43:29.780 | if anything?
01:43:30.260 | Yeah, there are aspects
01:43:31.620 | of industrialized food
01:43:32.580 | and processed food
01:43:33.380 | that are bad for people.
01:43:34.580 | And I do agree should be,
01:43:36.260 | it should be changed.
01:43:38.900 | I don't, I know a lot of people
01:43:42.020 | that work at the USDA,
01:43:43.060 | a lot of people that work
01:43:44.340 | in other government organizations
01:43:45.940 | that don't make a lot of money.
01:43:47.220 | They may or may not have worked
01:43:50.180 | at other companies.
01:43:50.900 | But I think that there's no economic
01:43:52.740 | incentive for them to do harm or wrong.
01:43:55.140 | I think that the real,
01:43:55.940 | so I don't think that there's
01:43:57.700 | a constructive design
01:43:59.060 | on doing bad things by any individual.
01:44:01.140 | I think that there is
01:44:02.500 | an unfortunate circumstance
01:44:04.340 | where people eat bad stuff,
01:44:05.700 | stuff that tastes better.
01:44:07.780 | They like it more, it sells better.
01:44:10.100 | And the economic incentive
01:44:11.300 | and capitalism is to make more
01:44:12.660 | of that stuff and sell more of it.
01:44:13.940 | And as a result,
01:44:15.140 | the stuff that people like
01:44:16.180 | that isn't good for them,
01:44:17.460 | they buy more of
01:44:18.100 | and the companies make more money.
01:44:19.220 | And so they continue to invest
01:44:21.060 | in selling more and more of that stuff.
01:44:22.420 | And this goes in most processed foods.
01:44:24.020 | It's terrible.
01:44:24.900 | It's not good.
01:44:25.620 | And so I do agree
01:44:26.500 | that much of this processed food industry
01:44:29.460 | is very adverse to health.
01:44:32.340 | But I don't think
01:44:32.820 | that there's a grand design
01:44:34.260 | by individuals that are malicious
01:44:35.700 | in their intent and trying to do it.
01:44:36.980 | I think that there are people
01:44:37.860 | that are doing their job on,
01:44:39.220 | "Hey, this is what the market wants,
01:44:40.500 | let's give them more."
01:44:41.220 | - In the government, right, yeah.
01:44:41.860 | - I'm talking about the government
01:44:42.900 | and private industry.
01:44:43.700 | - You don't think people
01:44:44.980 | in private industry
01:44:45.620 | are trying to make addicting foods?
01:44:46.740 | - No, no, I think that, yeah,
01:44:48.180 | the point is like,
01:44:48.820 | if people buy more of it,
01:44:49.780 | they're like, "Let's sell more of it."
01:44:50.980 | - Yeah, okay.
01:44:51.780 | - And if that were illegal,
01:44:53.540 | if it was illegal to say,
01:44:54.580 | "Hey, this sort of food product
01:44:55.940 | should not be made."
01:44:56.740 | But look, alcohol fits the bill too, right?
01:44:59.540 | And we keep making alcohol
01:45:01.220 | and sugar fits the bill.
01:45:02.180 | The more sugar...
01:45:03.220 | Coca-Cola did a study years ago
01:45:04.820 | where they kept increasing
01:45:07.540 | the amount of sugar in Coca-Cola
01:45:09.300 | until they maximally got sell-through.
01:45:13.060 | So some kids liked 60 grams of sugar
01:45:15.460 | in 12 ounces of Coke,
01:45:16.500 | some kids like 30,
01:45:17.940 | but the perfect level was at 42 grams.
01:45:20.660 | And so that study was done
01:45:22.260 | by the scientists that worked at Coca-Cola
01:45:24.180 | and then they said,
01:45:25.140 | "That's the product, it'll sell the most."
01:45:27.460 | And that's the incentive inside of that company.
01:45:29.140 | That's how that company operates.
01:45:30.340 | Now, you could ask yourself the question,
01:45:31.940 | "Is that evil? Is that bad?"
01:45:33.380 | We now know that sugar in general is bad.
01:45:35.700 | The executives at Coca-Cola,
01:45:36.900 | at AB InBev and other places
01:45:38.100 | are trying to make alcohol-free,
01:45:39.700 | sugar-free alternatives.
01:45:40.980 | So there's a lot of push
01:45:42.100 | by these people. Unilever has tried to
01:45:44.180 | make a big push towards good food.
01:45:46.020 | Nestle's tried to do the same.
01:45:47.060 | They've all made these stated commitments
01:45:48.420 | to improve the health of the food
01:45:49.620 | that they produce.
01:45:50.420 | But it is quite difficult
01:45:51.940 | to be successful in doing that
01:45:53.300 | and returning money to shareholders.
01:45:54.900 | The shareholders are like, "Where's the money?"
01:45:56.260 | - Oh, I think this is the key point
01:45:59.220 | you're making, Chamath.
01:46:00.340 | I think your clean food effort
01:46:02.340 | and it took me a decade to unravel
01:46:04.420 | me eating everything in sight
01:46:06.340 | and lose the 40 pounds.
01:46:07.540 | But Chamath, when you hear
01:46:09.060 | sort of this back and forth
01:46:11.700 | between Bobby and Friedberg,
01:46:13.220 | what's your take on it in terms of,
01:46:15.940 | and also the European lifestyle
01:46:17.620 | that you live for 10 weeks of the year,
01:46:19.940 | what's your take on what should happen here
01:46:21.620 | and how Bobby can be successful?
01:46:23.380 | - I think what Bobby says rings true
01:46:25.700 | in the way that I live my life
01:46:26.980 | and I just see it demonstrated
01:46:28.420 | on my own body.
01:46:29.300 | You're right, Jason.
01:46:31.060 | You know, my wife's Italian.
01:46:32.420 | She runs an Italian company.
01:46:33.860 | She works Italian and American hours
01:46:38.660 | for 10 months out of the year.
01:46:39.780 | And for those 10 weeks,
01:46:41.540 | we go there and we flip schedule.
01:46:43.140 | But when I'm there,
01:46:44.020 | I'm consuming Italian produce
01:46:46.420 | that isn't packed in plastic.
01:46:47.860 | I go to a local fruit store.
01:46:49.700 | I go to the local fish vendor
01:46:51.300 | and my body changes.
01:46:54.740 | And I know that because
01:46:56.500 | the people that see me when I get back,
01:46:58.260 | they always comment,
01:47:00.180 | "Oh, did you lose weight?"
01:47:01.300 | "Oh, do you look thinner?"
01:47:02.980 | Or this or that.
01:47:04.180 | And what's interesting is
01:47:05.140 | I actually do a body composition
01:47:06.980 | before I leave and after.
01:47:08.180 | I've done this for seven years now.
01:47:09.940 | And I can tell you that
01:47:11.940 | my weight doesn't change that much,
01:47:13.860 | but my body composition
01:47:15.220 | is completely different.
01:47:16.500 | And I don't know what it is
01:47:18.820 | except the things
01:47:19.780 | that I'm putting in my body
01:47:20.900 | that's different.
01:47:21.620 | And so I see it
01:47:22.740 | and I'm running an A/B test every day.
01:47:24.980 | - What's the price of the food, Chamath?
01:47:27.540 | Like, is it more in Italy?
01:47:28.900 | Like, you pay more, do you think?
01:47:30.020 | - Well, I've already commented on this.
01:47:31.460 | - The fishmonger is no joke.
01:47:33.860 | - But there are ways to eat
01:47:36.820 | at a materially lower price
01:47:38.420 | than there is here.
01:47:39.380 | And the access to the ultra-processed food
01:47:42.980 | is different there.
01:47:44.180 | You can't get the stuff.
01:47:45.540 | And when you do find that stuff,
01:47:48.660 | it doesn't have the same, you know,
01:47:51.300 | glycemic and metabolic load on your body.
01:47:53.380 | - I'm curious what you think of, you know,
01:47:55.140 | Ozempic and this category of drugs
01:47:57.380 | breaking the cycle.
01:47:59.620 | I think you've been against them
01:48:01.300 | or they certainly helped me
01:48:04.020 | with half of my weight loss.
01:48:05.060 | I know Sax had a good experience as well
01:48:06.660 | and he's been public about it.
01:48:08.100 | What are your thoughts on that?
01:48:09.140 | Because it does seem,
01:48:10.420 | when people take the GLPs,
01:48:12.180 | which exist in your body,
01:48:13.220 | that they, I'm sure there's more research
01:48:15.860 | that needs to be done,
01:48:16.580 | they do break this habit.
01:48:18.500 | I know anecdotally with me,
01:48:19.780 | I don't crave the foods I craved previously.
01:48:22.340 | And it did kind of rewire my brain
01:48:24.420 | in how I look at food,
01:48:25.940 | even when I'm off of it.
01:48:27.460 | So your thoughts on those
01:48:28.740 | and those potentially being a way
01:48:30.420 | to break the cycle?
01:48:31.220 | - Yeah, so, and this goes to David's point
01:48:35.060 | that, you know, this is,
01:48:36.500 | we need to have cheap food.
01:48:38.660 | And that that is kind of an outcome
01:48:41.620 | that is an admirable or virtuous outcome.
01:48:44.660 | The problem is that food isn't cheap.
01:48:47.300 | It's cheap on the shelf,
01:48:49.620 | but it imposes costs on the rest of us
01:48:52.020 | that were the externalities
01:48:54.660 | that we're paying elsewhere.
01:48:55.860 | So when I was a kid,
01:48:57.460 | the typical pediatrician
01:49:00.420 | would see one case of juvenile diabetes
01:49:03.380 | in its lifetime.
01:49:04.340 | Over a 40 or 50 year career,
01:49:07.460 | one case, it was essentially
01:49:09.940 | non-existent disease.
01:49:11.220 | Today, one out of every three children
01:49:14.260 | who walks through his office door
01:49:16.180 | is diabetic or pre-diabetic.
01:49:18.900 | When I was a kid,
01:49:21.460 | the autism rates were between
01:49:23.620 | one in 1,500 to one in 10,000 Americans.
01:49:28.020 | And that is still true in my generation,
01:49:30.340 | 70 year old man.
01:49:31.300 | And my kid's generation,
01:49:33.620 | according to CDC,
01:49:34.660 | it's one out of every 34 kids
01:49:37.620 | some states like California,
01:49:39.060 | it's one out of every 22.
01:49:40.660 | 77% of Americans are now,
01:49:45.460 | or 74% adults are obese.
01:49:47.620 | Half our kids, obesity,
01:49:50.180 | when, you know, 100 years ago,
01:49:53.540 | if you were obese,
01:49:55.620 | you could get a job in the circus.
01:49:57.700 | It was so unusual.
01:49:58.740 | So we're now,
01:50:01.380 | and who's making profit now?
01:50:03.220 | Ozempic.
01:50:04.100 | Ozempic's not going to,
01:50:05.540 | you can, obesity is absolutely,
01:50:08.260 | and diabetes are absolutely
01:50:09.940 | treatable by good food.
01:50:11.460 | That's the cause.
01:50:13.860 | Now, Ozempic is a good profit center
01:50:17.300 | for pharma.
01:50:21.460 | There's a bill now,
01:50:23.300 | which has been paid for by the company
01:50:25.300 | that makes it,
01:50:25.860 | which is the biggest company in Europe,
01:50:27.860 | Novo Nordisk.
01:50:28.820 | That, in Denmark,
01:50:31.540 | where that company is,
01:50:32.660 | they do not recommend it.
01:50:34.580 | It's, the treatment of diabetes,
01:50:40.260 | the standard of care,
01:50:41.300 | is diet and exercise.
01:50:44.100 | But that company's entire value
01:50:47.860 | is based upon the projections
01:50:49.540 | of what it's going to sell
01:50:50.420 | to the United States.
01:50:51.460 | And it has,
01:50:53.220 | and that company is pouring
01:50:54.820 | tens of millions of dollars
01:50:56.340 | into lobbying to pass this bill
01:50:58.980 | that will make Medicare pay for it
01:51:00.980 | for every American who's obese.
01:51:04.020 | That could be 74% of people
01:51:05.780 | are now eligible.
01:51:06.660 | I think it's $1,500 a week.
01:51:09.460 | The cost of that will be $3 trillion a year.
01:51:12.660 | If you took $3 trillion a year,
01:51:15.300 | a tiny fraction of that,
01:51:16.500 | you could buy organic food
01:51:18.180 | three meals a day
01:51:19.220 | for every human being
01:51:20.660 | in the United States.
01:51:21.700 | So wouldn't that be a better
01:51:23.540 | expenditure of our money?
01:51:26.020 | And, you know,
01:51:27.460 | what I would say is the,
01:51:30.660 | you know, the food producers,
01:51:32.340 | it's not a conspiracy,
01:51:33.620 | it's just people following
01:51:35.860 | perverse incentives.
01:51:36.900 | And there are conspiracies.
01:51:39.060 | I mean, when I sued Monsanto,
01:51:41.780 | we got emails that showed
01:51:43.460 | that the head of the pesticide division,
01:51:45.460 | Jess Roland, for a decade at EPA
01:51:49.220 | was secretly working for Monsanto
01:51:51.620 | the entire time,
01:51:52.500 | sabotaging studies,
01:51:54.580 | creating false science,
01:51:55.940 | which hide the carcinogenic nature
01:51:59.540 | of Roundup.
01:52:00.340 | So there are those kinds of instances
01:52:02.820 | throughout the federal government,
01:52:04.100 | but mainly it's just perverse incentives.
01:52:06.020 | Almost all, almost close to 100%
01:52:09.380 | of our food agricultural subsidies
01:52:13.940 | go to processed food.
01:52:15.220 | I mean, go to commodity agriculture,
01:52:18.020 | which is the feedstock for processed food.
01:52:20.180 | So, and then if you look at--
01:52:22.500 | - Mostly for meat.
01:52:24.340 | - If the industry controls
01:52:28.020 | through lobbying
01:52:30.100 | and through, you know,
01:52:31.060 | all these other mechanisms
01:52:32.340 | for corporate capture,
01:52:33.540 | controls the expenditures
01:52:36.100 | in the food stamp program.
01:52:38.180 | So 70% of the food stamp program
01:52:40.180 | goes for processed food.
01:52:41.380 | 10% goes for sugar drinks like Coca-Cola,
01:52:44.900 | which are just diabetes machines.
01:52:47.940 | So why are we poisoning
01:52:49.620 | poor kids in this country?
01:52:50.900 | The school lunch program,
01:52:53.860 | the same thing, almost 80%,
01:52:55.300 | I think 70% to 70% of food,
01:52:56.900 | the lunch program is terrible foods
01:53:00.020 | that are actually poisoning our children.
01:53:02.420 | And, you know, don't we, you know,
01:53:04.580 | don't we care enough about our kids to say,
01:53:07.140 | we need, you know,
01:53:09.540 | we want to care about them.
01:53:11.380 | We want to make sure that they're not sick.
01:53:13.220 | They are the most precious things in our country.
01:53:16.980 | Shouldn't that be the focus?
01:53:18.580 | And, you know, whatever we're doing
01:53:20.260 | to make them so sick.
01:53:21.460 | When I was a kid,
01:53:23.700 | 6% of American kids had chronic disease.
01:53:26.820 | Today, 60%.
01:53:28.580 | Is that not an alarm?
01:53:29.780 | Is that not, you know,
01:53:31.460 | something that we should all be concerned about?
01:53:33.300 | - I will agree on an important point.
01:53:37.380 | You know, the food stamp program,
01:53:39.860 | the SNAP program provides food stamps
01:53:42.980 | to support 42 million Americans.
01:53:46.740 | 42 million people rely on food stamps.
01:53:48.500 | It costs $120 billion of federal money per year.
01:53:52.020 | And as Bobby said,
01:53:55.060 | the number one product bought
01:53:56.660 | on the food stamp program is,
01:53:58.260 | soda, canned soda.
01:53:59.540 | And there was an important debate a few years ago
01:54:02.660 | about whether or not canned soda,
01:54:05.300 | by the way, Bobby,
01:54:06.580 | you and I probably agree on a lot of things.
01:54:08.020 | It's definitely a lot of things we don't agree on,
01:54:09.700 | but like these aspects,
01:54:11.140 | I think are just no brainers.
01:54:12.820 | There was a debate a few years ago
01:54:14.900 | about whether or not canned soda
01:54:16.420 | should be allowed as a purchase
01:54:17.940 | on the food stamp program,
01:54:18.900 | or whether it should be fresh fruits and vegetables
01:54:21.380 | and grains and other things.
01:54:22.580 | And ultimately there was a food lobbying effort made
01:54:27.060 | that kept canned soda on the food stamp program.
01:54:29.940 | And it is again,
01:54:30.980 | $120 billion of annual federal spend
01:54:33.620 | with the biggest line item going to canned soda
01:54:36.660 | to feed 42 million Americans.
01:54:38.980 | And the connection is completely direct.
01:54:42.020 | High sugar, high glycemic index,
01:54:44.340 | diabetes and other chronic health conditions
01:54:47.060 | arise from that connection.
01:54:48.740 | So I'm definitely aligned with you
01:54:51.300 | on the misincentives and the disincentives
01:54:54.340 | in these programs that have been created.
01:54:55.860 | And they only expand every year.
01:54:57.060 | - Bobby, can you comment on
01:54:59.220 | when people talk about revamping the food supply,
01:55:02.180 | one of the things that sometimes
01:55:05.140 | is not allowed to be said
01:55:06.260 | is that focusing on organic food and produce
01:55:10.260 | can exclude certain communities.
01:55:13.220 | And so there's like this DEI filter
01:55:15.060 | that preferring that is almost racist in some way.
01:55:17.540 | Like, can you just comment on that whole vein of thinking
01:55:20.020 | and your thought on that?
01:55:20.900 | - Yeah, I mean, I think feeding people poisonous food
01:55:25.540 | is racist.
01:55:26.420 | And by the way, the NAACP gets huge amounts of money
01:55:30.340 | every year from the food industry.
01:55:32.900 | It may be one of the biggest,
01:55:34.420 | I think Coca-Cola is the biggest supporter of NAACP.
01:55:39.300 | So a lot of the NGOs that are supposed to be concerned
01:55:43.940 | about the disproportionate impact
01:55:47.300 | on minority communities of federal policies
01:55:49.940 | have actually been bought off
01:55:51.300 | and bought into the process.
01:55:52.820 | And a lot of times those are the voices
01:55:55.300 | that you hear saying this is racist.
01:55:57.060 | What's really racist is poisoning black Americans
01:56:02.180 | because these are communities that are food deserts.
01:56:04.900 | The school lunch program is the,
01:56:08.180 | you know, oftentimes in those communities
01:56:10.180 | the biggest access that they have to food
01:56:13.060 | and we're giving them poison food.
01:56:14.580 | Many of these communities have no grocery stores.
01:56:18.580 | You know, they're no big,
01:56:22.260 | they definitely don't have whole foods.
01:56:24.900 | They don't have access to those kinds of foods.
01:56:28.180 | Shouldn't we have national policies
01:56:30.020 | that make sure our people are healthy?
01:56:34.900 | And, you know, they use of course market dynamics
01:56:39.140 | but also supports.
01:56:40.500 | We're giving billions of dollars
01:56:43.300 | in agricultural subsidies for farmers
01:56:47.140 | to addict farmers to growing commodity agriculture
01:56:51.220 | which is a bad food.
01:56:52.740 | It's low in nutrients, it's high in chemicals,
01:56:55.300 | it's high in pesticides.
01:56:56.740 | And, you know, we need to change these perverse incentives
01:57:01.460 | so that and feed America.
01:57:04.100 | How can anybody argue with this?
01:57:08.500 | How can anybody say that
01:57:09.860 | we should not have healthy children,
01:57:12.020 | that we should be giving people food that is hurting them?
01:57:14.820 | It's just, it doesn't make any sense.
01:57:16.580 | - Sax, any questions for you for Bobby
01:57:20.020 | about democracy, what went down with Biden,
01:57:25.060 | the fairness of the Democratic Party?
01:57:26.980 | I know you've got some strong feelings on it.
01:57:28.420 | So I just wanted to give you your red meat
01:57:29.860 | in your window here.
01:57:30.980 | I'm sure a lot of it's confirming for you what he's saying.
01:57:33.940 | And, you know, I'll be honest.
01:57:36.340 | I think what the Democratic Party did to you, Mr. Kennedy,
01:57:39.540 | was absolutely abhorrent and disgusting.
01:57:41.620 | And it really is infuriating to me,
01:57:45.060 | especially what the mainstream media did.
01:57:47.700 | And I'm glad that we got to have you on early
01:57:50.740 | on our podcast, at least to let some of your ideas
01:57:53.060 | get out there.
01:57:53.700 | But let's give Sax his red meat here
01:57:55.940 | because you joining Trump is a wild card.
01:57:59.540 | I don't think any of us saw it coming.
01:58:01.140 | - Well, let me pick up on those themes, Jay Cowell.
01:58:03.540 | First of all, I want to commend Bobby
01:58:04.820 | on running an upbeat and positive campaign.
01:58:06.900 | You know, you were, and still are,
01:58:10.660 | the most articulate and powerful champion
01:58:13.700 | of free speech over censorship,
01:58:15.860 | civil liberties over the surveillance state,
01:58:18.340 | peace over war.
01:58:20.020 | You've spoken about the issue of chronic health,
01:58:22.660 | which, to be honest, is an issue
01:58:24.100 | I didn't know that much about.
01:58:24.980 | But I think you've put it now on the political radar screen
01:58:27.620 | in a way that it's not going away.
01:58:28.980 | So I think you ran a very noble and effective campaign.
01:58:34.980 | And I think, like you said, it was a campaign
01:58:37.220 | for the soul of the Democratic Party.
01:58:38.580 | You know, I think you represented issues
01:58:42.900 | that in the days of your father
01:58:45.220 | and President John F. Kennedy,
01:58:46.660 | these would have been Democratic Party issues.
01:58:49.620 | How did the Democratic Party respond?
01:58:51.140 | They effectively ran you out.
01:58:53.060 | They did not give you the chance.
01:58:54.180 | They conducted lawfare to keep you off the ballot.
01:58:57.540 | They didn't let you debate.
01:58:58.980 | I heard your running mate say
01:59:00.100 | that you even tried to infiltrate your campaign.
01:59:02.980 | And all the while, this party was claiming
01:59:05.060 | to be the party of democracy.
01:59:06.500 | Find that incredibly hypocritical.
01:59:11.060 | I think that if they had given you the opportunity to debate,
01:59:14.660 | I think we now know what would have happened.
01:59:17.220 | I mean, we saw what happened
01:59:19.140 | when Biden actually debated Trump,
01:59:20.740 | is there's a complete implosion of Biden's campaign.
01:59:25.060 | We discovered that, indeed, the Democratic Party
01:59:27.780 | had been hiding his condition for a long time.
01:59:31.860 | And when he was finally forced to debate,
01:59:34.020 | didn't have a teleprompter or script,
01:59:36.660 | it became extremely obvious.
01:59:38.580 | What happened then?
01:59:39.620 | They basically put in a new nominee
01:59:42.580 | who's never been voted on.
01:59:44.900 | Kamala Harris has never received one primary vote.
01:59:47.780 | It was done through a process that was opaque.
01:59:50.420 | We still don't know how it went down.
01:59:52.340 | I have to disagree with Reed
01:59:54.260 | that Biden did it in a voluntary way.
01:59:57.140 | Biden went kicking and screaming.
01:59:59.060 | I mean, he basically said publicly over and over again,
02:00:01.300 | I'm not leaving the race.
02:00:02.420 | I'm in this race.
02:00:03.460 | He tweeted it, he said it.
02:00:04.820 | He said, only God Almighty could get me out of the race.
02:00:07.220 | And then it was reported that Nancy Pelosi went to him
02:00:09.380 | and said, we can do this the easy way or the hard way.
02:00:11.300 | - He said God Almighty.
02:00:13.060 | - She might be God Almighty.
02:00:15.380 | - He did say God Almighty.
02:00:16.660 | - But the point is,
02:00:17.860 | there was nothing Democratic about this.
02:00:19.380 | And now we have a new Democratic Party nominee
02:00:22.420 | who refuses to do press conferences,
02:00:24.980 | refuses to do solo interviews,
02:00:27.540 | refuses to take questions from the press,
02:00:30.260 | who's hiding herself effectively.
02:00:32.020 | And yet this party, again,
02:00:34.260 | claims to be the party of democracy.
02:00:36.420 | I find it just almost maddening or galling again
02:00:40.740 | in its hypocrisy.
02:00:41.780 | And I just, I don't see how everybody
02:00:44.580 | can't see through this.
02:00:45.620 | It's just not the way that democracy is supposed to work.
02:00:49.460 | And I certainly don't think that the people
02:00:52.420 | engaging these tactics can be cloaking themselves
02:00:55.380 | in all this high fluting rhetoric of democracy.
02:00:58.420 | It's just absurd.
02:00:59.140 | And so I feel like I'm in the place that you are, Bobby.
02:01:02.420 | You know, as viewers of this pod know,
02:01:04.820 | I did not start off supporting Trump in the primaries.
02:01:08.020 | I supported you in the Democratic primary
02:01:09.620 | and I did fundraisers for DeSantis and Vivek.
02:01:12.580 | And that was in large part
02:01:14.740 | because I think of the job that DeSantis did as governor.
02:01:17.220 | But when it came to the general,
02:01:19.140 | the realization that I came to
02:01:21.300 | is that Trump is the indispensable figure
02:01:24.020 | in our current politics
02:01:25.460 | for marshaling this populist energy
02:01:27.620 | to resist this hypocritical elite authoritarianism
02:01:34.580 | that wants to engage in censorship over debate,
02:01:37.220 | that seems to want to protect and defend
02:01:40.420 | this surveillance state over anything it wants to do,
02:01:44.100 | that wants to keep all these wars going,
02:01:45.860 | even when they don't make sense,
02:01:47.700 | when we could have found a way
02:01:48.900 | to negotiate a diplomatic end to them.
02:01:51.060 | And so I'm kind of delighted
02:01:53.860 | that you've kind of come around to this opinion too.
02:01:55.380 | I know you have your reservations about Trump.
02:01:57.220 | I'm not saying that Trump is perfect.
02:01:58.500 | I mean, I think he's a, he's human.
02:02:00.900 | I mean, he's a flawed vessel.
02:02:01.860 | But at the end of the day, he is the choice that represents,
02:02:06.180 | again, these populist forces resisting authoritarianism.
02:02:09.380 | Sorry, this is more of a statement than a question,
02:02:10.820 | but I'll let you react to all of that.
02:02:12.100 | - Well, let me react to the last thing
02:02:15.300 | you said about President Trump.
02:02:16.500 | I think if President Trump wins,
02:02:19.940 | that people are gonna see a very different President Trump
02:02:23.620 | than they did during the first term.
02:02:25.140 | I think he's changed as a person
02:02:27.700 | and I've known him for, you know, 30 years.
02:02:31.140 | I've sued him, I've litigated against him
02:02:33.460 | and had a friendship with him
02:02:35.300 | even when I was litigating against him.
02:02:37.060 | And by the way, successfully against him.
02:02:40.500 | But I think he is, he's focused on his legacy.
02:02:47.140 | He said many interesting things to me
02:02:49.140 | about what he did wrong the last time
02:02:51.140 | and about how he filled his, you know,
02:02:54.340 | he had no idea he was gonna win.
02:02:56.100 | He had no idea how to govern
02:02:57.700 | and people descended on him the day that he got elected.
02:03:00.900 | And said, you gotta appoint this guy, appoint this guy.
02:03:03.780 | And he said, you know, I appointed a lot of people
02:03:06.580 | I shouldn't have appointed.
02:03:07.380 | I know who they are now.
02:03:08.500 | He also said something interesting to me.
02:03:11.540 | He said, the Democrats,
02:03:14.420 | one of the big sort of fulcrums of their terror of Trump
02:03:18.180 | is that he's gonna implement this Heritage Foundation,
02:03:22.340 | you know, blueprint, which is called Project 2025.
02:03:26.340 | And he brought this issue up to me and he said,
02:03:29.380 | you know, they always telling me I'm for Project 2025.
02:03:32.420 | I never read Project 2025
02:03:34.340 | until they started accusing me of it.
02:03:36.340 | He said, I was written by a right-wing asshole.
02:03:40.580 | That's what he said.
02:03:42.660 | He said, there are left-wing assholes
02:03:44.500 | and there was, there are right-wing assholes.
02:03:46.500 | And it was a right-wing asshole who wrote that thing.
02:03:48.820 | And then he started going through it.
02:03:51.460 | So I, you know, I think there's a lot.
02:03:54.740 | And I think he's interested in his legacy now.
02:03:57.780 | He wants to leave behind some accomplishments
02:04:02.500 | and he wants to make our country better.
02:04:04.420 | And I think he's, you know,
02:04:07.380 | he's listening to a wider range of voices.
02:04:10.100 | And as he's preparing to govern right now,
02:04:13.700 | and, you know, I'm gonna be on the transition committee,
02:04:16.820 | picking the people who are gonna govern.
02:04:19.300 | Tulsi's gonna be there.
02:04:21.460 | There's gonna be a wide diversity of stakeholders,
02:04:25.060 | but he's listening to more than just
02:04:26.980 | that kind of narrow right-wing band
02:04:28.740 | that people are terrified of.
02:04:29.620 | - It would be great if you could get to him
02:04:31.300 | because, you know, he really did present well
02:04:33.860 | on this podcast and had like a very good moment
02:04:36.100 | in that first half of the RNC.
02:04:37.540 | And then he started defaulting back to,
02:04:40.180 | and I know a lot of it's like myself,
02:04:42.260 | hate this about him.
02:04:43.700 | And it's a big part of why we don't like him
02:04:45.620 | is he goes back to incel comics,
02:04:47.540 | goes back to race, goes back to gender.
02:04:49.380 | You know, and it's just like, dude,
02:04:51.700 | that Trump 1.0 is what people don't want.
02:04:54.580 | They don't want chaotic Trump.
02:04:56.180 | They want, you know, post-assassination attempt Trump.
02:05:00.180 | And it's just so infuriating.
02:05:02.180 | - Well, a lot of people feel that way, Jay Conn.
02:05:04.980 | I think at the end of the day,
02:05:06.260 | you and others are gonna have to decide,
02:05:08.020 | do you wanna support the candidate
02:05:10.260 | who has the right policies,
02:05:11.300 | but maybe there's style points
02:05:13.220 | that you don't like about him?
02:05:15.220 | 'Cause I think that the things you're talking about,
02:05:17.220 | the mean tweets and so forth,
02:05:18.820 | are at the end of the day,
02:05:19.780 | I think they're stylistic things.
02:05:22.180 | I don't think they go deep to policy
02:05:24.340 | or how he would govern.
02:05:25.620 | Or do you wanna support a campaign
02:05:28.580 | that is running on vibes and joy,
02:05:31.380 | you know, that has the superficiality that you like,
02:05:34.980 | but there's nothing underneath it?
02:05:36.420 | And when we do learn something underneath it,
02:05:38.820 | when we actually learn a policy,
02:05:40.260 | then all the people who are supporting her
02:05:43.780 | have to say, oh, well, she's not really gonna do that.
02:05:45.620 | She's not gonna do that.
02:05:47.220 | So the best thing you can say about her campaign
02:05:49.380 | is that she's not gonna be able to accomplish
02:05:51.700 | the things that she says she wants to accomplish.
02:05:54.420 | - Good news is-- - There's no basis
02:05:55.620 | on which to vote for a president.
02:05:57.220 | - Good news is I'm in Texas,
02:05:58.740 | so I could put in my Bobby Kennedy vote
02:06:01.060 | as a protest vote and it doesn't make a difference.
02:06:02.980 | - Can I respond to the other part
02:06:05.940 | of David's question? - Sure.
02:06:07.380 | - And, you know, I think to me,
02:06:11.460 | the most troubling thing about what's happening now,
02:06:13.460 | you've had two Democratic candidates
02:06:15.540 | who have not been able to give unscripted interviews,
02:06:19.380 | which is extraordinary.
02:06:20.500 | I mean, my father and uncle were so proud of,
02:06:24.740 | you know, the United States for our capacity
02:06:26.660 | to engage in debate, to defend who we were in the world,
02:06:29.620 | to defend a vision of our country,
02:06:31.300 | to articulate it to the rest of the world,
02:06:32.980 | to be the leaders of the free world
02:06:34.740 | and have a command of the facts and of knowledge
02:06:39.300 | and to be eloquent.
02:06:40.260 | And how can you be a leader in the world?
02:06:44.660 | What does the rest of the world think of us right now?
02:06:47.540 | I mean, what could they possibly think?
02:06:49.380 | We have two Democratic Party candidates
02:06:51.700 | who are not able to explain themselves in an interview.
02:06:56.740 | - It's boggles. - Bill O'Neill said
02:06:59.060 | the other day something really, I think, poignant,
02:07:03.060 | which is if you want the job of handling the nuclear code,
02:07:07.780 | you gotta do an interview first.
02:07:09.700 | And, you know, how can you go 30, 39 days
02:07:13.780 | without talking to the press,
02:07:16.660 | without being able to defend your record,
02:07:19.460 | to explain who you are to the American people?
02:07:21.940 | And if you talk to Democrats about this
02:07:24.260 | and you can get past the anger and pass the vigil
02:07:28.020 | on this kind of wall of tribal resistance
02:07:32.180 | to any new knowledge coming in or any contrary facts,
02:07:37.940 | what they'll say is,
02:07:39.620 | "Well, we're not really voting for Kamala,
02:07:43.060 | we're voting for the apparatus."
02:07:45.860 | And you ask the next question,
02:07:47.140 | has that apparatus served you?
02:07:50.740 | You know, has the open border served you?
02:07:53.140 | Has the $35 trillion debt served you?
02:07:55.700 | Has all the endless wars served you?
02:07:58.660 | Has the destruction of the American middle class,
02:08:00.980 | the highest inflation rate in the generation,
02:08:04.340 | has any of that actually, you know,
02:08:07.700 | has that apparatus produced something
02:08:10.020 | for the United States that you're so proud of
02:08:12.660 | that you wanna blindly vote
02:08:14.020 | without knowing who you're voting for?
02:08:16.420 | Anyway, that's--
02:08:17.380 | - Well, I mean, and Kamala and Waltz
02:08:20.820 | will do an interview with Dana Bash tonight,
02:08:22.500 | the night we're taping this on Thursday.
02:08:24.820 | So we'll see, maybe she'll miraculously do 10 podcasts
02:08:27.940 | and she'll be dynamic,
02:08:30.100 | but it certainly doesn't look good
02:08:32.020 | that they filibustered with Biden
02:08:33.700 | and gave him only most favored nation interviews.
02:08:36.260 | I do respect the fact
02:08:37.300 | that we've had so many great candidates come on this pod
02:08:39.940 | and have 90 minute, two hour discussions.
02:08:42.260 | I'm very proud of the work we've done here.
02:08:44.100 | And Bobby, you are a key piece of that.
02:08:46.020 | And we really appreciate you coming on early
02:08:47.700 | and having these debates
02:08:48.580 | and coming here today to talk about it.
02:08:50.180 | Just means the world for you to come back
02:08:52.420 | and talk about this.
02:08:52.980 | Wish you great success with Make America Healthy Again.
02:08:55.300 | I think it's incredibly noble,
02:08:56.820 | independent of how I feel about Trump,
02:08:58.180 | January 6th, abortion, any of those issues.
02:09:00.820 | I respect the fact that you wanna make America healthy again
02:09:03.940 | and I wish you great continued success with that.
02:09:08.340 | And we will see you all next time
02:09:10.020 | on the "American Podcast."
02:09:10.900 | Bye-bye.
02:09:11.480 | ♪ I'm going all in ♪
02:09:13.000 | ♪ We'll let your winners ride ♪
02:09:15.800 | ♪ Rain Man, David Sachs ♪
02:09:17.960 | ♪ I'm going all in ♪
02:09:20.120 | ♪ And instead we open source it to the fans ♪
02:09:22.360 | ♪ And they've just gone crazy with it ♪
02:09:24.200 | ♪ Love you, West Coast ♪
02:09:25.000 | ♪ Queen of Kinwans ♪
02:09:26.360 | ♪ I'm going all in ♪
02:09:27.560 | ♪ Let your winners ride ♪
02:09:29.160 | ♪ Let your winners ride ♪
02:09:31.320 | ♪ I'm going all in ♪
02:09:33.000 | ♪ Besties are gone ♪
02:09:35.880 | ♪ That is my dog taking a notice in your driveway, Sachs ♪
02:09:38.360 | ♪ Win it all ♪
02:09:40.680 | ♪ Oh, man ♪
02:09:41.560 | ♪ My avatar will meet me at Blitz ♪
02:09:43.400 | ♪ We should all just get a room ♪
02:09:44.520 | ♪ And just have one big huge orgy ♪
02:09:46.120 | ♪ 'Cause they're all just useless ♪
02:09:47.320 | ♪ It's like this, like, sexual tension ♪
02:09:48.760 | ♪ That they just need to release somehow ♪
02:09:50.200 | ♪ Let the beat ♪
02:09:53.320 | ♪ Let your beat ♪
02:09:54.360 | ♪ Let your beat ♪
02:09:55.160 | ♪ Beat, that's gonna be good ♪
02:09:56.760 | ♪ We need to get merch ♪
02:09:57.560 | ♪ Besties are gone ♪
02:09:58.200 | ♪ I'm going all in ♪
02:10:00.200 | ♪ I'm going all in ♪
02:10:09.000 | And now, the plugs!
02:10:11.320 | The All In Summit is taking place in Los Angeles,
02:10:14.360 | September 8th, 9th, and 10th.
02:10:16.360 | You can apply for tickets, summit.allinpodcast.co.
02:10:21.320 | And you can subscribe to this show on YouTube.
02:10:23.720 | Yes, watch all the videos.
02:10:24.840 | Our YouTube channel has passed 500,000 subscribers.
02:10:28.200 | Shout out to Donnie from Queens.
02:10:29.960 | Follow the show, x.com/theallinpod.
02:10:33.160 | TikTok, theallinpod.
02:10:34.760 | Instagram, theallinpod.
02:10:36.360 | LinkedIn, search for All In Podcast.
02:10:38.200 | And to follow Chamath, he's x.com/chamath.
02:10:41.320 | Sign up for his weekly email.
02:10:43.480 | What I read this week at chamath.substack.com.
02:10:46.520 | And sign up for a developer account at console.grok.com
02:10:50.440 | and see what all the excitement is about.
02:10:52.600 | Follow Sax at x.com/davidsax.
02:10:55.160 | And sign up for Glue at glue.ai.
02:10:57.320 | Follow Friedberg, x.com/friedberg.
02:11:00.040 | And Ohalo is hiring.
02:11:01.560 | Click on the careers page at ohalogenetics.com.
02:11:05.080 | I am the world's greatest moderator, Jason Calacanis.
02:11:07.560 | If you are a founder and you want to come
02:11:10.120 | to my accelerators and my programs,
02:11:12.200 | founder.university, lunch.close/apply
02:11:14.600 | to apply for funding from your boy J Cal for your startup.
02:11:17.320 | And check out athenawow.com.
02:11:19.560 | This is the company I am most excited about at the moment.
02:11:22.440 | athenawow.com to get a virtual assistant
02:11:25.160 | for about $3,000 a month.
02:11:27.080 | I have two of them.
02:11:28.120 | Thanks for tuning in to the world's number one podcast.
02:11:30.360 | You can help by telling just two friends.
02:11:32.600 | That's all I ask.
02:11:33.400 | Forward this podcast to two friends and say,
02:11:36.840 | this is the world's greatest podcast.
02:11:38.440 | You got to check it out.
02:11:39.080 | It'll make you smarter and make you laugh.
02:11:40.520 | Laugh while learning.
02:11:41.640 | We'll see you all next time.