back to indexIn conversation with Reid Hoffman & Robert F. Kennedy Jr.
Chapters
0:0 Bestie intros: Buttons are back for fall
1:48 Reid Hoffman joins the show, reminiscing on PayPal stories with Sacks
7:52 State of AI: Nvidia, cluster buildouts, competition
19:51 OpenAI's corporate structure and thoughts on Elon's lawsuit
29:9 Inflection AI's deal structure with Microsoft, Lina Khan's impact on the tech industry
41:27 Reid's perspective on Kamala being hot swapped for Biden, funding groups that attempted to keep RFK Jr. off ballots
52:2 Reid's thoughts on growing antisemitism
55:3 Thoughts on Kamala's economic proposals: price caps, wealth tax, etc.
64:19 How Silicon Valley views both candidates, why Reid funded legal action against Trump
79:3 Robert F. Kennedy Jr. joins the show and recaps his campaign and decision to back Trump
91:13 Falling out with the Democratic Party
97:26 Potential role in the Trump Administration, Make America Healthy Again agenda
118:1 Sacks recaps RFK Jr's campaign, RFK Jr. on Trump's legacy
00:00:00.000 |
Welcome back to the all in podcast, the number one business 00:00:03.740 |
technology and political podcast in the world. I am your host, 00:00:08.360 |
Jake out, Jason Calacanis. And with us today, three of my 00:00:12.540 |
besties, you got David Freeberg cackling over there. He is your 00:00:17.060 |
sultan of science previously known as the Queen of Qinhua, 00:00:19.620 |
but he sold the Qinhua business made a killing in Qinhua. Also 00:00:23.520 |
with us back from Italy back from Italy, Chamath Palihapitiya 00:00:28.220 |
he's at 67% button. And he's not happy about it. But the hair 00:00:33.260 |
looks great. You still got a little sea salt from the yacht. 00:00:37.300 |
I think I'm going to try to keep my hair long. Let's see what 00:00:40.140 |
happens. Did you bring any of the sea salt back with you from 00:00:43.260 |
the Mediterranean? Put it in a little bottle to spray or no? 00:00:46.020 |
No, but I do. Oh, okay, great. And have you showered in the 00:00:50.360 |
last week? Or is it still you got the Mediterranean glove 00:00:52.660 |
every day? I showered since I've gotten back. See, that's the 00:00:55.340 |
problem. You don't have you don't have the sea to use as a 00:00:57.740 |
natural, you know, disinfectant and deodorant. Exfoliant. 00:01:03.140 |
Look how many buttons he's got going. I know it's just tragic 00:01:07.180 |
beginning. I feel uncomfortable for your neck. I mean, it's like 00:01:09.940 |
creeping all the way up. Your neck looks like a prisoner. 00:01:19.140 |
I walked here but I had it totally unbuttoned and I thought 00:01:33.340 |
this is completely inappropriate for Menlo Park in August. So I 00:01:36.820 |
buttoned two buttons back in business mode. He's in business 00:01:39.420 |
casual mode. He went from casual to business. Okay, and 00:01:43.180 |
with us, of course, the Dark Knight himself. Yeah, the Rain 00:01:46.180 |
Man David Sachs. And we have a bestie guest before you folks 00:01:50.700 |
friend of my other pod this weekend startups Reid Hoffman is 00:01:53.780 |
here. And you know him as a venture capitalist board member 00:01:57.940 |
at Microsoft. And you were the co founder or the founder of 00:02:01.780 |
LinkedIn. I don't know if you had a co founder, co founder, co 00:02:03.900 |
founder of LinkedIn, now owned by Microsoft. He's got his own 00:02:07.780 |
podcast masters of scale. And he and David Sachs work together at 00:02:14.260 |
PayPal. Reid, give it welcome to the program and give us a little 00:02:18.740 |
story. What is your fondest memory? Or the most quirky 00:02:22.860 |
memory? David Sachs, and that all those weirdos I'm sorry, I'm 00:02:27.980 |
not supposed to use the word weird anymore. I get banned on 00:02:29.900 |
X. All of those unique personalities at PayPal. Tell us 00:02:33.860 |
about that moment in time. And do you remember the first time 00:02:36.260 |
you met David Sachs? Yeah, I met David, because Peter Dunham from 00:02:42.060 |
Stanford and hired him in. And, you know, David, very quickly, 00:02:49.260 |
because he, you know, has a strong learning curve as he 00:02:53.340 |
plays these things kind of got the instinct of what the game we 00:02:57.300 |
were playing with PayPal was. And it's part of the reason why 00:03:00.180 |
I think, you know, each of the execs have had, you know, kind 00:03:04.060 |
of key contributions to making, you know, kind of PayPal 00:03:09.020 |
successful. And David's was this kind of like, maniacal focused 00:03:14.740 |
on the kind of the cycle of how the product worked on eBay. And 00:03:20.740 |
like, like, there was just a whole bunch of stuff I learned 00:03:23.380 |
from him. It's part of how I track, you know, kind of, you 00:03:27.460 |
know, people I respect, is what do I learn from them. And that 00:03:31.020 |
was one of the things that I would say I learned from David 00:03:34.980 |
David, tell us your first memory of meeting read Hoffman. Would 00:03:40.220 |
you remember where you were? Do you remember the conversation? 00:03:42.940 |
Yeah, I think when we met through Peter, you know, and 00:03:45.580 |
read, I think read was on the board of this confinity back 00:03:54.820 |
No, I mean, let's see. This would have been 1000. I guess 00:03:59.940 |
it would have been 27. When I first joined PayPal 2728. I 00:04:03.940 |
guess something like that. Yeah. 99. So whatever that was, in any 00:04:08.620 |
event, I mean, I'll just return the compliment, you know, PayPal 00:04:11.820 |
had all these existential issues, where you had these 00:04:15.220 |
larger entities trying to kill us visa, MasterCard, eBay, who 00:04:23.100 |
Yeah. And read was was kind of our emissary who kept all these 00:04:28.340 |
dogs at bay and managed to, I guess, be friends with them, I 00:04:32.420 |
guess, to some degree, even though they wanted to kill us. 00:04:34.740 |
And re was kind of in charge of making sure that these 00:04:37.820 |
existential issues didn't blow up on us. And they didn't. So we 00:04:42.020 |
It's it's the Will Rogers line. It's politics is the art of 00:04:45.300 |
saying nice doggy while you hunt for a stick. Tell us like a 00:04:48.820 |
moment read. That is incredibly memorable to you from that PayPal 00:04:55.060 |
era, you know, some existential moment or one or more difficult 00:04:59.140 |
or funny moments, late night moments, that would be 00:05:01.980 |
indicative of that era and whatever was in the water that 00:05:07.020 |
Well, part of it is that, I mean, this is, you know, among 00:05:11.900 |
the things I was learning from from Peter, was that Peter and 00:05:16.420 |
Max recruited just a tremendous focus on on like intense 00:05:24.940 |
learning curves. So, you know, it's one of the things that 00:05:27.500 |
Peter later is like, okay, I guess you have to interview for 00:05:29.820 |
being on sports teams and so forth, because this teamwork 00:05:32.420 |
thing does matter. But like high performers, and it was kind of 00:05:36.220 |
like a, like, and that was part of the reason why there was such 00:05:39.140 |
intense, you know, kind of innovation and capability. You 00:05:44.980 |
know, probably the most stunning memory I had it at PayPal is we, 00:05:49.500 |
you know, we're all young, we're all first time we're, we're kind 00:05:53.140 |
of doing a startup that matters, you know, kind of making this 00:05:56.540 |
stuff happen. And we do this merger with x.com. And, and, you 00:06:01.740 |
know, like, pre the merger closing, you know, Elon is 00:06:06.620 |
saying, Oh, I got the CEO, Bill Harris, he's best ever, that's 00:06:09.340 |
part of the reason why you give so much percentage of the 00:06:11.140 |
company to x.com and the merger, you know, dah, dah, dah, dah. And 00:06:14.740 |
then after the merger, literally, the first meeting I 00:06:19.180 |
had with Elon is Bill Harris, a complete disaster, we need to 00:06:21.340 |
fire him right away. Before we get to the first board meeting, 00:06:25.820 |
we need him fired. And I'm like, uh, Elon, you need to talk to 00:06:30.820 |
Well, I mean, he is decisive. That's for sure. Yes. All right. 00:06:37.700 |
Well, let's get into we want, you know, we're gonna go a 00:06:39.500 |
little bit mullet here, Friedberg, we're going to start 00:06:41.780 |
with business. And then maybe we had a fun meeting about that 00:06:45.860 |
topic at a place in Palo Alto that no longer exists called a 00:06:49.140 |
Antonio's nut house. Yes, exactly. Yes, the legendary 00:06:53.140 |
Antonio's done. And when Bill eventually did meet his demise, 00:07:01.500 |
He got whacked at the nut house. We're at school tables in the 00:07:08.100 |
Well, it was he wasn't whacked. He was whacked at a board 00:07:10.620 |
meeting, not at the house. But certain plans were formulated 00:07:17.460 |
It's Antonio's that house is, yeah, the most unhygienic bar in 00:07:24.420 |
the Bay Area. And then that's, that's a pretty low benchmark. 00:07:27.700 |
Let's just leave it at that. We will start with some business 00:07:30.420 |
here, talk a little AI. And then since two of our panelists have 00:07:34.220 |
a passion, we'll do the party political parties at the end. 00:07:38.340 |
Everybody knows that re was a co founder of inflection AI and as 00:07:42.260 |
a general partner at Greylock and one of the founding 00:07:46.500 |
investors also in open AI. There's a good story there, I'm 00:07:49.780 |
sure. And we just got results read from Nvidia results were 00:07:55.380 |
good. They beat across the board. Stock was down after 00:07:59.060 |
hours. Analysts said probably profit taking putting that 00:08:03.420 |
aside. We've never seen a chart like this in the history of 00:08:07.900 |
business. I would say data center revenue 26.3 billion 87% 00:08:12.940 |
of their revenue. Now you remember Nvidia started 00:08:15.100 |
obviously with, you know, video games and, and didn't have a 00:08:18.820 |
major data center business that has exploded net income 16.6 00:08:23.740 |
billion gross margin 75%. And here's your chart. On a total 00:08:30.460 |
basis in videos, revenue scale up is basically unlike anything 00:08:34.540 |
we've seen. But if you look their queue quarter of a quarter 00:08:38.260 |
revenue over the past couple years, things are starting to 00:08:40.900 |
cool off significantly after that giant boom. Re what's your 00:08:45.300 |
take on Nvidia is just incredible run here? Is it 00:08:49.660 |
sustainable? Will they have competitors? And do you think 00:08:53.540 |
this build out this massive bill that we're seeing from startups 00:08:56.660 |
to sovereigns? You know, to Microsoft, which are on the 00:08:59.740 |
board of Google, Apple, etc? Is this sustainable? And is this 00:09:04.300 |
Well, I got asked that question, unsurprisingly, by many public 00:09:08.500 |
market investors over the year. Yeah. And I said, basically told 00:09:11.700 |
them and say, Hey, look, it's sustainable for two years, which 00:09:18.300 |
Yes, exactly. So that that's infinity, right? In terms of 00:09:21.700 |
time. You know, Nvidia has a very sharp, you know, kind of 00:09:27.420 |
lead on the importance of the the chips for the training 00:09:31.140 |
clusters. You know, they're, they're effective on inference. 00:09:35.060 |
But I do think that as you kind of scale the demand, there'll be 00:09:38.780 |
a lot of inference chips coming in. You know, I think Chamath 00:09:43.820 |
you're invested in one of those. Oh, yeah. And I think there's 00:09:47.580 |
going to be a bunch of those kind of coming in and and and 00:09:50.740 |
the bulk of the demand will be on the inference side. And then 00:09:54.420 |
Nvidia will have this challenge of, do I try to keep my prices 00:09:58.300 |
and my margin? Or do I do what why we like competition? Do I 00:10:02.820 |
have to respond to the competitive market? And then 00:10:04.820 |
that I think will will play out, you know, start playing out 00:10:08.020 |
probably in a year, two at the latest, and then kind of go. So 00:10:12.660 |
I think it's not sustainable. The the pure heat is not 00:10:16.260 |
sustainable. But I think it's, you know, Nvidia has got a very 00:10:19.380 |
strong position. And, and, you know, I definitely, I would 00:10:25.900 |
Yeah, so yeah, there's growth left competition is coming. And 00:10:30.820 |
this is probably not the type of stock you would want to short at 00:10:34.060 |
this moment in time. Freeberg, what are your thoughts on this 00:10:37.420 |
build out, as well as the software build out that's 00:10:40.420 |
occurring? And when do you think we're going to see some 00:10:45.660 |
I don't know if there's competition in the build out. I 00:10:47.460 |
think we talked about this in the past. And I don't know if 00:10:49.180 |
you guys saw these quotes this week, or recently on, we don't 00:10:53.340 |
think about this build out in terms of ROI. Gavin Baker in 00:10:58.100 |
conversation on invest like the best. Is that the name of the 00:11:01.860 |
podcast? Yeah, I think reference some conversations he's been 00:11:05.260 |
having with the leaders of these companies, regarding the build 00:11:08.940 |
out is so important, because ultimately, if you create this, 00:11:11.940 |
quote, digital God, the, you know, return is how many 00:11:17.140 |
trillions. So it doesn't matter how many 10s of billions you're 00:11:20.180 |
spending each quarter right now, you have to get there, you have 00:11:22.660 |
to make sure you don't miss the boat. I guess read a question 00:11:25.260 |
for you. You're on the board of Microsoft still, right? Like, 00:11:27.620 |
yes, indeed. As Microsoft or Satya publicly talked about how 00:11:31.460 |
they rationalize the investing principles associated with 00:11:36.060 |
building out AI infrastructure in the cloud. Is it ROI based 00:11:40.180 |
like, hey, in the next two years, we're gonna make this 00:11:41.820 |
much additional incremental profit or, or we got to get this 00:11:46.100 |
thing working. Right, right. To be more precise, is the 00:11:48.260 |
investment driven by ROI? Or does everyone just say this is 00:11:50.740 |
so strategic, we just have to win it. And we'll throw all the 00:11:55.100 |
Well, so what will one board members speaking for Microsoft 00:12:00.020 |
is, you know, is forbidden. So I'm not speaking for me. Right. 00:12:04.700 |
To all cloud computing platform companies. How are you? Yeah, 00:12:09.100 |
what's your sense on how they're thinking? Just one of the 00:12:10.660 |
principles and the Microsoft thing is the company speaks for 00:12:12.780 |
itself, board members don't don't don't speak for him. But 00:12:16.620 |
you know, I think Satya is like the best public market CEO of 00:12:20.340 |
our generation. I think he is stunning in kind of blending 00:12:27.260 |
common combination of strategic insight with also kind of being, 00:12:30.140 |
you know, kind of return on capital, you know, sensible 00:12:33.100 |
risk taking, etc. And so the actual thing between your guys 00:12:36.780 |
questions in terms of because I can comment on how Satya things 00:12:39.380 |
with this stuff is, he's both thinking about like, it's a 00:12:41.860 |
platform change. And you have to be there for the platform 00:12:44.540 |
change for productivity for cloud, etc. And, okay, let's 00:12:49.340 |
rationalize the capital to when are we expecting revenue? How 00:12:52.420 |
do we get revenue sooner to have that as a good, productive 00:12:55.620 |
cycle? How do we, you know, be not trying to, you know, just 00:13:01.260 |
spend like drunken sailors, which is easy to do, right. But 00:13:05.620 |
but to be targeting, you know, kind of business outcomes. And 00:13:08.740 |
it's part of the reason why, you know, they're like, like, you 00:13:12.140 |
know, he's very focused on what are we doing with office? What 00:13:14.740 |
are we doing cloud? What are we doing with, you know, as 00:13:17.380 |
opposed to like, you rarely hear him talking about AGI or never 00:13:22.860 |
digital gods, because it's kind of the question of, I am, I am 00:13:27.060 |
focused on this on a business sense. And I think that's, 00:13:29.140 |
that's kind of the way he's doing it. But there is 00:13:30.620 |
obviously a, you know, kind of a, it's hard to predict the 00:13:33.860 |
future when it's novel and unknown and platforms. And it's 00:13:37.020 |
part of the reason why you have all the hyperscalers. Now, you 00:13:40.540 |
know, kind of fully engaged and intelligently engaged, because 00:13:43.300 |
you say, well, if even it's just the new platform by which, you 00:13:47.460 |
know, kind of software everything with with with a 00:13:50.660 |
computer unit in it, whether it's a phone or a speaker or a 00:13:54.940 |
computer or anything else, anything with a with a with a 00:13:58.740 |
with a kind of a CPU or a GPU gets more intelligent. Like, you 00:14:03.500 |
can't miss out on that platform. And so that's, that's, I think 00:14:07.420 |
the the thing that's motivating everybody, but it's obviously, 00:14:09.940 |
you know, how to do that smart is one of the things that, you 00:14:13.820 |
know, everybody is, I'd say obsessing about every week. 00:14:17.380 |
What do you think about the open source movement versus closed 00:14:21.740 |
source, you were one of the original donators to open AI, 00:14:25.460 |
you were originally on the board. And there's a couple of 00:14:28.220 |
ways to go with this question. But I just want to start with, 00:14:30.420 |
forget about the corporate structure over there, we'll get 00:14:33.020 |
to that in a second. But I want to talk specifically about open 00:14:36.660 |
source medicine, medicine, obviously far behind open AI far 00:14:40.820 |
behind Google, Microsoft, so they went open source, when 00:14:44.220 |
you're behind you go open source, I guess is the idea 00:14:46.220 |
here. But they're making some big progress. Who do you think 00:14:50.300 |
is going to win this ultimately, an open source provider of LLM 00:14:54.780 |
or proprietary closed source, like open AI is, and it's 00:15:00.220 |
confounding to say open AI is closed, but closed AI. 00:15:03.780 |
Yeah, um, you look at from the very founding, open AI was 00:15:08.380 |
never claiming it was gonna be open source was claiming it was 00:15:10.780 |
going to be one safety open access, and not differential or 00:15:15.260 |
controlling access for that. And I think that's they've stayed 00:15:18.820 |
true to that principle, which is I think what the genesis of the 00:15:21.940 |
word open is there. And, look, I think the key thing is going to 00:15:26.580 |
be winners all over the place. I think there's gonna be winners 00:15:28.700 |
in the open source side. And, you know, I don't, I don't know 00:15:32.580 |
if llama is going to win from its open source thing as much as 00:15:35.020 |
it's just saying, hey, we're training these models. So we're 00:15:36.860 |
gonna, you know, put them out there because our closed system 00:15:40.260 |
closed loop, you know, doesn't require selling for tokens and 00:15:43.500 |
so forth. But there's also, you know, Mr. All and other folks 00:15:46.300 |
who are doing competent models. And then I think that the the 00:15:51.620 |
but you know, there'll be wins in different ways. So it's not 00:15:54.700 |
like I think, like, for example, you know, I think there's going 00:15:57.460 |
to be a bunch of different startups, they're going to win, 00:15:59.340 |
whether it's coding agents, or, you know, kind of very specific 00:16:03.100 |
applications within medical or other kinds of things. And I 00:16:06.500 |
think they will, you know, generate big companies. And I 00:16:09.100 |
think large companies, like, you know, the hyperscalers are 00:16:11.940 |
gonna, are gonna succeed as well. Now, in the pure model 00:16:15.100 |
competition, the question is, when do we start seeing a 00:16:20.460 |
asymptote to scale? And, in my guess is, and, you know, kind of 00:16:25.380 |
the GPT landmarks is each order of magnitude, my guess is the 00:16:29.740 |
soonest will be GPD six. And it might not even it may even be 00:16:34.260 |
after that. And that's part of what the, the bet that open AI 00:16:37.620 |
and anthropic and the hyperscalers are all making is 00:16:40.580 |
that that that return to scale, and then that has a lot of 00:16:43.700 |
downstream effects. Because even if you say we can train smaller 00:16:46.940 |
models, to do effective things, part of what's going to be 00:16:50.180 |
really instrumental for training those smaller models is the 00:16:52.700 |
larger models. So, like, even if there's a bunch of smaller 00:16:56.020 |
models that are specifically capturing other kinds of market 00:16:58.580 |
opportunities, which is part of what I've been doing and 00:17:00.940 |
investing in AI since, you know, 2014, 2015, you know, there's a 00:17:06.860 |
there's a there's going to be a set of those things that are all 00:17:10.580 |
a whole bunch of startup opportunities. So I think that 00:17:12.500 |
that the the A versus B is is is a good dramatic framing. But 00:17:17.740 |
it's really on which specific opportunities because there's 00:17:20.340 |
going to be wins and opportunities across them. 00:17:23.060 |
Do you think you're sorry, just real quick, do you think there's 00:17:25.180 |
one LLM or one foundational model read that effectively does 00:17:32.060 |
everything like a meta model that starts to take most of the 00:17:35.220 |
market or does different versions of smaller models or 00:17:40.740 |
small agents that kind of network together, end up being 00:17:44.260 |
the best solution for specific applications and verticals? 00:17:47.340 |
Like, how does this evolve over time? Like, everyone's got this 00:17:50.140 |
concept that there's a God model that does everything and wins, 00:17:52.660 |
and whoever gets the God model wins everything. But the reality 00:17:56.500 |
of software and principles of biology would indicate that 00:18:00.420 |
you'll see like smaller network things that are better at doing 00:18:03.340 |
things than any one big thing. And so I hear your point of view 00:18:08.300 |
I think the mistake that people make is they think precisely is 00:18:11.620 |
like the one model to rule them all. It's like Sauron's ring. 00:18:15.380 |
And actually, in fact, already today, like, for example, one of 00:18:20.740 |
the things that happens with all the model providers at Microsoft 00:18:24.100 |
and open AI, which I've seen is sometimes sub in like GPD 3.5, 00:18:28.340 |
as opposed to four, to see what the answers are, because there's 00:18:30.900 |
a cost of compute, even as you learn that bring the cost of the 00:18:34.100 |
computer, the larger models down, the larger models are 00:18:36.740 |
always going to be a lot more expensive. And by the way, 00:18:38.340 |
they're going to be more expensive, kind of probably 00:18:41.740 |
loosely on the order of magnitude, right? So it's like, 00:18:44.300 |
well, it's 10x larger, it's 10x more expensive. Totally. And and 00:18:47.180 |
so when you're trying to say, hey, I'm trying to make business 00:18:50.180 |
by language translation, right? If I just want to do language 00:18:53.180 |
translation, I don't need a massive model, I just need a 00:18:55.260 |
model that's really good at language translation. 00:18:56.940 |
Exactly. And so what I think you're going to see is is is 00:19:00.300 |
networks of models, and like kind of traffic control and 00:19:04.140 |
escalation, all the rest and agents are not going to be one 00:19:07.220 |
model, they're gonna be blends of models. And that's one of the 00:19:09.820 |
reasons why you say, well, there's actually in fact, a lot 00:19:12.100 |
of a lot of room for startups, because it's not like we say, 00:19:14.380 |
well, we take GPD seven, and we just serve it for everything. 00:19:17.940 |
It's like, well, it's gonna be super expensive. And there's a 00:19:19.820 |
whole bunch of things about like serving it more cheaply. And 00:19:23.100 |
like, for example, one of the the really great technical 00:19:25.460 |
papers that I love for Microsoft is, you know, all you need is 00:19:27.820 |
textbooks. It's like you can you can train very specific models 00:19:32.180 |
on kind of like high quality data, along with, by the way, 00:19:36.540 |
the larger model helping train it that all of a sudden, you 00:19:38.580 |
have a functional smaller model. And you know, the question will 00:19:42.460 |
be a blend of these things. So I think the the the multi model 00:19:46.460 |
model approach is, I think, going to be, you know, quickly 00:19:51.020 |
What is your take on IP in this new era, we see open AI, and 00:19:55.580 |
you're not on the open AI board anymore, right? You're not so 00:19:58.140 |
you're independent of that, even though you made a big donation 00:20:01.540 |
Donation and investment. I led the first commercial series for 00:20:05.900 |
special. So you're an investor in it, and you donated to it. 00:20:08.820 |
Actually, let's start there. What's up with that corporate 00:20:13.580 |
structure? How do we make sense of that? Something's a 00:20:17.900 |
nonprofit, you donated to it, and then you invested in it, and 00:20:21.180 |
everybody's making money and selling in secondary at 100 00:20:26.460 |
So, so it's, it's a five, one, two, three is the governor thing 00:20:31.500 |
is what started and, you know, when, you know, kind of Elon and 00:20:35.580 |
Sam were starting this and said, Look, we need, you know, 00:20:38.180 |
philanthropic support. And we're trying to make sure that there's 00:20:41.700 |
like open access to AI, which is going to be an instrumental 00:20:44.220 |
technology. And we've got some great technologists who want to 00:20:46.540 |
come do this. We started as a, as a, you know, kind of as a 00:20:50.580 |
501 c three for doing it that that that persists, as far as I 00:20:54.740 |
know, till today. Then, you know, one of Sam Altman's, you 00:20:59.340 |
know, pieces of genius was that he kind of said, Look, we're 00:21:01.980 |
going to need scale capital. And I'm trying to go out to raise 00:21:06.020 |
and commercial round was 600 million. I'm trying to raise 600 00:21:09.700 |
million philanthropy and is not working. Right. So, so I have 00:21:12.940 |
this idea, which is the 501 c three, which is doing this kind 00:21:15.900 |
of research mission of, you know, AGI for humanity is also 00:21:19.700 |
producing commercial benefits. And we can create initially an 00:21:23.340 |
LP, which has a kind of a revenue right on the commercial 00:21:28.740 |
things that investors can invest in. And, you know, you know, 00:21:33.580 |
read it'd be really helpful if you led this. Because, you know, 00:21:37.580 |
and I was like, Oh, but you don't have a go to market plan. 00:21:39.780 |
You don't have a product plan, you know, business plan. Yeah, 00:21:44.580 |
but you know, like, we, we need to show that we're actually 00:21:47.220 |
serious about the business. I said, Alright, fine, I will, I 00:21:49.500 |
will lead it from my foundation. You know, because even though 00:21:53.420 |
none of these things we like to see as investors were there, but 00:21:56.580 |
I was like, Look, okay, I'll lead as investment, I'll manage 00:21:59.340 |
it as an investment, but I'll do as an investment for my 00:22:01.540 |
foundation, in order to do this. And, and what, you know, you 00:22:07.460 |
know, kind of that was kind of as we were beginning to get into, 00:22:09.980 |
you know, like, kind of, we, we hadn't seen anything, they were 00:22:13.580 |
still doing Dota and, and robot hands and much of this. So it's 00:22:16.940 |
like, we're betting on the scale thesis of generating something 00:22:20.380 |
magical. And so we hadn't seen GPT three yet. And of course, 00:22:23.780 |
once that started coming, then it was like, well, we need a 00:22:25.940 |
bunch more capital, let's do a strategic, you know, you know, 00:22:30.300 |
connection, and let's talk to all the hyperscalers. And let's 00:22:32.860 |
work out a deal by which one of them invest in us. And, and, and 00:22:36.460 |
then, you know, the, the Microsoft OpenAI deal came 00:22:38.700 |
together with, you know, converting the LP into a 00:22:42.780 |
subsidiary of the nonprofit, you know, kind of saying, Look, 00:22:46.220 |
there's all kinds of benefits that both OpenAI and Microsoft 00:22:49.540 |
can get from a business deal. And so that's, that's, that's 00:22:53.300 |
what's led to, you know, the structure, you know, that I was 00:22:58.900 |
Do you, what did you think of Elon's first lawsuit, and then 00:23:03.660 |
he dropped it, and then he refiled, where, where do you 00:23:07.540 |
Well, what is the, you know, I'm not very charitable about those 00:23:16.660 |
lawsuits. They know, I would like to be because, you know, 00:23:19.980 |
Elon's one of the entrepreneurial heroes of our 00:23:22.780 |
time and generation. But I think it's the, I think it's, you 00:23:26.940 |
know, frankly, I think probably the most charitable thing to say 00:23:29.060 |
is sour grapes. Because, you know, for example, I know, Sam 00:23:33.580 |
offered him as much of the investment round as he wanted, 00:23:36.620 |
right? Like, he could have done the whole thing, he could have 00:23:38.820 |
led it, you know, it was kind of like, Hey, look, we still love 00:23:41.900 |
you. And he was like, No, it's not a company that I control, 00:23:45.420 |
it's gonna fail. So I'm not interested in investing. I was 00:23:48.420 |
like, Okay, right. And so now you're getting these lawsuits 00:23:51.500 |
that are like, you know, like, I was misled. And it was like, you 00:23:56.580 |
were offered everything at every opportunity other than converting 00:23:59.500 |
open AI into a company that you you completely owned. And so, 00:24:02.860 |
you know, I think it's without without basis without merit. 00:24:07.460 |
But why do you think he would have dropped it and then 00:24:09.980 |
refiled? Where do you think that comes from? Is was it? Is there 00:24:18.860 |
buried. I mean, Elon put in the first what $44 million and he 00:24:23.660 |
Yeah. I by the way, put in 10 million at the same time, and I 00:24:29.220 |
But do you think that he kind of got screwed because he doesn't 00:24:32.180 |
have any shares? I mean, at the time he put in the 44 million, 00:24:34.900 |
it was never going to be a for profit. Now it's a for profit. A 00:24:38.620 |
lot of people are profiting, you know, assuming the paper market 00:24:42.060 |
ends up being realized. So he doesn't own anything. I mean, if 00:24:47.020 |
you were a seed investor, and put up 44 million and something, 00:24:50.260 |
and then everyone's making money, and you don't have any 00:24:52.260 |
shares, forget about the legal technicalities, wouldn't you 00:24:56.460 |
Well, look, I can understand the emotion of that. But like, it's 00:25:00.660 |
not like Elon short of money. Right. And so if you go look, 00:25:04.300 |
I'd like to have shares like I did invest in the other thing. I 00:25:07.500 |
didn't get any shares for the 10 million that I put in. And by 00:25:10.420 |
the way, it's not just legal technicalities. It's actually 00:25:12.860 |
really important that you're not doing private enrichment off 00:25:15.060 |
philanthropic donations. And so, you know, it's, it's, but isn't 00:25:20.180 |
No, from a viewpoint of, they're held separate, right. And so, 00:25:29.260 |
you know, and the 501 c three continues to, to control the, 00:25:33.940 |
you know, kind of control the kind of the, the, the mission 00:25:38.420 |
and destiny and so forth. And so the question about its mission 00:25:41.460 |
is still is still guiding things. And you're essentially 00:25:43.740 |
investing in that mission. And you recorded, you recruited 00:25:46.900 |
people, you know, on that mission. And so, you know, I 00:25:50.100 |
think that the, you know, you know, I think, like I said, 00:25:55.380 |
Okay, so let's get into some political stuff. First, I want 00:26:00.780 |
to get the IP question, then I want to talk about Lena Khan. So 00:26:04.900 |
how do you think about IP in this, you know, briefly in this 00:26:07.740 |
new world, opening, I and New York Times can't come to terms, 00:26:11.460 |
New York Times caught them red handed cookie jar, according to 00:26:14.780 |
their lawsuit, having indexed a ton of their content, it's 00:26:17.980 |
pretty crystal clear that their contents behind a paywall. And 00:26:20.700 |
that's how they make money. I also subscribe to chat GPT, I 00:26:23.900 |
give them 20 bucks a month, maybe 30 bucks a month for every 00:26:26.460 |
employee in my firm. And I get New York Times content from 00:26:30.500 |
there all the time. I will ask it, what is the wire cut I think 00:26:33.540 |
is the best choice in chat GPT, I get that. And then I get the 00:26:36.700 |
answer. And I don't need my New York Times subscription, I don't 00:26:39.300 |
visit the New York Times anymore. feels pretty clear cut 00:26:43.220 |
to me. But how do you think about IP? Should an LLM be able 00:26:48.140 |
to ingest whatever they please? Or should they be required to 00:26:51.300 |
get permission in advance and pay a royalty to content 00:26:55.740 |
Well, as a content creator to look, I think, I think that it 00:27:03.460 |
tends to be a little bit of a, we do want content creators to 00:27:08.780 |
benefit economically from the work. It's part of the reason 00:27:10.940 |
why we have copyright, it's part of the reason why we have 00:27:12.860 |
payrolls, you know, other kinds of things that I think are, are 00:27:16.220 |
very important. And I think it's, it's a complicated thing 00:27:18.380 |
that needs to be sorted out. Now, that being said, I think we 00:27:21.060 |
also want to say that we can train these models, like, you 00:27:24.740 |
know, like training is like reading, and like reading things 00:27:28.140 |
is, you know, like when something's available to be read, 00:27:31.220 |
and you've engaged in the right economic thing for reading it, I 00:27:33.620 |
think that's a kind of a reasonable fair use thing. Now, 00:27:36.580 |
maybe we update the terms of service, maybe we update, you 00:27:39.220 |
know, copyright law or other things to say, well, okay, that 00:27:41.140 |
now changes, you know, I think we don't want to forbid changes 00:27:44.740 |
in the future. You know, this is one of the problems we get with 00:27:48.780 |
it blocks innovation, when we do that, but blocks innovation, and, 00:27:51.620 |
you know, kind of Hollywood blocks innovation and music, it 00:27:54.780 |
blocks innovation. So you want to allow some new chain, you 00:27:59.020 |
know, changing landscape. And I think this is a changing 00:28:01.340 |
landscape that arguably is reading. So I think that, that 00:28:04.380 |
both of those things are true in terms of what do we, what do we 00:28:07.540 |
want to sort through? I think that one of the reasons why this 00:28:10.980 |
is kind of like, you know, over, like, when I give advice to, you 00:28:14.820 |
know, various news organizations, and so I say, 00:28:16.980 |
look, don't try to hold out for money on the training side of 00:28:20.180 |
things. Because, you know, we're going to create synthetic data, 00:28:23.500 |
we're going to do all kinds of other things that are going to 00:28:25.220 |
mean that no one's particular data is really going to matter 00:28:28.980 |
what you should be is on freshness, on brand, on other 00:28:32.220 |
things, and we should work out ongoing sustaining economic 00:28:35.580 |
arrangements like that, that would be my two cents, you know, 00:28:38.780 |
suggestion for it. And I do think we want to design an 00:28:41.140 |
ecosystem that includes that. And, you know, when I was 00:28:43.860 |
involved in those conversations at OpenAI, they, they agreed 00:28:47.180 |
with that, Microsoft certainly agrees with that in terms of, 00:28:49.820 |
you know, how do we make sure that economics are fairly 00:28:52.380 |
apportioned, and so forth, for, you know, what we're doing for, 00:28:55.980 |
you know, this phase of, you know, and ongoing, but like, you 00:28:59.620 |
know, there's a current tech, new technological wave that's 00:29:02.380 |
coming, and how do you do that? So, you know, that's a messy 00:29:05.100 |
answer. But unfortunately, it's a messy subject. 00:29:06.860 |
It's pretty messy subject. Before we move to politics, I 00:29:09.980 |
just wanted to actually ask you about inflection. So is it still 00:29:14.260 |
running? Yeah. And so what basically happened, there was 00:29:18.260 |
like some transfer payment from Microsoft and a couple of the 00:29:21.620 |
people like, and then it seems like whatever that deal was, a 00:29:25.180 |
little bit seems to have been copied by Google when they did 00:29:27.820 |
this character AI thing. So just trying to get a sense from you, 00:29:30.700 |
are these deals to structurally avoid FTC scrutiny in terms of 00:29:34.660 |
the building blocks of it? Or how, how did you think about it? 00:29:38.140 |
And what is the, what is the pattern and the trend on these 00:29:41.700 |
Well, the thing I think that's happened was, you know, very 00:29:45.860 |
early days, you had things like, you know, we're doing an agent, 00:29:48.620 |
and if Pi had launched before JATCPT, it'd probably be in a 00:29:52.620 |
different circumstance. But like JATCPT got the, oh, my God, 00:29:57.580 |
Pi is the inflection agent. Yeah. And so, so by the time 00:30:01.020 |
that Pi, we got the trend right, and the, and the interest of the 00:30:05.780 |
market, right, but we got the timing, you know, too late 00:30:09.100 |
happens with startups. So it's like, okay, we need to pivot, we 00:30:11.620 |
need to pivot from a B2C model to a B2B. And we have a unique 00:30:15.300 |
model, but let's sell that to other people who already have 00:30:17.340 |
audiences, because we're not going to be able to easily grow 00:30:19.580 |
our audience. And then, you know, once we had that as a 00:30:21.900 |
conversation, there were employees like, well, we want to 00:30:23.980 |
do the direct agent thing. And that's what we want to do. And 00:30:26.180 |
we will go somewhere in order to do that. And we're like, okay, 00:30:28.860 |
how do we fund this? And how do we make that work? And how do 00:30:30.940 |
we make it work for investors? And we said, hey, there's a deal 00:30:33.380 |
structure that could work, which is, you know, with a, you know, 00:30:36.860 |
kind of outside party, you can get paid enough in a non 00:30:40.180 |
exclusive IP license and an ability to selectively hire 00:30:43.660 |
folks. And then you can dividend some of that out to investors. 00:30:47.140 |
So investors, you know, get back a, you know, kind of a one 00:30:49.780 |
X, and then kind of a ongoing position. So, you know, as 00:30:54.740 |
investors, it's great to have a kind of optionality on its B2B 00:30:57.500 |
business in order to play that out. And this is a structure 00:31:01.060 |
that works for everybody in this pivot to B2B. And that's 00:31:06.860 |
I see. Great pivot Chamath into Lena Khan. I think one of the 00:31:12.180 |
things that is quite paradoxical about your relationship with 00:31:16.780 |
David Sachs, as you both agree on something in politics, which 00:31:19.660 |
is Lena Khan, and her concepts around future competition, and 00:31:25.100 |
maybe how she's running this issue for the United States is 00:31:29.420 |
leading to basically a freeze on the market, we're seeing weird 00:31:33.780 |
deal structures, like some of the ones we're talking about 00:31:36.500 |
here that could have just been acquisitions. And I'm curious 00:31:41.300 |
your thoughts on what she you know, this sort of breakup of 00:31:45.420 |
Google. Now we're seeing that emerge at the same time that 00:31:50.300 |
they're facing the biggest existential crisis of their 00:31:52.740 |
career, which is language models competing with them. And then I 00:31:56.660 |
would say half of my Google searches have already moved to, 00:32:00.380 |
you know, you know, chat GPT, like services. So what's your 00:32:04.340 |
take on Lena Khan's approach to M&A? And what impact if it's 00:32:09.020 |
continued and sustained? Will that have on capital allocation? 00:32:12.940 |
Because I don't know what happened to the single and 00:32:15.140 |
double M&A market, but it seems to be completely gone. 00:32:19.340 |
Everything from Adobe and Figma, to other mergers that could be 00:32:24.220 |
happening are essentially frozen. So what's your take? 00:32:26.900 |
So it was funny, because I kind of made an off the cuff, you 00:32:31.700 |
know, kind of remark about Lena Khan, which turned an all news 00:32:34.740 |
I saw you on CNN, where they were like, Are you telling 00:32:40.780 |
I'm like, no, because I don't believe in that kind of 00:32:43.180 |
corruption of, of politics. The only way she's gonna learn about 00:32:46.580 |
it is she asked me or she watched this television show. 00:32:49.100 |
And, and so, so she's done a good job on the price curtail. 00:32:54.820 |
She did a good job on the anti-competes, both of which I 00:32:56.580 |
think are very good for, you know, competitive markets. The 00:32:59.660 |
problem is, I think she has a misunderstanding of these large 00:33:02.980 |
tech companies. And, for example, on the M&A thing, you 00:33:07.620 |
know, her theory is, you got to prevent the aggregation of 00:33:10.220 |
power. So you got to, you got to fight every acquisition of 00:33:13.260 |
note. And the problem, of course, is that actually quells 00:33:16.660 |
venture capital investment, because it's like, okay, part of 00:33:19.980 |
the returns is, if I'm going to invest in something that might 00:33:24.220 |
be competing with, you know, one or more of the large tech 00:33:27.940 |
companies, I need to have acquisition exits as part of 00:33:32.300 |
being able to fund enough capital to really make that 00:33:35.380 |
acquisition, you know, that, that, that, that investment 00:33:37.700 |
possible, because if it doesn't work, I want to be able to at 00:33:39.620 |
least return, recover my capital by an investment. So the right 00:33:43.100 |
way to look at it is, is there competition amongst the top tech 00:33:47.060 |
companies? Because, you know, if one of them is like squashing 00:33:49.980 |
all the other ones, that's a problem. If we're, if we're five 00:33:53.500 |
large tech companies heading to three, then I'm much more 00:33:55.660 |
sympathetic to her point of view. But we're actually five 00:33:58.300 |
heading to 10, right, or five to seven heading to 12. Because 00:34:01.860 |
like NVIDIA is now in the mix, and others, I think, are coming 00:34:05.180 |
Tesla is, you know, now over 500 billion. Yeah. 00:34:07.660 |
Yes. And so you have this ability. So the thing is, is 00:34:10.900 |
they're competing on the acquisitions, just like they're 00:34:13.620 |
competing in the market, in the marketplace. And if you're 00:34:16.020 |
trying to quell the whole thing, because your theory is, like, 00:34:20.060 |
like, they should just, you know, the startup should just be 00:34:22.180 |
able to grow up to compete. That actually means that those will 00:34:24.980 |
never get the capital that they need in order to do that, which 00:34:27.700 |
means you're actually having the opposite of your intent, right? 00:34:31.620 |
What you're doing is you're actually making there to be less 00:34:34.340 |
competition. Because, you know, capitalists can't say if I'm 00:34:37.740 |
going to put 100 million, 500 million, a billion dollars into 00:34:41.220 |
this company, I at least have a chance of getting my capital 00:34:43.940 |
back, or I can possibly create a competitor. And that's, that's 00:34:47.580 |
the reason I was speaking out against it as an expert. 00:34:49.980 |
It was like, interesting. I saw you, I think it was a Jake 00:34:54.540 |
Taper, Tapper, who you kind of grilled you on, I thought you 00:34:58.140 |
did an exceptional job of just saying, Listen, I made a 00:35:00.180 |
donation. This is how I feel about it. But obviously, she's 00:35:02.700 |
going to do what she wants to do. And that's just how politics 00:35:06.620 |
works. So I thought that was actually pretty well done. And I 00:35:09.260 |
actually appreciate you fighting for more M&A, because it'd be 00:35:12.300 |
great for the industry. It's actually you want to throw up a 00:35:17.940 |
just just just to stick on Lena Khan for a second. So I agree 00:35:22.100 |
that her approach has been overly broad and has had a 00:35:26.020 |
chilling effect on M&A. And so Jake, like you said, we've lost 00:35:30.180 |
those base at acquisitions that I think are important to the 00:35:33.940 |
venture capital market that help new startups get funded. I mean, 00:35:37.020 |
if the returns on risk capital go down, there's going to be 00:35:40.540 |
Yeah, so I agree with read on that the area where I'm not sure 00:35:46.300 |
we agree is, and where I do agree with Lena Khan is I do 00:35:49.740 |
think the big tech companies have too much power. I do think 00:35:52.860 |
that they are monopolies or have monopolies. And I do think they 00:35:56.740 |
need to be controlled. I just think that, you know, I 00:36:00.100 |
wouldn't prevent them from doing any M&A whatsoever. I'm curious 00:36:03.060 |
if read, like agrees with that, that the big tech companies have 00:36:06.180 |
too much power, or agrees with Lena Khan on that. And I guess 00:36:10.260 |
specifically, do you think any big tech companies should be 00:36:12.260 |
broken up? If so, which ones? I mean, I, I would actually 00:36:15.620 |
entertain that idea of, of, of deconglomerating or breaking up 00:36:19.940 |
some of these big tech companies. Do you do you think 00:36:21.940 |
do you agree that big tech is too much power or not? 00:36:26.620 |
I think it's TBD. But the reason I'd say let's take the opposite 00:36:32.100 |
point of view would say no. The thesis for no, is that they are 00:36:38.020 |
very strong American companies that get, you know, in the most 00:36:42.180 |
cases, over half the revenue from overseas. They create 00:36:46.900 |
technology platforms that beneficially differentiate the 00:36:50.900 |
US versus, you know, many other countries and kind of global 00:36:56.420 |
circumstances, like the internet and other kinds of things. I 00:36:59.700 |
think that they are competing ferociously with each other. I 00:37:02.100 |
mean, you know, Jason just mentioned that, you know, it's 00:37:06.260 |
kind of like, look, we've already got like chat GPT 00:37:08.740 |
competing with Google search, and other kinds of things. And I 00:37:11.140 |
think it's competitive pressure, right? This is, I think what 00:37:13.860 |
capitalism is about is competitive pressure, that 00:37:16.500 |
essentially creates the thing. And that's the reason why, like, 00:37:18.740 |
if it's, if we were shrinking, like it was Google, Uber, Alice, 00:37:23.300 |
or actually, frankly, think that the, you know, everyone likes to 00:37:26.100 |
talk about, you know, Google, you like, I think that the prime 00:37:29.540 |
candidate is likely to be and I'm speaking as an individual, 00:37:32.340 |
and as a venture capitalist here is Apple with the App Store. 00:37:35.460 |
Right. Okay, so wait, so that brings up an interesting point. 00:37:38.500 |
One of the things we've talked about in this pod is that we 00:37:40.660 |
shouldn't shut down M&A, but the the FTC should limit 00:37:45.460 |
anti-competitive tactics by these big tech companies. 00:37:48.900 |
Apple, really good example, because they drive everything 00:37:53.300 |
through the App Store, you're not allowed to do side loading, 00:37:55.780 |
they want to take what is a 30% piece of any sales, you're not 00:38:01.060 |
even allowed to have a link inside an application to drive 00:38:03.940 |
accepting your website. Yeah, you're up now you can. So would 00:38:08.900 |
you at least want to crack down on those anti-competitive 00:38:11.860 |
tactics? Yeah, no, for sure. And look, especially when, you 00:38:15.620 |
know, we all know it's nonsense. It's like, look, you could just 00:38:18.340 |
give the consumers the option to, to, to allow side loading, 00:38:22.340 |
you could just say, it's technically very simple to do. 00:38:25.220 |
And you can say, look, we, we don't want you to side load, 00:38:29.460 |
because we view it to be safety and security. But we're giving 00:38:33.380 |
you the option. Right? Fine. Give people the option. Right? 00:38:38.580 |
Reed, were you surprised that then the first target where 00:38:41.940 |
there was like some successful antitrust pushback was against 00:38:45.060 |
Google versus Apple? And then second, do you think that 00:38:48.260 |
there's a chance like a meaningful chance that the 00:38:52.100 |
government tries to break Google up? Or do you think it 00:38:54.980 |
looks something maybe more similar to what happened to 00:38:57.300 |
Microsoft? So I think in mandating breakups, you know, 00:39:03.620 |
like I think is a look, I think we should operate through 00:39:07.780 |
competitive networks and competitive ecosystems. I think 00:39:09.860 |
it's part of what's smart about capitalism. And I think 00:39:12.180 |
mandating breakups is only when essentially capitalism is 00:39:15.140 |
failing on specific things, you want to do the least, the least 00:39:19.700 |
you can, to get back to competitive networks, in terms 00:39:23.540 |
of how you're operating. And so, you know, you say, hey, look, 00:39:26.820 |
I iOS has this kind of monopoly, and you say, there's no side 00:39:29.380 |
loading, you have to use App Store, you have to use the 00:39:30.900 |
payment mechanism, you know, etc, etc. It's like, well, that 00:39:33.620 |
quells a ton of startup innovation. We all know this as 00:39:36.900 |
investors, because we look at anyone who's prospectively doing 00:39:40.020 |
a business like this and say, no chance, it's you know, you're 00:39:43.220 |
not going to succeed. And so so then you say, well, what's the 00:39:46.500 |
least thing that we can do? Right? And you know, a classic 00:39:50.180 |
and you're like, well, let's break off the App Store from 00:39:52.740 |
Apple as well. Right? unclear that that would really fully 00:39:56.260 |
work. You know, that's like socialism mandating how the 00:39:59.460 |
thing should work. Let's try to get it so that we allow 00:40:01.700 |
competition to determine these things. And like, for example, 00:40:05.620 |
saying, hey, like, you have to allow consumers the option of 00:40:09.060 |
side loading, you have to allow consumers the option of 00:40:11.140 |
installing an alternative App Store, right? Like that kind of 00:40:14.740 |
stuff. I think, you know, what's the minimal set? I think 00:40:16.900 |
that's the kind of intervention we want to have. Because I think 00:40:21.700 |
what but why do you think that the case against Apple has made 00:40:27.460 |
I think it's kind of it's less politically easy, right? Like, 00:40:32.100 |
it's kind of like, everybody loves our iOS phone. And, you 00:40:35.700 |
know, there's, there's, there's less of a blue and red, you 00:40:39.940 |
know, kind of combo tackle, where, you know, the, the blue 00:40:44.420 |
feet, people are like, big companies, less offensive, 00:40:48.340 |
basically. Yes. Yes. More stylish. They're, they're 00:40:51.380 |
prettier. I kind of like your approach, though, with the App 00:40:54.820 |
Store, if you were to think of least harm to the ecosystem, 00:40:58.980 |
Epic Games has their own App Store for games, they charge 00:41:02.260 |
88%. They give a I'm sorry, they give developers 88%. They 00:41:06.020 |
only take 12. And forcing Apple to allow, you know, a startup to 00:41:10.980 |
do an App Store would solve the entire problem. And it seems 00:41:14.740 |
like that's where it's going to go. And all five of us would 00:41:16.980 |
invest instantly in an App Store that would say 0% take rate, 00:41:22.100 |
and all advertising based. What a great idea that would be. 00:41:25.380 |
Yeah, I have a question. A few weeks ago, you said something to 00:41:30.900 |
the effect very publicly that you had had a one hour or 00:41:33.540 |
multi hour lunch with Biden. And he just seemed like super on 00:41:37.620 |
his game. And then he was kind of dumped. Was that just a 00:41:43.220 |
moment in time where he was really great with you? Or how do 00:41:47.940 |
you reconcile that with Pelosi and all of these other folks? 00:41:51.300 |
And what happened to Biden? Well, like most of us, I was 00:41:56.980 |
pretty dismayed by the debate performance. Because when I 00:42:01.940 |
talked to him, like detailed, thoughtful analysis with no 00:42:05.300 |
notes on Gaza questions about AI, you know, and what kinds of 00:42:09.540 |
things, you know, what did I think about what the progress, 00:42:12.180 |
you know, the thing they were doing with the voluntary 00:42:14.340 |
commitments, the executive order, and, you know, what kinds 00:42:16.740 |
of things should happen in the future, and all that kind of 00:42:18.980 |
stuff being on the game a little slower, right, then, then, you 00:42:22.420 |
know, a 50 year old would be but but, you know, like, cogent and 00:42:26.660 |
totally worth it. And then you kind of looked at the debate. 00:42:31.140 |
Oh, my gosh, this is, this is, this is a disaster. And so it 00:42:37.940 |
was like, look, is the debate a one off thing? Is it? Did you 00:42:40.420 |
were you ill, you know, like trying to reconcile the two and, 00:42:44.020 |
you know, spend a little bit of time trying to figure that out 00:42:47.380 |
to to what was going on, because it was it was the first time I'd 00:42:50.820 |
seen something like that. And, you know, you know, I don't, you 00:42:56.100 |
know, I'm not enough of a DC insider to know exactly what the 00:43:00.260 |
set of conclusions were other than I, you know, I plotted, you 00:43:03.540 |
know, Biden for having the kind of integrity to go look, I'm 00:43:06.340 |
maybe I'm ill, maybe I'm old, maybe I'm slower, but, you know, 00:43:09.140 |
the it's about the country more than it's about me, because I'm, 00:43:12.660 |
you know, not, you know, it's, it's, it's important to be about 00:43:16.100 |
the country, not about yourself. I'll, I'll step aside. And 00:43:18.900 |
ultimately, his decision, there's nothing that anyone can 00:43:21.460 |
force Pelosi couldn't force it anyone else. It's ultimately his 00:43:24.020 |
decision. He came to that decision. So do you want that? 00:43:27.700 |
Do you think that they should have run an open primary after 00:43:30.660 |
that? And would Kamala have won an open primary? 00:43:34.580 |
Well, it's hard to know. I mean, I think they were definitely 00:43:37.940 |
leaning towards an open primary, and then all the people who 00:43:40.180 |
would be the most natural contenders all endorsed Kamala. 00:43:42.820 |
So and by the way, you say, well, kind of democratic process 00:43:46.660 |
was like, well, there was a democratic process that picked 00:43:48.580 |
the Biden Harris ticket, which turned into the Harris waltz 00:43:50.660 |
ticket. And so that's not anti democratic. But I think if you 00:43:54.580 |
look at the sequence of events, it was kind of like, well, you 00:43:57.540 |
know, we're, we're going to sort out, you know, what we're 00:44:00.020 |
going to do. And then, you know, all of the key folks, you know, 00:44:03.780 |
Shapiro, and, and Whitmer, and everyone else all endorsed 00:44:07.140 |
Kamala was like, okay, let's just let's get back to, you 00:44:10.180 |
know, kind of the, the choice of two candidates. And so I, you 00:44:13.220 |
know, do you feel the voters felt? Do you think the voters 00:44:17.220 |
felt left out? Um, the democratic voters? Well, I mean, 00:44:24.900 |
from post fact seems not right with the level of kind of 00:44:28.020 |
energy and all the rest, it seems that that that, um, that 00:44:33.620 |
the that, you know, like the with the pure polling and kind 00:44:38.420 |
of level energy and kind of what's going on. They're happy 00:44:41.060 |
with what they got. Yeah, they're happy with what they 00:44:42.580 |
got. I would have liked to have that speed run. Do you think it 00:44:44.900 |
sets a bad precedent that there were these back room 00:44:49.700 |
conversations, obviously, the staff of Whitmer, more Shapiro, 00:44:56.100 |
their office speaks with Democratic Party leadership 00:44:58.500 |
speak with big donors. And there was effectively a 00:45:01.140 |
coalescing that took place over a period of time that said, we 00:45:04.580 |
should all stand behind and endorse one person instead of 00:45:07.220 |
infighting and creating a split in the party. And does that not 00:45:11.380 |
set a bad precedent that there is a small group of people in 00:45:14.740 |
either party that in a primary process effectively get to 00:45:18.660 |
nominate their candidate, get their candidate to become the 00:45:21.620 |
nominee. And therefore, there's only two people for the country 00:45:24.500 |
to choose from. And as we have seen recently with RFK Jr. And 00:45:27.940 |
the lawsuits against him in being on the ballot in different 00:45:31.220 |
states, it makes it very difficult, maybe for the people 00:45:34.180 |
to have their choice. And is that a bad way for democracy to 00:45:37.940 |
work? And I just love your philosophical view on this. I'm 00:45:40.580 |
like, what's the best way for democracy in the United States 00:45:42.820 |
to work? So for the president, for the president, yes, we do 00:45:47.860 |
live in a republic, right? And there is various, like, you 00:45:51.780 |
know, some people have much more influence than others, 00:45:54.020 |
whether it's media platforms, whether it's, you know, 00:45:58.340 |
economics and ability to spend whether it's, you know, history 00:46:01.940 |
and a brand and, and, and other things. And so, you know, this 00:46:06.100 |
melee and, and, and kind of, you know, whole integration set 00:46:10.180 |
of things. Now, ultimately, you know, voters are going to 00:46:13.060 |
decide in November, right? So, you know, people do have a, you 00:46:16.500 |
know, and I think that that that staying to our democratic 00:46:19.780 |
process is what's really key, like, you know, people going to 00:46:23.460 |
the polls, you know, I think we should want to live in a 00:46:25.860 |
country where everyone does, you know, everyone who is who is 00:46:28.820 |
legally allowed to vote does vote. And I think that that's, 00:46:32.740 |
you know, ultimately a good thing. Now, you know, are there 00:46:35.220 |
things that I would like to change? Sure. I'd like to change. 00:46:38.180 |
I'd like to have rank choice voting. You know, I'd like to 00:46:40.660 |
have open primaries. There's a set of things like, like, 00:46:43.540 |
actually, my principal frustration and all this stuff 00:46:46.100 |
is, you know, what's, what's one of the fundamental things that 00:46:48.740 |
the two parties agree on that, that that that shouldn't be is 00:46:52.900 |
that there should be only two parties, right? And I think 00:46:56.020 |
that's, I think that's something you you need to fix. And you 00:46:59.380 |
can't fix it. Unfortunately, I think with independent 00:47:01.700 |
candidates, because because the whole system is really set up 00:47:04.580 |
for, you know, kind of two parties and independent 00:47:07.620 |
candidates are almost always spoilers one way or the other. 00:47:10.740 |
I mean, like on the RFK stuff, I understand it was a bunch of 00:47:13.300 |
Democrats who were trying to, you know, prevent him from 00:47:16.740 |
getting on the ballot. I actually prefer him on the 00:47:18.260 |
ballot because I actually think his his his anti-vax stance, 00:47:22.180 |
well, you know, really fit very well with with Trump. And so I 00:47:25.860 |
think he was more from trying to address that read because I 00:47:28.500 |
think there was a rumor that you were funding some of these 00:47:31.860 |
lawsuits to keep him off the ballot or whatever. Like, have 00:47:34.340 |
you spent any money to try to impact RFK one way or the other? 00:47:38.020 |
I wouldn't be surprised if we look at all the money that goes 00:47:42.500 |
to all the different organizations of organization x 00:47:44.820 |
kind of had some kind of ballot thing. Mine's my voice 00:47:49.700 |
instruction was always like, No, no, no, don't do that. That's 00:47:52.660 |
anti democratic. But you know, you can't control everything, 00:47:55.860 |
just like you invest in a company and CEO, sometimes the 00:48:00.980 |
Because you give money, you give money to folks that then 00:48:04.180 |
execute their own strategy. So you can't control on the ground 00:48:09.140 |
So there's that happens that you're like, No, don't do that. 00:48:12.180 |
Well, okay, that's a good, that's a good segue. Let's talk 00:48:16.660 |
about the five cases against Trump. There are five lawsuits. 00:48:20.420 |
No, hold on, Jake. Can we just stay on this topic for a second? 00:48:23.540 |
I think this is important. Okay, so in in Michigan and 00:48:26.820 |
Wisconsin, you had democratic groups, they fought RFK juniors 00:48:32.340 |
bid to get on the ballot. Okay, they failed. Now he wants to get 00:48:36.180 |
off the ballot, but they won't take him off. Now that they 00:48:39.140 |
think that his presence hurts Trump. And at the same time, 00:48:43.300 |
Michigan's trying to remove Cornell West, and Wisconsin 00:48:46.900 |
trying to remove Jill Stein. So I'm curious, do you think 00:48:49.700 |
there's any principle on display here besides naked partisan 00:48:53.940 |
hackery? I mean, basically, the democrats fought having third 00:48:59.300 |
parties on the ballot when they thought it would hurt Biden. 00:49:02.820 |
And now they want to keep them on the ballot when they think 00:49:06.420 |
it's going to hurt Trump, except for those third party 00:49:09.300 |
candidates who they still think will hurt Harris. So what is 00:49:12.900 |
there any principle here? Or is this just partisan hackery? 00:49:15.860 |
I think it's, you know, I frankly, you know, think that 00:49:19.780 |
everyone who follows the legal process to get on the ballot 00:49:22.260 |
should be on the ballot. And, you know, we should follow the 00:49:24.420 |
legal process. I'm very much of a legal process kind of person. 00:49:28.580 |
What I'm opposed to is like, you know, calling Raffensperger 00:49:31.060 |
and asking for 11,000 votes, right, which is not legal. 00:49:35.060 |
Right. So, so like, yeah, sure. That is that is that bad? And 00:49:39.140 |
do I advocate against that? The answer is absolutely yes. But 00:49:44.660 |
But if the Secretary of State of Colorado throws Trump off the 00:49:47.860 |
ballot, for example, is that legal process if it's then 00:49:50.980 |
overruled by the Supreme Court? Or can we just say, 00:49:53.300 |
substantively, that states shouldn't be removing candidates 00:49:58.340 |
Well, but you want them to remove RFK from the ballot? 00:50:06.980 |
No, the rule is that, well, first of all, I don't think that 00:50:10.740 |
Democratic groups should be suing RFK to keep him off the 00:50:14.340 |
ballot. And that's what he said is that Democratic groups were 00:50:17.380 |
suing to keep him off the ballot. And they were trying to 00:50:19.380 |
exhaust his resources. So he couldn't mount an effective 00:50:22.180 |
campaign. And some of those groups you funded, right? So 00:50:25.460 |
maybe you don't know what they were doing. But in any event, I 00:50:28.340 |
consider that to be anti democratic. RFK is now trying 00:50:30.980 |
to remove his name from the ballot. I think as a candidate, 00:50:33.060 |
you're allowed to do that. And those same groups that once 00:50:36.660 |
fought to keep him off the ballot are trying to keep his 00:50:39.700 |
name on the ballot. Because now they perceive, yeah, because 00:50:42.660 |
now they perceive the political calculation to be a little 00:50:45.060 |
different. So I don't see any of this as being democratic. This 00:50:51.460 |
Yes, fundamentally, from a viewpoint of like, for example, 00:50:57.860 |
my direct actions, and yes, there was, you fund a whole 00:51:00.980 |
bunch of different groups, and you have different groups doing 00:51:03.380 |
different things, but you funded them to do this thing that you 00:51:05.460 |
were thinking of, and things happen, just like companies. You 00:51:08.980 |
know, my thing was actually, in fact, making people aware of, of 00:51:14.100 |
RFK is anti vax, you know, statement is anti science stuff, 00:51:18.100 |
because I thought that would be relevant in the polls in 00:51:20.820 |
November, that that was the actual strategy that that I 00:51:24.340 |
believe, and I think that would differentially, you know, hit 00:51:28.260 |
Trump more. And so therefore would be a spoiler, right, as 00:51:32.180 |
these are, I have no problem with drawing attention to 00:51:35.220 |
issues. But I do think fundamentally, it's anti 00:51:37.460 |
democratic to sue third party candidates to the point where 00:51:40.660 |
they can't be on the ballot. Okay, let me ask you directly, 00:51:43.940 |
Cornel West. There's an effort right now to remove Cornel West 00:51:48.820 |
from the ballot in Michigan. Do you support that? Or would you 00:51:54.020 |
By default, I would oppose it. I don't know any of the details. 00:51:58.980 |
Reid, I have a I have a question for you. It's more of a 00:52:03.140 |
statement, actually, maybe I just love to get your reaction. 00:52:05.300 |
One of the most divisive issues that we have right now is 00:52:10.500 |
people's position on October 7, Israel, Palestine. There is a 00:52:17.700 |
sense that there's a growing kind of like virulent strain of 00:52:22.180 |
anti semitism in America. A lot of people point to the extreme 00:52:25.620 |
left as where that's really gestating. There was thoughts 00:52:30.100 |
that Josh Shapiro would have been an exceptional candidate. 00:52:32.980 |
But one of the large reasons why he was not really meaningfully 00:52:36.980 |
considered was his religion. I just want you to comment on the 00:52:41.140 |
broad issue. And whether you see it in the Democratic Party, 00:52:44.180 |
whether you see it in the Republican Party, whether you 00:52:46.100 |
see it at all, just give us a sense of where we stand 00:52:50.340 |
Well, so like I know, Josh Shapiro, I think he's great. 00:52:54.580 |
You know, I've had I've broken bread with them. And, you know, 00:52:59.540 |
he was meaningfully considered. You know, I think that the, you 00:53:04.900 |
know, I think we should be so lucky that he would, you know, 00:53:07.620 |
run for presidency someday, some year. You know, I actually 00:53:12.500 |
didn't know Walt at all. And, and, and, you know, was 00:53:16.740 |
initially kind of surprised because, you know, I was like, 00:53:19.700 |
Oh, I thought it was probably gonna be Shapiro. And I was 00:53:21.940 |
like, Well, you know, I think it was, you know, probably a 00:53:24.020 |
close call down to those two. And, and it looks like, you 00:53:27.060 |
know, you know, in making decisions, I think, you know, 00:53:30.660 |
Harris, you know, made a good decision of Walt's. So, you 00:53:34.500 |
know, I think it's a, you know, now on the, on the 00:53:38.020 |
antisemitism topic, I do worry that, you know, broadly, we're 00:53:42.980 |
seeing, you know, kind of more rise of antisemitism. And 00:53:48.900 |
that's extremely important to fight. You know, because I 00:53:52.980 |
think, and I think there are people on in the, you know, 00:53:56.180 |
it's, it's a weirdly like, like, there's some lefties are 00:53:59.620 |
doing it, and there's some righties are doing it's both a 00:54:01.940 |
blue and a red issue in different, different shape. And 00:54:05.300 |
I think it's very important that we, you know, we stand 00:54:08.660 |
against that as a country. And so, you know, I've been, you 00:54:14.100 |
know, kind of mostly just trying to say, Hey, look, we 00:54:16.340 |
gotta, we gotta be anti, anti racism, antisemitism, and also 00:54:20.580 |
anti genocide. And we got to figure that out. 00:54:22.820 |
What do you think of Kamala's handling of that issue in her 00:54:26.100 |
speech? She basically seemed to, I don't know, say both sides 00:54:30.420 |
it, but she said, Hey, you can believe that the people of 00:54:33.700 |
Gaza should be treated more humanely. And that, you know, 00:54:36.900 |
Israel has a right to defend herself. What do you think of 00:54:39.940 |
I think that's rational, right? Like, you should be anti 00:54:42.340 |
genocide, both of Palestinians and of Jews, right? And, and 00:54:47.620 |
like, like, it's obviously a very, very thorny topic. Yes. 00:54:51.540 |
Right. So. So I think, you know, saying that I'm going to 00:54:54.820 |
try to protect civilians on both sides, anti genocide, I 00:54:57.780 |
think that's a human, caring place to be looking out for 00:55:02.900 |
Reed, do you think that, generally speaking, Marxist 00:55:06.260 |
socialist principles are taking a firmer hold on the Democratic 00:55:11.220 |
Party, and kind of those principles are starting to 00:55:14.900 |
showcase not just in the cultural phenomena that that 00:55:18.180 |
Chamath is referencing, but also in some of the policymaking 00:55:20.740 |
that's going on, and concepts of equity, rooted in concepts 00:55:27.780 |
of social justice, ultimately rooted in Marxist principles 00:55:34.340 |
So as an example, the price gouging, you know, price caps 00:55:37.380 |
on food proposal, the concept of a wealth tax, not necessarily 00:55:42.580 |
the unrealized capital gains tax, but separately attacks on 00:55:45.140 |
wealth, all of these concepts of the degradation of power 00:55:48.820 |
structure through policy. And in part, some have argued that 00:55:54.420 |
the anti semitism arises from these principles and that the 00:55:57.380 |
Jews are considered a privileged and powerful cultural 00:56:00.900 |
class. Is that is that not being observed? Do you not do 00:56:04.500 |
you not think that there's some tendencies that are emerging 00:56:06.580 |
in the Democratic Party and may be influenced by a louder far 00:56:10.020 |
left and that far left is becoming more loud and better 00:56:14.180 |
Look, I think we should speak out against both the far left 00:56:18.500 |
and the far right. I think it's important to do both. And so, 00:56:21.940 |
you know, since, you know, I'm playing the Democrat here on 00:56:27.220 |
this conversation, I'll ask you guys to play the or especially 00:56:31.940 |
Sax play the Republican and speak out against the far right 00:56:36.020 |
too. But the short answer is yes, there are amongst the 00:56:41.060 |
extreme left, that's not everybody in the Democratic 00:56:43.060 |
Party, but the extreme left, there is some like, you know, 00:56:47.460 |
misunderstandings about, you know, why it's important to do 00:56:50.340 |
defend, you know, kind of anti genocide, like from the river 00:56:53.460 |
to the sea. It's like, yeah, that's a genocidal statement. 00:56:55.940 |
Don't use that one. Right. You understand what language 00:56:58.820 |
you're using. And and to be like, look, you know, we've had 00:57:03.140 |
a great genocidal moment with, you know, World War Two, and 00:57:06.180 |
we're still trying to recover from it to, you know, questions 00:57:09.540 |
around like, like what I think is a foolish wealth tax, even 00:57:14.260 |
though it's, by the way, narrowed to like 80%. And then 00:57:16.660 |
on like the, the price gouging stuff, you know, one of the 00:57:20.900 |
things is I started scratching at it, you know, it was 00:57:22.900 |
interesting, I think this week, Kroger said, yes, we did 00:57:25.380 |
actually artificially raise prices to profit from the 00:57:29.140 |
pandemic. And, you know, and yeah, you should stop price 00:57:31.780 |
gouging. It's not quite the same thing as price capping. And 00:57:34.100 |
apparently there's laws that affect even in Florida, right, 00:57:37.700 |
or in Texas, where some of, you know, you guys are living. So 00:57:42.420 |
like, it's kind of, you know, it's like, okay, I need to 00:57:45.860 |
understand this issue in more depth, but I don't think it's 00:57:47.940 |
as simplistic as the political headlines are having it. 00:57:51.060 |
- Well, but the reason why Kamala Harris proposed the 00:57:54.500 |
price fixing proposal, price gouging, whatever you want to 00:57:56.740 |
call it, was in response to inflation. In other words, 00:57:59.700 |
we've had 20% erosion in purchasing power over the last 00:58:02.900 |
four years, Harris needs a response to that. So she came 00:58:05.940 |
forward with this new economic proposal. So it's in that 00:58:08.980 |
context, this came up, and this wasn't some proposal by the far 00:58:11.940 |
left of the party, unless you consider Kamala Harris to be 00:58:15.380 |
far left, I actually do. But okay, fair enough. But my point 00:58:18.580 |
is just, this is her proposal. And it's in response to 00:58:22.180 |
inflation. I mean, you don't, you understand what causes 00:58:25.540 |
inflation, right? It's like the government printing too much 00:58:27.540 |
money. It's not, it's not greedy corporations raising their 00:58:30.580 |
prices too much. I mean, do you agree with that? 00:58:32.500 |
- Look, I agree that you have to have good monetary policy. And 00:58:35.860 |
so I think we probably agree on that. And I think some printing 00:58:39.220 |
of money is part of the normal functioning economy, but too 00:58:41.780 |
much is bad. And I don't think, look, I think price it, look, 00:58:46.820 |
part of the reason why we just talked about antitrust stuff 00:58:48.740 |
earlier, you do have to look at places where there's a 00:58:53.220 |
possibility of kind of commanding stuff from your 00:58:55.860 |
privileged position. And like, you know, the, like, I want to 00:58:59.620 |
- We all agree, we all agree that monopolies have to be 00:59:01.620 |
controlled. No, no, no debate there. But that's not what's 00:59:04.500 |
caused the inflation, right? Because we've had inflation of 00:59:06.500 |
commodities, not just monopoly products, but commodities, like 00:59:09.860 |
just food staples, eggs, you know, chicken, stuff like that. 00:59:14.180 |
- Driven by fuel and labor and all the other inflationary, you 00:59:18.420 |
know, underpinnings of those markets. And I think we tried to 00:59:22.260 |
highlight that. I don't know if you saw Elizabeth Warren's 00:59:23.940 |
interview on CNBC where she got taken apart because she made 00:59:26.820 |
some claims about profiteering by Kraft Heinz and the CNBC 00:59:30.420 |
anchors pointed out you were actually incorrect. Kraft Heinz 00:59:33.220 |
has seen a reduction in profit over this period of time. And so 00:59:36.980 |
like there were factual inaccuracies in these belief 00:59:39.220 |
systems. But, you know, for me, it feels a lot like the 00:59:42.180 |
government setting prices in free markets is one of those 00:59:45.700 |
steps towards socialist principles that worry me the 00:59:48.740 |
- Yeah. And look, I, generally speaking, as I was saying 00:59:52.260 |
earlier, I'm like, like, make sure the network sets, sorts it 00:59:59.780 |
- So, so it's kind of like, you have to look at, is there a 01:00:02.660 |
place where you're like going, okay, that's the reason I like 01:00:04.900 |
focused, her words were price gouging. And if you're focused 01:00:08.020 |
on the kind of gouging side of it is like, oh, there might be a 01:00:10.580 |
market inefficiency that you're essentially correcting, then 01:00:14.180 |
that's, I think the same kind of thing we were talking about 01:00:16.500 |
with like the FTC and the Apple app store and so forth. If 01:00:19.940 |
it's like the, I'm just going to set a fixed price on eggs, 01:00:23.220 |
right? That's a bad idea. And by the way, there's, there's bad 01:00:26.420 |
ideas, like the wealth tax thing that I, that I disagree 01:00:29.300 |
with. Her, her economic thing also had housing, which I think 01:00:32.100 |
is a, you know, a good, you know, kind of thing to kind of 01:00:34.740 |
lower costs for Americans and, you know, kind of make that kind 01:00:38.820 |
of stable work. Like, I think she's, she's been good on 01:00:42.660 |
immigration. I think that's the, the Lankford cinema bill, 01:00:46.260 |
which was from, you know, the, the, the Republican side was 01:00:49.940 |
something they were fully prepared to endorse. And, you 01:00:52.980 |
know, Trump killed it because he wanted to campaign on it. 01:00:55.060 |
It's like, look, we care about the actual running in the 01:00:57.300 |
country. And so you look, I think there's a bunch of good 01:01:00.260 |
things, but if you said, do I defend price capping? The 01:01:02.900 |
answer is not as an independent principle by itself. And by 01:01:06.500 |
the way, are there people lefties, like, you know, a lot 01:01:09.220 |
of what Elizabeth Warren says about capitalism, I disagree 01:01:11.860 |
with, right. I mean, I could disagree with you on the 01:01:14.020 |
border. I think, you know, Kamala Harris used to be 01:01:16.580 |
considered the borders are that's gotten scrubbed. I don't 01:01:18.580 |
think she's done a great job on that, but whatever that I want 01:01:21.060 |
to go back to issues that affect Silicon Valley, 25% 01:01:24.340 |
unrealized gains tax. It seems like most of Silicon Valley, 01:01:27.940 |
almost all of it is either disagrees with this or is up in 01:01:30.820 |
arms about this. I think J Cal, you would you said that this 01:01:34.020 |
is disqualifying and disqualifying for me for sure. 01:01:36.820 |
Yeah. So I mean, do you agree that a large unrealized gains 01:01:41.060 |
tax 25% would be a disaster for Silicon Valley and the whole 01:01:45.540 |
startup ecosystem? Or I mean, how do you come down on that? 01:01:48.420 |
Well, as I understand it, on that taxes is proposed is you 01:01:53.300 |
have to have 80% of your net worth. That's right. Liquid. 01:02:02.100 |
no, you get to defer the tax, but there's a penalty. 01:02:04.980 |
Yeah, you get to defer the tax as a penalty. That's right. 01:02:07.940 |
Look, I think it's definitely a quelling impact. And it's 01:02:10.500 |
definitely stupid and definitely shouldn't happen. 01:02:12.340 |
You know, so is it? Yeah, I think we got your position on 01:02:17.860 |
Why isn't it? Why isn't it disqualifying? The way that J 01:02:24.580 |
cow says, are we just supposed to hope that doesn't do what 01:02:28.660 |
she is going to do? I'll tell you why I think that both the 01:02:32.500 |
republicans and the democrats have realized that there's 01:02:36.420 |
actually very little difference on a lot of the major things 01:02:40.180 |
that they actually talk about. So what they're both being 01:02:44.420 |
forced to do is realize that because the centrality of a 01:02:47.860 |
bunch of the things they say are the same, they each have to go 01:02:50.820 |
to their flanks to get the end plus one vote. And so Kamala 01:02:55.220 |
goes to the left and spouts all this stuff that seems so 01:02:58.580 |
socialist or socialist or communist because she has to get 01:03:02.420 |
those people to vote for her. Ultimately, I think what ends 01:03:06.980 |
up happening is most of the stuff in the middle has a decent 01:03:10.260 |
chance of happening. The stuff at the fringes, I think they 01:03:13.380 |
get put up sacks almost as like a sacrificial lamb. A good 01:03:16.980 |
example, I think, is like all of the stuff that's happening 01:03:19.540 |
with the student loan reform, a half a trillion dollar plan, it 01:03:23.460 |
gets shot down by the Supreme Court, this new plan, another 01:03:26.660 |
$100 billion, not even being heard yet by the Supreme Court. 01:03:30.580 |
So I think they know this. I mean, it's not like the Biden 01:03:33.780 |
administration is dumb. The Trump administration is not 01:03:36.580 |
them either. So I think what they're doing is 10 million 01:03:38.820 |
people would be an example on the right and taking away a 01:03:41.300 |
woman's right to choose would be the other one. Yeah. And by 01:03:44.180 |
the way, one of the things you keep bringing that up, but 01:03:46.420 |
Trump has said that he would veto he would not support a 01:03:49.380 |
national ban. I'm talking about already doing he already he 01:03:52.260 |
already overturned. I'm talking about that. Yeah. And by the 01:03:54.980 |
way, just returning issue to the states. It's not outlawing 01:03:57.460 |
abortion. And by the way, as the people in Austin, Texas, 01:04:02.340 |
well, but that's a valid initiative. They have not had a 01:04:07.540 |
valid initiative. They're just about everywhere. There's been 01:04:09.460 |
a valid initiative. The pro choice forces have won. And 01:04:12.660 |
besides, that's a state issue. Now, J Cal, not federal. Yeah, 01:04:15.220 |
no, it's a state issue. And Trump succeeded in taking away 01:04:17.220 |
a woman's right to choose in Texas. But one thing, by the 01:04:19.940 |
way, look, in the spirit of the all in podcast, I wanted to be 01:04:22.580 |
clear about like, there's, there's this stuff on the on 01:04:25.140 |
the Dems, and some of their economic policy for the far 01:04:28.420 |
left people that, you know, kind of, you know, they're 01:04:30.580 |
advocating for that I'm opposed to, you know, sax, I'd love to 01:04:33.300 |
hear from you, what parts of Trump's thing you're opposed to? 01:04:37.060 |
There we go. Well, I mean, I have been consistent on this pod 01:04:41.140 |
for years that I thought that the, let's call it the like, 01:04:45.860 |
extreme pro life side was not good for the Republican Party, 01:04:49.380 |
and I've been opposed to it. I don't think it's what J. Cal 01:04:52.580 |
says, I think that overturning Roe v. Wade did not abolish 01:04:55.780 |
abortion, it basically returned the issue to the states. And if 01:04:58.500 |
you look at the referenda that have happened, they've pretty 01:05:00.580 |
much all gone the pro choice direction. So I think that the 01:05:04.260 |
overturning of Roe v. Wade has actually allowed the country to 01:05:06.660 |
sort of sort out that issue, although it's not completely 01:05:09.860 |
sorted out. But look, I would not support a national 01:05:13.780 |
abortion, I would not support refederalizing the issue. I 01:05:17.940 |
think there's a lot of issues about, you know, war and peace 01:05:21.620 |
where I do not support the, you could say the establishment 01:05:25.460 |
neocon strand within the party. I do not support all these 01:05:28.900 |
interventions, I do not support these forever wars. And there 01:05:32.420 |
is a big debate in the party about that. Now, one of the 01:05:34.900 |
reasons why at the end of the day, I support Trump, is I know 01:05:38.820 |
this will strike some people as counterintuitive, but I think 01:05:45.060 |
he is the moderate within the Republican Party. He's a 01:05:47.700 |
moderate on abortion. I know, J. Cal, you're still bitter about 01:05:50.420 |
that Supreme Court case. However, he's been very, very 01:05:53.860 |
clear that he will not support national abortion ban. Moreover, 01:05:56.980 |
he took the abortion language out of the Republican platform. 01:06:00.260 |
I think he's the moderate on issues of war. He was the first 01:06:06.100 |
Republican candidate to run opposing Bush's forever wars. 01:06:10.580 |
So I give him credit on those things. On style, he may not 01:06:14.260 |
come across as a moderate, but those are style points. I think 01:06:17.300 |
on issues, he is the moderate. The issue I have with Kamala 01:06:21.060 |
Harris is I don't think she's a moderate, you know? So like, 01:06:23.940 |
just to take this 25% unrealized gains tax first, when this issue 01:06:29.060 |
came up, we were assured, well, she doesn't really believe that, 01:06:31.780 |
even though it was in the Democratic platform, and it was 01:06:34.340 |
in the Biden-Harris budget. Then people said, well, maybe it's 01:06:38.740 |
part of her platform, but it's not a priority for her. And we 01:06:42.420 |
just had one of her, like, top economic advisors come out on, 01:06:45.620 |
I think it was CNBC, defending it, and her campaign confirmed 01:06:50.260 |
that she supports it, okay? So now the argument has become, 01:06:54.260 |
well, she supports it. It is really part of the platform. She 01:06:57.700 |
would do it if she could, but she's not going to be able to do 01:06:59.860 |
it. I just don't think that's a ringing endorsement of a 01:07:03.300 |
candidate. I don't think you want to support a candidate, 01:07:05.540 |
because they're not going to be able to do what they really 01:07:08.180 |
Do you think she's a moderate? Or do you think she's a socialist, 01:07:12.420 |
you know, going to take the country very far left? 01:07:15.380 |
By the way, yeah, but what Sachs didn't address is Trump's 01:07:19.620 |
tariff policy, which is also inflationary, almost equivalent 01:07:23.220 |
to the price gouging, you know, food price caps. I think that 01:07:25.940 |
they're both inflationary, and they're both bad policy. That's 01:07:29.780 |
Tariffs is where I thought it was going to go. 01:07:32.900 |
Honestly, I'm not sure what I think of that proposal. You 01:07:39.380 |
I'm not endorsing it, but I'm not opposing it. But just back 01:07:41.940 |
to this point that should we support Kamala Harris, even 01:07:45.620 |
though we oppose all the policies that her campaign says 01:07:49.140 |
she supports? Because it seems like that's the argument now is 01:07:53.140 |
that Silicon Valley is expected to support Harris, even though 01:07:56.420 |
she wants, and her campaign has confirmed, she wants a 44% 01:08:00.500 |
capital gains tax. She wants a 25% unrealized gains tax. 01:08:04.740 |
These are things that I think the vast majority of Silicon 01:08:07.300 |
Valley considers to be disastrous for the startup 01:08:10.180 |
ecosystem. Should we support her in spite of those things? 01:08:13.540 |
Well, look, the information did an actual data poll as opposed 01:08:18.900 |
to us being talking heads saying, we say that Silicon 01:08:21.700 |
Valley does X or Y. And, you know, the information's poll 01:08:24.980 |
showed that there was, you know, much broader support for the 01:08:28.500 |
Democratic ticket than the Republican ticket. 01:08:30.420 |
Is that the thing that Ron Conway just tweeted? 01:08:34.740 |
No, no, no, no, that's different. That's a subset. 01:08:39.140 |
That's a different group. That's a group to counteract you and 01:08:45.700 |
But the information, a news source that ran a poll, you 01:08:53.540 |
know, did it objectively ran the whole thing to try to answer 01:08:55.940 |
the question, came out with more folks in favor of, you know, 01:09:04.740 |
Well, because, because look, taxes is an important issue. 01:09:07.220 |
And I think if you ask any, any Silicon Valley business person 01:09:09.540 |
to say, look, lower capital gains, promote long-term 01:09:12.260 |
investment, ask me, that's what I would say too. 01:09:14.340 |
But, you know, you kind of go, well, what actually, in fact, 01:09:17.220 |
you most need for business is stability, rule of law, not 01:09:22.180 |
grifter capitalism, where it's like, you know, give me a 01:09:24.740 |
ability to launch my own NFT, you know, et cetera, et cetera. 01:09:31.700 |
And by the way, we can navigate a higher tax rate. 01:09:34.420 |
It'll be less fast on growth and everything else, but we can 01:09:36.980 |
still invest, create businesses, you know, et cetera, et cetera. 01:09:40.660 |
But we can't do it with, you know, kind of a corroding the 01:09:46.020 |
Like, you know, I think both David, both you and Chamath 01:09:55.300 |
That's the reason why the kind of the rule of law thing is my 01:09:59.380 |
Well, let me ask you about that formally here. 01:10:15.540 |
You funded, like Peter Thiel funded the Gawker case, the E. 01:10:21.540 |
And just to ask you, why did you choose to fund that? 01:10:26.180 |
And do you believe Trump sexually assaulted E. 01:10:29.860 |
Well, it's kind of not relevant whether or not I did or not. 01:10:33.700 |
What I funded was an ability to have, you know, kind of a woman 01:10:38.340 |
who doesn't have power, who's being threatened by a rich man 01:10:40.980 |
with a lot of money and power to try to silence her, to have 01:10:43.940 |
her day in court where 12 everyday Americans, right, can 01:10:49.220 |
And their judgment was that there was an assault and there 01:10:59.060 |
And, you know, I think that that's important. 01:11:01.540 |
We, you know, the laws apply more importantly to rich and 01:11:07.620 |
That's the important about, like, one thing I love about 01:11:16.740 |
That's my red line relative to the kind of lines in the sand 01:11:20.340 |
And, you know, that's, you know, that's the reason why in 01:11:25.300 |
the various kind of lawsuits where that seemed to be that 01:11:32.660 |
I don't see how it's rule of law when you have a district 01:11:38.500 |
attorney, Alvin Bragg, who's elected on a promise to get 01:11:41.380 |
Trump, he then takes what are at most a bookkeeping misdemeanor 01:11:48.500 |
that's past the statute of limitations that's expired. 01:11:51.300 |
And he turns into 34 felony charges on a legal theory that 01:11:57.300 |
And then basically Trump is convicted in a sham trial by a 01:12:03.140 |
hyper partisan New York jury system so that Democrats can 01:12:08.420 |
On the branding, on the branding that he's a, quote, 01:12:20.580 |
I don't think it's rule of law when Trump is prosecuted on a 01:12:24.260 |
documents charge that Biden himself is guilty of. 01:12:26.660 |
He's got all these documents in his garage for decades, which 01:12:32.900 |
And we've seen a bunch of these lawfare cases where Trump has 01:12:38.020 |
The judge has thrown it out or he's won it on appeal. 01:12:40.100 |
So that seems to me like abuse of the legal system for a 01:12:56.580 |
Jake, how what's the other one besides the three days? 01:13:02.100 |
And then the Trump organization with the CFO committing tax 01:13:08.180 |
He was convicted in that one as well, or the Trump organization 01:13:11.540 |
And people say that's lawfare by Letitia James. 01:13:14.260 |
So guilty, guilty, guilty in those three of five. 01:13:17.220 |
So what's your take on the four that we haven't discussed yet 01:13:20.420 |
So look, I think it's, you know, it's definitely possible to 01:13:26.580 |
have some versions of lawfare, although I think most people 01:13:29.940 |
use the term when it's the legal process and the law 01:13:35.300 |
You know, I think that in the Bragg case, you had, you know, 01:13:42.740 |
I think, as I recall, one of the jurors said he got that 01:13:46.580 |
juror got their principal news from Truth Social. 01:13:50.420 |
I think that, you know, you have Vice President Pence, you 01:13:56.900 |
know, comes out and says, you know, Trump asked me to 01:14:04.580 |
So I don't think that that kind of suggests that there's 01:14:07.540 |
this just rampant political persecution, that there's a 01:14:10.420 |
lot of fire where there's all this smoke doesn't mean that 01:14:12.980 |
every single thing, you know, kind of Democrats are trying 01:14:23.060 |
I think if he broke laws that says he should go to jail, I 01:14:27.380 |
think the laws apply to powerful people as much as they 01:14:33.940 |
Why did they wait for two years on these cases so they could 01:14:38.500 |
Actually, I don't think if you look at the like, look, 01:14:41.780 |
speaking factually, Trump's lawyers are always trying to 01:14:45.540 |
I think they were trying to follow every legal process and 01:14:47.700 |
Trump lawyers keep asking for a campaign this year instead 01:14:51.540 |
Look, after this was this was last year and the year before 01:14:54.660 |
asking for deferrals, setting out trial time, like all of 01:14:57.940 |
the stuff was from his side trying to delay it. 01:15:01.380 |
If it got delayed into this year, that's a bad judgment on 01:15:05.220 |
Jack Smith just filed new charges, new charges, and all 01:15:11.140 |
In the wake of January six, Merrick Garland's Justice 01:15:13.140 |
Department did an analysis of whether Trump could be 01:15:16.260 |
prosecuted for incitement, whether he incited that mob. 01:15:19.620 |
And the legal memo came back and they said, no, we don't 01:15:22.500 |
It does not meet the legal bar for incitement. 01:15:24.580 |
Then it was reported by The New York Times that Biden 01:15:27.620 |
thought that Merrick Garland was basically being a wimp and 01:15:32.180 |
So the hyper-partisan DA or prosecutor Jack Smith was 01:15:35.620 |
hired and he came up with a novel legal theory that somehow 01:15:39.220 |
Trump had perpetrated a fraud in the American people, never 01:15:42.900 |
And since then, he's been prosecuting Trump and seeking 01:15:45.940 |
And when the Supreme Court just kicked the legs out from 01:15:48.580 |
under his case with a recent decision, he just refiled 01:15:52.660 |
I don't understand how anyone can look at this and say, 01:15:54.420 |
yeah, look, what happened on January six wasn't great, but 01:15:56.580 |
the DOJ looked at it, it wasn't criminal, but yet they've 01:15:59.940 |
been pursuing this guy, seeking to put him away for the rest 01:16:02.500 |
of his life, seeking to interfere with this election, 01:16:06.020 |
seeking to deprive the American people of a choice. 01:16:07.860 |
On a separate track, you've got Democrats in states like 01:16:12.180 |
Colorado literally removing Trump from the ballot. 01:16:16.420 |
- So look, the first thing is January 6th, I think is a 01:16:24.020 |
I think it's, you did incite a riot, whether or not the 01:16:33.620 |
So I think it was the, you know, there was an incitement 01:16:38.260 |
I think that the rioters went in and, you know, killed 01:16:42.900 |
police officers, were looking to kill Vice President Pence, 01:16:46.180 |
you know, from the court testimony, courts are the best 01:16:48.420 |
proxy that we have for finding truth in this stuff. 01:16:51.060 |
It's one of the reasons why, you know, by the way, and when, 01:16:53.780 |
for example, the Supreme Court says, no, that's great, 01:16:56.900 |
- I just have to fact check that no police officers 01:17:01.700 |
- I think there were, there was the one died from his 01:17:05.780 |
injuries and, you know, very soon after, and then-- 01:17:08.020 |
- No, no, no, no, there was one cop who had a seizure later. 01:17:13.220 |
No police officers were killed as a part of the riot. 01:17:17.700 |
- Well, and then there's the one who committed suicide too, 01:17:23.940 |
- Yeah, so anyway, so you got, you know, the storming 01:17:29.700 |
of the Capitol, you know, he says these people 01:17:36.100 |
He's going to hire them into his administration, right? 01:17:39.060 |
And if that's not encouragement for other people 01:17:42.500 |
- Wait, he's going to hire January 6th rioters? 01:17:45.460 |
- Yeah, yeah, well, we'll get you the Trump speech. 01:17:50.660 |
There's all kinds of wonderful things in Trump's speeches. 01:17:56.180 |
So, Reid, I think this has been an amazingly robust 01:18:16.340 |
I think that there is a bunch of misinformation. 01:18:19.860 |
I think it's important to hear maybe from Bobby, 01:18:29.940 |
'Cause I think it would be an important thing to do. 01:18:33.860 |
- We're rolling into an interview with RFK Jr. 01:18:36.900 |
- It wasn't designed this way at the last minute, RFK. 01:18:39.380 |
- It's just the last minute RFK Jr., who's on vacation, 01:18:42.180 |
said that he would talk to us about what it was like 01:18:44.660 |
to kind of withdraw and all this sort of stuff. 01:18:51.380 |
I mean, it's- - Oh, okay, totally your choice. 01:18:53.540 |
- It's one of the things I like about your all-in podcast 01:19:25.220 |
I really wanted to see a third-party candidate 01:19:50.180 |
or we're trying to take ourselves off the ballot 01:19:58.740 |
And all red, all blue states will be on the ballot 01:20:10.340 |
with a polling show that we're getting off there, 01:21:27.700 |
and pejoratives and mischaracterizations, et cetera. 01:22:13.620 |
They also, I was very, very popular with them 01:22:25.300 |
but I was never able to communicate with them 01:22:29.380 |
they get their news from the mainstream media. 01:22:32.340 |
And if you're living in that information ecosystem, 01:22:36.260 |
you're going to have a very, very low opinion of me. 01:22:59.140 |
About three hours after the shooting in Butler, 01:45:27.460 |
And that's the incentive inside of that company. 01:45:54.900 |
The shareholders are like, "Where's the money?" 01:53:13.220 |
They are the most precious things in our country. 01:53:31.460 |
something that we should all be concerned about? 01:53:48.500 |
It costs $120 billion of federal money per year. 01:53:59.540 |
And there was an important debate a few years ago 01:54:08.020 |
It's definitely a lot of things we don't agree on, 01:54:18.900 |
or whether it should be fresh fruits and vegetables 01:54:22.580 |
And ultimately there was a food lobbying effort made 01:54:27.060 |
that kept canned soda on the food stamp program. 01:54:33.620 |
with the biggest line item going to canned soda 01:54:59.220 |
when people talk about revamping the food supply, 01:55:15.060 |
that preferring that is almost racist in some way. 01:55:17.540 |
Like, can you just comment on that whole vein of thinking 01:55:20.900 |
- Yeah, I mean, I think feeding people poisonous food 01:55:26.420 |
And by the way, the NAACP gets huge amounts of money 01:55:34.420 |
I think Coca-Cola is the biggest supporter of NAACP. 01:55:39.300 |
So a lot of the NGOs that are supposed to be concerned 01:55:57.060 |
What's really racist is poisoning black Americans 01:56:02.180 |
because these are communities that are food deserts. 01:56:14.580 |
Many of these communities have no grocery stores. 01:56:24.900 |
They don't have access to those kinds of foods. 01:56:34.900 |
And, you know, they use of course market dynamics 01:56:47.140 |
to addict farmers to growing commodity agriculture 01:56:52.740 |
It's low in nutrients, it's high in chemicals, 01:56:56.740 |
And, you know, we need to change these perverse incentives 01:57:12.020 |
that we should be giving people food that is hurting them? 01:57:26.980 |
I know you've got some strong feelings on it. 01:57:30.980 |
I'm sure a lot of it's confirming for you what he's saying. 01:57:36.340 |
I think what the Democratic Party did to you, Mr. Kennedy, 01:57:47.700 |
And I'm glad that we got to have you on early 01:57:50.740 |
on our podcast, at least to let some of your ideas 01:58:01.140 |
- Well, let me pick up on those themes, Jay Cowell. 01:58:20.020 |
You've spoken about the issue of chronic health, 01:58:24.980 |
But I think you've put it now on the political radar screen 01:58:28.980 |
So I think you ran a very noble and effective campaign. 01:58:34.980 |
And I think, like you said, it was a campaign 01:58:46.660 |
these would have been Democratic Party issues. 01:58:54.180 |
They conducted lawfare to keep you off the ballot. 01:59:00.100 |
that you even tried to infiltrate your campaign. 01:59:11.060 |
I think that if they had given you the opportunity to debate, 01:59:14.660 |
I think we now know what would have happened. 01:59:20.740 |
is there's a complete implosion of Biden's campaign. 01:59:25.060 |
We discovered that, indeed, the Democratic Party 01:59:27.780 |
had been hiding his condition for a long time. 01:59:44.900 |
Kamala Harris has never received one primary vote. 01:59:47.780 |
It was done through a process that was opaque. 01:59:59.060 |
I mean, he basically said publicly over and over again, 02:00:04.820 |
He said, only God Almighty could get me out of the race. 02:00:07.220 |
And then it was reported that Nancy Pelosi went to him 02:00:09.380 |
and said, we can do this the easy way or the hard way. 02:00:19.380 |
And now we have a new Democratic Party nominee 02:00:36.420 |
I find it just almost maddening or galling again 02:00:45.620 |
It's just not the way that democracy is supposed to work. 02:00:52.420 |
engaging these tactics can be cloaking themselves 02:00:55.380 |
in all this high fluting rhetoric of democracy. 02:00:59.140 |
And so I feel like I'm in the place that you are, Bobby. 02:01:04.820 |
I did not start off supporting Trump in the primaries. 02:01:09.620 |
and I did fundraisers for DeSantis and Vivek. 02:01:14.740 |
because I think of the job that DeSantis did as governor. 02:01:27.620 |
to resist this hypocritical elite authoritarianism 02:01:34.580 |
that wants to engage in censorship over debate, 02:01:40.420 |
this surveillance state over anything it wants to do, 02:01:53.860 |
that you've kind of come around to this opinion too. 02:01:55.380 |
I know you have your reservations about Trump. 02:02:01.860 |
But at the end of the day, he is the choice that represents, 02:02:06.180 |
again, these populist forces resisting authoritarianism. 02:02:09.380 |
Sorry, this is more of a statement than a question, 02:02:19.940 |
that people are gonna see a very different President Trump 02:02:40.500 |
But I think he is, he's focused on his legacy. 02:02:57.700 |
and people descended on him the day that he got elected. 02:03:00.900 |
And said, you gotta appoint this guy, appoint this guy. 02:03:03.780 |
And he said, you know, I appointed a lot of people 02:03:14.420 |
one of the big sort of fulcrums of their terror of Trump 02:03:18.180 |
is that he's gonna implement this Heritage Foundation, 02:03:22.340 |
you know, blueprint, which is called Project 2025. 02:03:26.340 |
And he brought this issue up to me and he said, 02:03:29.380 |
you know, they always telling me I'm for Project 2025. 02:03:36.340 |
He said, I was written by a right-wing asshole. 02:03:44.500 |
and there was, there are right-wing assholes. 02:03:46.500 |
And it was a right-wing asshole who wrote that thing. 02:03:54.740 |
And I think he's interested in his legacy now. 02:03:57.780 |
He wants to leave behind some accomplishments 02:04:13.700 |
and, you know, I'm gonna be on the transition committee, 02:04:21.460 |
There's gonna be a wide diversity of stakeholders, 02:04:31.300 |
because, you know, he really did present well 02:04:33.860 |
on this podcast and had like a very good moment 02:04:56.180 |
They want, you know, post-assassination attempt Trump. 02:05:02.180 |
- Well, a lot of people feel that way, Jay Conn. 02:05:15.220 |
'Cause I think that the things you're talking about, 02:05:31.380 |
you know, that has the superficiality that you like, 02:05:36.420 |
And when we do learn something underneath it, 02:05:43.780 |
have to say, oh, well, she's not really gonna do that. 02:05:47.220 |
So the best thing you can say about her campaign 02:05:49.380 |
is that she's not gonna be able to accomplish 02:05:51.700 |
the things that she says she wants to accomplish. 02:06:01.060 |
as a protest vote and it doesn't make a difference. 02:06:11.460 |
the most troubling thing about what's happening now, 02:06:15.540 |
who have not been able to give unscripted interviews, 02:06:20.500 |
I mean, my father and uncle were so proud of, 02:06:26.660 |
to engage in debate, to defend who we were in the world, 02:06:34.740 |
and have a command of the facts and of knowledge 02:06:44.660 |
What does the rest of the world think of us right now? 02:06:51.700 |
who are not able to explain themselves in an interview. 02:06:59.060 |
the other day something really, I think, poignant, 02:07:03.060 |
which is if you want the job of handling the nuclear code, 02:07:19.460 |
to explain who you are to the American people? 02:07:24.260 |
and you can get past the anger and pass the vigil 02:07:32.180 |
to any new knowledge coming in or any contrary facts, 02:07:58.660 |
Has the destruction of the American middle class, 02:08:00.980 |
the highest inflation rate in the generation, 02:08:10.020 |
for the United States that you're so proud of 02:08:24.820 |
So we'll see, maybe she'll miraculously do 10 podcasts 02:08:33.700 |
and gave him only most favored nation interviews. 02:08:37.300 |
that we've had so many great candidates come on this pod 02:08:52.980 |
Wish you great success with Make America Healthy Again. 02:09:00.820 |
I respect the fact that you wanna make America healthy again 02:09:03.940 |
and I wish you great continued success with that. 02:09:20.120 |
♪ And instead we open source it to the fans ♪ 02:09:35.880 |
♪ That is my dog taking a notice in your driveway, Sachs ♪ 02:10:11.320 |
The All In Summit is taking place in Los Angeles, 02:10:16.360 |
You can apply for tickets, summit.allinpodcast.co. 02:10:21.320 |
And you can subscribe to this show on YouTube. 02:10:24.840 |
Our YouTube channel has passed 500,000 subscribers. 02:10:43.480 |
What I read this week at chamath.substack.com. 02:10:46.520 |
And sign up for a developer account at console.grok.com 02:11:01.560 |
Click on the careers page at ohalogenetics.com. 02:11:05.080 |
I am the world's greatest moderator, Jason Calacanis. 02:11:14.600 |
to apply for funding from your boy J Cal for your startup. 02:11:19.560 |
This is the company I am most excited about at the moment. 02:11:28.120 |
Thanks for tuning in to the world's number one podcast.