back to indexPeter Thiel | All-In Summit 2024
Chapters
0:0 The Besties intro Peter Thiel
1:3 Why he's not financially participating in the 2024 election
6:53 US relationship with China, is defending Taiwan worth risking WW3?
16:38 State of AI: Similar to the internet in 1999
24:3 Innovation stagnation in the US
29:42 Thoughts on the current state of the US economy
32:50 The higher education bubble
39:9 Who will win AI, Nvidia's monopoly position going forward
00:00:00.000 |
Peter was the person who told me this really pithy quote. 00:00:08.560 |
the biggest risk you can take is not taking any risk. 00:00:12.720 |
This guy is a tough nut to try to sort of explain. 00:00:21.280 |
which is, I believe they helped find Osama Bin Laden. 00:00:31.240 |
I think what matters is a question of agency. 00:00:42.480 |
Tell me something that's true that nobody agrees with you on. 00:00:52.600 |
You don't do this too often, so we do appreciate it. 00:00:57.640 |
But when you do do it, you're always super candid, 00:01:02.060 |
You're sitting this year's political cycle out. 00:01:08.680 |
- Well, no, I mean, I think this is a question we all have, 00:01:22.960 |
It's very confounding to us because these are your guys. 00:01:32.280 |
Look, I have a lot of conflicted thoughts on it. 00:01:35.120 |
I am still very strongly pro-Trump, pro-J.A.D. 00:01:40.120 |
I've decided not to donate any money politically, 00:01:44.840 |
but I'm supporting them in every other way possible. 00:01:57.140 |
my pessimistic thought is that Trump is going to win 00:02:09.840 |
because the elections are always a relative choice, 00:02:13.000 |
and then once someone's president, it's an absolute, 00:02:22.640 |
that one would be more anti-Harris than anti-Trump. 00:02:33.720 |
there will be a lot of buyer's remorse and disappointment. 00:02:35.880 |
That's sort of the arc that I see of what's gonna happen, 00:02:44.080 |
I think the odds are slightly in favor of Trump, 00:03:03.220 |
and you can't always line things up and figure it out. 00:03:09.060 |
or maybe the Trump voters get really demotivated 00:03:13.740 |
is simply gonna collapse in the next two months, 00:03:20.360 |
with all the headaches that come with being involved, 00:03:32.120 |
if it's gonna be a razor-thin close election, 00:03:37.320 |
because they will cheat, they will fortify it, 00:03:45.700 |
if we can answer them in the event that it's close, 00:03:55.180 |
and so that's sort of a straightforward analysis right there. 00:04:01.220 |
on a percentage basis do you think happens every year? 00:04:04.900 |
How much, and do you think Trump actually won 00:04:07.600 |
with the verbs, so you know, cheating, stealing, 00:04:11.180 |
that implies something happened, the dark of night. 00:04:16.660 |
- Okay, yeah, we don't wanna get canceled on YouTube. 00:04:19.180 |
- Ballot harvesting, I mean, it's all sort of, 00:04:35.780 |
'Cause we all want everybody's votes to count, 00:04:44.220 |
you'd try to run elections the same way you do it 00:05:03.360 |
That's the way you do it in every other country. 00:05:18.920 |
and it used to be much more like that in the US. 00:05:21.860 |
I mean, it's meaningfully decayed over the last 20, 30 years. 00:05:25.660 |
You know, 20, 30 years ago, 30, 40 years ago, 00:05:30.340 |
and that sort of stopped happening a while ago. 00:05:54.180 |
I think there are some extremely difficult problems 00:05:56.860 |
that it's really hard to know how to solve them. 00:06:12.300 |
to meaningfully reduce the deficit with no tax hikes. 00:06:18.220 |
- Well, you would do it if you got a lot of GDP growth, 00:06:21.580 |
maybe, but if you could meaningfully reduce the deficit 00:06:25.300 |
with no tax hikes, that would be very impressive. 00:06:36.180 |
and the conflict in Gaza, just sort of the warmups 00:06:42.580 |
And so if Trump can find a way to head that off, 00:06:50.500 |
that will be better than I would expect, possibly. 00:06:53.460 |
- In relation to Taiwan, if Trump called you and asked, 00:06:56.600 |
should we defend it or not in this acute case, 00:07:00.220 |
would you advise to let Taiwan be taken by China 00:07:04.740 |
in order to avoid a nuclear holocaust and World War III? 00:07:09.820 |
Or would you believe that we should defend it 00:07:14.100 |
- Well, I think you're probably not supposed to say. 00:07:19.780 |
- No, no, no, if you, look, I think there's so many ways 00:07:24.780 |
our policies are messed up, but probably, you know, 00:07:27.920 |
the one thing that's roughly correct on the Taiwan policy 00:07:32.080 |
is that we don't tell China what we're gonna do. 00:07:35.360 |
And what we tell them is we don't know what we'll do 00:07:39.820 |
which probably has the virtue of being correct. 00:07:42.820 |
And then I think if you had a red line at Kumhoi Matsu, 00:07:46.860 |
the island, you know, five miles off the coast of China, 00:07:58.300 |
So I think the policy of not saying what our policy is 00:08:03.300 |
and maybe not even having a policy, you know, 00:08:16.040 |
Worth defending or not worth stirring the conflict, 00:08:22.100 |
worth going to war over or not, according to Peter Thiel? 00:08:36.260 |
- How does the world-- - Those can both be true. 00:08:49.460 |
Think out the next decade in kind of your base case 00:09:01.300 |
maybe it escalates all the way to nuclear war. 00:09:05.420 |
Probably it's some very messy in-between thing, 00:09:12.460 |
What I think happens economically is very straightforward. 00:09:24.620 |
between the US and China, and they all blow up. 00:09:35.260 |
but what I told him and what I felt was very honest advice 00:09:49.900 |
I would get the computers out, the people out. 00:10:00.180 |
And if you think there's a 50/50 chance this happens 00:10:08.900 |
- He said that they had done a lot of simulations 00:10:30.140 |
So he seemed perfectly bright to me, even though-- 00:10:35.500 |
- A lot of bright people are wrong about this. 00:10:38.500 |
- I saw you give a talk last summer with Barry Weiss 00:10:40.820 |
and you talked about this decoupling should be happening. 00:10:45.380 |
You were recommending that every industry leader 00:10:50.100 |
I think your comment was it's like picking up nickels 00:10:54.740 |
- Well, I think there are a lot of different ways 00:11:06.140 |
There are people who try to compete within China. 00:11:08.820 |
There are people who built factories in China for export. 00:11:17.340 |
But yeah, I certainly would not try to invest 00:11:22.340 |
in a company that competed domestically inside China. 00:11:45.020 |
of building factories in China for export to the West. 00:11:54.900 |
I mean, I visited the Foxconn factory nine years ago 00:11:58.020 |
and it's, you have people get paid a dollar and a half, 00:12:02.460 |
And they live in a dorm room with two bunk beds 00:12:10.820 |
And you sort of realize they're really far behind us 00:12:15.380 |
And either way, it's not that straightforward 00:12:18.700 |
to just shift the iPhone factories to the United States. 00:12:23.340 |
So I sort of understand why a lot of businesses 00:12:28.300 |
ended up there and why this is the arrangement that we have. 00:12:32.180 |
But yeah, my intuition for what is going to happen 00:12:37.180 |
without making any normative judgments at all 00:12:43.860 |
- It presumably is, it's presumably pretty inflationary. 00:12:49.980 |
Yeah, that's probably the, you know, I don't know. 00:12:53.260 |
It's, you'd have to sort of look at, you know, 00:12:56.180 |
what the inelasticities of all these goods are. 00:12:58.540 |
- So if that's true, what's the policy reaction? 00:13:00.860 |
- It's probably not that, it may not be as inflationary 00:13:03.500 |
as people think because people always model trade 00:13:07.420 |
in terms of pairwise, in terms of two countries. 00:13:10.300 |
So if you literally have to move the people back to the US, 00:13:15.420 |
I don't, you know, I don't know how much it would cost 00:13:20.460 |
- Well, I think India is sort of too messed up, 00:13:24.460 |
there are, you know, there are 5 billion people 00:13:27.020 |
living in countries where the incomes are lower than China. 00:13:30.020 |
And so, you know, probably the negative sum trade policy 00:13:47.460 |
That's kind of, and that's kind of the negative sum policy 00:13:52.020 |
that's going to manifest as this sort of decoupling happens. 00:13:58.340 |
- Let's talk about avoiding it for a second here. 00:14:06.340 |
I mean that in like as a compliment as a superpower, right? 00:14:08.980 |
Like he doesn't have a problem talking to them. 00:14:11.700 |
He connects with them and they seem to like them. 00:14:14.740 |
So what would be the path to him working with Xi 00:14:21.300 |
Because we were sitting here last year talking about this 00:14:23.940 |
and it just seems mind boggling that if everybody agrees 00:14:29.740 |
that we can't figure out a way to make it not happen. 00:14:43.780 |
We don't exactly, I feel we just have no clue 00:14:51.620 |
But I think it's sort of the sense of history 00:14:58.940 |
that you have a rising power against an existing power 00:15:28.980 |
to avoiding the conflict would be we have to start 00:15:32.020 |
by admitting that China believes the conflict's happening. 00:15:36.540 |
- And then if people like you are constantly saying, 00:15:43.100 |
- That is a recipe, that's a recipe for World War III. 00:15:59.780 |
to the North Korean dictator, but yeah, in general, 00:16:02.400 |
it's probably a good idea to try to talk to people 00:16:06.060 |
even if they're really bad people most of the time. 00:16:25.420 |
I don't think Tucker Carlson counts as an emissary 00:16:29.860 |
And if anybody gets tuckered or I don't know what the verb is 00:16:32.940 |
who talks, that seems worse than the alternative. 00:16:46.140 |
some of the misguided things we've done in the past 00:16:48.400 |
in the name of technology and use like big data 00:17:10.640 |
and machine learning, big data, cloud computing. 00:17:15.640 |
I'm gonna build a mobile app, bring the cloud to, 00:17:18.760 |
if you have sort of a concatenation of buzzwords, 00:17:22.980 |
my first instinct is just to run away as fast as possible. 00:17:28.560 |
And for many years, my bias is probably that AI 00:17:37.540 |
the last generation of computers, anything in between. 00:17:40.900 |
So it's meant all these very different things. 00:17:47.300 |
probably the AI, to the extent you concretize it, 00:17:54.900 |
by the two books, the two canonical books that framed it. 00:17:58.660 |
There was the Bostrom book, "Superintelligence 2014" 00:18:16.200 |
where basically AI was going to be surveillance tech, 00:18:21.820 |
because they had no qualms about applying this technology. 00:18:25.540 |
And then if we now think about what actually happened, 00:18:37.380 |
which was actually what people would have defined AI as 00:18:46.380 |
It's a computer that can pretend to be a human 00:18:48.860 |
or that can fool you into thinking it's a human. 00:18:57.600 |
you could say that pre-chat GPT wasn't passed 00:19:05.860 |
And then obviously it leads to all these questions. 00:19:29.660 |
And it's probably, certainly the big picture questions, 00:19:33.860 |
which I think Silicon Valley's always very bad 00:19:39.060 |
Sort of the, I don't know, the stupid 2022 answer 00:19:42.420 |
would be that humans differ from all the other animals 00:19:46.740 |
If you're a three-year-old or an 80-year-old, 00:19:48.420 |
you speak, you communicate, we tell each other stories. 00:19:53.960 |
And so, yeah, I think there's something about it 00:19:56.220 |
that's incredibly important and very disorienting. 00:20:02.140 |
I don't know, the narrower question I have as an investor 00:20:03.780 |
is sort of how do you make money with this stuff? 00:20:13.820 |
this is always where I'm anchored on the late '90s 00:20:47.820 |
who's gonna have pricing power is super unclear. 00:20:53.500 |
Probably, you know, one layer deeper of analysis, 00:20:59.780 |
you need to pay attention to who's making money. 00:21:01.980 |
And in AI, it's basically one company is making, 00:21:09.900 |
And so there's sort of a, you have to do some sort, 00:21:19.820 |
You know, is it, you know, my monopoly question, 00:21:24.900 |
You know, and then I, it's hard for me to know 00:21:27.260 |
because I'm in Silicon Valley and I haven't done anything, 00:21:29.420 |
we haven't done anything in semiconductors for a long time. 00:21:32.580 |
- Do you, if you, let's de-buzzword the word AI 00:21:45.460 |
What do you, have you thought about what it means 00:21:47.740 |
for, you know, 8 billion people in the world, 00:21:50.300 |
if there's an extra billion that necessarily couldn't work 00:21:53.900 |
or like whether that in political or economic terms? 00:21:57.180 |
- I don't know, the, I don't know if this is the same, 00:22:04.300 |
but this is, you know, the history of 250 years, 00:22:07.540 |
the industrial revolution, what was it, you know, 00:22:15.900 |
You know, maybe there's, you know, there was, yeah, 00:22:18.620 |
there was a, I don't know, there was a Luddite critique 00:22:25.820 |
because the machines would replace the people. 00:22:27.820 |
You know, maybe the Luddites are right this time around. 00:22:30.740 |
I'm probably, I'm probably pretty, pretty skeptical of it, 00:22:34.540 |
but, but yeah, it's, it's, it's extremely confusing, 00:22:38.020 |
you know, where, where the gains and, and losses are. 00:22:46.900 |
You can always just use it on your hobby horses. 00:22:49.220 |
So I don't know the, you know, my anti-Hollywood 00:22:51.460 |
or anti-university hobby horse is that it seems to me 00:22:54.820 |
that, you know, the, the AI is quite good at the woke stuff. 00:22:59.140 |
And it'll, and, and so, you know, if you want to, 00:23:05.100 |
or a little bit sexist, or just really funny. 00:23:08.260 |
And you won't have any risk of the AI replacing you. 00:23:12.700 |
Everybody else will get, everybody else will get replaced. 00:23:15.460 |
And then probably, I don't know, I don't know, 00:23:20.340 |
Claudine Gay, the plagiarizing Harvard University president, 00:23:26.820 |
the AI will produce endless amounts of, of these sort of, 00:23:31.740 |
I don't even know what to call them, woke papers. 00:23:40.540 |
They were, they were always saying the same thing 00:23:48.860 |
obviously they've been able to do it for a long time 00:23:50.420 |
and no one's noticed, but I think at this point, 00:23:53.340 |
it doesn't seem promising from a competitive point of view. 00:24:00.740 |
So I'm, I'm just, maybe just wishful thinking on my part. 00:24:03.580 |
- What are the areas of technology that you're curious about 00:24:15.900 |
you want to instantiate it more in companies than, 00:24:20.420 |
than things, or, you know, if you ask sort of like, 00:24:25.900 |
You know, in our society, and it doesn't have to be this way 00:24:35.900 |
in a certain subset of relatively small companies. 00:24:38.740 |
We have these relatively small teams of people 00:24:45.860 |
that's sort of what I find, you know, inspiring about, 00:24:50.460 |
And then, and then obviously you don't just want innovation. 00:25:00.060 |
It's somehow, it doesn't happen in universities. 00:25:06.340 |
I mean, you know, somehow in this very, very weird, 00:25:09.700 |
different country that was the United States in the 1940s, 00:25:12.580 |
you had, you know, somehow the army organized the scientists 00:25:15.700 |
and got them to produce a nuclear bomb in Los Alamos 00:25:22.300 |
after that was, you know, it's, it's, you know, 00:25:27.820 |
obviously maybe if you'd left the prima donna scientists 00:25:30.020 |
to their own, it would have taken them 50 years 00:25:32.980 |
And the army could just tell them what to do. 00:25:34.900 |
And this will silence anybody who doesn't believe 00:25:44.220 |
But I think, yeah, but I think that's sort of, 00:25:56.100 |
we've episodically tried to do all this investing. 00:25:59.100 |
We've probably tried to do too much investing in Europe 00:26:03.580 |
it's a nice place to go on vacation as an investor. 00:26:13.220 |
but it's a very strange thing that so much of it is still, 00:26:23.140 |
social evolutionary problem in the United States? 00:26:26.660 |
What is the root cause of the failure to innovate 00:26:29.940 |
in the United States relative to the expectation 00:26:40.860 |
there's always one of the big picture claims I have 00:26:44.020 |
that we've been in an era of relative tech stagnation 00:26:55.260 |
which is not an anti-Twitter, anti-X commentary, 00:26:58.100 |
even though the way I used to always qualify it 00:27:00.660 |
was that at least, it was at least a good company. 00:27:21.020 |
'Cause you point out that it's not a technology trend tracker 00:27:23.780 |
that you think about, it's about people and teams 00:27:30.100 |
And what's gone wrong with our view of the world 00:27:46.500 |
and again, it's not that there's been no innovation. 00:27:56.780 |
So there are sort of all these world of bits places 00:28:06.380 |
But it was everything having to do with atoms 00:28:10.500 |
when I was an undergraduate at Stanford in the late '80s. 00:28:12.620 |
In retrospect, any applied engineering field was a bad idea. 00:28:16.420 |
It was a bad idea to become a chemical engineer, 00:28:49.220 |
for covering up for the failures of that generation. 00:28:53.380 |
And then I think, but I think maybe a very big picture 00:28:58.380 |
part of it was that at some point in the 20th century, 00:29:35.340 |
- Yeah, Woodstock was three weeks after that. 00:29:39.460 |
- Progress stopped and the hippies took over. 00:29:42.100 |
- Can we shift gears just to the domestic economy? 00:29:44.300 |
What do you think's happening in the domestic economy? 00:29:51.940 |
The revisions are supposed to be completely random, 00:29:58.740 |
There's also what's happening with the yield curve, 00:30:01.580 |
What's your take on what's happening in the economy? 00:30:04.140 |
- You know, it's, man, it's always hard to know exactly. 00:30:18.620 |
It's being stopped by really big government spending. 00:30:27.100 |
the projection for the deficit in fiscal year '24, 00:30:38.300 |
The deficit's gonna come in about 400 billion higher. 00:30:41.380 |
And so, you know, it was sort of a crazy deficit 00:30:55.140 |
this crazy deficit at the top of the economic cycle, 00:30:58.900 |
you know, you're supposed to increase deficits 00:31:04.700 |
You know, things would be probably very shaky. 00:31:07.900 |
There's some way where, yeah, we have too much debt, 00:31:16.060 |
Again, I always think it comes back to, you know, 00:31:21.100 |
There probably are other ways to grow an economy 00:31:32.540 |
And then that's where it's very deeply stuck. 00:31:37.020 |
there's always a question, why did people not realize 00:31:39.380 |
that this tech stagnation had happened sooner? 00:31:43.980 |
people could do economically that had nothing to do 00:31:51.380 |
which was to massively cut taxes, deregulate, 00:31:54.620 |
allow lots of companies to merge and combine. 00:31:57.460 |
And it was sort of a one-time way to make the economy 00:32:10.900 |
and that was sort of the right-wing capitalist move. 00:32:13.580 |
And then in the '90s, there was sort of a Clinton-Blair 00:32:21.060 |
And there was a giant global arbitrage you could do, 00:32:24.180 |
which also had a lot of negative externalities 00:32:26.340 |
that came with it, but it sort of was a one-time move. 00:32:36.540 |
I don't think you should raise taxes like crazy, 00:32:44.820 |
And so I think you have to somehow get back to the future. 00:32:50.180 |
You, I think, saw that maybe those Ivy League institutions 00:32:55.180 |
maybe weren't producing the best and brightest 00:33:03.900 |
And I meet them all because they all have crazy ideas 00:33:07.700 |
What have you learned getting people to quit school, 00:33:10.740 |
giving them $100,000, and then how many parents call you 00:33:13.660 |
and get really upset that their kids are quitting school? 00:33:33.340 |
than I even thought when I started this thing. 00:33:37.140 |
I think, yeah, it's, I did this debate at Yale last week, 00:33:48.740 |
And you should go through all the different numbers. 00:33:53.740 |
And then, and again, I was careful to word it 00:34:00.660 |
and then people kept saying, well, what's your alternative? 00:34:03.220 |
And I said, nope, that's not, was not the debate. 00:34:05.220 |
I'm not, you know, I'm not your guidance counselor. 00:34:12.860 |
the first thing you should do is probably not, you know, 00:34:18.900 |
And, you know, the student debt was 300 billion in 2000. 00:34:22.500 |
It's basically close to 2 trillion at this point. 00:34:33.260 |
if you graduated from college in 1997, 12 years later, 00:34:39.200 |
but most of the people had sort of paid it down. 00:34:53.620 |
If you take the people graduated from college in 2009, 00:35:01.100 |
the median person had more student debt 12 years later 00:35:08.100 |
Because it's actually just, it's just compounding faster. 00:35:27.060 |
And, you know, again, I think it's on some level, 00:35:30.980 |
there are sort of a lot of debates in our society 00:35:33.780 |
that are probably dominated by sort of a boomer narrative. 00:35:37.020 |
And maybe the baby boomers were the last generation 00:35:59.980 |
It would be good if we figured something out before then. 00:36:02.820 |
- Does the government need to stop underwriting the loans? 00:36:16.340 |
And there's, if you're an accredited university, 00:36:20.100 |
And accreditation, in a rigid kind of free market system, 00:36:34.220 |
would figure out what the rate should be, and so on. 00:36:36.260 |
But in this case, the government simply provides capital 00:36:47.140 |
relative to the earning potential over time is gone. 00:36:54.740 |
I know, I'm sort of, some ways I'm right wing, 00:37:00.700 |
I do think a lot of the students got ripped off. 00:37:03.220 |
And so I think there should be some kind of broad-- 00:37:12.500 |
It's the universities and it's the bondholders. 00:37:18.140 |
out of those endowments. - And the universities. 00:37:20.580 |
And then obviously, if you just make it the taxpayers, 00:37:25.180 |
then you'll just, then the universities can just charge 00:37:28.100 |
more and more every year-- - There's no incentive 00:37:36.580 |
that the bankruptcy laws got rewritten in the US 00:37:42.940 |
And if you haven't paid it off by the time you're 65, 00:37:45.420 |
your social security wage checks will be garnished. 00:37:56.620 |
- Well, if we say that, if we start with my place 00:38:01.620 |
where a lot of the student debt should be forgiven, 00:38:10.380 |
how many of the colleges can pay for all the students-- 00:38:27.260 |
It might, you might not have to shut them down 00:38:31.180 |
because a lot of them have gotten extremely abloated. 00:38:46.660 |
So there's sort of all these sort of parasitic people 00:38:56.020 |
there would be a lot of rational ways to dial this back, 00:39:02.500 |
- If the only way to lose weight is to cut off your thumb, 00:39:05.940 |
that's kind of a difficult way to go on a diet. 00:39:18.060 |
are arguably the three leading AI language model leaders. 00:39:27.220 |
Rank them in order and tell us a little bit about each. 00:39:41.380 |
you said today you felt extremely honest and candid, 00:39:45.580 |
but I've already been extremely honest and candid, 00:39:59.580 |
Well, maybe talk a little bit about each individual. 00:40:07.980 |
that Sam Altman was getting away with turning open AI 00:40:19.140 |
It has to be totally illegal, what Sam's doing, 00:40:24.020 |
and that seemed really, really convincing in the moment, 00:40:31.020 |
actually, man, it's been such a horrifically mismanaged 00:40:36.020 |
place at open AI with this preposterous nonprofit board 00:40:43.140 |
and so there actually isn't much of a moral hazard from it, 00:40:52.980 |
I mean, do you see a path to a monopoly there? 00:40:55.220 |
- Well, again, this is, again, where, you know, 00:40:57.420 |
you should, you know, attention is all you need. 00:41:00.300 |
You need to pay attention to who's making money. 00:41:02.420 |
It's NVIDIA, it's the hardware, the chip slayer, 00:41:11.260 |
you know, it's not what we've done in tech for 30 years. 00:41:21.860 |
Yeah, everyone else is just spending money on the computer, 00:41:30.820 |
but yeah, I think everyone else is collectively losing money. 00:41:34.260 |
- What do you think of Zuckerberg's approach to say, 00:41:36.300 |
I'm so far behind, this isn't core to my business, 00:41:45.140 |
- Again, I would say my, again, my big qualification is, 00:41:58.800 |
it does feel uncomfortably close to the bubble of 1999. 00:42:07.940 |
And I want to have more clarity in investing, 00:42:21.260 |
in retrospect, Nvidia would have been a good buy. 00:42:28.180 |
Everyone's going to try to copy them on the chip side. 00:42:31.340 |
Maybe that's straightforward to do, maybe it's not, 00:42:40.900 |
you should not be asking this question about, you know, 00:42:47.260 |
You should really be focusing on the Nvidia question, 00:42:49.940 |
And the fact that we're not able to focus on that, 00:42:52.380 |
that tells us something about how we've all been trained. 00:42:54.660 |
You know, I think Nvidia got started in 1993. 00:43:05.860 |
And then, yeah, it's probably a really bad idea 00:43:07.940 |
to start a semiconductor company even in '93, 00:43:14.520 |
No talented people started semiconductor companies 00:43:17.060 |
after 1993 because they all went into software. 00:43:38.480 |
And then I think the, you know, I think the risk, 00:43:43.200 |
it's always, you know, attention is all that you need. 00:43:48.940 |
when you get started as an actress, as a startup, 00:43:58.480 |
attention becomes the worst thing in the world. 00:44:03.380 |
had the largest market cap in the world earlier this year. 00:44:07.300 |
And I do think that represented a phase transition. 00:44:11.060 |
they probably had more attention than was good. 00:44:23.340 |
than a lot of the folks that we get to talk to. 00:44:28.100 |
With all of this, are you optimistic for the future? 00:44:31.380 |
- I always, man, I always push back on that question. 00:44:36.540 |
I think extreme optimism and extreme pessimism 00:44:48.500 |
You know, extreme pessimism, nothing you can do. 00:44:51.400 |
Extreme optimism, the future will take care of itself. 00:44:56.220 |
it's probably you wanna be somewhere in between, 00:45:06.940 |
it's not some sort of winning a lottery ticket 00:45:09.140 |
or some astrological chart that's gonna decide things. 00:45:14.860 |
And that's sort of an axis that's very different 00:45:19.920 |
extreme pessimism, they're both excuses for laziness. 00:45:24.860 |
Ladies and gentlemen, give it up for Peter Thiel.