back to index

Peter Thiel | All-In Summit 2024


Chapters

0:0 The Besties intro Peter Thiel
1:3 Why he's not financially participating in the 2024 election
6:53 US relationship with China, is defending Taiwan worth risking WW3?
16:38 State of AI: Similar to the internet in 1999
24:3 Innovation stagnation in the US
29:42 Thoughts on the current state of the US economy
32:50 The higher education bubble
39:9 Who will win AI, Nvidia's monopoly position going forward

Whisper Transcript | Transcript Only Page

00:00:00.000 | Peter was the person who told me this really pithy quote.
00:00:03.960 | In a world that's changing so quickly,
00:00:08.560 | the biggest risk you can take is not taking any risk.
00:00:12.720 | This guy is a tough nut to try to sort of explain.
00:00:16.440 | Changed money with PayPal.
00:00:18.040 | Was the first outside investor in Facebook.
00:00:20.440 | Backed Palantir,
00:00:21.280 | which is, I believe they helped find Osama Bin Laden.
00:00:24.400 | Almost certainly the most successful
00:00:26.280 | technology investor in the world.
00:00:29.600 | I don't think the future is fixed.
00:00:31.240 | I think what matters is a question of agency.
00:00:33.960 | What I think works really well
00:00:36.260 | are sort of one of a kind companies.
00:00:39.160 | How do you get from zero to one?
00:00:40.720 | What great business is nobody building?
00:00:42.480 | Tell me something that's true that nobody agrees with you on.
00:00:45.480 | All right, Peter, welcome back.
00:00:48.760 | Good to see you.
00:00:49.600 | (audience applauds)
00:00:52.600 | You don't do this too often, so we do appreciate it.
00:00:57.640 | But when you do do it, you're always super candid,
00:00:59.880 | and we appreciate that as well.
00:01:00.960 | You fit right in here.
00:01:02.060 | You're sitting this year's political cycle out.
00:01:06.400 | - Right into politics.
00:01:08.680 | - Well, no, I mean, I think this is a question we all have,
00:01:11.220 | which is, you were very active.
00:01:13.480 | You bet on J.A.D. in a major way.
00:01:16.180 | He delivered today.
00:01:17.920 | It was a very impressive discussion.
00:01:20.440 | Why aren't you involved this cycle?
00:01:22.960 | It's very confounding to us because these are your guys.
00:01:25.400 | - Man, how much time do we have?
00:01:26.840 | I mean, this was, let's talk about this
00:01:28.000 | for two hours or something.
00:01:29.380 | I don't know.
00:01:32.280 | Look, I have a lot of conflicted thoughts on it.
00:01:35.120 | I am still very strongly pro-Trump, pro-J.A.D.
00:01:40.120 | I've decided not to donate any money politically,
00:01:44.840 | but I'm supporting them in every other way possible.
00:01:52.440 | Obviously, I think there's,
00:01:57.140 | my pessimistic thought is that Trump is going to win
00:02:02.760 | and probably will win by a big margin.
00:02:05.880 | He'll do better than the last time,
00:02:07.920 | and it'll still be really disappointing
00:02:09.840 | because the elections are always a relative choice,
00:02:13.000 | and then once someone's president, it's an absolute,
00:02:15.700 | and you get evaluated.
00:02:17.560 | Do you like Trump or Harris better?
00:02:19.780 | And there seem to be a lot of reasons
00:02:22.640 | that one would be more anti-Harris than anti-Trump.
00:02:26.320 | Again, no one's pro any of these people.
00:02:28.240 | It's all negative, right?
00:02:29.740 | But then after they win,
00:02:33.720 | there will be a lot of buyer's remorse and disappointment.
00:02:35.880 | That's sort of the arc that I see of what's gonna happen,
00:02:39.160 | and it's somewhat under-motivating.
00:02:41.940 | I don't know, just to describe it,
00:02:44.080 | I think the odds are slightly in favor of Trump,
00:02:48.240 | but it's basically 50/50.
00:02:50.720 | My one contrarian view on the election
00:02:52.600 | is that it's not gonna be close.
00:02:54.560 | Most presidential elections aren't,
00:02:56.480 | and one side just breaks.
00:02:58.640 | 2016, 2020, we're super close,
00:03:01.520 | but 2/3 of the elections aren't,
00:03:03.220 | and you can't always line things up and figure it out.
00:03:06.080 | I think either the Kamala bubble will burst,
00:03:09.060 | or maybe the Trump voters get really demotivated
00:03:11.900 | and don't show up, but I think one side
00:03:13.740 | is simply gonna collapse in the next two months,
00:03:17.460 | and then if you wanna get involved
00:03:20.360 | with all the headaches that come with being involved,
00:03:22.640 | if it makes a difference counterfactually,
00:03:24.480 | and if it's a really close election,
00:03:25.840 | everything makes a difference.
00:03:26.880 | If it's not even close,
00:03:28.840 | I don't think it makes much of a difference.
00:03:30.200 | If it is gonna be close, by the way,
00:03:32.120 | if it's gonna be a razor-thin close election,
00:03:35.160 | then I'm pretty sure Kamala will win,
00:03:37.320 | because they will cheat, they will fortify it,
00:03:40.560 | they will steal the ballots,
00:03:42.080 | and so if we can answer that question,
00:03:45.700 | if we can answer them in the event that it's close,
00:03:50.660 | I don't wanna be involved.
00:03:52.020 | In the event that it's not close,
00:03:53.820 | I don't need to be involved,
00:03:55.180 | and so that's sort of a straightforward analysis right there.
00:03:58.340 | - My jumping off point, how much cheating
00:04:01.220 | on a percentage basis do you think happens every year?
00:04:04.900 | How much, and do you think Trump actually won
00:04:06.780 | the last time? - You need to be careful
00:04:07.600 | with the verbs, so you know, cheating, stealing,
00:04:11.180 | that implies something happened, the dark of night.
00:04:13.740 | - Massaging. - I think the verb
00:04:14.780 | you're allowed to use is fortify.
00:04:16.660 | - Okay, yeah, we don't wanna get canceled on YouTube.
00:04:19.180 | - Ballot harvesting, I mean, it's all sort of,
00:04:21.940 | there were all these rule changes,
00:04:23.100 | it was sort of done in plain daylight,
00:04:25.060 | but yeah, I think our elections are not,
00:04:29.000 | they're not perfectly clean,
00:04:30.820 | otherwise we could examine it,
00:04:32.380 | we could have a vigorous debate about it.
00:04:33.980 | - Well, what would you change then?
00:04:34.940 | What should change?
00:04:35.780 | 'Cause we all want everybody's votes to count,
00:04:38.180 | we want it to be clean.
00:04:40.340 | I'm talking about the audience here.
00:04:41.500 | - I don't know, at a minimum you'd run them,
00:04:44.220 | you'd try to run elections the same way you do it
00:04:46.220 | in every other Western democracy,
00:04:47.740 | you have one day voting,
00:04:49.180 | you have practically no absentee ballots.
00:04:52.420 | You have, and, (audience applauding)
00:04:55.420 | and it's one day where everything happens,
00:04:59.740 | it's not this two-month elongated process.
00:05:03.360 | That's the way you do it in every other country.
00:05:05.140 | You'd have some somewhat stronger voter ID
00:05:08.580 | and make sure that the people who are voting
00:05:12.180 | have a right to vote.
00:05:13.920 | - Make it a national holiday.
00:05:15.320 | - That's basically what you do
00:05:17.500 | in every other Western democracy,
00:05:18.920 | and it used to be much more like that in the US.
00:05:21.860 | I mean, it's meaningfully decayed over the last 20, 30 years.
00:05:25.660 | You know, 20, 30 years ago, 30, 40 years ago,
00:05:28.100 | you got the results on the day of the vote,
00:05:30.340 | and that sort of stopped happening a while ago.
00:05:32.980 | - What would make you not disappointed?
00:05:36.060 | So Trump gets elected, how do you,
00:05:38.760 | what's your counter-narrative on, you know,
00:05:40.900 | we're a year or two years past the election,
00:05:43.860 | Trump is president, what makes you say
00:05:45.700 | I'm surprisingly not disappointed?
00:05:48.060 | What takes place?
00:05:50.380 | - Man, it's, you know, it's,
00:05:54.180 | I think there are some extremely difficult problems
00:05:56.860 | that it's really hard to know how to solve them.
00:06:01.580 | I wouldn't know what to do,
00:06:02.680 | but we have an incredibly big deficit,
00:06:07.380 | and yeah, if you can find some way
00:06:12.300 | to meaningfully reduce the deficit with no tax hikes.
00:06:16.220 | - And without GDP contraction.
00:06:18.220 | - Well, you would do it if you got a lot of GDP growth,
00:06:21.580 | maybe, but if you could meaningfully reduce the deficit
00:06:25.300 | with no tax hikes, that would be very impressive.
00:06:30.300 | You know, I think we're sort of sleepwalking
00:06:33.700 | into Armageddon with, you know, Ukraine
00:06:36.180 | and the conflict in Gaza, just sort of the warmups
00:06:40.060 | to the China-Taiwan war.
00:06:42.580 | And so if Trump can find a way to head that off,
00:06:47.580 | that would be incredible.
00:06:48.900 | If they don't go to war in four years,
00:06:50.500 | that will be better than I would expect, possibly.
00:06:53.460 | - In relation to Taiwan, if Trump called you and asked,
00:06:56.600 | should we defend it or not in this acute case,
00:07:00.220 | would you advise to let Taiwan be taken by China
00:07:04.740 | in order to avoid a nuclear holocaust and World War III?
00:07:09.820 | Or would you believe that we should defend it
00:07:11.860 | and defend free countries like that?
00:07:14.100 | - Well, I think you're probably not supposed to say.
00:07:18.460 | - Wait, but you're Peter Thiel.
00:07:19.780 | - No, no, no, if you, look, I think there's so many ways
00:07:24.780 | our policies are messed up, but probably, you know,
00:07:27.920 | the one thing that's roughly correct on the Taiwan policy
00:07:32.080 | is that we don't tell China what we're gonna do.
00:07:35.360 | And what we tell them is we don't know what we'll do
00:07:38.260 | and we'll figure it out when you do it,
00:07:39.820 | which probably has the virtue of being correct.
00:07:42.820 | And then I think if you had a red line at Kumhoi Matsu,
00:07:46.860 | the island, you know, five miles off the coast of China,
00:07:49.920 | that's unbelievable.
00:07:51.280 | If you say we want some guardrails
00:07:55.420 | and we won't defend Taiwan,
00:07:56.700 | then they'd get invaded right away.
00:07:58.300 | So I think the policy of not saying what our policy is
00:08:03.300 | and maybe not even having a policy, you know,
00:08:07.340 | in some ways is relatively the best.
00:08:09.620 | I think anything precise you say,
00:08:12.260 | that's gonna just lead to war right away.
00:08:14.740 | - But what do you believe?
00:08:16.040 | Worth defending or not worth stirring the conflict,
00:08:20.140 | democracy in this tiny island,
00:08:22.100 | worth going to war over or not, according to Peter Thiel?
00:08:25.420 | - It's not worth World War III.
00:08:29.560 | And I still think it's quite catastrophic
00:08:34.020 | if it gets taken over by the communists.
00:08:36.260 | - How does the world-- - Those can both be true.
00:08:37.660 | - How does the world divide
00:08:39.820 | if we end up in a heightened escalation?
00:08:43.740 | Is China, Russia, Iran friends?
00:08:46.820 | Is that an axis that forms?
00:08:49.460 | Think out the next decade in kind of your base case
00:08:52.540 | and estimate how the world divides.
00:08:55.060 | - I don't know what happens militarily
00:08:56.960 | if there's a China-Taiwan invasion.
00:08:59.340 | I mean, maybe we roll over,
00:09:01.300 | maybe it escalates all the way to nuclear war.
00:09:05.420 | Probably it's some very messy in-between thing,
00:09:09.900 | sort of like what you have in the Ukraine.
00:09:12.460 | What I think happens economically is very straightforward.
00:09:15.820 | I think basically with Russia and Germany,
00:09:20.460 | you had one Nord Stream pipeline,
00:09:22.580 | and we have the equivalent of 100 pipelines
00:09:24.620 | between the US and China, and they all blow up.
00:09:27.900 | I met the TikTok CEO about a year ago,
00:09:31.140 | and maybe I wouldn't have said this now,
00:09:35.260 | but what I told him and what I felt was very honest advice
00:09:38.220 | was you don't need to worry about the US.
00:09:41.060 | We're never gonna do anything about TikTok.
00:09:42.820 | We're too incompetent.
00:09:44.300 | (audience laughing)
00:09:46.260 | But if I were in your place,
00:09:47.220 | I would still get the business out of China.
00:09:49.900 | I would get the computers out, the people out.
00:09:51.900 | I'd completely decouple it from ByteDance
00:09:54.100 | because TikTok will be banned
00:09:57.380 | 24 hours after the Taiwan invasion.
00:10:00.180 | And if you think there's a 50/50 chance this happens
00:10:03.140 | and that will destroy 100% of the value
00:10:06.580 | of the TikTok franchise--
00:10:07.700 | - What was his reaction?
00:10:08.900 | - He said that they had done a lot of simulations
00:10:14.700 | and there were a bunch of companies
00:10:16.020 | in World War I and World War II
00:10:17.340 | that managed to sell things to both sides.
00:10:19.620 | - He doesn't seem so bright to me.
00:10:21.140 | Do you think he's--
00:10:22.300 | (audience laughing)
00:10:23.140 | No, he gotta see.
00:10:23.980 | What was your taste on him?
00:10:24.820 | - He didn't disagree with my frame,
00:10:26.460 | and so I always find that flattering
00:10:28.300 | if someone basically agrees with my framing.
00:10:30.140 | So he seemed perfectly bright to me, even though--
00:10:32.980 | - Game theory?
00:10:33.820 | - Maybe not, not correct.
00:10:34.660 | - But game theory.
00:10:35.500 | - A lot of bright people are wrong about this.
00:10:38.500 | - I saw you give a talk last summer with Barry Weiss
00:10:40.820 | and you talked about this decoupling should be happening.
00:10:44.500 | You weren't saying should.
00:10:45.380 | You were recommending that every industry leader
00:10:48.340 | consider decoupling from China.
00:10:50.100 | I think your comment was it's like picking up nickels
00:10:52.060 | in front of a freight train.
00:10:53.340 | Do you remember saying that?
00:10:54.740 | - Well, I think there are a lot of different ways
00:11:00.980 | in which businesses are coupled to China.
00:11:03.980 | There were investors that tried investing.
00:11:06.140 | There are people who try to compete within China.
00:11:08.820 | There are people who built factories in China for export.
00:11:12.380 | And there are different parts of that
00:11:14.260 | that worked to varying degrees.
00:11:17.340 | But yeah, I certainly would not try to invest
00:11:22.340 | in a company that competed domestically inside China.
00:11:29.700 | I think that's virtually impossible.
00:11:32.220 | I think it's probably quite tricky
00:11:36.940 | even to invest in Chinese businesses.
00:11:39.900 | And then there is sort of this model
00:11:45.020 | of building factories in China for export to the West.
00:11:50.020 | And it was a very big arbitrage.
00:11:53.540 | These things do work.
00:11:54.900 | I mean, I visited the Foxconn factory nine years ago
00:11:58.020 | and it's, you have people get paid a dollar and a half,
00:12:00.340 | $2 an hour, and they work 12 hours a day.
00:12:02.460 | And they live in a dorm room with two bunk beds
00:12:05.740 | where you get eight people in the dorm room.
00:12:07.860 | Someone's sleeping in your bed
00:12:08.980 | while you're working and vice versa.
00:12:10.820 | And you sort of realize they're really far behind us
00:12:14.140 | or they're really far ahead of us.
00:12:15.380 | And either way, it's not that straightforward
00:12:18.700 | to just shift the iPhone factories to the United States.
00:12:23.340 | So I sort of understand why a lot of businesses
00:12:28.300 | ended up there and why this is the arrangement that we have.
00:12:32.180 | But yeah, my intuition for what is going to happen
00:12:37.180 | without making any normative judgments at all
00:12:40.420 | is it is going to decouple.
00:12:42.260 | - How inflationary will that be?
00:12:43.860 | - It presumably is, it's presumably pretty inflationary.
00:12:49.980 | Yeah, that's probably the, you know, I don't know.
00:12:53.260 | It's, you'd have to sort of look at, you know,
00:12:56.180 | what the inelasticities of all these goods are.
00:12:58.540 | - So if that's true, what's the policy reaction?
00:13:00.860 | - It's probably not that, it may not be as inflationary
00:13:03.500 | as people think because people always model trade
00:13:07.420 | in terms of pairwise, in terms of two countries.
00:13:10.300 | So if you literally have to move the people back to the US,
00:13:13.620 | that's insanely expensive.
00:13:15.420 | I don't, you know, I don't know how much it would cost
00:13:16.700 | people to build an iPhone.
00:13:18.140 | - Does India become China?
00:13:20.460 | - Well, I think India is sort of too messed up,
00:13:22.060 | but you shift it to like Vietnam, Mexico,
00:13:24.460 | there are, you know, there are 5 billion people
00:13:27.020 | living in countries where the incomes are lower than China.
00:13:30.020 | And so, you know, probably the negative sum trade policy
00:13:35.020 | we should have with China is, you know,
00:13:38.980 | we should just shift it to other countries,
00:13:41.100 | which is a little bit bad for the US,
00:13:43.540 | extremely bad for China,
00:13:45.300 | and let's say really good for Vietnam.
00:13:47.460 | That's kind of, and that's kind of the negative sum policy
00:13:52.020 | that's going to manifest as this sort of decoupling happens.
00:13:58.340 | - Let's talk about avoiding it for a second here.
00:14:00.620 | Trump seems to be extremely good
00:14:01.980 | with dictators and authoritarians.
00:14:04.580 | Kim Jong-un seems like a big fan.
00:14:06.340 | I mean that in like as a compliment as a superpower, right?
00:14:08.980 | Like he doesn't have a problem talking to them.
00:14:11.700 | He connects with them and they seem to like them.
00:14:14.740 | So what would be the path to him working with Xi
00:14:18.300 | to avoid this?
00:14:19.660 | Is there a path to avoid this?
00:14:21.300 | Because we were sitting here last year talking about this
00:14:23.940 | and it just seems mind boggling that if everybody agrees
00:14:28.020 | that this is going to happen,
00:14:29.740 | that we can't figure out a way to make it not happen.
00:14:32.860 | - Well, it's not just up to us.
00:14:37.220 | So yeah, there is, and so I don't know,
00:14:41.700 | it's obviously somewhat of a black box.
00:14:43.780 | We don't exactly, I feel we just have no clue
00:14:48.580 | what people in China think.
00:14:51.620 | But I think it's sort of the sense of history
00:14:55.820 | is strongly the sort of Thucydides trap idea
00:14:58.940 | that you have a rising power against an existing power
00:15:02.260 | and it tends to, it's Wilhelmine, Germany
00:15:05.940 | versus Britain before World War I.
00:15:07.820 | And it's Athens against Sparta,
00:15:11.660 | the rising power against the existing power,
00:15:14.060 | you tend to get conflict.
00:15:16.140 | That's probably what deep down I think
00:15:20.420 | is really, really far in the China DNA.
00:15:23.460 | So I'd say maybe the first, I don't know,
00:15:26.780 | the meta version would be the first step
00:15:28.980 | to avoiding the conflict would be we have to start
00:15:32.020 | by admitting that China believes the conflict's happening.
00:15:35.700 | - Right.
00:15:36.540 | - And then if people like you are constantly saying,
00:15:40.260 | well we just need to have some happy talk.
00:15:42.260 | - Right.
00:15:43.100 | - That is a recipe, that's a recipe for World War III.
00:15:46.100 | - I'm not advocating happy talk necessarily.
00:15:48.300 | (audience applauding)
00:15:50.380 | I get accused of being a bit more hawkish.
00:15:53.380 | - Obviously, in general, I don't know.
00:15:57.900 | I'm not sure Trump should have talked
00:15:59.780 | to the North Korean dictator, but yeah, in general,
00:16:02.400 | it's probably a good idea to try to talk to people
00:16:06.060 | even if they're really bad people most of the time.
00:16:08.980 | And it's certainly a very odd dynamic
00:16:13.460 | with the U.S. and Russia at this point
00:16:16.540 | where I think it is impossible for anybody
00:16:20.700 | in the Biden administration even to have
00:16:22.380 | a back channel communication with people.
00:16:25.420 | I don't think Tucker Carlson counts as an emissary
00:16:28.300 | from the Biden administration.
00:16:29.860 | And if anybody gets tuckered or I don't know what the verb is
00:16:32.940 | who talks, that seems worse than the alternative.
00:16:38.600 | - Can we talk about technology?
00:16:40.380 | You have a speech where you talk about
00:16:46.140 | some of the misguided things we've done in the past
00:16:48.400 | in the name of technology and use like big data
00:16:50.180 | as an example of that.
00:16:51.520 | What is AI?
00:16:53.560 | - Oh man, that's sort of a big question.
00:17:00.460 | Yeah, I always had this riff
00:17:08.380 | where I don't like the buzzwords
00:17:10.640 | and machine learning, big data, cloud computing.
00:17:15.640 | I'm gonna build a mobile app, bring the cloud to,
00:17:18.760 | if you have sort of a concatenation of buzzwords,
00:17:22.980 | my first instinct is just to run away as fast as possible.
00:17:26.260 | It's some really bad group think.
00:17:28.560 | And for many years, my bias is probably that AI
00:17:32.500 | was one of the worst of all these buzzwords.
00:17:34.500 | It meant the next generation of computers,
00:17:37.540 | the last generation of computers, anything in between.
00:17:40.900 | So it's meant all these very different things.
00:17:43.700 | If we roll the clock back to the 2010s,
00:17:47.300 | probably the AI, to the extent you concretize it,
00:17:52.260 | I would say the AI debate was maybe framed
00:17:54.900 | by the two books, the two canonical books that framed it.
00:17:58.660 | There was the Bostrom book, "Superintelligence 2014"
00:18:01.420 | where AI was gonna be this superhuman,
00:18:04.100 | super duper intelligent thing.
00:18:07.380 | And then the anti-Bostrom book
00:18:10.700 | was the Kai-Fu Lee "2018 AI Superpowers".
00:18:13.260 | We can think of the CCP rebuttal to Bostrom
00:18:16.200 | where basically AI was going to be surveillance tech,
00:18:19.580 | face recognition, and China was going to win
00:18:21.820 | because they had no qualms about applying this technology.
00:18:25.540 | And then if we now think about what actually happened,
00:18:29.180 | let's say with the LLMs and chat GPT,
00:18:32.260 | it was really neither of those two.
00:18:35.300 | And it was this in-between thing,
00:18:37.380 | which was actually what people would have defined AI as
00:18:40.040 | for the previous 60 or 70 years,
00:18:42.280 | which is passing the Turing test,
00:18:44.180 | which is the somewhat fuzzy line.
00:18:46.380 | It's a computer that can pretend to be a human
00:18:48.860 | or that can fool you into thinking it's a human.
00:18:52.740 | Even with the fuzziness of that line,
00:18:57.600 | you could say that pre-chat GPT wasn't passed
00:19:00.600 | and then chat GPT passed it.
00:19:02.620 | And that seems very, very significant.
00:19:05.860 | And then obviously it leads to all these questions.
00:19:10.100 | What does it mean?
00:19:11.220 | Is it gonna compliment people?
00:19:13.820 | Is it gonna substitute for people?
00:19:16.140 | What does it do to the labor market?
00:19:17.620 | Do you get paid more, do you get paid less?
00:19:19.660 | So there are all these questions,
00:19:21.580 | but it seems extremely important.
00:19:29.660 | And it's probably, certainly the big picture questions,
00:19:33.860 | which I think Silicon Valley's always very bad
00:19:35.540 | at talking about is like,
00:19:36.820 | what does it mean to be a human being?
00:19:39.060 | Sort of the, I don't know, the stupid 2022 answer
00:19:42.420 | would be that humans differ from all the other animals
00:19:44.960 | because we're good at languages.
00:19:46.740 | If you're a three-year-old or an 80-year-old,
00:19:48.420 | you speak, you communicate, we tell each other stories.
00:19:51.260 | This is what makes us different.
00:19:53.960 | And so, yeah, I think there's something about it
00:19:56.220 | that's incredibly important and very disorienting.
00:20:00.180 | You know, the question I always have as a,
00:20:02.140 | I don't know, the narrower question I have as an investor
00:20:03.780 | is sort of how do you make money with this stuff?
00:20:06.300 | And- - How do you make money?
00:20:08.940 | - It's pretty confusing.
00:20:11.900 | And I think, I don't know,
00:20:13.820 | this is always where I'm anchored on the late '90s
00:20:15.860 | is sort of the formative period for me.
00:20:18.100 | But I keep thinking that AI in 2023, 2024
00:20:24.460 | is like the internet in 1999.
00:20:27.020 | It's really big.
00:20:29.620 | It's going to be very important.
00:20:31.420 | It's going to transform the world,
00:20:33.260 | not in six months, but in 20 years.
00:20:36.420 | And then there are probably all kinds
00:20:38.860 | of incredibly catastrophic approximations
00:20:42.620 | where what businesses are gonna make money,
00:20:46.780 | who's gonna have monopoly,
00:20:47.820 | who's gonna have pricing power is super unclear.
00:20:53.500 | Probably, you know, one layer deeper of analysis,
00:20:56.220 | you know, if attention is all you need,
00:20:58.420 | and if you're not post-economic,
00:20:59.780 | you need to pay attention to who's making money.
00:21:01.980 | And in AI, it's basically one company is making,
00:21:04.860 | NVIDIA is making over 100% of the profits.
00:21:07.740 | Everybody else is collectively losing money.
00:21:09.900 | And so there's sort of a, you have to do some sort,
00:21:14.020 | you should try to do some sort of analysis.
00:21:17.620 | Do you go long, NVIDIA?
00:21:18.580 | Do you go short?
00:21:19.820 | You know, is it, you know, my monopoly question,
00:21:22.220 | is it a really durable monopoly?
00:21:24.900 | You know, and then I, it's hard for me to know
00:21:27.260 | because I'm in Silicon Valley and I haven't done anything,
00:21:29.420 | we haven't done anything in semiconductors for a long time.
00:21:31.740 | So I have no clue.
00:21:32.580 | - Do you, if you, let's de-buzzword the word AI
00:21:35.060 | and say it's a bunch of process automation.
00:21:37.340 | Let's just say that's version 0.1,
00:21:39.900 | where brains that are roughly the equivalent
00:21:42.020 | of a teenager can do a lot of manual stuff.
00:21:45.460 | What do you, have you thought about what it means
00:21:47.740 | for, you know, 8 billion people in the world,
00:21:50.300 | if there's an extra billion that necessarily couldn't work
00:21:53.900 | or like whether that in political or economic terms?
00:21:57.180 | - I don't know, the, I don't know if this is the same,
00:22:04.300 | but this is, you know, the history of 250 years,
00:22:07.540 | the industrial revolution, what was it, you know,
00:22:11.060 | it adds to GDP, it frees people up
00:22:13.660 | to do more, more productive things.
00:22:15.900 | You know, maybe there's, you know, there was, yeah,
00:22:18.620 | there was a, I don't know, there was a Luddite critique
00:22:20.500 | in the 19th century of the factories
00:22:23.020 | that people were going to be unemployed
00:22:24.660 | and wouldn't have anything to do
00:22:25.820 | because the machines would replace the people.
00:22:27.820 | You know, maybe the Luddites are right this time around.
00:22:30.740 | I'm probably, I'm probably pretty, pretty skeptical of it,
00:22:34.540 | but, but yeah, it's, it's, it's extremely confusing,
00:22:38.020 | you know, where, where the gains and, and losses are.
00:22:41.980 | There probably are, you know, there,
00:22:45.700 | there's always sort of a hobby.
00:22:46.900 | You can always just use it on your hobby horses.
00:22:49.220 | So I don't know the, you know, my anti-Hollywood
00:22:51.460 | or anti-university hobby horse is that it seems to me
00:22:54.820 | that, you know, the, the AI is quite good at the woke stuff.
00:22:59.140 | And it'll, and, and so, you know, if you want to,
00:23:01.620 | if you want to be a successful actor,
00:23:03.820 | you should be maybe a little bit racist
00:23:05.100 | or a little bit sexist, or just really funny.
00:23:08.260 | And you won't have any risk of the AI replacing you.
00:23:11.380 | (audience laughing)
00:23:12.700 | Everybody else will get, everybody else will get replaced.
00:23:15.460 | And then probably, I don't know, I don't know,
00:23:20.340 | Claudine Gay, the plagiarizing Harvard University president,
00:23:24.780 | you know, the AI is going to, you know,
00:23:26.820 | the AI will produce endless amounts of, of these sort of,
00:23:31.740 | I don't even know what to call them, woke papers.
00:23:36.340 | And they, they were all already
00:23:38.660 | sort of plagiarizing one another.
00:23:40.540 | They were, they were always saying the same thing
00:23:41.940 | over and over again.
00:23:43.180 | - They were using their own version.
00:23:44.380 | - The AI is just going to flood the zone
00:23:46.100 | with even more of that.
00:23:47.180 | And that, you know, I don't know,
00:23:48.860 | obviously they've been able to do it for a long time
00:23:50.420 | and no one's noticed, but I think at this point,
00:23:53.340 | it doesn't seem promising from a competitive point of view.
00:23:58.900 | - What are the areas?
00:23:59.740 | - But it's obviously my hobby horses.
00:24:00.740 | So I'm, I'm just, maybe just wishful thinking on my part.
00:24:03.580 | - What are the areas of technology that you're curious about
00:24:06.860 | that your mind is like, wow, this is really,
00:24:08.980 | I have to learn more, pay attention?
00:24:12.380 | - You know, I'm always, I always think you,
00:24:15.900 | you want to instantiate it more in companies than,
00:24:20.420 | than things, or, you know, if you ask sort of like,
00:24:22.700 | where is, where is innovation happening?
00:24:25.900 | You know, in our society, and it doesn't have to be this way
00:24:31.100 | but it's, it's, it's mostly in, you know,
00:24:35.900 | in a certain subset of relatively small companies.
00:24:38.740 | We have these relatively small teams of people
00:24:41.460 | that are really pushing the envelope.
00:24:43.420 | And that's, that's sort of, you know,
00:24:45.860 | that's sort of what I find, you know, inspiring about,
00:24:48.940 | about venture capital.
00:24:50.460 | And then, and then obviously you don't just want innovation.
00:24:53.380 | You also want it to, it to, it to translate
00:24:57.340 | into, into good businesses.
00:24:58.500 | But that's, that's where it happens.
00:25:00.060 | It's somehow, it doesn't happen in universities.
00:25:02.980 | It doesn't happen in government.
00:25:05.180 | You know, there was a time it did.
00:25:06.340 | I mean, you know, somehow in this very, very weird,
00:25:09.700 | different country that was the United States in the 1940s,
00:25:12.580 | you had, you know, somehow the army organized the scientists
00:25:15.700 | and got them to produce a nuclear bomb in Los Alamos
00:25:18.140 | in three and a half years.
00:25:19.220 | And the way the New York Times editorialized
00:25:22.300 | after that was, you know, it's, it's, you know,
00:25:24.540 | it was sort of an anti-libertarian write-up.
00:25:26.460 | It was, you know, there were, you know,
00:25:27.820 | obviously maybe if you'd left the prima donna scientists
00:25:30.020 | to their own, it would have taken them 50 years
00:25:32.140 | to build a bomb.
00:25:32.980 | And the army could just tell them what to do.
00:25:34.900 | And this will silence anybody who doesn't believe
00:25:37.700 | the government can do things.
00:25:39.260 | And they don't write editorials like that
00:25:41.980 | in the New York Times anymore.
00:25:44.220 | But I think, yeah, but I think that's sort of,
00:25:47.020 | that's sort of where one should look.
00:25:49.740 | I think a crazy amount of it still happens
00:25:52.820 | in the United States.
00:25:54.780 | You know, and there's sort of, you know,
00:25:56.100 | we've episodically tried to do all this investing.
00:25:59.100 | We've probably tried to do too much investing in Europe
00:26:01.300 | over the years.
00:26:02.140 | It's always sort of a junket, sort of,
00:26:03.580 | it's a nice place to go on vacation as an investor.
00:26:06.860 | And it is very, it's very,
00:26:11.780 | I don't have a great explanation,
00:26:13.220 | but it's a very strange thing that so much of it is still,
00:26:17.740 | the US is somehow still the country
00:26:19.180 | where people do new things.
00:26:20.900 | - Peter, is that a team organizational,
00:26:23.140 | social evolutionary problem in the United States?
00:26:26.660 | What is the root cause of the failure to innovate
00:26:29.940 | in the United States relative to the expectation
00:26:33.020 | going back 70 years, 50 years, et cetera?
00:26:37.060 | From, you know, the rocket ships
00:26:38.700 | and we're all gonna live in.
00:26:39.540 | - Yeah, well, this is always,
00:26:40.860 | there's always one of the big picture claims I have
00:26:44.020 | that we've been in an era of relative tech stagnation
00:26:46.780 | the last 40 or 50 years,
00:26:48.300 | or the tagline that we hadn't.
00:26:50.940 | - And techno negativism.
00:26:52.620 | - They promised us flying cars,
00:26:53.900 | all we got was 140 characters,
00:26:55.260 | which is not an anti-Twitter, anti-X commentary,
00:26:58.100 | even though the way I used to always qualify it
00:27:00.660 | was that at least, it was at least a good company.
00:27:04.180 | You had, you know, 10,000 people
00:27:06.260 | who didn't have to do very much work
00:27:07.540 | and could just smoke marijuana all day.
00:27:09.660 | - Very similar to Europe.
00:27:10.660 | - And so I think that actually,
00:27:11.980 | that part actually did get corrected, but.
00:27:14.100 | - Very subtle.
00:27:18.660 | - But I think.
00:27:20.180 | - Like what went wrong?
00:27:21.020 | 'Cause you point out that it's not a technology trend tracker
00:27:23.780 | that you think about, it's about people and teams
00:27:26.140 | that innovate and drive to outcomes
00:27:28.100 | based on their view of the world.
00:27:30.100 | And what's gone wrong with our view of the world
00:27:32.380 | and our ability to organize,
00:27:33.500 | to achieve the seemingly unachievable
00:27:36.620 | with very rare exceptions,
00:27:37.940 | obviously Elon's here later, but yeah.
00:27:39.340 | - You know, it's over-determined.
00:27:42.180 | The rough frame I always have,
00:27:46.500 | and again, it's not that there's been no innovation.
00:27:48.420 | There's been a decent amount of innovation
00:27:50.860 | in the world of bits, computers, internet,
00:27:53.820 | mobile internet, you know, crypto, AI.
00:27:56.780 | So there are sort of all these world of bits places
00:28:01.460 | where there was a significant,
00:28:04.460 | but sort of somehow narrow cone of progress.
00:28:06.380 | But it was everything having to do with atoms
00:28:08.820 | that was slow.
00:28:09.660 | And this was already the case
00:28:10.500 | when I was an undergraduate at Stanford in the late '80s.
00:28:12.620 | In retrospect, any applied engineering field was a bad idea.
00:28:16.420 | It was a bad idea to become a chemical engineer,
00:28:18.380 | you know, a mechanical engineer.
00:28:20.180 | AeroAstro was terrible.
00:28:21.940 | Nuclear engineering, everyone knew.
00:28:23.540 | I mean, no one did that, you know.
00:28:26.380 | And there's something about, you know,
00:28:29.500 | the world of atoms that, you know,
00:28:31.460 | from a libertarian point of view,
00:28:32.500 | you'd say it got regulated to death.
00:28:34.780 | There were probably, you know,
00:28:37.900 | there's some set of arguments
00:28:40.380 | where the low-hanging fruit got picked
00:28:43.180 | and got harder to find new things to do.
00:28:45.540 | Although I always think that was just
00:28:47.620 | a sort of baby boomer excuse
00:28:49.220 | for covering up for the failures of that generation.
00:28:53.380 | And then I think, but I think maybe a very big picture
00:28:58.380 | part of it was that at some point in the 20th century,
00:29:05.700 | the idea took hold that not all forms
00:29:08.820 | of technological progress were simply good
00:29:12.180 | and simply for the better.
00:29:13.300 | And there's, you know,
00:29:14.220 | there's something about the two world wars
00:29:16.820 | and the development of nuclear weapons
00:29:18.900 | that gradually pushed people
00:29:21.420 | into this more risk-averse society.
00:29:23.780 | And it didn't happen overnight,
00:29:25.780 | but, you know, maybe a quarter century,
00:29:28.820 | you know, after the nuclear bomb,
00:29:29.900 | it's like- - By Woodstock it happened.
00:29:31.540 | - By Woodstock it happened.
00:29:32.500 | - Yeah, 'cause that was the same summer
00:29:33.980 | we landed on the moon.
00:29:35.340 | - Yeah, Woodstock was three weeks after that.
00:29:37.260 | - Yeah, that was the tipping point.
00:29:39.460 | - Progress stopped and the hippies took over.
00:29:41.260 | - Saxon.
00:29:42.100 | - Can we shift gears just to the domestic economy?
00:29:44.300 | What do you think's happening in the domestic economy?
00:29:46.180 | And just as backdrop,
00:29:47.660 | we've had something like 14 straight months
00:29:49.540 | of downward revisions to jobs.
00:29:51.940 | The revisions are supposed to be completely random,
00:29:53.700 | but somehow they've all been down.
00:29:55.340 | Probably doesn't mean anything.
00:29:58.740 | There's also what's happening with the yield curve,
00:30:00.740 | but I'll stop there.
00:30:01.580 | What's your take on what's happening in the economy?
00:30:04.140 | - You know, it's, man, it's always hard to know exactly.
00:30:12.780 | Yeah, I suspect we're close to a recession.
00:30:15.540 | I've probably thought this for a while.
00:30:18.620 | It's being stopped by really big government spending.
00:30:23.340 | So, you know, in May of 2023,
00:30:27.100 | the projection for the deficit in fiscal year '24,
00:30:31.460 | which is October of '23 to September '24,
00:30:34.780 | was something like 1.5, 1.6 trillion.
00:30:38.300 | The deficit's gonna come in about 400 billion higher.
00:30:41.380 | And so, you know, it was sort of a crazy deficit
00:30:44.540 | was projected and it was way off.
00:30:46.660 | And then somehow, and so if we had not found
00:30:51.260 | another 400 billion to add to, you know,
00:30:55.140 | this crazy deficit at the top of the economic cycle,
00:30:58.900 | you know, you're supposed to increase deficits
00:31:01.660 | in a recession, not at the top of the cycle.
00:31:04.700 | You know, things would be probably very shaky.
00:31:07.900 | There's some way where, yeah, we have too much debt,
00:31:14.380 | not enough sustainable growth.
00:31:16.060 | Again, I always think it comes back to, you know,
00:31:19.860 | tech innovation.
00:31:21.100 | There probably are other ways to grow an economy
00:31:23.020 | without tech or intensive progress.
00:31:27.140 | But I think those don't seem to be on offer.
00:31:32.540 | And then that's where it's very deeply stuck.
00:31:34.660 | If you wind back over the last 50 years,
00:31:37.020 | there's always a question, why did people not realize
00:31:39.380 | that this tech stagnation had happened sooner?
00:31:41.820 | And I think there were two one-time things
00:31:43.980 | people could do economically that had nothing to do
00:31:46.460 | with science or tech.
00:31:47.940 | There was a 1980s Reagan-Thatcher move,
00:31:51.380 | which was to massively cut taxes, deregulate,
00:31:54.620 | allow lots of companies to merge and combine.
00:31:57.460 | And it was sort of a one-time way to make the economy
00:32:01.660 | a lot bigger, even though it had,
00:32:03.820 | it was not something that really had
00:32:05.260 | the sort of compounding effect.
00:32:06.900 | So it led to one great decade.
00:32:09.300 | And then there was, you know,
00:32:10.900 | and that was sort of the right-wing capitalist move.
00:32:13.580 | And then in the '90s, there was sort of a Clinton-Blair
00:32:17.180 | center-left thing, which was sort of leaning
00:32:20.020 | into globalization.
00:32:21.060 | And there was a giant global arbitrage you could do,
00:32:24.180 | which also had a lot of negative externalities
00:32:26.340 | that came with it, but it sort of was a one-time move.
00:32:29.700 | I think both of those are not on offer.
00:32:32.620 | You know, I don't necessarily think
00:32:35.100 | you should undo globalization.
00:32:36.540 | I don't think you should raise taxes like crazy,
00:32:39.060 | but you can't do more globalization
00:32:42.420 | or more tax cuts here.
00:32:43.700 | That's not gonna be the win.
00:32:44.820 | And so I think you have to somehow get back to the future.
00:32:48.300 | - We have time for a couple more questions.
00:32:50.180 | You, I think, saw that maybe those Ivy League institutions
00:32:55.180 | maybe weren't producing the best and brightest
00:32:57.420 | or weren't exactly hitting their mandate,
00:33:01.060 | and you created the Thiel Fellows.
00:33:02.780 | And you've been doing that for a while.
00:33:03.900 | And I meet them all because they all have crazy ideas
00:33:05.980 | and they pitch me for angel investment.
00:33:07.700 | What have you learned getting people to quit school,
00:33:10.740 | giving them $100,000, and then how many parents call you
00:33:13.660 | and get really upset that their kids are quitting school?
00:33:16.340 | - Well, I've learned a lot.
00:33:25.140 | I mean, it's, I don't know.
00:33:30.140 | I think the universities are far worse
00:33:33.340 | than I even thought when I started this thing.
00:33:35.660 | (audience laughing)
00:33:37.140 | I think, yeah, it's, I did this debate at Yale last week,
00:33:42.140 | resolved the higher education's a bubble.
00:33:48.740 | And you should go through all the different numbers.
00:33:53.740 | And then, and again, I was careful to word it
00:33:58.540 | in such a way that I didn't have to,
00:34:00.660 | and then people kept saying, well, what's your alternative?
00:34:02.300 | What should people do instead?
00:34:03.220 | And I said, nope, that's not, was not the debate.
00:34:05.220 | I'm not, you know, I'm not your guidance counselor.
00:34:07.180 | I'm not your career counselor.
00:34:08.540 | I don't know how to solve your problems.
00:34:10.620 | But if something's a bubble, you know,
00:34:12.860 | the first thing you should do is probably not, you know,
00:34:16.300 | lean into it in too crazy a way.
00:34:18.900 | And, you know, the student debt was 300 billion in 2000.
00:34:22.500 | It's basically close to 2 trillion at this point.
00:34:26.060 | So it's just been this sort of runaway,
00:34:28.260 | this runaway process.
00:34:29.980 | And then if you look at it by cohort,
00:34:33.260 | if you graduated from college in 1997, 12 years later,
00:34:37.980 | people still had student debt,
00:34:39.200 | but most of the people had sort of paid it down.
00:34:42.460 | But the first, by 2009,
00:34:44.860 | we started the Thiel Fellowship in 2010,
00:34:46.940 | and it felt, by 2009 was the first cohort
00:34:51.940 | where this really stopped.
00:34:53.620 | If you take the people graduated from college in 2009,
00:34:56.740 | and you fast forward 12 years to 2021,
00:35:01.100 | the median person had more student debt 12 years later
00:35:06.100 | than they graduated with.
00:35:08.100 | Because it's actually just, it's just compounding faster.
00:35:11.380 | And it was, you know, partially,
00:35:12.660 | partially the global financial crisis,
00:35:14.420 | the people had less well-paying jobs,
00:35:16.380 | they stayed in college longer,
00:35:17.900 | and the colleges, it's just sort of been
00:35:21.140 | this background thing where it's decayed
00:35:24.660 | in these really significant ways.
00:35:27.060 | And, you know, again, I think it's on some level,
00:35:30.980 | there are sort of a lot of debates in our society
00:35:33.780 | that are probably dominated by sort of a boomer narrative.
00:35:37.020 | And maybe the baby boomers were the last generation
00:35:39.700 | where college really worked.
00:35:41.260 | And, you know, they think, well, you know,
00:35:43.420 | I worked my way through college,
00:35:44.820 | and why can't an 18-year-old
00:35:49.020 | going to college do that today?
00:35:50.780 | And so I think the bubble will be done
00:35:55.780 | once the boomers have exited stage left.
00:35:58.660 | - But does the government need to stop?
00:35:59.980 | It would be good if we figured something out before then.
00:36:02.820 | - Does the government need to stop underwriting the loans?
00:36:06.300 | 'Cause it's the lending.
00:36:07.460 | I think the 90 plus percent of the capital
00:36:10.300 | in the student loan programs
00:36:11.620 | is funded by the federal government.
00:36:16.340 | And there's, if you're an accredited university,
00:36:18.460 | you can take out a loan and go to it.
00:36:20.100 | And accreditation, in a rigid kind of free market system,
00:36:25.100 | you would have an underwriter that says,
00:36:26.980 | are you gonna be able to graduate,
00:36:28.020 | make enough money to pay your loan off?
00:36:29.740 | Is this a good school?
00:36:30.820 | Are you gonna get a good job?
00:36:31.980 | And then the market would figure out
00:36:33.140 | whether or not to give you a loan,
00:36:34.220 | would figure out what the rate should be, and so on.
00:36:36.260 | But in this case, the government simply provides capital
00:36:38.420 | to support all of this, and as a result,
00:36:40.060 | everything's gotten more expensive.
00:36:42.580 | And the rigidity in the system
00:36:44.220 | that basically qualifies schools
00:36:45.860 | and the quality of those schools
00:36:47.140 | relative to the earning potential over time is gone.
00:36:49.700 | So we need the government to get out
00:36:50.980 | of student loan business.
00:36:52.820 | - Yeah, but look, the place where I'm,
00:36:54.740 | I know, I'm sort of, some ways I'm right wing,
00:36:56.980 | some ways I'm left wing on this.
00:36:58.180 | So the place where I'm left wing is,
00:37:00.700 | I do think a lot of the students got ripped off.
00:37:03.220 | And so I think there should be some kind of broad--
00:37:07.060 | - Relief. - Student debt forgiveness
00:37:08.140 | at this point. - Wow.
00:37:09.620 | Who should pick up the tab?
00:37:10.460 | - But it's not just the taxpayers.
00:37:12.500 | It's the universities and it's the bondholders.
00:37:16.140 | - Got it. (audience applauding)
00:37:17.300 | - The bondholders-- - Take a little bit
00:37:18.140 | out of those endowments. - And the universities.
00:37:20.580 | And then obviously, if you just make it the taxpayers,
00:37:25.180 | then you'll just, then the universities can just charge
00:37:28.100 | more and more every year-- - There's no incentive
00:37:28.940 | to change it. - There's no incentive
00:37:30.020 | to reform whatsoever.
00:37:31.380 | But-- - We've had--
00:37:32.340 | - Well, also, I mean, if you--
00:37:33.180 | - But it's in 2005, it was under Bush 43
00:37:36.580 | that the bankruptcy laws got rewritten in the US
00:37:39.740 | where you cannot discharge student debt
00:37:41.660 | even if you go bankrupt.
00:37:42.940 | And if you haven't paid it off by the time you're 65,
00:37:45.420 | your social security wage checks will be garnished.
00:37:49.020 | - It's crazy. - So I do think--
00:37:52.660 | - But should we stop lending?
00:37:53.780 | Should the federal government get out
00:37:55.180 | of the student lending business?
00:37:56.620 | - Well, if we say that, if we start with my place
00:38:01.620 | where a lot of the student debt should be forgiven,
00:38:05.540 | and then-- - And then reform
00:38:07.580 | the whole program. - And then we'll see
00:38:08.580 | how many people are willing to lend,
00:38:10.380 | how many of the colleges can pay for all the students--
00:38:14.540 | - What's your sense?
00:38:15.380 | If it was a totally free market system,
00:38:17.460 | how many colleges would shut down?
00:38:19.940 | 'Cause they wouldn't be able to,
00:38:21.180 | there's no tuition support.
00:38:23.420 | - It probably would be a lot smaller.
00:38:27.260 | It might, you might not have to shut them down
00:38:31.180 | because a lot of them have gotten extremely abloated.
00:38:34.780 | It's like Baumol's cost disease where,
00:38:36.900 | I don't know, I have no idea,
00:38:38.540 | but a place like UCLA, it probably has twice
00:38:42.700 | or three times as many bureaucrats
00:38:44.220 | as they had 30, 40 years ago.
00:38:46.660 | So there's sort of all these sort of parasitic people
00:38:50.820 | that have sort of gradually accrued,
00:38:52.940 | and so there's probably a lot of,
00:38:56.020 | there would be a lot of rational ways to dial this back,
00:38:58.580 | but yeah, maybe-- - We're gonna need
00:39:00.740 | a new location for next year.
00:39:02.500 | - If the only way to lose weight is to cut off your thumb,
00:39:05.940 | that's kind of a difficult way to go on a diet.
00:39:08.180 | - Peter, three of your collaborators,
00:39:10.540 | long-time collaborators, Elon Musk,
00:39:12.740 | Mark Zuckerberg, and Sam Altman
00:39:18.060 | are arguably the three leading AI language model leaders.
00:39:23.060 | Which one is gonna win?
00:39:27.220 | Rank them in order and tell us a little bit about each.
00:39:29.500 | (audience laughing)
00:39:32.500 | Peter said he would answer any question.
00:39:35.860 | - I said I would take any question.
00:39:37.860 | I didn't say to answer any questions.
00:39:40.220 | - You said you would honestly,
00:39:41.380 | you said today you felt extremely honest and candid,
00:39:43.540 | so let's-- - I, yeah,
00:39:45.580 | but I've already been extremely honest and candid,
00:39:47.660 | so I think I'm gonna-- (audience laughing)
00:39:49.620 | - Quartermaster.
00:39:50.460 | - It's whoever I talked to last.
00:39:53.420 | - Okay, well, okay, why don't you--
00:39:55.060 | - They're all very, very convincing people,
00:39:56.860 | so, you know, I-- - Great answer.
00:39:59.580 | Well, maybe talk a little bit about each individual.
00:40:01.340 | - I talked to Elon a while ago,
00:40:03.140 | and it was just how ridiculous it was
00:40:07.980 | that Sam Altman was getting away with turning open AI
00:40:11.060 | from a nonprofit into a for-profit.
00:40:13.780 | That was such a scam.
00:40:15.300 | If everybody was allowed to do this,
00:40:17.300 | everybody would do this.
00:40:19.140 | It has to be totally illegal, what Sam's doing,
00:40:21.820 | and it shouldn't be allowed at all,
00:40:24.020 | and that seemed really, really convincing in the moment,
00:40:27.180 | and then sort of half an hour later,
00:40:28.900 | I thought to myself that, you know,
00:40:31.020 | actually, man, it's been such a horrifically mismanaged
00:40:36.020 | place at open AI with this preposterous nonprofit board
00:40:40.780 | they had, nobody would do this again,
00:40:43.140 | and so there actually isn't much of a moral hazard from it,
00:40:45.780 | so, but yeah, whoever I talked to,
00:40:48.060 | I find very convincing in the moment.
00:40:50.620 | - Will that space just get commoditized?
00:40:52.980 | I mean, do you see a path to a monopoly there?
00:40:55.220 | - Well, again, this is, again, where, you know,
00:40:57.420 | you should, you know, attention is all you need.
00:41:00.300 | You need to pay attention to who's making money.
00:41:02.420 | It's NVIDIA, it's the hardware, the chip slayer,
00:41:06.060 | and then that's just, it's just what we,
00:41:11.260 | you know, it's not what we've done in tech for 30 years.
00:41:15.220 | - Are they making 120% of the profits?
00:41:18.220 | - They're making, I think everybody else
00:41:20.620 | is losing money collectively.
00:41:21.860 | Yeah, everyone else is just spending money on the computer,
00:41:24.460 | so it's one company that's making,
00:41:26.340 | I mean, maybe there's a few other people
00:41:27.400 | that are making some money.
00:41:28.240 | I mean, I assume TSMC and ASML,
00:41:30.820 | but yeah, I think everyone else is collectively losing money.
00:41:34.260 | - What do you think of Zuckerberg's approach to say,
00:41:36.300 | I'm so far behind, this isn't core to my business,
00:41:39.260 | I'm going to open source it.
00:41:41.580 | Is that going to be the winning strategy?
00:41:43.980 | Candy cap that for us.
00:41:45.140 | - Again, I would say my, again, my big qualification is,
00:41:53.500 | you know, my model is AI feels like,
00:41:58.800 | it does feel uncomfortably close to the bubble of 1999.
00:42:03.260 | So I'm, we haven't invested that much in it.
00:42:07.940 | And I want to have more clarity in investing,
00:42:12.100 | but the simplistic question is, you know,
00:42:17.100 | who's going to make money?
00:42:18.580 | You know, I think a year ago, two years ago,
00:42:21.260 | in retrospect, Nvidia would have been a good buy.
00:42:23.500 | You know, I think at this point, everybody,
00:42:25.140 | it's sort of too obvious
00:42:26.620 | that they're making too much money.
00:42:28.180 | Everyone's going to try to copy them on the chip side.
00:42:31.340 | Maybe that's straightforward to do, maybe it's not,
00:42:34.140 | but that's, no, I'd say probably you should,
00:42:38.780 | if you want to figure out the AI thing,
00:42:40.900 | you should not be asking this question about, you know,
00:42:43.700 | meta or open air or any of these things.
00:42:47.260 | You should really be focusing on the Nvidia question,
00:42:49.100 | the chips question.
00:42:49.940 | And the fact that we're not able to focus on that,
00:42:52.380 | that tells us something about how we've all been trained.
00:42:54.660 | You know, I think Nvidia got started in 1993.
00:42:57.340 | I believe that was the last year
00:42:59.640 | where anybody in their right mind
00:43:01.140 | would have studied electrical engineering
00:43:02.600 | over computer science, right?
00:43:03.940 | '94, Netscape takes off.
00:43:05.860 | And then, yeah, it's probably a really bad idea
00:43:07.940 | to start a semiconductor company even in '93,
00:43:10.420 | but the benefit is there was going to be,
00:43:12.780 | no one would come after you.
00:43:14.520 | No talented people started semiconductor companies
00:43:17.060 | after 1993 because they all went into software.
00:43:21.860 | - Score their monopoly power.
00:43:23.520 | - It's,
00:43:27.100 | I think it's quite strong
00:43:33.580 | because this history that I just gave you
00:43:35.700 | where none of us know anything about chips.
00:43:38.480 | And then I think the, you know, I think the risk,
00:43:43.200 | it's always, you know, attention is all that you need.
00:43:46.260 | The qualifier to that is, you know,
00:43:48.940 | when you get started as an actress, as a startup,
00:43:52.720 | as a company, you need attention.
00:43:55.820 | Then it's desirable to get more.
00:43:57.420 | And at some point,
00:43:58.480 | attention becomes the worst thing in the world.
00:44:00.500 | And there was the one day where Nvidia
00:44:03.380 | had the largest market cap in the world earlier this year.
00:44:07.300 | And I do think that represented a phase transition.
00:44:09.700 | Once that happened,
00:44:11.060 | they probably had more attention than was good.
00:44:14.660 | - Hey, Peter, as we wrap here,
00:44:16.160 | your brain works in a unique way.
00:44:19.260 | You're an incredible strategist.
00:44:20.620 | You think, you know, very differently
00:44:23.340 | than a lot of the folks that we get to talk to.
00:44:28.100 | With all of this, are you optimistic for the future?
00:44:31.380 | - I always, man, I always push back on that question.
00:44:36.540 | I think extreme optimism and extreme pessimism
00:44:43.460 | are both really bad attitudes.
00:44:46.340 | And they're somehow the same thing.
00:44:48.500 | You know, extreme pessimism, nothing you can do.
00:44:51.400 | Extreme optimism, the future will take care of itself.
00:44:54.980 | So if you have to have one,
00:44:56.220 | it's probably you wanna be somewhere in between,
00:44:58.380 | maybe mildly optimistic, mildly pessimistic.
00:45:01.300 | But, you know, I believe in human agency
00:45:04.540 | and that it's up to us.
00:45:05.780 | And it's not, you know,
00:45:06.940 | it's not some sort of winning a lottery ticket
00:45:09.140 | or some astrological chart that's gonna decide things.
00:45:12.860 | And I believe in human agency.
00:45:14.860 | And that's sort of an axis that's very different
00:45:17.180 | from optimism or pessimism.
00:45:18.780 | - What a great place. - Extreme optimism,
00:45:19.920 | extreme pessimism, they're both excuses for laziness.
00:45:23.020 | - What an amazing place to end.
00:45:24.860 | Ladies and gentlemen, give it up for Peter Thiel.
00:45:27.420 | - Thank you, thank you, Peter.
00:45:28.680 | - Come on now. - Thanks, man.
00:45:30.660 | - Wow. - All right.
00:45:31.860 | - Peter Thiel.