back to index

All-In Summit: In conversation with Vinod Khosla


Chapters

0:0 Besties welcome Vinod Khosla to All-In Summit ‘23!
2:28 “Most VCs add negative value”
5:22 AI & OpenAI
10:37 “The need to work will disappear”
14:3 “Capitalism is by permission of democracy”
15:26 Universal basic income
17:4 AI value creation beyond LLMs
21:9 Incumbents don’t innovate
23:52 Fusion
27:7 Autonomous public transit
30:10 Asset bubbles and the state of venture
33:13 Timing the market

Whisper Transcript | Transcript Only Page

00:00:00.000 | Please join me welcoming Vinod Khosla to stage.
00:00:03.880 | Vinod?
00:00:06.120 | (audience cheers)
00:00:08.080 | How are you?
00:00:08.920 | - How are you doing?
00:00:09.740 | - Legend.
00:00:10.580 | Legend.
00:00:11.420 | (upbeat music)
00:00:14.000 | (upbeat music)
00:00:32.360 | Legend, how are you?
00:00:33.600 | - Legend.
00:00:34.440 | - Vinod, good to see you.
00:00:35.640 | Just by way of background, I think we all know Vinod.
00:00:38.280 | Vinod is a legend and a friend.
00:00:43.240 | His Indian Army officer dad wanted his kid to join the army
00:00:46.760 | and Vinod decided instead to go into business
00:00:49.760 | after reading about Intel and getting inspired
00:00:52.120 | by co-founder Andy Grove, an immigrant who got funding
00:00:55.680 | for his startup in Silicon Valley.
00:00:57.380 | Vinod later got a master's in biomedical engineering
00:01:00.960 | from Carnegie Mellon on a full scholarship.
00:01:03.240 | Stanford GSB rejected him twice.
00:01:05.600 | He finally got accepted and received his MBA in 1980
00:01:08.920 | and he worked at several startups, some of which failed
00:01:11.880 | and then one of them worked out pretty good,
00:01:14.360 | Sun Microsystems.
00:01:15.840 | Started in 1982 with his Stanford classmates,
00:01:19.160 | Scott McNeely, Andy Bechtolstein and Bill Joy.
00:01:22.560 | Vinod raised $300,000 in seed capital from venture firm
00:01:27.600 | Kleiner Perkins, Caulfield and Byers
00:01:29.920 | and within five years, Sun made a billion dollars
00:01:32.560 | in annual revenue.
00:01:33.960 | He became a GP at Kleiner where he helped create NextGen
00:01:37.160 | which he sold to AMD for 28% of its market cap,
00:01:39.760 | the first successful Intel microprocessor clone company
00:01:43.200 | and his largest return to date,
00:01:45.640 | maybe we'll talk about this today, was Juniper Networks
00:01:48.020 | where the firm's $3 million investment in the 90s
00:01:50.580 | earned $7 billion.
00:01:51.620 | In 2004, he launched Kosella Ventures.
00:01:56.980 | 2006, I pitched him and he rejected me
00:02:00.220 | as an investor in my company.
00:02:01.620 | I came back in 2011 and he led my Series B
00:02:06.060 | and I had three competing offers and I picked Vinod.
00:02:08.560 | So Vinod's been a great partner.
00:02:11.480 | By the way, I will also give you guys,
00:02:13.680 | I had offers from, I guess I'll say it,
00:02:16.360 | Founders Fund and Andreessen Horowitz for my Series B.
00:02:19.200 | Vinod gave me the lowest valuation
00:02:21.520 | and I still took his offer.
00:02:23.040 | - Entry price matters.
00:02:23.880 | - Did he add more value?
00:02:25.400 | - Vinod's whole point was VCs typically add negative value.
00:02:29.760 | We're gonna put a senator on your board that we know
00:02:32.260 | that knows this and that firm
00:02:34.600 | and the other board members they brought
00:02:36.180 | and the partners they brought to the table for me
00:02:38.200 | were incredible.
00:02:39.280 | And he's been a great mentor and advisor.
00:02:42.600 | I'll also say I always gave people advice on Vinod
00:02:44.720 | 'cause I get a lot of reference calls like,
00:02:45.960 | should I take money from Vinod?
00:02:47.000 | I heard he's a pain in the ass sometimes.
00:02:48.920 | And I'm like, let me tell you something about Vinod.
00:02:51.360 | I've never told you this in public, but I said,
00:02:53.520 | the thing about Vinod is you go into a meeting with Vinod
00:02:56.960 | and he's like the Oracle in Neo, the Matrix.
00:03:00.160 | You go in the kitchen and he's like,
00:03:02.200 | she says, "Know thyself."
00:03:03.540 | She pulls out the chocolate chip cookies.
00:03:05.680 | She says, he says, "Am I the one?"
00:03:07.320 | She says, "You're not the one."
00:03:08.960 | Kicks him out.
00:03:09.920 | He has to ultimately decide what the right decision is.
00:03:12.840 | And Vinod will cast for you a vision of all the things
00:03:16.120 | that you can become and push you
00:03:18.800 | and try and make you extend your vision for your business
00:03:21.360 | and what is possible.
00:03:22.900 | And through that thought, through that conversation,
00:03:25.160 | through the visioneering that he can help you do
00:03:26.960 | as an entrepreneur, sometimes you need to listen
00:03:29.320 | and sometimes you need to ignore
00:03:30.380 | the bullshit advice he gives you.
00:03:32.240 | But I will say that the role he plays is really important
00:03:36.360 | and I've seen it work across some of the most
00:03:38.440 | important companies in Silicon Valley and the world today.
00:03:40.960 | So Vinod, thanks so much for being with us.
00:03:41.800 | - Can I just interrupt with one comment?
00:03:44.400 | - Yes.
00:03:45.240 | - You know, advice to all the entrepreneurs.
00:03:47.300 | Most VCs wanna be your friend.
00:03:49.820 | And so they are hypocritically polite
00:03:53.960 | instead of brutally honest in a way
00:03:56.040 | that can actually help you make a better decision.
00:03:59.160 | And that's sort of religion for me.
00:04:01.520 | I prefer brutal honesty to hypocritical politeness.
00:04:05.520 | And when people don't do that,
00:04:07.040 | they are actually harming the entrepreneur
00:04:10.040 | and having the entrepreneur like them more.
00:04:12.200 | That's not my game.
00:04:13.280 | - And is that what you mean when you say
00:04:14.840 | most VCs add negative value?
00:04:16.740 | - Well, they take a very short-term approach.
00:04:20.000 | Just going through a situation.
00:04:24.800 | Companies being recapped, they asked us to participate.
00:04:27.880 | I said, "You are hanging on to revenue
00:04:31.240 | "that is bad revenue.
00:04:33.160 | "If you get rid of it and restart the company
00:04:37.220 | "with positive revenue in the right technology approach,
00:04:42.060 | "we'll participate and even lead."
00:04:45.320 | But they hung on to revenue since the beginning of this year.
00:04:48.320 | By the way, three years ago,
00:04:49.620 | I asked them to take a technology approach
00:04:51.900 | as opposed to a people approach to the business.
00:04:54.920 | They did not because revenue came fast,
00:04:57.600 | the technology took time to develop.
00:05:00.720 | And they didn't invest in it.
00:05:02.280 | Now they're going back.
00:05:04.280 | I just got an email today saying
00:05:06.200 | they are happy to take our approach
00:05:08.680 | of really investing in the technology, lower burn rate,
00:05:12.160 | not try and show higher revenue so they can be acquired,
00:05:16.200 | instead build a company.
00:05:18.000 | That's fundamentally the kind of thing
00:05:19.920 | I run into constantly.
00:05:22.360 | - So I wanna start our conversation with AI.
00:05:25.680 | I'm gonna kick this off.
00:05:27.280 | You invested in open AI.
00:05:29.120 | You've been investing in AI technology
00:05:31.080 | as people are framing it today for some time.
00:05:33.480 | Can you share with us a little bit
00:05:35.400 | about how you became an investor in open AI
00:05:37.760 | in the context of the AI trajectory
00:05:40.520 | that you've been seeing for some time?
00:05:42.640 | And also maybe give us a little color
00:05:44.120 | 'cause we haven't had this question answered on our show
00:05:46.280 | about how open AI went from non-profit to for-profit
00:05:48.960 | and what the future is for open AI.
00:05:50.800 | - Well, let me tell you a more general story first,
00:05:53.760 | starting with the internet.
00:05:55.280 | In 1982, when we started Sun,
00:05:58.680 | we bet on the internet.
00:06:00.440 | Only company to bet on the internet.
00:06:04.360 | 1996, since you brought up Juniper,
00:06:07.640 | the internet was growing, but not yet clear.
00:06:11.920 | The senior management of every major telco
00:06:15.920 | in the United States told me
00:06:17.480 | they would never adapt TCP/IP for the internet.
00:06:21.240 | And I said, "Fine, we'll build Juniper to build TCP/IP."
00:06:26.760 | Cisco told me they would never do TCP/IP
00:06:29.720 | above what is probably your home service today, OC12.
00:06:34.240 | Never.
00:06:35.080 | And so we built TCP/IP.
00:06:39.680 | Because we've seen the internet on an exponential
00:06:42.880 | and the flat part of the exponential
00:06:45.120 | looked very uninteresting and nobody believed in it.
00:06:50.000 | We invested in Juniper for the 2,500x return for us
00:06:54.200 | at Kleiner, one of the largest returns in venture,
00:06:57.680 | at least when Wall Street Journal did a piece
00:07:00.480 | on the Snap IPO of largest venture returns ever.
00:07:04.440 | But we believed in something and did that.
00:07:07.560 | How it relates to open AI?
00:07:11.080 | Probably year 2000, I gave an interview
00:07:16.080 | with the New York Times.
00:07:18.400 | My son just sent it to me, so I remember it.
00:07:21.440 | And I said, "AI will be powerful enough
00:07:24.120 | "we don't know when, but we'll have to redefine
00:07:29.120 | "what it means to be human."
00:07:32.120 | And I understand Stephen Wolfram
00:07:35.200 | had some interesting comments in that area,
00:07:37.520 | so happy to talk about it.
00:07:38.960 | 10 years ago, I wrote two blogs.
00:07:42.880 | The first one was called Do We Need Doctors?
00:07:46.200 | With the premise that AI would replace
00:07:48.480 | the expertise of doctors and let them do
00:07:51.480 | the things humans do really well.
00:07:54.520 | So 20% doctor, 80% AI is the thing I defined.
00:07:59.040 | And I wrote another blog called Do We Need Teachers?
00:08:03.080 | Because the only way to scale education
00:08:06.080 | at an equal level for every kid was with AI.
00:08:09.560 | That was January of 2012, so about 12 years ago.
00:08:14.560 | So five years ago, late 2018 is when I started
00:08:19.200 | talking to Sam, the opportunity came up
00:08:22.160 | to do something with OpenAI.
00:08:25.120 | I said, "This is a clear bet.
00:08:27.280 | "I couldn't predict when things will emerge,
00:08:29.920 | "how much capability will happen, how fast."
00:08:32.760 | But you didn't need to know that.
00:08:35.360 | Whether it was in three years, five years, or 10 years,
00:08:39.360 | I was pretty convinced it would be pretty phenomenal
00:08:43.320 | and there'd be practical applications along the way
00:08:46.920 | where it would have a large impact
00:08:48.440 | long before we got to AGI.
00:08:50.840 | So we placed a bet, we placed a long-term bet.
00:08:53.840 | You know, this audience is gonna hear from Bob Mumgard.
00:08:57.840 | You know, five years ago I placed a bet on fusion
00:09:01.120 | for the same reason.
00:09:03.120 | If we get it to work, these are humongous returns,
00:09:06.920 | big bets, but reasonably high probability
00:09:10.440 | of success initially, but very high payback.
00:09:13.640 | And I was so convinced in OpenAI,
00:09:16.320 | I placed twice, our initial investment
00:09:20.280 | was twice the largest previous investment I'd made
00:09:23.960 | in 40 years in venture capital.
00:09:26.800 | Two X, the largest bet I'd ever placed
00:09:29.240 | initially in a company.
00:09:31.040 | - How did that go from non-profit to for-profit
00:09:33.400 | and raise venture money?
00:09:34.760 | - Well, you know, those are details.
00:09:38.760 | (audience laughing)
00:09:41.280 | But I would say the following.
00:09:43.440 | There are plenty of non-profits in Europe
00:09:48.440 | that have for-profit arms, right?
00:09:52.920 | It's a very good model to sustain and generate profit
00:09:57.480 | for a non-profit to scale.
00:09:59.560 | Like I think the European model,
00:10:01.200 | some of these examples are really, really great.
00:10:04.400 | IKEA, Bosch, they have very large non-profit ownership.
00:10:08.640 | And I think that's a very good model to scale.
00:10:12.120 | I mean, I just mentioned, do we need doctors?
00:10:14.240 | My son's building a primary care doctor AI.
00:10:18.240 | My wife's doing a primary care or tutor AI as a non-profit.
00:10:23.240 | It really doesn't matter how you do it.
00:10:26.520 | The question is, how can you gather the resources
00:10:29.680 | to scale the social good you're trying to do,
00:10:32.480 | whether with a doctor or with a tutor
00:10:35.600 | for every kid on the planet?
00:10:37.240 | - There's a lot of discussion about
00:10:39.760 | how jobs might be displaced because of AI.
00:10:43.160 | You've seen, I guess, three or four
00:10:45.720 | of these amazing efficiency cycles.
00:10:50.540 | What's your take on that?
00:10:51.520 | Because this one does feel qualitatively different,
00:10:55.360 | and it does seem to be moving at a faster pace.
00:10:57.480 | So the speed at which AI is moving,
00:11:00.120 | and just the leaps and bounds
00:11:02.080 | of how it's making people augmented,
00:11:04.000 | you're saying 80% AI, 20% doctor.
00:11:07.640 | What do you think this does to jobs,
00:11:09.080 | and do you have those concerns of displacement
00:11:11.200 | that a lot of people seem to be concerned about?
00:11:13.800 | - I think it's a genuine concern.
00:11:15.560 | - Okay.
00:11:16.400 | - The path to getting to that point.
00:11:20.520 | So let me start with that 20, 25 years from now.
00:11:24.840 | I think the need to work will disappear.
00:11:27.240 | People will work because they want to work,
00:11:30.640 | not because they need to work.
00:11:32.520 | And nobody wants a job doing the same thing
00:11:35.680 | for 30 years on an assembly line at GM.
00:11:38.680 | There's a set of jobs nobody wants to do
00:11:41.960 | that we shouldn't really call jobs.
00:11:45.560 | They're absolute--
00:11:47.240 | - Treachery.
00:11:49.640 | - Yeah.
00:11:50.880 | So hopefully those disappear.
00:11:53.160 | I love my job, so 25 years from now, health permitting,
00:11:57.040 | I'll still be doing the same thing, likely.
00:11:59.960 | I'm 68, so, you know,
00:12:02.520 | and Warren Buffett's gone a lot longer than I have.
00:12:06.160 | I'm very passionate about what I do.
00:12:08.180 | I enjoy it so much, I do it for free,
00:12:10.440 | and I'll keep doing it, and other people will.
00:12:12.960 | There will be new jobs.
00:12:15.800 | What you call jobs are what people get paid for.
00:12:20.000 | So it's X Games, part-spend a job.
00:12:23.480 | I don't know, I don't call it a job,
00:12:25.520 | but it is a passion that can pay.
00:12:27.440 | So I do think the nature of jobs will disappear.
00:12:33.440 | Now societies will have the transition from today
00:12:38.440 | to that vision of not needing to work if you don't want to
00:12:42.020 | will be very messy, very political,
00:12:46.700 | very, I would say, controversial.
00:12:51.580 | And I think some countries
00:12:54.660 | will move much faster than others.
00:12:58.760 | I suspect, let's take something I'm very hawkish on, China.
00:13:03.760 | They will take a Tiananmen Square tactic
00:13:06.880 | to enforcing what they think is right for their society
00:13:10.440 | over a 25-year period.
00:13:12.680 | We will have a political process,
00:13:14.560 | which I prefer to the Chinese style.
00:13:17.720 | - Like the debate over the tanks.
00:13:20.080 | - Well, debate always improves things,
00:13:22.280 | but it also does slow down.
00:13:24.000 | And certain societies may choose not to adapt these.
00:13:29.000 | And I think the societies that do that will fall behind.
00:13:33.260 | And I think that's a massive disadvantage
00:13:38.060 | for our political system.
00:13:39.700 | We may choose not to do it.
00:13:41.180 | We've done it before.
00:13:43.820 | But we've also seen massive transitions before.
00:13:46.700 | Agriculture was 50% of US employment in 1900.
00:13:50.820 | By 1970, it was 4%.
00:13:53.700 | So most jobs in that area,
00:13:55.900 | the biggest area of employment, got displaced.
00:13:59.060 | We did manage the transition.
00:14:01.020 | - Well, cost went down and productivity went up.
00:14:03.140 | - Yeah.
00:14:04.500 | So I do think we will have to rethink.
00:14:08.220 | So the one thing I would say is
00:14:09.940 | capitalism is by permission of democracy,
00:14:14.620 | and it can be revoked, that permission.
00:14:17.640 | And I worry that
00:14:22.660 | this messy transition to the vision I'm very excited about
00:14:27.100 | may cause us to change systems.
00:14:30.300 | I do think the nature of capitalism will have to change.
00:14:33.700 | I'm a huge fan of capitalism.
00:14:36.060 | I'm a capitalist.
00:14:37.560 | But I do think the set of things you optimize for,
00:14:41.740 | when economic efficiency isn't the only criteria,
00:14:46.500 | will have to be increased.
00:14:48.860 | So disparity is one that will be increased.
00:14:51.220 | I first wrote a piece of 2014 in Forbes.
00:14:56.220 | It was a long piece that said AI will cause great abundance,
00:15:01.200 | great GDP growth, great productivity growth,
00:15:04.180 | all the things economists measure,
00:15:06.380 | and increasing income disparity.
00:15:09.060 | That was 2014.
00:15:10.740 | And I think I ended that to close this answer off
00:15:14.560 | with we will have to consider UBI in some form.
00:15:18.980 | And I do think that will be a solution,
00:15:22.260 | but meaning for human beings
00:15:24.060 | will be the largest problem we face.
00:15:26.100 | - Which was my follow-up, which is if you give people UBI,
00:15:28.980 | do you have concerns of the idle mind and what happens?
00:15:33.980 | We've seen in other states when 25% of young men
00:15:39.160 | maybe don't have jobs,
00:15:40.560 | their minds wander and bad things happen.
00:15:42.380 | And so what do you think the societal issues will be
00:15:46.300 | if we do offer UBI?
00:15:49.040 | - I think we will have issues,
00:15:51.800 | but we will eliminate other issues.
00:15:55.640 | There's pros and cons to everything.
00:15:58.080 | It's very hard to predict how complex systems behave.
00:16:01.860 | And it's very hard for me to sit here and say,
00:16:05.440 | I have a clear vision of how that happens.
00:16:08.240 | But I, for one, am addicted to learning.
00:16:12.080 | I know other people addicted to producing music.
00:16:16.400 | All of those are reasonable things
00:16:18.640 | for people to find meaning in, or personal meaning.
00:16:21.820 | Yeah, there's others, like I said,
00:16:25.200 | just wanna perfect their participation in a certain sport.
00:16:30.200 | I love X Games because it wasn't even remotely
00:16:33.720 | considered a sport not that long ago.
00:16:36.140 | So things, passions will emerge.
00:16:39.560 | And I think if people have very early in their life
00:16:43.000 | the ability to pursue passions,
00:16:44.900 | I hope that adds the meaning human beings will need.
00:16:48.920 | And in some sense, that's what I was talking about in 2000
00:16:51.920 | when I said we'll have to redefine
00:16:54.560 | what it means to be human.
00:16:57.280 | Your assembly line job won't define you for your whole life.
00:17:01.960 | That'd be a terrible thing.
00:17:04.760 | Yeah.
00:17:05.600 | - In the AI landscape,
00:17:07.840 | just going back to that for a second,
00:17:10.440 | you have a lot of these big companies
00:17:12.200 | who seem to have been caught flat-footed,
00:17:13.960 | so they're acting pretty aggressively.
00:17:15.380 | I think even yesterday there was a leak of something
00:17:18.340 | where Meta's trying to really superpower Llama 2
00:17:21.000 | and then create some high-performance cloud
00:17:23.260 | that they effectively wanna give away.
00:17:25.740 | We talked about this a little bit on the pod,
00:17:27.520 | but the idea seems like the big companies
00:17:29.560 | wanna scorch the earth on the foundational model side.
00:17:33.040 | Then you have some other big companies
00:17:34.380 | that are working on silicon.
00:17:35.680 | So can you just walk us through in your mind
00:17:38.180 | how you've organized, where the value's getting created,
00:17:41.540 | where there'll be product value,
00:17:43.260 | but no economic value, et cetera, et cetera?
00:17:45.340 | - Yeah, look, going back to open AI,
00:17:48.120 | part of my interest in funding open AI
00:17:51.400 | was when we looked at it, when I looked at it,
00:17:54.780 | there was two major centers of excellence in AI.
00:17:57.900 | There was Google.
00:17:58.900 | - DeepMind.
00:17:59.740 | - They had DeepMind and Google Brain, but Google.
00:18:02.700 | And then there was an effort at Baidu.
00:18:05.380 | And I was very worried the Chinese might win that.
00:18:09.820 | I'm hawkish on China.
00:18:11.660 | I have no trust there.
00:18:14.660 | And I love Graham Allison's book, "Destined for War."
00:18:18.660 | I'm a complete believer in that thesis.
00:18:20.900 | But I thought it'd be valuable to create a third center,
00:18:27.700 | and that was part of the motivation in funding open AI.
00:18:32.180 | I'm really glad Meta's doing it,
00:18:34.900 | and others are trying to do it.
00:18:36.820 | And you know, there's efforts in every country,
00:18:39.700 | every country wants their national AI.
00:18:42.180 | I think more competition,
00:18:43.700 | especially in the set of countries
00:18:46.420 | subscribing to Western values, is good.
00:18:51.260 | And I think that will sort out.
00:18:53.200 | But I would say the following.
00:18:55.220 | Almost all the efforts I have seen are very limited.
00:19:00.220 | They're all trying to scale LLMs.
00:19:03.820 | And you know, I spend my time looking at
00:19:06.780 | what besides LLM will play an important role.
00:19:11.780 | I haven't seen one effort among the majors
00:19:15.460 | that isn't following the same model.
00:19:17.980 | More NVIDIA GPUs, scale the parameters, more data,
00:19:22.980 | instead of saying, are there other methods?
00:19:28.140 | So I'll mention, you know,
00:19:29.660 | we made an investment in a symbolic logic idea.
00:19:32.780 | Would that be dramatically additive?
00:19:35.220 | And in, you know, just like open AI
00:19:37.980 | didn't look that interesting for five years ago,
00:19:40.580 | 2018, late '18 is when we really made our decision to invest.
00:19:45.580 | It took a while to get organized.
00:19:48.940 | But I'm saying, what else would look like that in 2028?
00:19:54.780 | - Yeah.
00:19:55.740 | - Yeah, would it be symbolic logic?
00:19:57.740 | I'll invest in anybody doing that.
00:20:00.140 | - Are you worried about-- - Is it probabilistic
00:20:01.580 | programming, is it completely other ways,
00:20:05.300 | like multi-agent systems?
00:20:07.420 | Is there are going to be other axes
00:20:11.420 | in which we will see progress?
00:20:12.900 | - Said differently, do you need to make sure
00:20:14.540 | there are other axes so that we're not looking back
00:20:16.820 | a few years from now and there's just massive CUDA lock-in
00:20:19.900 | and nothing can happen without one gatekeeper?
00:20:22.980 | Is that-- - Well, the great thing
00:20:24.060 | about venture and venture capitalists
00:20:26.140 | is they like to place long bets.
00:20:28.260 | So I'll place all those long bets on alternative ways
00:20:32.180 | to learn and not replace, but add to LLMs,
00:20:36.100 | I think is the more likely scenario, though, who knows?
00:20:39.140 | - Yeah.
00:20:39.980 | - But even among just, imagine LLMs do it all.
00:20:46.180 | It's good that there's multiple players.
00:20:50.020 | And so more big companies pop up, great.
00:20:53.020 | I also do think governments will have their own efforts.
00:20:57.900 | China, of course, has one, but plenty of other rumblings
00:21:02.500 | of governments trying to put together
00:21:05.020 | an effort in my country.
00:21:07.140 | - Tell us the-- - And that's all good.
00:21:09.100 | - Tell us, up-leveling away from just AI,
00:21:11.260 | but broadly speaking venture, give us the state of the union
00:21:14.580 | of venture capital in September '23.
00:21:17.020 | - So let me give you a perspective.
00:21:20.540 | I've been doing venture for some 40 years.
00:21:25.140 | In that time, I have not seen,
00:21:29.140 | and people get surprised at this,
00:21:30.900 | one example of a large innovation
00:21:34.580 | that came from a large company or institution.
00:21:37.260 | So the only thing you can do, not one.
00:21:42.300 | (audience applauding)
00:21:44.580 | The only two examples I could find was in the early '70s,
00:21:49.100 | which was, if you go back 50 years,
00:21:51.300 | when I was, Bank of America put debit and credit
00:21:56.260 | on a plastic card.
00:21:57.420 | Like, that wasn't even a major innovation,
00:21:59.660 | but I'll give them that.
00:22:01.700 | But I couldn't think of examples like that.
00:22:04.300 | - Apple with the iPhone, maybe?
00:22:06.060 | - That was Steve Jobs.
00:22:08.340 | I think it was a founder-led,
00:22:09.980 | I should have said founder-led innovation.
00:22:12.340 | - Okay, got it.
00:22:13.180 | - Is a more accurate description.
00:22:15.460 | - I like that one.
00:22:16.300 | - You know, did General Motors do driverless cars,
00:22:19.380 | or Waymo?
00:22:21.020 | Did, would somebody at Avis do Uber?
00:22:26.020 | Would somebody at Hilton do Airbnb?
00:22:31.860 | Would somebody, and Elon's speaking next,
00:22:35.940 | but somebody at Lockheed Martin do SpaceX or Rocket Lab?
00:22:40.900 | We're investors in Rocket Lab, so I like it a little more.
00:22:43.860 | They do have a superior strategy,
00:22:48.140 | but SpaceX has done a great job.
00:22:51.020 | But think, think of every large social change.
00:22:57.460 | Is there a large company that played?
00:23:02.540 | You know, the only thing somebody came up with
00:23:04.420 | was debt structuring that the bankers did.
00:23:09.420 | And I said, they didn't take any risk, right?
00:23:14.100 | - No innovation.
00:23:14.940 | - They took a risk with somebody else's money.
00:23:16.780 | Other people's money, easy to play with,
00:23:19.020 | whether you're in finance or in government.
00:23:22.740 | Other people's money is easy.
00:23:24.380 | But my idea is innovation comes from groups like this.
00:23:29.500 | And I'm completely convinced, I lost the original question,
00:23:33.180 | but this is where innovation will come from.
00:23:35.020 | - State of venture capital.
00:23:35.860 | - What's the state of venture capital?
00:23:36.700 | - I can ask Bob on that.
00:23:37.540 | - Yeah, so any large area,
00:23:39.660 | you'll be hearing from Bob Mumgard,
00:23:41.420 | you know, four or five years ago invested in Fusion.
00:23:44.700 | Now we may fail, I think we are much more likely
00:23:47.940 | to succeed sitting here today than fail, but it's possible.
00:23:52.660 | - Can you talk about that just while you're on Fusion?
00:23:56.020 | We've got both Bob and David from Helion tomorrow morning.
00:23:59.340 | And there's broadly, call it six or some number
00:24:02.900 | of general architectural approaches to solving the--
00:24:06.180 | - Yeah, there's six approaches, a dozen companies.
00:24:08.900 | I think it's--
00:24:10.020 | - Well, now there's 70, but yeah.
00:24:11.860 | I mean, it's just ballooning.
00:24:12.820 | But how do you think about the role that a Commonwealth
00:24:16.100 | or even an open AI, I guess,
00:24:17.740 | how big of a role does capital play in getting there?
00:24:22.460 | 'Cause what makes Commonwealth different
00:24:24.580 | is how much capital they've raised,
00:24:26.180 | billion dollar plus at this point, open AI similar.
00:24:29.820 | The decision as an investor,
00:24:34.260 | how much of it is predicated on this team
00:24:36.620 | or this architecture getting it right?
00:24:37.460 | - So let me tell you the Commonwealth story
00:24:39.140 | before Bob has a chance to tell you.
00:24:41.060 | - Is he here by the way?
00:24:41.900 | - Yeah, yeah.
00:24:42.900 | - Yeah, you got it.
00:24:45.140 | - I'm sure he's here somewhere
00:24:46.500 | because I'm meeting him in half an hour.
00:24:48.300 | - Oh, yeah, you must be.
00:24:49.140 | (audience laughing)
00:24:49.980 | - Backstage.
00:24:50.820 | But here's the thing.
00:24:54.540 | They may have raised a lot of money.
00:24:56.900 | It took double digit millions to handle
00:25:00.460 | the single biggest risk in building the architecture, right?
00:25:04.740 | Which was, can you build a 20 Tesla magnet?
00:25:07.100 | - Yeah, the magnet.
00:25:07.940 | - Right?
00:25:08.780 | And Bob and I had this discussion.
00:25:11.140 | I said, if fusion doesn't work,
00:25:15.100 | but you can build a 20 Tesla magnet,
00:25:17.140 | we got lots of interesting applications in nuclear medicine.
00:25:20.900 | - Yeah.
00:25:21.740 | - You know, a fusion reactor is a particle accelerator.
00:25:24.940 | - That technical breakthrough, we can pivot somewhere else.
00:25:26.940 | - Yeah, nuclear medicine, MRI machines, lots of areas.
00:25:30.900 | So let's go build with double digit millions,
00:25:33.740 | which is not a huge amount in the context of the upside
00:25:37.780 | of maybe a bigger upside than the 2,500 X Juniper had.
00:25:42.500 | If you solve the fusion problem,
00:25:44.540 | there's still a 10 X upside
00:25:46.020 | if you just solve the MRI machine problem.
00:25:48.820 | - So you're thinking of probabilities.
00:25:50.460 | - Well, today we are not, but when we started,
00:25:54.780 | that was clearly to look at off ramps, right?
00:25:58.060 | Like where else could we go
00:26:00.220 | if we only had double digit millions?
00:26:02.180 | We built the magnet.
00:26:04.080 | If we didn't build the magnet, we failed.
00:26:06.100 | That's venture capital.
00:26:07.320 | - So how do you know that that bet, that investment,
00:26:10.140 | at this point, is still a good investment,
00:26:12.220 | or you keep putting more capital in,
00:26:13.820 | if there's now six competing architectures,
00:26:15.720 | any one of which could have their own breakthrough,
00:26:17.540 | that could yield better economics?
00:26:19.620 | - So we've continued to invest in it,
00:26:22.300 | mostly because I think Bob's just a phenomenal CEO.
00:26:26.140 | And if anybody can make this happen,
00:26:29.140 | and the world needs it to happen, Bob can.
00:26:33.460 | But I'm also optimistic, and as, you know,
00:26:35.980 | we invest in Sam too, he invested in Helion.
00:26:39.480 | I think there's more than one approach
00:26:41.120 | that could be successful.
00:26:42.440 | In fact, I guess 15 years from now,
00:26:45.440 | we'll be looking at one,
00:26:46.600 | more than one company that's successful.
00:26:48.600 | - Sax, you had a question I wanted to make sure
00:26:50.040 | you got to ask.
00:26:50.880 | - Let's let Vidhut finish that.
00:26:53.000 | - Okay, I thought it was finished, yeah.
00:26:55.400 | So my view is large problems like that,
00:27:00.040 | you'd think GE would solve, not a chance.
00:27:03.000 | - Not a chance.
00:27:03.840 | - Right.
00:27:04.660 | - Not a chance, not a chance.
00:27:05.820 | - Yeah.
00:27:06.660 | - You know, you take any of these,
00:27:09.340 | one of my favorite, you take large problems.
00:27:12.860 | One of my biggest beefs is the traffic in cities.
00:27:16.640 | I actually think we are very much on track
00:27:19.420 | to replace most cars in most cities in 25 years.
00:27:24.380 | By building a different kind of public transit system.
00:27:27.540 | Anybody wanna invest in it?
00:27:29.000 | I'm really excited about it.
00:27:32.620 | You know, Waymo went the way of autonomous cars.
00:27:37.060 | For consumers or taxis,
00:27:40.980 | autonomous cars in public transit
00:27:45.980 | will increase throughput.
00:27:48.180 | The only thing fixed in cities is lane rate.
00:27:50.660 | You have a road certain size,
00:27:52.100 | you don't destroy buildings on both sides.
00:27:55.040 | So if you can increase passengers per hour through it,
00:27:58.900 | and by the way, bicycles don't.
00:28:01.100 | We love bicycles, but they don't.
00:28:03.660 | Scooters don't.
00:28:05.060 | Public transit autonomously driven in small parts
00:28:11.020 | will make public transit on demand,
00:28:13.420 | so unscheduled, on demand, any time of the day or night,
00:28:17.420 | and point to point.
00:28:19.740 | You don't stop for anybody else to get up.
00:28:22.140 | It's a beautiful way to change cities.
00:28:25.980 | And I think it'll happen.
00:28:26.980 | There's no question about it.
00:28:28.580 | - So like Uber Pool, but with a larger mini bus type format.
00:28:32.220 | - No, no, no.
00:28:33.060 | You don't wanna go to mini bus
00:28:34.220 | because then you'll have to stop
00:28:35.660 | for other people to get up.
00:28:37.180 | - So just autonomous.
00:28:38.020 | - If it's two or four person vehicles,
00:28:40.580 | it's faster than your private car.
00:28:43.060 | It doesn't stop at traffic lights.
00:28:45.420 | It's on demand.
00:28:46.620 | You don't have to park it.
00:28:48.020 | I could go on and on.
00:28:50.420 | I don't wanna spend too much time on one particular idea,
00:28:53.060 | but all I'm saying is,
00:28:54.740 | whether it's fusion or public transit,
00:28:57.660 | and we're doing 3D printed buildings.
00:29:00.180 | Another one of my favorites is a new,
00:29:03.620 | and you were asking about this,
00:29:06.140 | a music model to produce music
00:29:08.740 | that is not trained on any YouTube or public music.
00:29:13.460 | So completely IP-free of any constraints.
00:29:17.780 | Like I'm really excited about it.
00:29:19.220 | Little outfit in Australia,
00:29:21.540 | and they've just released a product
00:29:26.220 | that would be the mid journey for music.
00:29:28.380 | Like really exciting.
00:29:30.740 | - What's your favorite genre of music?
00:29:32.980 | What do you like to listen to?
00:29:33.820 | - I'm not a huge music fan, to be honest.
00:29:35.860 | I had to admit it.
00:29:37.460 | But I love a great challenge
00:29:40.620 | when this entrepreneur came to me and said,
00:29:43.180 | "Hey, in some period of time, say 10 years,"
00:29:46.340 | and this was four or five years ago,
00:29:48.780 | and long before AI was a big thing.
00:29:50.860 | He said, "I wanna produce a music, a song,
00:29:55.740 | a top 10 music song untouched by a human."
00:29:59.380 | I said, "I love a challenge like that, let's go."
00:30:02.340 | It literally took me half an hour to say,
00:30:04.380 | "I'm in, no diligence, didn't need all that."
00:30:08.460 | (audience laughing)
00:30:09.460 | - David?
00:30:10.300 | - Yeah, I think where Tamath was gonna go a minute ago
00:30:12.540 | about the State of the Union and VC
00:30:14.740 | is that we talk a lot on the pod about the fact
00:30:17.100 | that we've just been through an asset bubble
00:30:18.700 | that popped last year.
00:30:20.980 | And it really inflated during COVID
00:30:23.860 | with all the money printing that went on,
00:30:25.140 | and it may have been inflating before that
00:30:27.060 | with these ZURP policies going back, I don't know, 15 years.
00:30:30.220 | I guess my question to you is,
00:30:33.500 | how much fakeness do you think that created
00:30:36.700 | in the ecosystem?
00:30:38.420 | Or is there any way to even quantify it?
00:30:41.420 | I mean, I think we see now that the size of venture funds
00:30:44.500 | is contracting, that new funds are gonna be smaller.
00:30:48.340 | Probably there's a lot of VCs
00:30:49.460 | who shouldn't even be in business.
00:30:50.580 | I think you'll be happy about that.
00:30:52.980 | We've estimated on the pod
00:30:55.060 | there's 1,400 unicorns right now.
00:30:56.620 | We've tried to sort of get, well, are half of them fake?
00:30:59.340 | I guess my question to you is,
00:31:02.420 | how much did this asset bubble sort of inflate
00:31:05.420 | everything in VC?
00:31:07.060 | And where do you think it stands right now?
00:31:09.220 | - You know, it inflated things a lot in certain areas,
00:31:12.980 | and it deflated things a lot in certain areas.
00:31:16.620 | But let me give you two analogies to understand this.
00:31:20.140 | Your investors only,
00:31:21.380 | and there's a fair number of investors I hear here,
00:31:23.980 | only care about two emotions, fear and greed.
00:31:26.900 | And they bounce between the two,
00:31:29.060 | and there's nothing in the middle.
00:31:31.060 | And so the key to being a good investor
00:31:33.500 | is staying focused on the long-term in the middle.
00:31:37.420 | So any time you get a hype cycle,
00:31:40.940 | you'll get inflation, and then you will get deflation.
00:31:44.740 | But let me ask you the following question.
00:31:47.380 | 1998, there was a dot-com bubble, a bust.
00:31:52.900 | Tell me, and there was clearly a change
00:31:55.340 | in Wall Street prices and valuations and all that.
00:31:58.660 | There was absolutely no bust
00:32:01.500 | in the amount of internet traffic.
00:32:03.860 | So if you look at internet traffic,
00:32:06.820 | you can't tell when the dot-com bust happened.
00:32:10.180 | So I'm saying the reality of using the internet
00:32:13.260 | didn't change just because people had inflated values
00:32:16.340 | and then deflated them.
00:32:17.980 | That's just a fear reaction and a greed reaction.
00:32:21.060 | Same thing happened with the original bubble
00:32:23.020 | in the 1830s in England.
00:32:25.740 | If you got the right to build a railroad between two cities,
00:32:29.260 | you could offer strips on the market, big bubble.
00:32:32.780 | Then a big collapse.
00:32:34.460 | But more railroad was built in the 10 years
00:32:38.060 | after the bubble collapsed in England in the 1830s
00:32:42.100 | than before.
00:32:43.580 | My point is the reality of the underlying business,
00:32:46.260 | if you're adding value, does not change that much.
00:32:50.060 | It can accelerate a little bit or slow down
00:32:52.260 | because less resources available.
00:32:54.700 | But fundamentals is a good way to invest,
00:32:58.260 | sort of the Warren Buffett method,
00:33:00.620 | not the short-term, hey, he's valuing it twice,
00:33:04.300 | so let me pay up, and then you'll pay twice as much as that
00:33:08.860 | or SoftBank will later.
00:33:11.020 | I think that--
00:33:14.860 | - So I think it's a question of you can't time the market.
00:33:17.940 | - You can't time the market.
00:33:18.780 | - As long as you know that this is an investable technology,
00:33:22.060 | an investable trend, timing what price you're coming in at,
00:33:26.020 | I mean, this is an old adage in markets.
00:33:27.980 | - You know, it doesn't really matter
00:33:29.780 | what price you're coming at.
00:33:31.500 | You know, in inflationary times, you pay a little bit more,
00:33:35.700 | but you get less dilution later
00:33:38.500 | because other people are investing,
00:33:40.220 | and then when it goes into a deflationary cycle,
00:33:44.140 | you just have to learn to respond quickly.
00:33:47.300 | And good investors will help guide entrepreneurs correctly
00:33:50.860 | through all those phases.
00:33:52.020 | We've seen a lot of that in climate investing.
00:33:54.380 | - Yeah, for sure.
00:33:55.660 | - You know, very low valuations.
00:33:58.300 | Look, when we invested in Impossible Foods,
00:34:00.900 | it was a $3 million valuation, pre-money.
00:34:04.340 | - Crazy idea.
00:34:05.300 | - Crazy idea, nobody thought,
00:34:08.180 | plant proteins wasn't a word.
00:34:10.220 | Another one of my favorite projects
00:34:13.580 | is replace soy as a protein source on this planet,
00:34:16.460 | and I think we are well on our way to do that.
00:34:19.140 | I'm very excited.
00:34:20.620 | But at that level, it doesn't matter
00:34:23.980 | whether a valuation in the end goes up or down
00:34:26.940 | by a factor of two.
00:34:28.500 | It doesn't really matter if you're adding fundamental value
00:34:31.820 | and long-term things.
00:34:33.620 | - I know that your long-term view
00:34:35.820 | is critical to making Silicon Valley work.
00:34:38.660 | I think that we all kind of appreciate your leadership,
00:34:41.940 | your support, your mentorship,
00:34:43.260 | and the fact that you're willing
00:34:45.300 | to make those big bets in very low probability outcomes,
00:34:50.220 | but if they work, the return is so much more
00:34:52.740 | than that multiple, that ratio that you're getting.
00:34:56.940 | And it matters so much.
00:34:58.740 | - I just wanted to add, it's just inspiring
00:35:00.540 | how much you share with all of us who are behind you
00:35:03.300 | and how much we've all learned from you.
00:35:05.220 | I didn't mention it, but you've taken a lot of time
00:35:07.220 | for Trimoth, you've taken time for me and my career,
00:35:09.620 | and I just think it's like a great gift to us.
00:35:12.380 | - I can end with a quick story.
00:35:13.940 | - We really do have to jump.
00:35:16.340 | - I emailed, I mailed, paper mailed,
00:35:19.820 | 40 or 50 people to try to get a job in VC in 2000.
00:35:23.220 | The only single person sent me
00:35:25.500 | a handwritten rejection letter, VK.
00:35:28.340 | - Hey!
00:35:29.180 | (audience applauding)
00:35:30.300 | Give it up.
00:35:31.140 | - Thank you, everybody.
00:35:32.820 | (audience applauding)
00:35:35.900 | - Another standing ovation, wow.
00:35:37.580 | - Thank you.
00:35:38.420 | - Thank you, sir.
00:35:39.260 | - That was great.
00:35:40.100 | - Thank you.
00:35:40.940 | - That was great.
00:35:41.780 | (upbeat music)
00:35:44.380 | ♪ Let your beat, let your beat ♪
00:35:49.380 | ♪ Let your winners ride ♪
00:35:55.940 | ♪ And I said ♪
00:35:58.900 | ♪ We open sourced it to the fans ♪
00:36:00.220 | ♪ And they've just gone crazy ♪
00:36:02.140 | ♪ Love you, Wesley ♪
00:36:02.980 | ♪ I see the queen of the king of the line ♪
00:36:07.980 | ♪ Besties are back ♪
00:36:13.780 | - I'm gonna have my dog take an admission in your driveway.
00:36:15.780 | (laughing)
00:36:18.020 | - I am, I am.
00:36:20.300 | - The natural VVF voice.
00:36:21.140 | - We should all just get a room
00:36:22.380 | and just have one big huge orgy
00:36:23.940 | 'cause they're all just useless.
00:36:25.180 | It's like sexual tension
00:36:26.660 | that we just need to release somehow.
00:36:28.460 | - Let your beat, let your beat.
00:36:31.060 | - Let your beat, let your beat.
00:36:33.660 | - We need to get merch.
00:36:35.500 | ♪ Besties are back ♪
00:36:36.340 | ♪ I'm going all in ♪
00:36:39.340 | (upbeat music)
00:36:41.940 | ♪ I'm going all in ♪
00:36:46.980 | Thanks for watching!