back to indexChamath Palihapitiya: Money, Success, Startups, Energy, Poker & Happiness | Lex Fridman Podcast #338
Chapters
0:0 Introduction
1:5 Childhood and forgiveness
14:49 Money and happiness
21:40 Poker
25:7 Mistakes
35:58 Early jobs
37:35 Facebook
55:21 Energy
63:1 Cloud computation
67:17 Fixing social media
77:8 Trump's Twitter ban
82:13 Kanye West
93:25 All-In Podcast
102:41 Nuclear war
114:17 Startups
122:48 Work-life balance
133:57 Teamwork
145:18 Energy transition
155:51 Silicon Valley culture
159:10 Activism culture
163:32 Advice for young people
170:7 Meaning of life
00:00:00.000 |
In terms of your mistakes, society tells you don't make them because we will judge you and 00:00:06.160 |
we will look down on you. And I think the really successful people realize that actually, no, 00:00:11.200 |
it's the cycle time of mistakes that gets you to success. Because your error rate will diminish 00:00:18.400 |
the more mistakes that you make. You observe them, you figure out where it's coming from. 00:00:23.120 |
Is it a psychological thing? Is it a, you know, cognitive thing? And then you fix it. 00:00:27.440 |
The following is a conversation with Chamath Palihapitiya, a venture capitalist and engineer, 00:00:35.680 |
founder and CEO of Social Capital, previously an early senior executive at Facebook, and is the 00:00:43.520 |
co-host of the All In podcast, a podcast that I highly recommend for the wisdom and the camaraderie 00:00:50.960 |
of the four co-hosts, also known as besties. This is the Lex Friedman podcast. To support it, 00:00:58.400 |
please check out our sponsors in the description. And now, dear friends, here's Chamath Palihapitiya. 00:01:04.560 |
You grew up in a dysfunctional household on welfare. You've talked about this before. 00:01:11.280 |
What were, for you personally, psychologically, some difficult moments in your childhood? 00:01:16.480 |
I'll answer that question in a slightly different way, which is that I think when you grow up 00:01:21.440 |
in a household that's defined by physical abuse and psychological abuse, 00:01:28.480 |
you're hypervigilant all the time. And so it's actually easier for me to point to moments where 00:01:35.760 |
I was happy or I felt compassion or I felt safe. Otherwise, every moment, I'll give you a couple 00:01:43.840 |
of examples. I was thinking about this a while ago. There was a tree outside of my apartment 00:01:51.840 |
where we lived when I was growing up, and my father sometimes would make me go outside 00:01:58.160 |
to take the tree branch that he would hit me with. And so you can imagine if you're a 10, 00:02:04.560 |
11-year-old kid and you have to deal with that, what do you do? Well, a hypervigilant child 00:02:09.920 |
learns how to basically estimate the strength of these branches. How far can he go before it breaks? 00:02:19.680 |
You have to estimate his anger and estimate the effective strength of branches and bring back 00:02:27.120 |
something. Because I remember these moments where if it was, he would look at it and then he would 00:02:31.760 |
make me go out again and get it, get a different one. Or there was a certain belt that he wore 00:02:39.600 |
that had this kind of belt buckle that stuck out. And you just wanted to make sure if that was the 00:02:47.440 |
thing that you were going to get hit by, that it wasn't the buckle facing out because that really 00:02:52.960 |
hurt. And so you became hyperaware of which part of the buckle was facing out versus facing in in 00:02:59.840 |
those moments. And there are hundreds of these little examples, which essentially I would say 00:03:06.080 |
the through line is that you're just so on edge. And you walk into this house and you're just 00:03:12.080 |
basically trying to get to the point where you leave the house. And so in that microcosm of 00:03:19.520 |
growing up, any moment that's not like that is seared in my memory in a way that I just can't 00:03:26.400 |
describe to a person. I'll give you an example. I volunteered when I was in grade five or six, 00:03:33.920 |
I can't remember which it was, in the kindergarten of my school. And I would just go and the teacher 00:03:40.000 |
would ask you to clean things up. And at the end of that grade five year, she took me and two other 00:03:47.920 |
kids to Dairy Queen. And I'd never gone to a restaurant, literally, because we just didn't 00:03:56.720 |
have the money. And I remember the first time I tasted this Dairy Queen meal, it was like a 00:04:02.160 |
hamburger, fries, a Coke, and a Blizzard. And I was like, "What is this?" And I felt 00:04:09.040 |
so special, because you're getting something that most people would take for granted. Oh, 00:04:15.280 |
it's a Sunday or I'm really busy, let me go take my kid to fast food. I think that until I left 00:04:22.560 |
high school, I think, and this is not just specific to me, but a lot of other people, 00:04:30.000 |
you're in this hypervigilant loop, punctuated with these incredibly visceral moments of compassion 00:04:37.120 |
by other people. A different example, we had such a strict budget and we didn't have a car. 00:04:45.680 |
And so I was responsible with my mom to always go shopping. And so I learned very early on how to 00:04:54.320 |
look for coupons, how to buy things that were on sale or special. And we had a very basic diet, 00:05:00.240 |
because you have to budget this thing really precisely. But the end of every year where I 00:05:07.920 |
lived, there was a large grocery chain called Loblaws. And Loblaws would discount a cheesecake 00:05:16.800 |
from 7.99 to 4.99. And my parents would buy that once a year. And we probably did that six or seven 00:05:26.480 |
times. And you can't imagine how special we felt, myself, my two sisters. We would sit there, 00:05:33.760 |
we would watch the New Year's Eve celebration on TV, we would cut this cheesecake into five pieces. 00:05:41.360 |
It felt like everything. So that's sort of how my existence when I was at that age is, 00:05:50.720 |
for better or for worse, that's how I remember it. 00:05:53.520 |
The hypervigilant loop, is that still with you today? What are the echoes of that still with 00:05:59.920 |
you today, the good and the bad? If you put yourself 00:06:05.840 |
in the mind of a young child, the thing that that does to you is at a very core basic level, 00:06:18.000 |
it says you're worthless. Because if you can step outside of that and you think about any child in 00:06:25.200 |
the world, they don't deserve to go through that. And at some point, by the way, I should tell you, 00:06:32.080 |
I don't blame my parents anymore. It was a process to get there, but I feel like they did the best 00:06:37.440 |
they could. And they suffered their own issues and enormous pressures and stresses. And so, 00:06:45.120 |
I've really, for the most part, forgiven them. >> How did you, sorry to interrupt, 00:06:50.960 |
let go of that blame? >> That was a really long process where, 00:06:56.800 |
I would say the first 35 years of my life, I compartmentalized and I avoided all of those 00:07:03.360 |
memories. And I sought external validation. Going back to this self-worth idea, if you're taught as 00:07:12.320 |
a child that you're worthless, because why would somebody do these things to you? It's not because 00:07:16.640 |
you're worth something, you think to yourself, very viscerally, you're worth nothing. And so 00:07:22.800 |
then you go out and you seek external validation. Maybe you try to go and get into a great college, 00:07:27.600 |
you try to get a good job, you try to make a lot of money, you try to demonstrate in superficial 00:07:34.800 |
ways with the car you drive or the clothes you wear that you deserve people to care about you, 00:07:40.320 |
to try to make up for that really deep hole. But at some point, it doesn't get filled in. 00:07:47.360 |
And so you have a choice. And so for me, what happened was in the course of a six-month period, 00:07:52.880 |
I lost my best friend and I lost my father. And it was really like the dam broke loose, 00:07:59.040 |
because the compartmentalization stopped working because the reminder 00:08:04.160 |
of why I was compartmentalizing was gone. And so I had to go through this period of disharmony 00:08:12.640 |
to really understand and steel man his perspective. And can you imagine trying to do that, 00:08:22.400 |
to go through all of the things where you have to now look at it from his perspective 00:08:27.520 |
and find compassion and empathy for what he went through? And then I shift the focus to my mom and 00:08:36.000 |
I said, "Well, you were not the victim, actually. You were somewhat complicit as well, because 00:08:41.920 |
you were of sound mind and body and you were in the room when it happened." 00:08:45.200 |
So then I had to go through that process with her and steel man her perspective. 00:08:50.240 |
At the end of it, I never justified what they did, but I've been able to forgive what they did. 00:08:56.720 |
I think they did the best they could. And at the end of the day, they did the most important thing, 00:09:03.920 |
which is they gave me and my sisters a shot by emigrating, by giving up everything, 00:09:10.160 |
by staying in Canada and doing whatever it took between the two of them 00:09:14.240 |
to sort of claw and scrape together enough money to live so that my sisters and I could have a 00:09:21.120 |
shot. And I'm very thankful for them. Could they have done better? Obviously. But I'm okay with 00:09:26.400 |
what has taken place. But it's been a long process of that steel manning so that you 00:09:31.840 |
can develop some empathy and compassion and forgive. >> BOWEN Do you think if you talk to 00:09:36.720 |
your dad shortly after he died and you went through that process or today, you'll be able to 00:09:43.680 |
have the same strength to forgive him? >> SUBRATA I think it would be a very 00:09:53.360 |
complicated journey. I think I've learned to be incredibly open about what has happened. 00:10:01.920 |
And all of the mistakes I've made, I think it would require him to be pretty radically honest 00:10:11.440 |
about confirming what I think he went through, because otherwise it just wouldn't work. Otherwise, 00:10:18.960 |
I would say, let's keep things where they are, which is I did the work with people that have 00:10:24.800 |
helped me, obviously. But it's better for him to just kind of hopefully he's looking 00:10:30.960 |
from some place and he's thinking it was worth it. I think he deserves to think that all of this, 00:10:36.800 |
because I think the immigrant challenge, or not even the immigrant challenge, the 00:10:42.000 |
lower middle class challenge, anybody who really wants better for their kids and doesn't have a 00:10:46.560 |
good toolkit to give it to them, some of them just they choke up on the bat. They just get so 00:10:54.160 |
agitated about this idea that all this sacrifice will not be worth it, 00:10:58.880 |
that it spills out in really unproductive ways. And I would put him in that category. 00:11:04.080 |
Yeah. And there is self-evaluation, introspection, they have tunnel vision, so they're not 00:11:10.960 |
able to often see the damage that did. I mean, I know, like yourself, a few successful people that 00:11:19.440 |
had very difficult relationships with their dad. And when you take the perspective of the dad, 00:11:27.040 |
they're completely in denial about any of it. So if you actually have a conversation, 00:11:31.840 |
there would not be a deep honesty there. And I think that's maybe in part the way of life. 00:11:38.160 |
Yeah. And I remember pretty distinctly after I left and in my middle 30s, where 00:11:46.720 |
by all measure, I had roughly become reasonably successful. And my dad didn't particularly care 00:11:54.560 |
about that, which was so odd, because I had to confront the fact that whether it was a title or 00:12:01.520 |
money or press clippings, he never really cared. He moved on to a different set of goals, which was 00:12:07.520 |
more about my character and being a good person to my family and really preparing me to lead our 00:12:16.000 |
family when he wasn't there. And that bothered me, because I thought I got to the finish line, 00:12:22.080 |
and I thought there was going to be a medal, meaning I can tell you, Lex, he never told me 00:12:27.840 |
that he loved me. I'm not sure if that's normal or not. It was my normality. And I thought there's 00:12:33.840 |
going to be something, some gold star, which never appeared. And so that's a hard thing 00:12:40.000 |
to confront, because you're like, well, now what is this all about? Was this all just a ruse? 00:12:49.200 |
But then I realized, well, hold on a second. There were these moments where in his way, again, 00:12:54.960 |
putting yourself in his shoes, I think he was trying to say he was sorry. He would hold my hand, 00:13:01.200 |
and he would interlock the fingers, which I felt that's a really intimate way of holding 00:13:06.000 |
somebody's hand, I think. So I remember those things. So these are the things that are just 00:13:11.920 |
etched at least in my mind. And at the end of it, I think I've done a decent job in repairing 00:13:20.240 |
my relationship with him, even though it was posthumous. >> It does make me wonder in which way 00:13:26.720 |
you and I, we might be broken and not see it. It might be hurting others and not see it. 00:13:33.200 |
>> Well, I think that when you grow up in those kinds of environments, and they're all different 00:13:38.400 |
kinds of this kind of dysfunction, but if what you get from that is that you're not worthwhile, 00:13:45.200 |
you're less than many, many other people, when you enter adulthood or semi-adulthood in your 00:13:53.920 |
early 20s, you will be in a cycle where you are hurting other people. You may not know it. 00:14:00.640 |
Hopefully, you find somebody who holds you accountable and tells you and loves you enough 00:14:05.920 |
through that. But you are going to take all of that disharmony in your childhood, and you're 00:14:11.920 |
going to inject that disharmony into whether it's your professional relationships or your 00:14:16.640 |
personal relationships or both until you get to some form of rock bottom and you start to repair. 00:14:23.280 |
And I think there's a lot of people that resonate with that because they have each suffered their 00:14:29.760 |
own things that at some point in their lives have told them that they're less than. And then they go 00:14:37.120 |
and cope. And when you cope, eventually those coping mechanisms escalate, and at some point, 00:14:44.560 |
it'll be unhealthy either for you, but oftentimes it's for the people around you. 00:14:48.800 |
>> Well, from those humble beginnings, you are now a billionaire. How has money changed your life, 00:14:57.920 |
or maybe the landscape of experience in your life? Does it buy happiness? 00:15:04.560 |
>> It doesn't buy happiness, but it buys you a level of comfort for you to really amplify what 00:15:12.240 |
happiness is. I kind of think about it in the following way. Let's just say that there's a 00:15:18.320 |
hundred things on a table, and the table says, "Find happiness here." And there are different 00:15:25.120 |
prices. The way that the world works is that many of these experiences are cordoned off a little bit 00:15:33.520 |
behind a velvet rope where you think that there's more happiness as the prices of things escalate. 00:15:41.760 |
If you live in an apartment, you admire the person with the house. If you live in a house, 00:15:47.360 |
you admire the person with the bigger house. That person admires the person with an island. 00:15:55.120 |
Right? Some person drives their car, admires the person who flies, who admires the person who flies 00:16:01.440 |
business class, who admires the person who flies first to private. There's all of these escalations 00:16:07.680 |
on this table, and most people get to the first five or six. And so they just naturally assume 00:16:14.720 |
that items seven through a hundred is really where happiness is found. And just to tell you 00:16:24.240 |
the finish line, I've tried a hundred and back, and I've tried two more hundred to it, 00:16:29.120 |
and happiness isn't there. But it does give you a level of comfort. I read a study, 00:16:36.720 |
and I don't know if it's true or not, but it said that the absolute maximal link between 00:16:45.840 |
money and happiness is around $50 million. And it was just like a social studies kind of thing, 00:16:54.240 |
that I think one of the Ivy Leagues put out. And underneath it, the way that they explained it was 00:16:58.560 |
because you could have a home, you could have all kinds of the creature comforts, you could take 00:17:03.120 |
care of your family, and then you were left to ponder what it is that you really want. 00:17:08.960 |
I think the challenge for most people is to realize that this escalating arms race of 00:17:16.640 |
more things will solve your problems is not true. More and better is not the solution. 00:17:24.880 |
It's this idea that you are on a very precise journey that's unique to yourself. You are 00:17:33.920 |
playing a game of which only you are the player. Everybody else is an interloper, 00:17:41.840 |
and you have a responsibility to design the gameplay. And I think a lot of people don't 00:17:47.440 |
realize that, because if they did, I think they would make a lot of different decisions about how 00:17:51.840 |
they live their life. And I still do the same thing. I mean revert to basically running around 00:17:59.120 |
asking other people, "What will make you like me more? What will make me more popular in your eyes?" 00:18:06.080 |
And I try to do it, and it never works. It is just a complete dead end. 00:18:12.880 |
Is there negative aspects to money? Like, for example, it becoming harder to find people you 00:18:19.280 |
can trust? I think the most negative aspect is that it amplifies a 360-degree view of your 00:18:26.240 |
personality, because there are a lot of people, and society tells you that more money is actually 00:18:34.480 |
better. You are a better person somehow, and you're factually more worthwhile than some other 00:18:39.760 |
people that have less money. That's also a lie. But when you're given that kind of attention, 00:18:46.160 |
it's very easy for you to become a caricature of yourself. That's probably the single worst thing 00:18:52.240 |
that happens to you. But I say it in the opposite way. I think all I've ever seen in Silicon Valley, 00:18:57.360 |
as an example, is that when somebody gets a hold of a lot of money, it tends to cause them to become 00:19:04.960 |
exactly who they were meant to be. They're either a kind person, they're either a curious person, 00:19:10.800 |
they're either a jerk, they're either cheap. They can use all kinds of masks, but now that there's 00:19:16.800 |
no expectations and society gives you a get-out-of-jail-free card, you start to behave 00:19:21.120 |
the way that's most comfortable to you. So you see somebody's innate personality. 00:19:25.680 |
And that's a really interesting thing to observe, because then you can very quickly bucket sort 00:19:29.680 |
where do you want to spend time and who is really additive to your gameplay, 00:19:34.560 |
and who is really a negative detractor to your gameplay. 00:19:38.640 |
You're an investor, but you're also a kind of philosopher. You analyze the world 00:19:45.360 |
in all its different perspectives on all in podcasts, on Twitter, everywhere. 00:19:52.320 |
Do you worry that money puts you out of touch from being able to truly empathize with the experience 00:20:02.480 |
of the general population, which in part, first of all, on a human level, that could be limiting, 00:20:08.320 |
but also as an analyst of human civilization, that could be limiting? 00:20:12.080 |
I think it definitely can for a lot of people, because it's an abstraction for you to stop 00:20:18.640 |
caring. I also think the other thing is that you can very quickly, especially in today's world, 00:20:27.520 |
become the scapegoat. Just to use a Girardian, like Rene Girard, if you think about mimetic 00:20:33.440 |
theory in a nutshell, we're all competing for these very scarce resources that we are told 00:20:40.880 |
is worthwhile. And if you view the world through that Girardian lens, what are we really doing? 00:20:47.920 |
We are all fighting for scarce resources, whether that's Twitter followers, money, 00:20:53.760 |
acclaim, notoriety, and we all compete with each other. And in that competition, 00:20:58.960 |
Girard writes, "The only way you escape that loop is by scapegoating something or somebody." 00:21:06.640 |
And I think we are in that loop right now where just the fact of being successful is a thing that 00:21:13.200 |
one should scapegoat to end all of this tension that we have in the world. I think that it's a 00:21:20.480 |
little misguided, because I don't think it solves the fundamental problem. And we can talk about 00:21:25.920 |
what the solution to some of these problems are, but that's, I think, the loop that we're all 00:21:31.040 |
living. And so if you become a caricature and you feed yourself into it, I mean, you're not doing 00:21:37.040 |
anything to really advance things. >> Your nickname is The Dictator. How'd 00:21:41.840 |
you get the nickname? We're talking about the corrupting nature of money. 00:21:45.440 |
>> That came from poker. In a poker game, when you sit down, it's chaos, especially like in our 00:21:51.840 |
home game. There's a ton of big egos. There's people always watching, rail birding the game, 00:21:57.840 |
all kinds of interesting folks. And in that, somebody needs to establish hygiene and rules. 00:22:04.640 |
And I really care about the integrity of the game. And it would just require somebody to just say, 00:22:10.000 |
"Okay, enough." And then people were just like, "Okay, stop dictating." And that's where that 00:22:17.920 |
speaking of which, is the greatest poker player of all time, and why is it Phil Hellmuth? 00:22:22.800 |
>> Exactly. You know, Muth probably knew this question was coming. Here's what I'll say. I 00:22:28.000 |
think Hellmuth is the antidote to computers, more than any other player playing today. 00:22:35.120 |
And when you see him in a heads up situation, so I think he's played nine or 10 heads up 00:22:42.160 |
tournaments in a row. And he's played, basically call it 10 of the top 20 people so far. 00:22:47.440 |
And he's beaten all but one of them. When you're playing heads up, one V one, that is the most 00:22:58.960 |
GTO understandable spot, meaning game theory optimal position. That's where computers can 00:23:05.680 |
give you an enormous edge. The minute you add even a third player, the value of computers and the 00:23:11.440 |
value of their recommendations basically falls off a cliff. So one way to think about it is Hellmuth 00:23:18.720 |
is forced to play against people that are essentially trained like AIs. And so to be 00:23:24.240 |
able to beat, you know, eight out of nine of them means that you are playing so orthogonally 00:23:29.840 |
to what is considered game theory optimal, and you're overlaying human reasoning. 00:23:35.360 |
The judgment to say, well, in this spot, I should do x, but I'm going to do y. It's not 00:23:40.560 |
dissimilar in chess, like what makes, you know, Magnus Carlsen so good. You know, sometimes he 00:23:45.280 |
takes these weird lines, he'll sacrifice positions, you know, he'll overplay certain positions for 00:23:50.560 |
certain, you know, bishops versus knights and all of these spots that are very confusing. 00:23:54.800 |
And what it does is it throws people off their game. I think he just won a recent 00:23:59.360 |
online tournament and it's like by move six, there is no GTO move for his opponent to make, 00:24:07.280 |
because it's like out of the rule book. Maybe he read some game. You know, I read the quote, 00:24:11.280 |
it was like he probably read some game in some bar in Russia in 1954, memorized it. 00:24:16.560 |
And all of a sudden by six moves in, the computer AI is worthless. So that's what makes Helmuth 00:24:21.440 |
great. There is one person that I think is superior. And I think it's what Daniel also 00:24:31.520 |
said, and I would echo that because I played Phil as well, but Phil Ivey is the most well-rounded, 00:24:41.280 |
cold-blooded, bloodthirsty animal. He's just, and he sees into your soul, Lex, in a way where 00:24:49.920 |
you're just like, oh my God, stop looking at me. Have you ever played him? Yeah, yeah, we've played. 00:24:55.520 |
We've played and you know, he crushes the games, crushes the games. So what does feeling crushed 00:25:01.440 |
mean and feel like in poker? Is it like that you just can't read at all? You're being constantly 00:25:07.440 |
pressured. You feel off balance. You try to bluff and the person reads you perfectly, 00:25:13.040 |
that kind of stuff. - This is a really, really excellent question because I think this has 00:25:17.600 |
parallels to a bunch of other things. Okay, let's just use poker as a microcosm to explain a bunch 00:25:23.280 |
of other systems or games. Maybe it's running a company or investing. Okay, so let's use those 00:25:31.520 |
three examples, but we use poker to explain it. What does success look like? Well, success looks 00:25:36.880 |
like you have positive expected value, right? In poker, the simple way to summarize that is 00:25:42.320 |
your opponent, let's just say you and I are playing, are going to make a bunch of mistakes. 00:25:48.560 |
There's a bunch of it that's going to be absolutely perfect. And then there's a few 00:25:52.400 |
spots where you make mistakes. And then there's a bunch of places in the poker game where I play 00:25:57.840 |
perfectly and I make a few mistakes. Basically, your mistakes minus my mistakes is the edge, 00:26:07.440 |
right? That's how poker works. If I make fewer mistakes than you make, I will make money and I 00:26:14.880 |
will win. That is the objective of the game. Translate that into business. You're running a 00:26:20.240 |
company. You have a team of employees. You have a pool of human capital that's capable of being 00:26:26.160 |
productive in the world and creating something. But you are going to make mistakes in making that. 00:26:33.280 |
Maybe it doesn't completely fit the market. Maybe it's mispriced. Maybe it actually doesn't require 00:26:40.160 |
all of the people that you need, so the margins are wrong. And then there's the competitive set 00:26:45.040 |
of all the other alternatives that customer has. Their mistakes minus your mistakes is the expected 00:26:52.880 |
value of Google, Facebook, Apple, et cetera. Now take investing. Every time you buy something, 00:27:00.640 |
somebody else on the other side is selling it to you. 00:27:03.440 |
Is that their mistake? We don't know yet. But their mistakes minus your mistakes is how you 00:27:12.480 |
make a lot of money over long periods of time as an investor. Somebody sold you Google at $40 a 00:27:18.080 |
share. You bought it and you kept it. Huge mistake on their part, minimum mistakes on your part. 00:27:24.000 |
The difference of that is the money that you made. So life can be summarized in many ways in that 00:27:30.480 |
way. So the question is, what can you do about other people's mistakes? And the answer is nothing. 00:27:35.520 |
That is somebody else's game. You can try to influence them. You could try to subvert them. 00:27:41.760 |
Maybe you plant a spy inside of that other person's company to sabotage them. I guess 00:27:46.720 |
there are things at the edges that you can do. But my firm belief is that life success really 00:27:52.720 |
boils down to how do you control your mistakes? Now this is a bit counterintuitive. The way you 00:28:00.240 |
control your mistakes is by making a lot of mistakes. So taking risks is somehow a way to 00:28:08.800 |
minimize the number of mistakes. Let's just say you want to find love. You want to find somebody 00:28:13.760 |
you're deeply connected with. Do you do that by not going out on dates? Yes. Sorry. You're the 00:28:24.080 |
only person that thinks that's the answer to that question. No, I'm joking. I'm joking. No, 00:28:27.120 |
but you know what I mean? You have to date people. You have to open yourself up. You have to be 00:28:31.280 |
authentic. You give yourself a chance to get hurt. But you're a good person. So you know what happens 00:28:38.560 |
when you get hurt? That is actually their mistake. And if you are inauthentic, that's your mistake. 00:28:44.800 |
That's a controllable thing in you. You can tell them the truth, who you are, and say, 00:28:50.320 |
here's my pluses and minuses. My point is there are very few things in life 00:28:55.360 |
that you can't break down, I think, into that very simple idea. 00:28:59.520 |
And in terms of your mistakes, society tells you don't make them because we will judge you, 00:29:07.840 |
and we will look down on you. And I think the really successful people realize that actually, 00:29:12.800 |
no, it's the cycle time of mistakes that gets you to success. Because your error rate will diminish 00:29:20.320 |
the more mistakes that you make. You observe them. You figure out where it's coming from. 00:29:25.040 |
Is it a psychological thing? Is it a cognitive thing? And then you fix it. 00:29:30.000 |
So the implied thing there is that there is, in business and investing, in poker, in dating, 00:29:38.000 |
in life, is that there's this platonic GTO, game theory, optimal thing out there. And so when you 00:29:44.640 |
say mistakes, you're always comparing to that optimal path you could have taken. 00:29:49.520 |
I think slightly different, I would say, mistake is maybe a bad proxy, but it's the best proxy I 00:29:55.040 |
have for learning. But I'm using the language of what society tells you. 00:30:01.440 |
Society tells you that when you try something and it doesn't work, it's a mistake. So I just 00:30:07.360 |
use that word because it's the word that resonates most with most people. 00:30:14.000 |
Yeah, it's like in neural networks, it's lost. 00:30:17.760 |
Yeah, right. So you're using the mistake that is most, the word that is most understandable, 00:30:23.680 |
especially by the way people experience it. I guess most of life is a sequence of mistakes. 00:30:29.520 |
The problem is when you use the word mistake and you think about mistakes, it actually has a 00:30:34.320 |
counterproductive effect of you becoming conservative in just being risk averse. 00:30:40.800 |
So if you flip it and say, try to maximize the number of successes, 00:30:47.760 |
somehow that leads you to take more risk. Mistake scares people. 00:30:54.400 |
I think mistakes scare people because society likes these very simplified boundaries of who 00:31:04.000 |
is winning and who is losing. And they want to reward people who make traditional choices 00:31:12.400 |
and succeed. But the thing is, what's so corrosive about that is that they're actually not even 00:31:23.280 |
being put in a position to actually make a quote unquote mistake and fail. 00:31:27.680 |
So I'll give you a, if you look at like getting into an elite school, 00:31:31.200 |
society rewards you for being in the Ivy Leagues in a way that, in my opinion, incorrectly, 00:31:38.800 |
doesn't reward you for being in a non Ivy League school. There's a certain level of status 00:31:43.680 |
and presumption of intellect and capability that comes with being there. 00:31:47.520 |
But that system doesn't really have a counterfactual because it's not as if you both 00:31:54.480 |
go to MIT and Ohio State. And then we can see two versions of Lex Friedman so that we can figure out 00:32:00.960 |
that the jig is up and there was no difference. And so instead it reinforces this idea that there 00:32:08.080 |
is no truth seeking function. There is no way to actually make this thing whole. And so it tells 00:32:15.040 |
you, you have to get in here. And if you don't, your life is over. You've made a huge mistake 00:32:19.360 |
or you've failed completely. And so you have to find different unique ways of dismantling this. 00:32:26.720 |
This is why part of what I've realized where I got very lucky is I had no friends in high school. 00:32:35.200 |
I had a few cohort of acquaintances. But part of being so hypervigilant when I grew up 00:32:41.040 |
was I was so ashamed of that world that I had to live in. I didn't want to bring anyone into it. 00:32:48.640 |
I could not see myself that anybody would accept me. But the thing with that is that I had no 00:32:56.320 |
definition of what expectations should be. So they were not guided by the people around me. 00:33:02.960 |
And so I would escape to define my expectations. - It's interesting, but you didn't feel 00:33:07.520 |
like your dad didn't put you in a prison of expectation? 'Cause like that's, if you don't 00:33:16.400 |
have a friend, so the flip side of that, you don't have any other signals. It's very easy to believe 00:33:21.440 |
like when you're in a cult that-- - Well, he was angry. He pushed me. 00:33:29.280 |
He used me as a mechanism to alleviate his own frustration. And this may sound very crazy, 00:33:34.960 |
but he also believed in me. And so that's what created this weird duality where you were just, 00:33:41.280 |
I was always confused about-- - You could be somebody great. 00:33:44.400 |
He believed that you could be somebody truly special. - I didn't believe him because I 00:33:50.160 |
couldn't reconcile then the other half of the day, those behaviors. But what it allowed me to do was 00:33:57.760 |
I escaped in my mind and I found these archetypes around me that were saviors to me. 00:34:06.320 |
So I grew up in Ottawa, Ontario, Canada. I grew up right at the point where the telecom boom was 00:34:14.160 |
happening. Companies like Nortel and Newbridge Networks and Mitel, Bell Northern Research, 00:34:19.680 |
these were all built in the suburbs of Ottawa. And so there were these larger than life figures, 00:34:26.080 |
entrepreneurs, Terry Matthews, Michael Copeland. And so I thought I'm gonna be like them. I would 00:34:32.720 |
read Forbes Magazine. I would read Fortune Magazine. I would look at the rich people on 00:34:36.640 |
that list and say, I would be like them. Not knowing that maybe that's not who you wanted to 00:34:43.040 |
be, but it was a lifeline. And it kept my mind relatively whole because I could direct my 00:34:50.480 |
ambition in a direction. And so why that's so important, just circling back to this is 00:34:57.040 |
I didn't have a group of friends who were like, I'm gonna go to community college. 00:35:00.160 |
I didn't have a group of friends that said, well, the goal is just to go to university, 00:35:05.200 |
get a simple job and join the public service, have a good life. And so because I had no 00:35:11.440 |
expectations and I was so afraid to venture out of my own house, I never saw what middle-class 00:35:16.480 |
life was like. And so I never aspired to it. Now, if I was close to it, I probably would have 00:35:21.440 |
aspired to it because my parents in their best year made 32,000 Canadian together. 00:35:27.280 |
And if you're trying to raise a family of five people on $32,000, it's a complicated job. 00:35:32.960 |
And most of the time they were probably making 20 something thousand. And I was working since I was 00:35:37.600 |
14. So I knew that our station in life was not the destination. We had to get out. But because I 00:35:46.000 |
didn't have an obvious place, it's not like I had a best friend whose house I was going to. And 00:35:50.000 |
I saw some normal functional home. If I had had that in this weird way, I would have aspired to 00:35:56.560 |
that. >> What was the worst job you had to do? 00:35:59.520 |
>> The best job, but the worst job was I worked at Burger King when I was 14 years old and I would 00:36:07.600 |
do the closing shift. And that was from like 6 PM till about 2 in the morning. And in Ontario, 00:36:15.520 |
where I lived, Ottawa borders Quebec. In Ontario, the drinking age is 19. You can see where I'm 00:36:20.880 |
going with this. The drinking age in Quebec is 18. And that year made all the difference to all 00:36:26.320 |
these kids. And so they would go get completely drunk. They would come back. They would come to 00:36:31.840 |
the Burger King. You would see all these kids you went to high school with. Can you imagine how 00:36:35.520 |
mortifying it is? You're working there in this getup. And they would light that place on fire, 00:36:43.280 |
vomit everywhere, puking, pooing, peeing. And when the thing shuts down at one o'clock, 00:36:49.360 |
you got to clean that all up, all of it, changing the garbage, taking it out. It was a grind. 00:36:59.600 |
And it really teaches you, okay, I do not want this job. I don't want to. 00:37:07.600 |
>> But it's funny that that didn't push you towards the stability and the security of the 00:37:11.280 |
middle class like life. >> I didn't have any good examples of that. 00:37:15.760 |
I didn't have those around me. I was so ashamed. I could have never built a relationship where I 00:37:21.600 |
could have seen those interactions to want that. And so my desires were framed by these two random 00:37:26.960 |
rich people that lived in my town who I'd never met, and what I read in magazines about people 00:37:31.520 |
like Bill Gates and Warren Buffett. >> You were an early senior executive 00:37:38.240 |
at Facebook during a period of a lot of scaling in the company history. I mean, 00:37:43.120 |
it's actually a fascinating period of human history in terms of technology. 00:37:46.400 |
Well, in terms of human civilization, honestly. What did you learn from that 00:37:53.440 |
time about what it takes to build and scale a successful tech company, 00:37:58.880 |
a company that has almost immeasurable impact on the world? 00:38:05.320 |
>> That was an incredible moment in time because everything was so new. To your point, 00:38:11.160 |
like even how the standards of Web 2.0 at that time were being defined, we were defining them. 00:38:16.200 |
I mean, I think if you search in the patent library, there's a bunch of these patents that 00:38:24.600 |
me and Zuck have for random things like cookies or cross-site JavaScript, all these crazy things 00:38:31.560 |
that are just like these duh kind of ideas in 2023. We had to invent our way around how do 00:38:38.920 |
websites communicate with each other? How do we build in the cloud versus in a data center? How 00:38:43.640 |
do we actually have high-performance systems? >> You mentioned data science, the term and the idea. 00:38:49.000 |
>> I invented this thing called data scientist because we had a PhD from Google that refused 00:38:54.440 |
to join because he got a job offer that says data analyst. And so we said, "Call him a scientist," 00:39:01.000 |
because he was a PhD in particle physics. So he was a scientist. I said, "Great, you're a scientist 00:39:07.320 |
>> That launched a discipline. >> I mean, a term, what's a rose by 00:39:11.560 |
any other name? But yeah, sometimes words like this can launch entire fields. And it did in that 00:39:18.760 |
case. I mean, I guess at that time you didn't anticipate the impact of machine learning on 00:39:24.760 |
the entirety of this whole process because you need machine learning to have both ads and 00:39:30.040 |
recommender systems to have the feed for the social networks. >> Exactly right. The first 00:39:35.000 |
real scaled version of machine learning, not AI, but machine learning was this thing that Facebook 00:39:40.120 |
introduced called PYMK, which is people you may know. And the simple idea was that can we initiate 00:39:45.960 |
a viral mechanic inside the application where you log in, we grab your credentials, we go to your 00:39:52.920 |
email inbox, we harvest your address book, we do a compare, we make some guesses, and we start to 00:40:00.440 |
present to other people that you may actually know that may not be in your address book. 00:40:04.120 |
Really simple, a couple of joins of some tables, whatever. And it started to just go crazy. And 00:40:09.800 |
the number of people that you were creating this density and entropy inside this social graph 00:40:16.280 |
with what was some really simple, basic math. And that was eye opening for us. And what it led us 00:40:23.480 |
down this path of is really understanding the power of all this machine learning. And so that 00:40:28.200 |
infused itself into News Feed and how the content that you saw could be tailored to who you were 00:40:34.040 |
and the type of person that you were. So there was a moment in time that all of this stuff was so new. 00:40:39.880 |
How did you translate the app to multiple languages? How do you launch the company in 00:40:44.600 |
all of these countries? - How much of it is just kind of 00:40:49.000 |
stumbling into things using your best first principles gut thinking? And how much is it 00:40:54.440 |
like five, 10, 15, 20 year vision? How much was thinking about the future of the internet 00:41:02.280 |
and the metaverse and the humanity and all that kind of stuff? Because the News Feed, 00:41:09.160 |
- But that's like changes everything. - Well, you have to remember, News Feed, 00:41:14.760 |
was named and we had this thing where we would just name things what they were. And at the time, 00:41:23.720 |
all of these other companies, and if you go back into the Wayback Machine, you can see this, 00:41:28.600 |
people would invent an MP3 player and they would come up with some crazy name. Or they would invent 00:41:37.160 |
a software product and come up with a crazy name. And it sounded like the pharma industry, 00:41:43.880 |
blow Casimab, tag your best friends. And you think, what is this? This makes no sense. 00:41:50.200 |
And this was Zuck's thing. He was like, well, this is a feed of news, so we're going to call it 00:41:54.440 |
News Feed. This is where you tag your photos, so we're going to call that photo tagging. 00:41:59.160 |
I mean, literally, pretty obvious stuff. So the way that those things came about though, 00:42:07.480 |
was very experimentally. And this is where I think it's really important for people 00:42:11.400 |
to understand. I think Bezos explains this the best. There is a tendency after things work 00:42:18.280 |
to create a narrative fallacy because it feeds your ego. And you want to have been the person 00:42:25.640 |
that saw it coming. And I think it's much more honest to say, we were very good probabilistic 00:42:35.480 |
thinkers that tried to learn as quickly as possible, meaning to make as many mistakes as 00:42:42.040 |
possible. I mean, if you look at this very famous placard that Facebook had from back in the day, 00:42:47.000 |
what did it say? It said, move fast and break things. In societal language, that's saying, 00:42:53.400 |
make mistakes as quickly as you can. Because the minute you break something, you don't do that by 00:42:58.120 |
design. It's not a feature. Theoretically, it's a bug. But he understood that, and we embraced that 00:43:04.440 |
idea. I used to run this meeting once a week where the whole goal was, I want to see that there was 00:43:10.040 |
a thousand experiments that were run and show me them all from the dumbest to the most impactful. 00:43:15.880 |
And we would go through that loop and what did it train people? Not that you got celebrated 00:43:21.400 |
for the right answer, but you got celebrated for trying. I ran 12 experiments, 12 failed, 00:43:28.680 |
and we'd be like, you're the best. >>COREY: Can I just take a small tangent on 00:43:33.080 |
that is that move fast and break things has become like a catchphrase of the thing that 00:43:41.480 |
embodies the toxic culture of Silicon Valley in today's discourse, which confuses me. Of course, 00:43:49.720 |
words and phrases get sort of captured and so on. 00:43:52.760 |
>>COREY: Becomes very reductive. That's a very loaded set of words that together can be, 00:43:56.680 |
many years later, people can view very reductive. >>LUIS: Can you steel man each side of that? 00:44:02.680 |
So pro move fast and break things and against move fast and break things. 00:44:08.760 |
>>COREY: So I think the pro of move fast and break things is saying the following. 00:44:12.840 |
There's a space of things we know and a massive space of things we don't know. 00:44:18.600 |
And there's a rate of growth of the things we know, but the rate of growth of the things we 00:44:25.640 |
don't know is actually, we have to assume growing faster. So the most important thing is to move 00:44:32.520 |
into the space of the things we don't know as quickly as possible. And so in order to acquire 00:44:38.920 |
knowledge, we're going to assume that the failure mode is the nominal state. 00:44:44.120 |
And so we just need to move as quickly as we can, break as many things as possible, 00:44:50.120 |
which means things are breaking in code. Do the root cause analysis, figure out how to make things 00:44:57.080 |
better and then rapidly move into the space. And he or she who moves fastest into that space will 00:45:05.080 |
carelessness, right? It doesn't imply moving fast without also aggressively picking up the lessons 00:45:15.160 |
from the mistakes you make. >>COREY: Well, again, that's steel 00:45:18.040 |
manning the pro, which is it's a thoughtful movement around velocity and acquisition of 00:45:27.400 |
knowledge. Now let's steel man the con case. When these systems become big enough, 00:45:34.840 |
there is no more room to experiment in an open-ended way because the implications have 00:45:42.680 |
broad societal impacts that are not clear upfront. So let's take a different, less controversial 00:45:49.880 |
example. If we said Lipitor worked well for all people except South Asians, and there's a specific 00:45:59.320 |
immuno response that we can iterate to. And if we move quickly enough, we can run 10,000 experiments 00:46:05.560 |
and we think the answer is in that space. Well, the problem is that those 10,000 experiments may 00:46:11.800 |
kill 10 million people. So you have to move methodically. When that drug was experimental 00:46:18.440 |
and it wasn't being given to 500 million people in the world, moving fast made sense because you 00:46:25.160 |
could have a pig model, a mouse model, a monkey model. You could figure out toxicity, but we 00:46:30.120 |
picked all that low hanging fruit. And so now these small iterations have huge impacts that need to be 00:46:37.640 |
measured and implemented. Different example is like, if you work at Boeing and you have an 00:46:44.360 |
implementation that gives you a 2% efficiency by reshaping the wing or adding winglets, 00:46:49.080 |
there needs to be a methodical move slow, be right process because mistakes when they compound, 00:46:57.960 |
when it's already implemented and at scale have huge externalities that are impossible to measure 00:47:03.560 |
until after the fact. And you see this in the 737 max. So that's how one would steel man the 00:47:09.720 |
con case, which is that when an industry becomes critical, you got to slow down. 00:47:14.200 |
This makes me sad because some industries like Twitter and Facebook are a good example. 00:47:21.880 |
They achieve scale very quickly before really exploring the big area of things to learn. 00:47:31.720 |
So you basically pick one low hanging fruit and that became your huge success. And now you're 00:47:38.920 |
sitting there with that stupid fruit. Completely. Well, so as an example, 00:47:43.080 |
if I was running Facebook for a day, the big opportunity in my opinion was really not the 00:47:53.240 |
metaverse, but it was actually getting the closest that anybody could get to AGI. 00:48:01.320 |
And if I had to steel man that product case, here's how I would have pitched it to the board 00:48:06.200 |
and to Zuck. I would have said, "Listen, there are three and a half billion people monthly 00:48:10.840 |
using this thing." If we think about human intelligence very reductively, 00:48:15.000 |
we would say that there's a large portion of it, which is cognitive. And then there's a large 00:48:20.600 |
portion of it, which is emotional. We have the best ability to build a multimodal model 00:48:26.840 |
that basically takes all of these massive inputs together to try to intuit how a system would react 00:48:33.000 |
to all kinds of stimuli. That to me would have been a profound leap forward for humanity. 00:48:39.880 |
Can you dig into that a little bit more? So in terms of, now this is a board meeting, 00:48:45.880 |
how would that make Facebook money? I think that you have all of these systems over time 00:48:54.280 |
that we don't know could benefit from some layer of reasoning to make it better. 00:49:02.280 |
What does Spotify look like when instead of just a very simple recommendation engine, 00:49:10.760 |
it actually understands sort of your emotional context and your mood and can move you to a body 00:49:16.840 |
of music that you would like? What does it look like if your television, instead of having to go 00:49:23.480 |
and channel surf 50,000 shows on a horrible UI, instead just has a sense of what you're into 00:49:31.240 |
and shows it to you? What does it mean when you get in your car and it actually drives you to a 00:49:38.920 |
place because you should actually eat there even though you don't know it? These are all random 00:49:43.720 |
things that make no sense a priori, but it starts to make the person or the provider of that service 00:49:52.360 |
the critical reasoning layer for all these everyday products that today would look very 00:49:57.240 |
flat without that reasoning. I think you license that and you make a lot of money. 00:50:01.640 |
In many ways, instead of becoming more of the pixels that you see, you become more of the 00:50:07.240 |
bare metal that actually creates that experience. If you look at the companies that are multi-decade 00:50:14.600 |
legacy kinds of businesses, the thing that they have done is quietly and surreptitiously move 00:50:20.760 |
down the stack. You never move up the stack to survive. You need to move down the stack. 00:50:25.720 |
So if you take that OSI reference stack, these layers of how you build an app from the physical 00:50:30.600 |
layer to the transport layer, all the way up to the app layer, you can map from the 1980s, 00:50:36.600 |
all the big companies that have been created, all the way from Fairchild Semiconductor and NatSemi 00:50:41.800 |
to Intel to Cisco to 3Com, Oracle, Netscape at one point, all the way up to the Googles and 00:50:50.120 |
the Facebooks of the world. But if you look at where all the lock-in happened, 00:50:54.120 |
it's by companies like Apple, who used to make software saying, "I'm going to get one close. 00:50:59.640 |
I'm going to make the bare metal and I'm going to become the platform." Or Google, same thing. 00:51:04.520 |
I'm going to create this dominant platform and I'm going to create a substrate that organizes 00:51:09.160 |
all this information that's just omnipresent and everywhere. So the key is, if you are lucky 00:51:15.320 |
enough to be one of these apps that are in front of people, you better start digging quickly 00:51:21.880 |
and moving your way down and get out of the way and disappear. But by disappearing, 00:51:28.040 |
you will become much, much bigger and it's impossible to usurp you. 00:51:33.400 |
Yeah, I 100% agree with you. That's why you're so smart. This is the depersonalization and 00:51:42.520 |
the algorithms that enable depersonalization, almost like a operating system layer. So pushing 00:51:48.360 |
away from the interface and the actual system that does the personalization. I think the challenges 00:51:53.800 |
there, there's obviously technical challenges, but there's also societal challenges that 00:51:59.320 |
it's like in a relationship. If you have an intimate algorithmic connection with 00:52:06.200 |
individual humans, you can do both good and bad. And so there's risks that you're taking. 00:52:12.440 |
So if you're making a lot of money now as Twitter and Facebook with ads, surface layer ads, 00:52:19.080 |
what is the incentive to take the risk of guiding people more? Because you can hurt people, you can 00:52:27.880 |
piss off people. I mean, there's a cost to forming a more intimate relationship with the users. 00:52:36.040 |
In the short term, I think. You said a really, really key thing, which was a really great 00:52:41.880 |
emotional, instinctive reaction, which is when I said the AGI thing, you said, well, 00:52:47.160 |
how would you ever make money from that? That is the key. The presumption is that this thing would 00:52:52.600 |
not be an important thing at the beginning. And I think what that allows you to do if you were 00:52:57.160 |
Twitter or Google or Apple or Facebook, anybody, Microsoft, embarking on building something like 00:53:02.600 |
this, is that you can actually have it off the critical path. And you can experiment with this 00:53:09.720 |
for years if that's what it takes to find a version one that is special enough where it's 00:53:15.640 |
worth showcasing. And so in many ways, you get the free option. You're going to be spending, 00:53:21.320 |
any of these companies will be spending tens of billions of dollars in OpEx and CapEx every year 00:53:26.840 |
and all kinds of stuff. It is not a thing that money actually makes more likely to succeed. 00:53:34.440 |
In fact, you actually don't need to give these kinds of things a lot of money at all, because 00:53:40.520 |
starting in 2023, right now, you have the two most important tectonic shifts that have ever 00:53:47.480 |
happened in our lifetime in technology. They're not talked about, but these things allow AGI, 00:53:53.320 |
I think, to emerge over the next 10 or 15 years where it wasn't possible for. 00:53:56.840 |
The first thing is that the marginal cost of energy is zero. You're not going to pay for 00:54:01.800 |
anything anymore. And we can double click into why that is. And the second is the marginal cost 00:54:06.760 |
of compute is zero. And so when you take the multiplication or, if you want to get really 00:54:12.680 |
fancy mathematically, the convolution of these two things together, it's going to change everything. 00:54:19.640 |
So think about what a billion dollars gets today. And we can use OpenAI as an example. 00:54:25.480 |
A billion dollars gets OpenAI a handful of functional models and a pretty fast iterative 00:54:32.040 |
loop, right? But imagine what OpenAI had to overcome. They had to overcome a compute challenge. 00:54:41.000 |
They had to strip together a whole bunch of GPUs. They had to build all kinds of scaffolding 00:54:45.320 |
software. They had to find data center support. That consumes all kinds of money. So that billion 00:54:50.360 |
dollars didn't go that far. So it's a testament to how clever that OpenAI team is. But in four 00:54:57.160 |
years from now, when energy costs zero and basically GPUs, they're falling off a truck 00:55:02.360 |
and you can use them effectively for free, now all of a sudden a billion dollars gives you some 00:55:08.840 |
amount of teraflops of compute that is probably the total number of teraflops available today in 00:55:14.920 |
the world. Like that's how gargantuan this move is when you take these two variables to zero. 00:55:21.080 |
There's like a million things to ask. I almost don't want to get distracted by 00:55:25.480 |
the marginal cost of energy going to zero because I have no idea what you're talking about there. 00:55:32.120 |
Okay. So if you look inside of the two most progressive states, the three most progressive 00:55:37.160 |
states, New York, California, and Massachusetts, a lot of left-leaning folks, a lot of people who 00:55:42.120 |
believe in climate science and climate change, the energy costs in those three states are the 00:55:46.600 |
worst they are in the entire country. And energy is compounding at 3% to 4% per annum. So every 00:55:53.240 |
decade to 15 years, energy costs in these states double. In some cases and in some months, our 00:55:59.640 |
energy costs are increasing by 11% a month. But the ability to actually generate energy 00:56:09.480 |
is now effectively zero. The cost per kilowatt hour to put a solar panel on your roof and a 00:56:15.000 |
battery wall inside your garage, it's the cheapest it's ever been. These things are the most efficient 00:56:20.840 |
they've ever been. And so to acquire energy from the sun and store it for your use later on 00:56:28.840 |
So how do you explain the gap between the cost going up? 00:56:31.480 |
Great question. So this is the other side of regulatory capture, right? We all fight to build 00:56:37.800 |
monopolies. While there are monopolies hiding in plain sight, the utilities are a perfect example. 00:56:43.240 |
There are 100 million homes in America. There are about 1,700 utilities in America. So they have 00:56:49.960 |
captive markets. But in return for that captive market, the law says you need to invest a certain 00:56:56.920 |
amount per year in upgrading that power line, in changing out that turbine, in making sure you 00:57:02.520 |
transition from coal to wind or whatever. Just as an example, upgrading power lines in the United 00:57:10.120 |
States over the next decade is a $2 trillion proposition. These 1,700 organizations have to 00:57:16.440 |
spend, I think it's a quarter of a trillion dollars a year just to change the power lines. 00:57:23.800 |
That is why, even though it costs nothing to make energy, you are paying double every seven or eight 00:57:29.960 |
years. It's CapEx and OpEx of a very brittle old infrastructure. It's like you trying to build an 00:57:36.440 |
app and being forced to build your own data center. And you say, "But wait, I just want to write to 00:57:40.600 |
AWS. I just want to use GCP. I just want to move on. All that complexity is solved for me." And 00:57:47.320 |
some law says, "No, you can't. You got to use it." So that's what consumers are dealing with, 00:57:51.320 |
but it's also what industrial and manufacturing organizations, it's what we all deal with. 00:57:56.680 |
So how do we get rid ourselves of this old infrastructure that we're paying for? 00:58:00.920 |
So the thing that's happening today, which I think is... This is why I think it's the most 00:58:06.520 |
important trend right now in the world, is that 100 million homeowners are each going to become 00:58:13.800 |
their own little power plant and compete with these 1,700 utilities. And that is a great... 00:58:21.160 |
No, just deal with the United States for a second, because I think it's easier to see here. 00:58:25.480 |
100 million homes, solar panel on the roof. And by the way, just to make it clear, the sun doesn't 00:58:30.520 |
need to shine. These panels now work where you have these UV bands that can actually extrapolate 00:58:36.040 |
beyond the visible spectrum. So they're usable in all weather conditions. And a simple system can 00:58:42.520 |
support you collecting enough power to not just run your functional day-to-day life, 00:58:48.040 |
but then to contribute what's left over back into the grid for Google's data center or Facebook's 00:58:54.600 |
data center, where you get a small check. The cost is going to zero. 00:59:00.360 |
How obvious is this to people? You're making this out? 00:59:03.400 |
Okay. So because this is a pretty profound prediction, if the cost is indeed going to zero, 00:59:09.320 |
that... I mean, the compute... The cost of compute going to zero, I can... 00:59:13.880 |
So the cost of compute going to zero is simpler to see. 00:59:16.040 |
I can kind of understand, but the energy seems like a radical prediction of yours. 00:59:20.440 |
Well, it's just naturally what's happening, right? Now, let me give you a different way 00:59:25.320 |
of explaining this. If you look at any system, there's a really important thing that happens. 00:59:31.400 |
It's what Clay Christensen calls crossing the chasm. If you explained it numerically, 00:59:36.120 |
here's how I would explain it to you, Lex. If you introduce a disruptive product, 00:59:39.880 |
typically what happens is the first 3% to 5% of people are these zealous believers. 00:59:47.480 |
And they ignore all the logical reasons why this product doesn't make any sense, 00:59:52.520 |
because they believe in the proposition of the future and they buy it. 00:59:56.040 |
The problem is at 5%, if you want a product to get to mass market, you have one of two choices, 01:00:03.400 |
which is you either bring the cost down low enough, or the feature set becomes so compelling 01:00:09.960 |
that even at a high price point. An example of the latter is the iPhone. The iPhone today, 01:00:15.880 |
the 14 iPhone, costs more than the original iPhone. It's probably doubled in price over 01:00:19.880 |
the last 14 or 15 years, but we view it as an essential element of what we need in our daily 01:00:25.240 |
lives. It turns out that battery EVs and solar panels are an example of the former, 01:00:31.320 |
because people like President Biden, with all of these subsidies, have now introduced so much 01:00:39.320 |
money for people to just do this, where it is a money-making proposition for 100 million homes. 01:00:46.360 |
And what you're seeing as a result are all of these companies who want to get in front of that 01:00:53.000 |
trend. Why? Because they want to own the relationship with 100 million homeowners. 01:00:57.160 |
They want to manage the power infrastructure, Amazon, Home Depot, Lowe's. Just name the company. 01:01:05.000 |
So if you do that and you control that relationship, they're going to show you... 01:01:10.440 |
For example, Amazon will probably say, "If you're a member of Prime, 01:01:13.480 |
we'll stick the panels on your house for free. We'll do all the work for you for free. And it's 01:01:20.600 |
just a feature of being a member of Prime, and we'll manage all that energy for you." It makes 01:01:25.240 |
so much sense, and it is mathematically accretive for Amazon to do that. It's not accretive 01:01:32.440 |
for the existing energy industry, because they get blown up. It's extremely accretive for peace 01:01:38.200 |
and prosperity. If you think the number of wars we fight over natural resources, take them all 01:01:43.640 |
off the table if we don't need energy from abroad. There's no reason to fight. You'd have to find a 01:01:50.280 |
reason to fight. Meaning, sorry, there'd be a moral reason to fight, but the last number of wars that 01:01:55.720 |
we fought were not as much rooted in morality as they were rooted in... >> Yeah, it feels like 01:02:01.000 |
they were very much rooted in conflict over resources, energy specifically. 01:02:06.920 |
>> And then, sorry, just the last thing I want to say. I keep interrupting, apologies. But 01:02:09.720 |
the chips, what people want to say is that now that we're at two and three nanometer scale 01:02:17.480 |
for typical kind of like transistor fab, we're done. And forget about transistor density, 01:02:23.720 |
forget about Moore's Law, it's over. And I would just say, "No." Look at teraflops, 01:02:27.800 |
and really teraflops is the combination of CPUs, but much and much less important, 01:02:33.160 |
and really is the combination of ASICs, so application specific ICs and GPUs. 01:02:37.800 |
And so you put the two together. I mean, if I gave you a billion dollars five years from now, 01:02:43.640 |
the amount of damage you could do, damage in good way, in terms of building racks and racks of GPUs, 01:02:50.840 |
the kind of models that you could build, the training sets and the data that you could consume 01:02:54.680 |
to solve a problem, it's enough to do something really powerful. Whereas today, it's not yet 01:03:02.920 |
interesting idea that you talk about in terms of Facebook and Twitter that's connected to this, 01:03:08.680 |
that if you were running sort of Twitter or Facebook, that you would move them all to like AWS. 01:03:15.080 |
So you would have somebody else to compute the infrastructure. It probably, if you could explain 01:03:23.000 |
that reasoning, means that you believe in this idea of energy going to zero, compute going to 01:03:28.680 |
zero. So let people that are optimized in that do the best job. >> And I think that's, you know, 01:03:35.640 |
initially in the early 2000s and the beginning of the 2010s, if you were big enough scale, 01:03:43.320 |
oh, sorry, everybody was building their own stuff. Then between 2010 through 2020, 01:03:49.800 |
really the idea was everybody should be on AWS except the biggest of the biggest folks. 01:03:54.680 |
I think in the 2020s and 30s, I think the answer is actually everybody should be in these public 01:04:02.600 |
clouds. And the reason is the engineering velocity of the guts. So, you know, take a simple example, 01:04:10.200 |
which is, you know, we have not seen a massive iteration in database design until Snowflake, 01:04:15.880 |
right? I think maybe Postgres was like the last big turn of the dial. 01:04:18.920 |
Why is that? I don't exactly know, except that everybody that's on AWS and everybody that's on 01:04:26.600 |
GCP and Azure gets to now benefit from a hundred plus billion dollars of aggregate market cap, 01:04:34.600 |
rapidly iterating, making mistakes, fixing, solving, learning. And that is a best in class 01:04:43.400 |
industry now, right? Then there's going to be all these AI layers around analytics so that app 01:04:51.160 |
companies can make better decisions. All of these things will allow you to build more nimble 01:04:56.920 |
organizations because you'll have this federated model of development. I'll take these things off 01:05:02.680 |
the shelf. Maybe I'll roll my own stitching over here because the thing that where you make money 01:05:09.080 |
is still for most people and how the apps provision an experience to a user. And everybody else can 01:05:15.640 |
make a lot of money just servicing that. So they work in a really, they play well together in the 01:05:21.720 |
sandbox. So in the future, everybody just should be there. It doesn't make sense for anybody, 01:05:27.880 |
I don't think, because, you know, if you were to roll your own data centers, you know, for example, 01:05:32.120 |
like Google for a long time had these massive leaps where they had GFS and Bigtable. Those are 01:05:37.400 |
really good in the 2000s and 2010s. And this is not just to throw shade at Google. It's very hard 01:05:44.280 |
for whatever exists that is the progeny of GFS and Bigtable to be anywhere near as good as a 01:05:49.880 |
hundred billion dollar industry's attempt to build that stack. And you're putting your organization 01:05:56.200 |
under enormous pressure to be that good. >> I guess the implied risk taken there is that you 01:06:01.640 |
could become the next AWS, like Tesla doing some of the compute in-house. I guess the bet there is 01:06:10.520 |
that you can become the next AWS for the new wave of computation if that level, if that kind of 01:06:20.120 |
computation is different. So if it's machine learning, I don't know if anyone's won that 01:06:25.080 |
battle yet, which is machine learning centric compute. >> Well, I think that software has a 01:06:30.200 |
very powerful property in that there's a lot of things that can happen asynchronously so that 01:06:37.240 |
real-time inference can be actually really lightweight code deployment. And that's why 01:06:42.600 |
I think you can have a very federated ecosystem inside of all of these places. Tesla is very 01:06:48.840 |
different because in order to build the best car, it's kind of like trying to build the best iPhone, 01:06:55.080 |
which is that you need to control it all the way down to the bare metal in order to do it well. 01:06:59.240 |
And that's just not possible if you're trying to be a systems integrator, which is what everybody 01:07:05.320 |
other than this modern generation of car companies have been. And they've done a very good job of 01:07:11.320 |
that, but it won't be the experience that allows you to win in the next 20 years. 01:07:16.040 |
>> So let's linger on the social media thing. So you said if you ran Facebook for a day, 01:07:24.520 |
let's extend that. If you were to build a new social network today, how would you fix 01:07:32.280 |
Twitter? How would you fix social media? If you want to answer a different question, 01:07:38.760 |
is if you were Elon Musk, somebody you know, and you were taking over Twitter, what would you fix? 01:07:44.600 |
>> I've thought about this a little bit. First of all, let me give you a backdrop. I wouldn't 01:07:51.000 |
actually build a social media company at all. And the answer is the reasoning is the following. 01:07:56.920 |
I really tend to believe, as you probably got in a sense of sort of patterns and probabilities. 01:08:03.400 |
And if you said to me, "Smath, probabilistically answer where are we going in apps and social 01:08:11.640 |
experiences?" What I would say is, "Lex, we spent the first decade building platforms and getting 01:08:18.120 |
them to scale." And if you want to think about it again, back to sort of this poker analogy, 01:08:23.320 |
others' mistakes minus your mistakes is the value. Well, the value that was captured 01:08:28.920 |
was trillions of dollars, essentially to Apple and to Google. And they did that by basically 01:08:35.800 |
attracting billions of monthly active users to their platform. 01:08:42.840 |
Then this next wave were the apps, Facebook, QQ, Tencent, TikTok, Twitter, Snapchat, 01:08:50.280 |
that whole panoply of apps. And interestingly, they were in many ways an atomized version 01:08:58.120 |
of the platforms, right? They sat on top of them. They were an ecosystem participant. 01:09:03.480 |
But the value they created was the same. Trillions of dollars of enterprise value, 01:09:10.680 |
billions of monthly active users. Well, there's an interesting phenomenon that's kind of 01:09:16.360 |
hiding in plain sight, which is that the next most obvious atomic unit are content creators. 01:09:24.040 |
Now, let me give you two examples. Lex Friedman, this random crazy guy, Mr. Beast, 01:09:30.280 |
Jimmy Donaldson, just the two of you alone, add it up, okay? And you guys are going to approach 01:09:37.240 |
in the next five years a billion people. The only thing that you guys haven't figured out yet is how 01:09:41.800 |
to capture trillions of dollars of value. Now, maybe you don't want to, and maybe that's not 01:09:45.160 |
your stated mission. >> Right, right. But let's just look at Mr. 01:09:47.800 |
Beast alone because he is trying to do exactly that probably. 01:09:50.680 |
>> Yeah, and I think Jimmy is going to build an enormous business. But if you take Jimmy and all 01:09:54.680 |
of the other content creators, right, you guys are atomizing what the apps have done. 01:10:01.960 |
You're providing your own curated news feeds. You're providing your own curated communities. 01:10:07.640 |
You're allowed, you let people move in and out of these things in a very lightweight way, 01:10:12.520 |
and value is accruing to you. So the honest answer to your question is I would focus on 01:10:16.920 |
the content creator side of things because I believe that's where the puck is going. 01:10:21.080 |
That's a much more important shift in how we all consume information and content and are 01:10:27.480 |
entertained. It's through brands like you, individual people that we can humanize and 01:10:31.720 |
understand are the filter. >> But aren't you just arguing against 01:10:37.400 |
the point you made earlier, which is what you would recommend is the invest in the AGI, 01:10:41.320 |
the depersonalization? >> Because they could still be a 01:10:46.280 |
participant. In that end state, if that happens, you have the option value of being an enabler of 01:10:51.800 |
that, right? You can help improve what they do. Again, you can be this bare metal service provider 01:10:57.960 |
where you can be a tax, right? You can participate in everything that you do, 01:11:04.040 |
every question that's asked, every comment that's curated. If you could have more intelligence as 01:11:09.240 |
you provide a service to your fans and your audience, you would probably pay a small 01:11:13.400 |
percentage of that revenue. I suspect all content creators would. And so it's that stack of services 01:11:20.200 |
that is like a smart human being. It's like, how do you help produce this information? You would pay 01:11:24.440 |
a producer for that. I mean, maybe you would. But so back to your question. So what would I do? 01:11:28.680 |
I think that you have to move into that world pretty aggressively. I think that right now you 01:11:35.880 |
first have to solve what is broken inside of these social networks. And I don't think it's 01:11:41.320 |
a technical problem. So just to put it out there, I don't think it's one where there are these 01:11:47.960 |
nefarious organizations. That happens. Brigading XYZ, that happens. But the real problem is a 01:11:55.560 |
psychological one that we're dealing with, which is people through a whole set of situations 01:12:03.720 |
have lost belief in themselves. And I think that that 01:12:15.640 |
comes up as this very virulent form of rejection that they tried to put into these social networks. 01:12:20.840 |
So if you look inside of comments on anything, you could have a person that says on Twitter, 01:12:25.800 |
I saved this dog from a fiery building. And there would be negative commenters. 01:12:32.040 |
And you're like, well, again, put yourself in their shoes. How do I steel man their case? I do 01:12:38.920 |
this all the time. I get people throw shade at me. I'm like, OK, let me steel man their point of view. 01:12:44.440 |
And the best that I can come up with is I'm working really hard over here. I'm trying. I played by all 01:12:50.440 |
the rules that were told to me. I've played well. I've played fairly. And I am not being rewarded 01:12:56.680 |
in a system of value that you recognize. And that is making me mad. And now I need to cope and I 01:13:03.880 |
need to vent. So back in the day, my dad used to drink. He would make me go get things to hit me 01:13:09.640 |
with. Today, you go to Twitter, you spot off, you try to deal with the latent anger that you feel. 01:13:15.480 |
So a social network has to be designed, in my opinion, to solve that psychological corner case, 01:13:21.960 |
because it is what makes a network unusable. To get real density, you have to find a way 01:13:28.360 |
of moving away from that toxicity because it ruins a product experience. You could have the 01:13:33.800 |
best pixels in the world. But if people are virulently spitting into their keyboards, 01:13:39.160 |
other people are just going to say, you know what, I'm done with this. It doesn't make me feel good. 01:13:44.200 |
So the social network has to have a social cost. You can do it in a couple of ways. One is where 01:13:51.480 |
you have real world identity. So then there's a cost to being virulent and there's a cost to being 01:13:57.080 |
caustic. A second way is to actually just overlay an economic framework so that there's a more 01:14:04.760 |
pertinent economic value that you assign to basically spouting off. And the more you want 01:14:09.960 |
to spend, the more you can say. And I think both have a lot of value. I don't know what the right 01:14:15.640 |
answer is. I tend to like the latter. I think real world identity shuts down a lot of debate 01:14:21.960 |
because there's still too much, you know, there's a sensation that there'll be some retribution. 01:14:27.800 |
So I think there's more free speech over there, but it cannot be costless because in that there's 01:14:34.680 |
a level of toxicity that just makes these products unusable. - Third option, and by the way, all these 01:14:41.240 |
work together. If we look at this, what you call the corner case, which is hilarious, 01:14:47.000 |
what I would call the human condition, which is, you know, that anger is rooted with the challenges 01:14:57.880 |
of life. And what about having an algorithm that shows you what you see that's personalized to you 01:15:10.600 |
and helps you maximize your personal growth in the longterm such that you're challenging yourself, 01:15:18.600 |
you're improving, you're learning. There's just enough of criticism to keep you on your toes, 01:15:26.600 |
but just enough of like the dopamine rush to keep you entertained and finding that balance for each 01:15:32.520 |
individual person. - You just described an AGI of a very empathetic, well-rounded friend. - Yes, 01:15:38.440 |
exactly. And then you can throw that person, even anonymous, into a pool of discourse. - 100%. 01:15:45.240 |
- And they would be better. - I think you're absolutely right. 01:15:47.800 |
That is a very, very, very elegant way of stating it. You're absolutely right. - But like you said, 01:15:51.720 |
the AGI might be a few years away, so that's a huge investment. My concern, my gut feeling is 01:15:59.000 |
this thing we're calling AGI is actually not that difficult to build technically, but it requires a 01:16:06.120 |
certain culture and it requires certain risks to be taken. - I think you could reductively boil down 01:16:13.320 |
the human intellect into cognition and emotion. And depending on who you are and depending on the 01:16:23.720 |
moment, they're weighted very differently, obviously. Cognition is so easily done by computers 01:16:31.160 |
that we should assume that that's a solved problem. So our differentiation is the reasoning 01:16:36.440 |
part. It's the emotional overlay. It's the empathy. It's the ability to steelman the 01:16:40.840 |
opposite person's case and feel why that person, you can forgive them without excusing what they 01:16:47.240 |
did, as an example. That is a very difficult thing, I think, to capture in software, but 01:16:54.840 |
I think it's a matter of when, not if. - If done crudely, it takes a form of censorship, 01:17:03.160 |
just banning people off the platform. Let me ask you some tricky questions. Do you think 01:17:11.480 |
Trump should have been removed from Twitter? - No. - What's the pro case? I'm having fun here. 01:17:18.440 |
- Do you steelman each side? - Yeah. Let's steelman the get him off the platform. 01:17:25.400 |
Here we have a guy who is virulent in all ways. He promotes confrontation. He lacks decorum. He 01:17:40.120 |
incites the fervent believers of his cause to act up and push the boundaries bordering on 01:17:48.120 |
and potentially even including breaking the law. He does not observe the social norms of a society 01:17:56.200 |
that keep us well-functioning, including an orderly transition of power. If he is left in 01:18:01.560 |
a moment where he feels trapped and cornered, he could behave in ways that will confuse the people 01:18:08.440 |
that believe in him to act in ways that they so regret that it could bring our democracy to an end 01:18:17.720 |
or create so much damage or create a wound that's so deep it will take years of conflict and years 01:18:24.360 |
of confrontation to heal it. We need to remove him and we need to do it now. It's been too long. 01:18:31.320 |
We've let it go on too long. The other side of the argument would be he was a duly elected person 01:18:39.240 |
whose views have been run over for way too long. He uses the ability to say extreme things in order 01:18:49.560 |
to showcase how corrupt these systems have become and how insular these organizations are in 01:18:57.880 |
protecting their own class. If you really want to prevent class warfare and if you really want to 01:19:03.720 |
keep the American dream alive for everybody, we need to show that the First Amendment, the 01:19:09.240 |
Constitution, the Second Amendment, all of this infrastructure is actually bigger than any partisan 01:19:15.720 |
view no matter how bad it is and that people will make their own decisions. There are a lot of people 01:19:24.280 |
that can see past the words he uses and focus on the substance of what he's trying to get across 01:19:32.120 |
and more generally agree than disagree. When you silence that voice, what you're effectively saying 01:19:38.840 |
is this is a rigged game and all of those things that we were told were not true are actually true. 01:19:45.080 |
If you were to look at the crude algorithms of Twitter, of course, I don't have any insider 01:19:50.920 |
knowledge, but I could imagine that they saw the... Let's say there's a metric that measures 01:19:59.560 |
how negative the experience is of the platform and they probably saw in several ways. You could 01:20:06.920 |
look at this, but the presence of Donald Trump on the platform was consistently increasing how 01:20:13.640 |
shitty people are feeling short-term and long-term because they're probably yelling at each other, 01:20:19.240 |
having worse and worse and worse experience. If you even do a survey of how do you feel about 01:20:23.240 |
using this platform over the last week, they would say horrible relative to maybe a year ago when 01:20:30.120 |
Donald Trump was not actively tweeting or so on. So here you're sitting at Twitter and saying, 01:20:35.320 |
"Okay, I know everyone's talking about speech and all that kind of stuff, but I kind of want 01:20:41.400 |
to build a platform where the users are happy and they're becoming more and more unhappy. How do I 01:20:47.400 |
solve this happiness problem?" Well, let's ban the sources of the unhappiness. Now, we can't just say 01:20:58.200 |
you're a source of unhappiness, we'll ban you. Let's wait until that source says something 01:21:03.720 |
that we can claim breaks our rules, like incites violence or so on. 01:21:09.800 |
That would work if you could measure your construct of happiness properly. The problem is, 01:21:15.240 |
I think what Twitter looked at were active commenters and got it confused for overall 01:21:19.160 |
system happiness. Because for every piece of content that's created on the internet, 01:21:23.000 |
of the 100 people that consume it, maybe one or two people comment on it. 01:21:27.480 |
And so by overamplifying that signal and assuming that it was the plurality of people, that's where 01:21:34.840 |
they actually made a huge blunder. Because there was no scientific method, I think, to get to the 01:21:40.280 |
answer of deplatforming him. And it did expose this idea that it's a bit of a rigged game, 01:21:46.600 |
and that there are these deep biases that some of these organizations have 01:21:52.920 |
to opinions that are counter to theirs and to their orthodox view of the world. 01:21:57.160 |
So in general, you lean towards keeping, first of all, presidents on the platform, but also 01:22:08.520 |
controversial voices. All the time. I think it's really important to keep them there. 01:22:13.720 |
Let me ask you a tricky one in the recent news that's become especially relevant for me. What 01:22:20.200 |
do you think about if you've been paying attention to Kanye West, a recent controversial outburst on 01:22:26.760 |
social media about Jews, black people, racism in general, slavery, Holocaust, all of these topics 01:22:40.600 |
that he touched on in different ways on different platforms, including Twitter. What do you do with 01:22:48.200 |
that? And what do you do with that from a platform perspective, and what do you do from a humanity 01:22:54.760 |
perspective of how to add love to the world? >> Should we take both sides of that? 01:23:04.280 |
completely out of line, and option two is he's not. Just to simplify. 01:23:09.240 |
>> Sure, right. >> So the path one is he's an incredibly 01:23:17.800 |
important tastemaker in the world that defines the belief system for a lot of people. 01:23:24.120 |
And there just is no room for any form of racism or bias or antisemitism in today's day and age, 01:23:33.000 |
particularly by people whose words and comments will be amplified around the world. We've already 01:23:39.800 |
paid a large price for that. And then the expectation of success is some amount of 01:23:46.600 |
societal decorum that keeps moving the ball forward. The other side would say, 01:23:52.200 |
"Life, I think, goes from harmony to disharmony to repair." And anybody who has gone through a 01:24:02.440 |
very complicated divorce will tell you that in that moment, your life is extremely disharmonious, 01:24:11.240 |
and you are struggling to cope. And because he is famous, we are seeing a person really 01:24:21.160 |
struggling in a moment that may need help. And we owe it to him, not for what he said, 01:24:28.760 |
because that stuff isn't excusable, but we owe it to him to help him in a way, 01:24:35.080 |
and particularly his friends. And if he has real friends, hopefully what they see is that. 01:24:41.320 |
What I see on the outside looking in is a person that is clearly struggling. 01:24:47.560 |
Can I ask you a human question? And I know it's outside looking in, but there's several questions 01:24:55.480 |
I want to ask. So one is about the pain of going through a divorce and having kids and all that 01:25:01.560 |
kind of stuff. And two, when you're rich and powerful and famous, I don't know, maybe you can 01:25:08.360 |
enlighten me to which is the most corruptive, but how do you know who are the friends to trust? 01:25:16.840 |
So a lot of the world is calling Kanye insane, or has mental illness, all that kind of stuff. 01:25:26.440 |
And so how do you have friends close to you that say something like that message, but from a place 01:25:33.560 |
of love and where they actually care for you, as opposed to trying to get you to shut up? 01:25:40.520 |
The reason I ask all those questions, I think, if you care about the guy, how do you help him? 01:25:47.160 |
Right. I've been through a divorce. It's gut-wrenching. The most horrible part 01:25:54.600 |
is having to tell your kids. I can't even describe to you how proud I am of and how 01:26:01.960 |
resilient these three beautiful little creatures were when my ex-wife and I had to sit them down 01:26:06.520 |
and talk through it. And for that thing, I'll be just so protective of them and so proud of them. 01:26:18.360 |
It's hard. Now, I don't know that that's what he went through, but it doesn't matter. In that 01:26:24.440 |
moment, there's no fame, there's no money, there's nothing. There's just the raw intimacy of a 01:26:28.040 |
nuclear family breaking up. And that, there is a death, and it's the death of that idea. 01:26:34.120 |
And that is extremely, extremely profound in its impact, especially in your children. 01:26:46.040 |
Could you have seen yourself in the way you see the world being clouded during, 01:26:51.160 |
especially at first, to where you would make poor decisions outside of that nuclear family? So, 01:26:58.840 |
like, poor business decisions, poor tweeting decisions, poor writing decisions. 01:27:05.320 |
If I had to boil down a lot of those, what I would say is that there are moments in my life, 01:27:11.640 |
Lex, where I have felt meaningfully less than. And in those moments, the loop that I would fall 01:27:19.240 |
into is I would look to cope and be seen by other people. So, I would throw away all of the work I 01:27:26.040 |
was doing around my own internal validation, and I would try to say something or do something that 01:27:32.440 |
would get the attention of others. And oftentimes, when that loop was unproductive, it's because 01:27:39.960 |
those things had really crappy consequences. So, yeah, I went through that as well. So, 01:27:48.280 |
I had to go through this disharmonious phase in my life and then to repair it. 01:27:53.320 |
I had the benefit of meeting someone and building a relationship block by block 01:28:02.840 |
where there was just enormous accountability, where my partner Nat has just incredible empathy 01:28:12.840 |
but accountability. And so, she can put herself in my shoes sometimes when I'm a really tough 01:28:20.680 |
person to be around, but then she doesn't let me off the hook. She can forgive me, 01:28:25.640 |
but it doesn't make what I may have said or whatever excusable. And that's been really 01:28:32.280 |
healthy for me, and it's helped me repair my relationships, be a better parent, be a better 01:28:38.840 |
friend to my ex-wife, who's a beautiful woman who I love deeply and will always love her. 01:28:44.920 |
And it took me a few years to see that, that it was just a chapter that had come to an end, 01:28:49.640 |
but she's an incredible mother and an incredible businesswoman. And I'm so thankful that I've had 01:28:55.080 |
two incredible women in my life. That's like a blessing. 01:28:58.200 |
- But it's hard. So, with Nat, it's hard to find a person that has that. I mean, 01:29:03.960 |
a lot of stuff you said is pretty profound, but having that person who has empathy and 01:29:08.120 |
accountability. So, basically, that's ultimately what great friendship is, which is people that 01:29:15.880 |
love you, have empathy for you, but can also call you out on your bullshit. 01:29:18.760 |
- She's a LeBron James-like figure. And the reason I say that is I've seen and met so many people. 01:29:25.240 |
I've seen the distribution on the scale of friendship and empathy. 01:29:32.360 |
- She's a GOAT. Well, what's so funny is we have a dinner around poker, 01:29:37.160 |
and it's taken on a life of its own, mostly because of her, because these guys look to her. 01:29:44.200 |
And I'm like, "Whoa, whoa, whoa. She's taken, like... Her registers are already full. She's 01:29:51.000 |
thinking of all kinds of crap with me." But it's a very innate skill, and it's paired with... 01:29:59.400 |
But it's not just an emotional thing, meaning she's the person that I make all my decisions with. 01:30:06.280 |
These decisions we're making together as a team, I've never understood that. You know, 01:30:10.760 |
there's that African proverb, like, "Go fast, go alone. Go far, go together." And, 01:30:18.040 |
like, since I was born, I was by myself, and I had to cope, and I didn't have a good toolkit 01:30:23.400 |
to use into the world. And in these last five or six years, she's helped me. And at first, 01:30:28.520 |
my toolkit was literally like sticks, you know? And then I found a way to... You know, 01:30:34.440 |
she helped me sharpen a little rock, and that became a little knife, but even that was crap. 01:30:38.840 |
And then she showed me fire, and then I forged a knife. And that's what it feels like, where now 01:30:43.960 |
this toolkit is like most average people. And I feel humbled to be average, because I was here, 01:30:52.360 |
down here on the ground. So it's made all these things more reasonable. So I see what comes from 01:31:00.680 |
having deep, profound friendships and love to help you through these critical moments. 01:31:05.880 |
I have another friend who I would say just completely unabashedly loves me, 01:31:12.120 |
this guy Rob Goldberg. He doesn't hold me accountable that much, which I love. Like, 01:31:16.440 |
I could say I killed a homeless person. He's like, "Ah, they probably deserved it." 01:31:19.400 |
You know? Whereas Nat would be like, "That was not good, what you just did." 01:31:23.640 |
But I have both. I mean, I have Nat every day. You know, Rob, I don't talk to that often. But 01:31:29.720 |
to have two people, I had zero. I think most people, unfortunately, have zero. 01:31:37.320 |
So I think what he needs is somebody to just listen. You don't have to put a label on these 01:31:45.720 |
things. And you just have to try to guide in these very unique moments where you can just 01:31:51.080 |
like de-escalate what is going on in your mind. And I suspect what's going on in his mind, again, 01:31:58.680 |
to play armchair quarterback, I don't know, is that he is in a moment where he just feels lower 01:32:04.280 |
than low. And we all do it. We've all had these moments where we don't know how to get attention. 01:32:11.640 |
And if you didn't grow up in a healthy environment, you may go through a negative 01:32:16.680 |
way to get attention. And it's not to excuse it, but it's to understand it. 01:32:22.600 |
That's so profound, the feeling less than and at those low points going externally to find it. 01:32:33.640 |
And maybe creating conflict and scandal to get that attention. 01:32:40.760 |
The way that my doctor explained it to me is you have to think about your self-worth 01:32:47.160 |
like a knot. It's inside of a very complicated set of knots. So it's like some people don't 01:32:54.520 |
have these knots. It's just presented to you on a platter. But for some of us, 01:32:58.680 |
because of the way we grow up, it's covered in all these knots. So the whole goal is to 01:33:05.320 |
loosen those knots. And it happens slowly. It happens unpredictably. And it takes a long time. 01:33:11.960 |
And so while you're doing that, you are going to have moments where when you feel less than, 01:33:16.680 |
you're not prepared to look inside and say, "Actually, here's how I feel about myself. 01:33:21.000 |
It's pretty cool. I'm happy with where I'm at." 01:33:26.120 |
I have to ask on the topic of friendship. You do an amazing podcast called All In Podcast. 01:33:32.840 |
People should stop listening to this and go listen to that. You just did your 100th episode. 01:33:37.560 |
I mean, it's one of my favorite podcasts. It's incredible. For the technical and the human 01:33:46.920 |
psychological wisdom that you guys constantly give in the way you analyze the world, but also 01:33:53.160 |
just the chemistry between you. You're clearly—there's a tension, and there's a camaraderie 01:34:01.400 |
that's all laid out on the table. So I don't know the two Davids that well, but I have met Jason. 01:34:10.200 |
I mean, I'll give you a little psychological breakdown of all three of these guys. 01:34:18.360 |
Would they agree with your psychological breakdown? 01:34:22.520 |
I don't know. I think that what I would say about J. Cal is he is unbelievably loyal to no end. 01:34:33.000 |
And he's like any of those movies which are about the mafia or whatever, where something bad's going 01:34:43.160 |
wrong and you need somebody to show up. That's J. Cal. 01:34:46.840 |
So if you killed the said proverbial homeless person, he would be right there to help you. 01:34:52.440 |
But he's the one that he'll defend you in every way, shape, or form, even if it doesn't make 01:34:57.320 |
sense in that moment. He doesn't see that as an action of whether it'll solve the problem. He sees 01:35:03.480 |
that as an act of devotion to you, your friend. And that's an incredible gift that he gives us. 01:35:08.200 |
The other side of it is that J. Cal needs to learn how to trust that other people love him 01:35:16.760 |
back as much as he loves us. And that's where he makes mistakes, because he assumes that he's not 01:35:23.000 |
as lovable as the rest of us. But he's infinitely more lovable than he understands. You have to see 01:35:29.640 |
Lex. He is unbelievably funny. I cannot tell you how funny this guy is. Next level funny. 01:35:38.840 |
Timing, charm, the care he takes. So he is as lovable, but he doesn't believe himself to be 01:35:46.520 |
and that manifests itself in areas that drive us all crazy from time to time. 01:35:50.600 |
Which makes it for a very pleasant listening experience. Okay, so what about the two Davids, 01:35:57.560 |
David Sachs is the one that I would say I have the most emotional connection with. 01:36:01.320 |
He and I can go a year without talking, and then we'll talk for four hours straight. 01:36:07.080 |
And then we know where we are, and we have this ability to pick up and have a level of intimacy 01:36:12.680 |
with each other. And I think that's just because I've known David for so long now. 01:36:16.760 |
That I find really comforting. And then Friedberg is this person who I think similar to me, 01:36:24.040 |
had a very turbulent upbringing, has fought through it to build an incredible life for himself. 01:36:30.200 |
And I have this enormous respect for his journey. I don't particularly care about his outcomes, 01:36:36.040 |
to be honest, but I just have, I look at that guy and I think, "He did it." And so if I didn't do it, 01:36:43.160 |
I would be glad that he did it, if it makes any sense. And you can see that he 01:36:52.600 |
feels like his entire responsibility is really around his kids. And to give a better counterfactual 01:37:05.240 |
and sometimes I think he gets that right and wrong, but he's a very special human being. 01:37:11.640 |
On that show, the two of you have a very kind of, like from a geopolitics perspective, 01:37:18.680 |
I don't know, there's just a very effective way to think deeply about the world, 01:37:27.080 |
He's a very systems level thinker, which I really, really like. 01:37:32.360 |
Very systems level. So looking at everything. 01:37:34.360 |
And he's very rooted in a broad body of knowledge, which I have a tremendous respect for. He brings 01:37:42.120 |
all these things in. Sachs is incredible because he has this unbelievable understanding of things, 01:37:48.920 |
but it has a core nucleus. So Freeberg can just basically abstract a whole bunch of systems and 01:37:54.040 |
talk about it. I tend to be more like that where I try to kind of, I find it to be more of a puzzle. 01:37:59.880 |
Sachs is more like anchored in a philosophical and historical context as the answer. 01:38:05.160 |
And he starts there, but he gets to these profound understandings of systems as well. 01:38:10.360 |
On the podcast, in life, you guys hold to your opinion pretty strong. 01:38:15.480 |
What's the secret to being able to argue passionately with friends? 01:38:21.480 |
So hold your position, but also not murder each other, which you guys seem to come close to. 01:38:28.120 |
I think it's like strong opinions, weakly held. 01:38:34.840 |
Wait, is that a haiku or is that, can you explain that please? 01:38:39.880 |
Yeah, like look today, you and I, we steel man like the two sides of three different things. 01:38:46.360 |
Now you could be confused and think I believe in those things. I believe that it's important to be 01:38:54.040 |
able to intellectually traverse there, whether I believe in it or not. And like steel man, not 01:38:59.800 |
But we intro those things by saying, let us steel man this position. Sometimes you guys skip the- 01:39:05.160 |
You're right. We edit those things out and sometimes we'll sit on either sides and we'll 01:39:09.800 |
just kind of bat things back and forth just to see what the other person thinks. 01:39:14.680 |
So that's how, like as fans, we should listen to that sometimes. 01:39:19.400 |
So sometimes, 'cause you hold a strong opinion sometimes. Like for example, the cost of energy 01:39:25.560 |
going to zero, is that, like what's the degree of certainty on that? Is this kind of like you 01:39:33.240 |
really taking a prediction of how the world will unroll and if it does, this will benefit a huge 01:39:42.360 |
amount of companies and people that will believe that idea. So you really, you spend a few days, 01:39:51.160 |
I've been spending two years with that idea. And that idea has manifested into 01:39:56.600 |
many pages and pages of more and more branches of a tree. But it started with that idea. So if 01:40:06.600 |
you think about this tree, this logical tree that I built, I would consider it more of a mosaic. 01:40:11.080 |
And at the base or root, however you want to talk about it, is this idea, the incremental cost of 01:40:16.520 |
energy goes to zero. How does it manifest? And so I talked about one traversal, which is the 01:40:22.280 |
competition of households versus utilities. But if even some of that comes to pass, we're going 01:40:29.320 |
to see a bunch of other implications from a regulatory and technology perspective. If some 01:40:33.800 |
of those come to pass, so I've tried to think sort of this six, seven, eight hops forward. 01:40:41.480 |
And I have some, to use a chess analogy, I have a bunch of short lines, which I think can work. 01:40:47.560 |
And I've started to test those by making investments, tens of millions over here to 01:40:53.880 |
a hundred millions over there. But it's a distribution based on how probabilistic I 01:40:59.240 |
think these outcomes are and how downside protected I can be and how much I will learn, 01:41:04.440 |
how many mistakes I can make, et cetera. And then very quickly over the next two years, 01:41:09.960 |
some of those things will happen or not happen, and I will rapidly re-underwrite. 01:41:13.400 |
And I'll rewrite that tree. And then I'll get some more data. I'll make some more investments. 01:41:19.480 |
And I'll rapidly re-underwrite. So in order for me to get to this tree, maybe you can ask, 01:41:25.560 |
how did I get there? It was a complete accident. The way that it happened was I have a friend of 01:41:31.240 |
mine who works at a great organization called Fortress. His name is Drew McKnight. And he 01:41:34.840 |
called me one day and he said, "Hey, I'm doing a deal. Will you anchor it? We're going public. 01:41:38.120 |
And it's a rare earth mining company." And I said, "Drew, I'm going to get tarred and feathered in 01:41:45.080 |
Silicon Valley for backing a mining company." And he said, "Samath, just talk to the guy and learn." 01:41:49.640 |
And the guy, Jim Latinsky, blew me away. He's like, "Here's what it means for energy. And 01:41:55.000 |
here's what it means for the supply chain. And here's what it means for the United States versus 01:41:58.040 |
China." But Lex, I did that deal. And then I did seven others. And that deal made money. 01:42:05.080 |
The seven others did not. But I learned, I made enough mistakes where the net of it was I got to 01:42:12.840 |
a thesis that I believed in. I could see it. And I was like, "Okay, I paid the price. I acquired 01:42:18.440 |
the learning. I made my mistakes. I know where I am at. And this is step one." And then I learned a 01:42:24.440 |
little bit more. I made some more investments. And that's how I do the job. The minute that you 01:42:30.200 |
try to wait for perfection in order to make a bet either on yourself or a company, 01:42:35.400 |
a girlfriend, whatever, it's too late. So if we just linger on that tree, 01:42:41.320 |
it seems like a lot of geopolitics, a lot of international military even conflict is around 01:42:48.520 |
energy. So how does your thinking about energy connect to what you see happening in the next 01:42:55.560 |
10, 20 years? Maybe you can look at the war in Ukraine or relationship with China and other 01:43:03.240 |
places through this lens of energy. What's the hopeful, what's the cynical trajectory that the 01:43:09.560 |
world might take through with this drive towards zero energy, zero cost energy? 01:43:14.760 |
So the United States was in a period of energy surplus until the last few years, some number of 01:43:20.440 |
years in Trump and I think some number of now the current administration with President Biden. 01:43:24.280 |
But we know what it means to basically have more than enough energy to fund our own domestic 01:43:33.400 |
manufacturing and living standards. And I think that by being able to generate this energy from 01:43:40.280 |
the sun that is very capex efficient, that is very climate efficient, gives us a huge tailwind. 01:43:47.320 |
The second thing is that we are now in a world, in a regime for many years to come of non-zero 01:43:54.760 |
interest rates. And it may interest you to know that the really the last time that we had long 01:44:04.840 |
dated wars supported at low interest rates was World War II, where I think the average interest 01:44:10.600 |
rates was like 1.07% in the 10 year. And every other war tends to have these very quick opening 01:44:17.400 |
closes because these long protracted fights get very difficult to finance when rates are non-zero. 01:44:24.360 |
So just as an example, even starting in 2023, so the practical example today in the United States 01:44:30.360 |
is President Biden's budget is about 1.5 trillion for next year. That's not including the entitlement 01:44:37.560 |
spending, meaning Medicare, Social Security. So the stuff that he wants to spend that he has 01:44:43.000 |
discretion over is about 1.582 trillion is the exact number. Next year, our interest payments 01:44:49.720 |
are going to be $455 billion. That's 29% of every budget dollar is going to pay interest. 01:44:57.240 |
So you have these two worlds coming together, right, Lex? If you have us hurtling forward 01:45:05.080 |
to being able to generate our own energy and the economic peril that comes with trying to underwrite 01:45:11.640 |
several trillion dollars for war, which we can't afford to pay when rates are at 5%, 01:45:16.920 |
means that despite all the bluster, the probabilistic distribution of us engaging 01:45:23.480 |
in war with Russia and Ukraine seems relatively low. The override would obviously be a moral 01:45:31.880 |
reason to do it. That may or may not come if there is some nuclear proliferation. 01:45:38.840 |
But now you have to steel man the other side of the equation, which is, well, what were to happen 01:45:44.200 |
if you were sitting there and you were Putin? Let's steel man setting off a tactical nuke 01:45:48.920 |
someplace. OK, I'm getting calls every other day from my two largest energy buyers, India and China, 01:45:57.720 |
telling me, slow my roll. I have the entire world looking to find the final excuse to turn me off 01:46:07.080 |
and unplug me from the entire world economy. The only morally reprehensible thing that's left in 01:46:13.640 |
my arsenal that could do all of these things together would be to set off a tactical nuke. 01:46:18.280 |
I would be the only person since World War II to have done that. 01:46:21.640 |
I mean, it seems like it's a really, really, really big step to take. And so I think that 01:46:33.000 |
X of the clamoring for war that the military industrial complex wants us to buy into, 01:46:39.080 |
the financial reasons to do it and the natural resources needs to do it are making it very 01:46:47.480 |
unlikely. That is not just true for us. I think it's also true for Europe. I think the European 01:46:54.440 |
economy is going to roll over. I think it's going-- I see a very hard landing for them, 01:47:00.760 |
which means that if the economy slows down, there's going to be less need for energy. 01:47:05.160 |
And so it starts to become a thing where a negotiated settlement is actually the win-win 01:47:12.280 |
for everybody. But none of this would be possible without zero interest rates. In a world of zero 01:47:19.480 |
interest rates, we would be in war. >> So you believe in the financial forces 01:47:26.360 |
and pressures overpowering the human ones? >> I believe in the invisible hand. I really 01:47:31.320 |
do believe in the invisible hand. >> Even in international war? 01:47:34.200 |
>> More so there. I think the invisible hand-- and by the invisible hand for the audience, 01:47:39.320 |
I think really what it means is the financial complex and really the central bank complex and 01:47:46.120 |
the interplay between fiscal and monetary policy is a very convoluted and complicated set of things. 01:47:53.240 |
But if we had zero interest rates, we would be probably in the middle of it now. 01:48:01.000 |
>> See, there's a complexity to this game at the international level where some nations 01:48:09.000 |
are authoritarian and there's significant corruption. And so that adds from a game 01:48:17.720 |
theoretic optimal perspective, the invisible hand is operating in the mud. 01:48:23.720 |
>> Preventing war. The person that is the most important figure in the world right now is Jerome 01:48:30.280 |
Powell. He is probably doing more to prevent war than anybody else. He keeps ratcheting rates. 01:48:35.400 |
It's just impossible. It's a mathematical impossibility for the United States unless 01:48:39.640 |
there is such a cataclysmic moral transgression by Russia. So there is tail risk that it is possible 01:48:44.840 |
where we say, "Forget it. All bets are off. We're going back to zero rates. Issue a 100-year bond. 01:48:50.920 |
We're going to finance a war machine." There is a small risk of that. But I think the propensity of 01:48:56.120 |
the majority of outcomes is more of a negotiated settlement. 01:48:58.680 |
>> So what about, I mean, what's the motivation of Putin to invade Ukraine in the first place? 01:49:04.760 |
Financial forces are the most powerful forces. Why did it happen? Because it seems like there's 01:49:15.720 |
other forces at play of maintaining superpower status on the world stage. It seems like geopolitics 01:49:25.080 |
doesn't happen just with the invisible hand in consideration. 01:49:29.560 |
>> I agree with that. I can't beg to know, to be honest. I don't know. But he did it. 01:49:35.560 |
And I think it's easier for me to guess the outcome from here. It would have been impossible 01:49:42.760 |
for me to really understand. It is what got him to this place. But it seems like there's an end 01:49:49.000 |
game here, and there's not much playability. >> Yeah. I feel like I'm on unsteady ground 01:49:57.000 |
because there's been so many experts at every stage of this that have been wrong. 01:50:01.720 |
>> Well, there are no experts. >> Well, on this... 01:50:04.440 |
>> There are no experts, Lex. >> I understand this. Well, 01:50:10.440 |
let's dig into that because we just said Phil Hellmuth is the greatest poker player of all time. 01:50:18.760 |
>> He doesn't... They would be mistaken... >> Phil, Ivey is an expert at poker. 01:50:22.840 |
>> Phil has an opinion. Ivey has an opinion as well on how to play all these games, 01:50:26.600 |
meaning an opinion means here's the lines I take, here are the decisions I make. I live and die by 01:50:32.200 |
those, and if I'm right, I win. If I'm wrong, I lose. I've made more mistakes than my opponent. 01:50:37.400 |
>> I thought you said there's an optimal. So aren't there people that have a deeper 01:50:43.640 |
understanding, a higher likelihood of being able to describe and know the optimal set of actions 01:50:50.520 |
here at every layer? >> Well, there may be a theoretically 01:50:55.800 |
set of optimal decisions, but you can't play your life against a computer, meaning the minute that 01:51:04.200 |
you face an opponent and that person takes you off that optimal path, you have to adjust. 01:51:08.840 |
>> Yeah. Like what happens if a tactical nuke... >> It would be really bad. 01:51:17.400 |
I think the world is resilient enough. I think the Ukrainians are resilient enough to overcome it. 01:51:22.120 |
It would be really bad. It's just an incredibly sad moment in human history. 01:51:26.040 |
>> But do you wonder what US does? Is there any understanding? Do you think people inside 01:51:31.400 |
the United States understand? Not the regular citizens, but people in the military. Do you 01:51:37.880 |
think Joe Biden understands? Do you think... >> I think Joe Biden does understand. I think that... 01:51:41.800 |
>> You think they have a clear plan? >> I think that there are few reasons to let 01:51:47.960 |
the gerontocracy rule, but this is one of the reasons where I think they are better adept than 01:51:52.280 |
other people. Folks that were around during the Bay of Pigs, folks that hopefully have studied 01:51:58.680 |
that and studied nuclear de-escalation will have a better playbook than I do. My suspicion 01:52:07.640 |
is that there is an in an emergency break glass plan. I think before military intervention or 01:52:15.960 |
anything else, I think that there are an enormous number of financial sanctions 01:52:23.240 |
that you can do to just completely cripple Russia that they haven't undertaken yet. 01:52:29.640 |
If you couple that with an economic system in Europe that is less and less in need of energy, 01:52:37.400 |
because it is going into a recession, it makes it easier for them to be able to walk away, 01:52:44.280 |
while the US ships a bunch of LNG over there. I don't know the game theory on all of this, but... 01:52:50.680 |
>> Does it make you nervous that... Or we're just being temperamental? It feels like the world hangs 01:52:58.760 |
in a balance. It feels like, at least from my naive perspective, I thought we were getting to 01:53:09.160 |
a place where surely human civilization can't destroy itself. And here's a presentation of 01:53:13.960 |
what looks like a hot war where multiple parties involved in escalation towards a world war is not 01:53:21.320 |
entirely out of the realm of possibility. >> It's not. I would really, really hope that 01:53:28.120 |
he is spending time with his two young twins. >> Well, this is part of what... 01:53:37.880 |
>> I really hope he's spending time with his kids. >> Agreed, but not kids, not just kids, 01:53:44.680 |
but friends. >> I'm not sure that he may not have friends, but it's very hard for anybody 01:53:51.000 |
to look at their kids and not think about protecting the future. >> 01:53:55.120 |
Well, there's partially because of the pandemic, but partially because of the nature of power, 01:54:02.600 |
it feels like you're surrounded by people you can't trust more and more. I do think the pandemic 01:54:07.880 |
had an effect on that too, the isolating effect. A lot of people were not their best selves during 01:54:13.880 |
the pandemic. From a super heavy topic, let me go back to the space where you're one of the most 01:54:19.880 |
successful people in the world. How to build companies, how to find good companies, what it 01:54:26.280 |
takes to find good companies, what it takes to build good companies, what advice do you have 01:54:31.560 |
for someone who wants to build the next super successful startup in the tech space and have a 01:54:37.880 |
chance to be impactful like Facebook, Apple? >> I think that's the key word. If your 01:54:44.280 |
precondition is to start something successful, you've already failed because now you're playing 01:54:48.760 |
somebody else's game. What success means is not clear. You're walking into the woods. It's murky, 01:54:55.080 |
it's dark, it's wet, it's raining. There's all these animals about. There's no comfort there. 01:55:01.480 |
So you better really like hiking. And there's no short way to shortcut that. 01:55:09.000 |
>> Isn't it obvious what success is? Like success is scale, so it's not what are the... 01:55:15.240 |
>> No. I think that there's a very brittle basic definition of success that's outside in, 01:55:21.400 |
but that's not what it is. I know people that are much, much, much richer than I am, 01:55:32.360 |
and they are just so completely broken. And I think to myself, the only difference between you 01:55:42.680 |
and me is outsider's perception of your wealth versus mine. But the happiness and the joy that 01:55:50.840 |
I have in the simple basic routines of my life give me enormous joy. And so I feel successful, 01:56:00.200 |
no matter what anybody says about my success or lack of success. There are people that live normal 01:56:07.720 |
lives, that have good jobs, that have good families. I have this idyllic sense. I see it on 01:56:15.400 |
TikTok all the time, so I know it exists. These neighborhoods where there's a cul-de-sac and these 01:56:20.600 |
beautiful homes and these kids are biking around. And every time I see that, Lex, I immediately 01:56:26.600 |
flashback to what I didn't have. And I think that's success. Look at how happy those kids are. 01:56:34.360 |
So no, there is no one definition. And so if people are starting out to try to make 01:56:39.320 |
a million dollars, a hundred million dollars, a billion dollars, you're going to fail. 01:56:43.000 |
There's a definition of personal success, but is there's also some level of, 01:56:49.000 |
that's different from person to person, but is there's also some level of 01:56:54.120 |
the responsibility you have if there's a mission to have a positive impact on the world. 01:57:01.400 |
So I'm not sure that Elon is happy. >> No. In fact, I think if you focus on 01:57:06.600 |
trying to have an impact on the world, I think you're going to end up deeply unhappy. 01:57:09.800 |
>> But does that matter? Like why is your own personal happiness matter? 01:57:14.440 |
>> It may happen as a byproduct, but I think that you should strive to find your own personal 01:57:20.280 |
happiness and then measure how that manifests as it relates to society and to other people. 01:57:26.760 |
But if the answer to those questions is zero, that doesn't make you less of a person. 01:57:31.560 |
>> No, a hundred percent. But then the other way, is there times when you need to sacrifice your 01:57:36.360 |
own personal happiness for a bigger thing that you've created? 01:57:41.640 |
>> Yeah. If you're in a position to do it, I think some folks are tested. Elon is probably 01:57:46.920 |
the best example. And it must be really, really hard to be him. Really hard. I have 01:57:57.160 |
enormous levels of empathy and care for him. I really love him as a person, because I just see 01:58:07.320 |
that it's not that fun. And he has these ways of being human that in his position, 01:58:16.040 |
I just think are so dear that I just hope he never loses them. Just a simple example, 01:58:21.800 |
two days ago, I don't know why, but I went on Twitter and I saw the perfume thing. So I'm like, 01:58:29.000 |
"Ah, fuck it. I'm just going to go buy some perfume." So I bought his perfume, the burnt 01:58:32.760 |
hair thing. And I emailed him the receipt and I'm like, "All right, you got me for a bottle." 01:58:37.640 |
And he responded in eight seconds and it was just a smiley face or whatever. 01:58:43.160 |
Just deeply normal things that you do amongst people that are just... So nobody sees that. 01:58:48.600 |
You know what I mean? But it would be... He deserves for that stuff to be seen because 01:58:53.960 |
the rest of his life is so brutally hard. He's just a normal guy that is just caught in this 01:58:59.880 |
ultra mega vortex. >> Why do you think there's so few Elons? 01:59:06.280 |
>> It's an extremely lonely set of trade-offs. Because to your point, if you get tested... 01:59:15.000 |
So if you think about it again, probabilistically, there's 8 billion people in the world, 01:59:20.680 |
maybe 50 of them get put in a position where they are building something of such 01:59:25.160 |
colossal importance that they even have this choice. And then of that 50, maybe 10 of them 01:59:32.120 |
are put in a moment where they actually have to make a trade-off. You're not going to be able to 01:59:37.480 |
see your family. I'm making this up. You're not going to be able to see your family. You're going 01:59:41.160 |
to have to basically move into your factory. You're going to have to sleep on the floor. 01:59:44.520 |
But here's the outcome, energy independence and resource abundance and a massive peace dividend. 01:59:51.880 |
And then he says to himself, I don't know that he did because I've never had this... 01:59:57.160 |
Yeah, you know what, that's worth it. And then you look at your kids and you're like, 02:00:01.720 |
I'm making this decision. I don't know how to explain that to you. 02:00:07.880 |
There's no amount of money where I would want to be in that position. So that takes an enormous 02:00:12.200 |
fortitude and a moral compass that he has. And that's what I think people need to appreciate 02:00:18.040 |
about that guy. >> It's also on the first number you said, it's confusing that there's 50 people 02:00:23.560 |
or 10 people that are put in the position to have that level of impact. It's unclear that that has 02:00:30.200 |
to be that way. It seems like there could be much more. >> There should be. There's definitely people 02:00:35.160 |
with the potential. But think about his journey. His mom had to leave a very complicated environment, 02:00:43.480 |
move to Canada, move to Toronto, a small apartment just north of Bay and Bloor, if you've ever been 02:00:50.440 |
to Toronto. I remember talking to her about this apartment. It's so crazy because I used to live 02:00:54.760 |
like around the corner from that place and raise these three kids and just have to... So how many 02:01:00.600 |
people are going to start with those boundary conditions and really grind it out? It's just 02:01:07.880 |
very few people in the end that will have the resiliency to stick it through where you don't 02:01:15.480 |
give into the self-doubt. And so it's just a really hard set of boundary conditions where you 02:01:23.160 |
can have 50 or 100 of these people. That's why they need to be really appreciated. >> Yeah. Well, 02:01:29.320 |
that's true for all humans that follow the thread of their passion and do something beautiful in 02:01:38.360 |
this world. That could be on a small scale or a big scale. Appreciation is a gift you give to 02:01:45.160 |
the other person, but also a gift to yourself. Somehow it becomes like this contagious thing. 02:01:49.800 |
>> I went to this. You are so right. My brain just lit up because yesterday I went to 02:01:56.200 |
an investor day of my friend of mine, described Brad Gerstner. And on the one very reductive world, 02:02:05.640 |
Brad and I are theoretically competitors, but we're not. He makes his own set of decisions. 02:02:10.680 |
I make my own set of decisions. We're both trying to do our own view of what is good work in the 02:02:15.240 |
world, but he's been profoundly successful. And it was really the first moment of my adult life 02:02:23.800 |
where I could sit in a moment like that and really be appreciative of his success and not feel less 02:02:29.240 |
than. And so, a little selfishly for me, but mostly for him as well, I was so proud to be in the room. 02:02:36.920 |
That's my friend. That guy plays poker with me every Thursday. He is crushing it. It's awesome. 02:02:41.880 |
And it's a really amazing feeling. >> I mean, to linger on the trade-offs, 02:02:51.480 |
the complicated trade-offs with all of this, what's your take on work-life balance 02:02:55.880 |
in a company that's trying to do big things? >> I think that you have to have 02:03:06.360 |
some very, very strict boundaries, but otherwise I think balance is kind of dumb. 02:03:13.320 |
It will make you limited. I think you need to immerse yourself in the problem, 02:03:18.760 |
but you need to define that immersion with boundaries. So, if you ask me, "What does my 02:03:25.240 |
process look like?" It's monotonous and regimented, but it's all the time, except when it's not. 02:03:33.400 |
And that's also monotonous and regimented. And I think that makes me very good at my craft 02:03:41.000 |
because it gives me what I need to stay connected to the problem without feeling resentful about the 02:03:49.320 |
problem. >> Which part? The monotonous, all-in nature of it? Or when you say hard boundaries, 02:03:57.640 |
essentially go all out until you stop, and you don't stop often? >> I'm in a little bit of a 02:04:05.240 |
quandary right now because I'm trying to redefine my goals. And you're catching me in a moment where 02:04:12.440 |
I have, even in these last few years of evolution, I think I've made some good progress, 02:04:19.800 |
but in one very specific way, I'm still very reptilian. And I'm trying to let go, 02:04:27.240 |
which was that exactly? If you can-- >> In my business, it really gets reduced to 02:04:32.600 |
what is your annual rate of compounding? That's my demarcation. Steph Curry and LeBron James, 02:04:38.280 |
Michael Jordan, it's how many points did you average, not just in a season, but over your 02:04:44.040 |
career? And in their case, to really be the greatest of all time, it's points, rebounds, 02:04:50.520 |
assists, steals. There's all kinds of measures to be in that pantheon of being really, really 02:04:58.280 |
good at your craft. And in my business, it's very reductive. It's how well have you compounded? 02:05:05.560 |
And if you look at all the heroes that I have put on a pedestal in my mind, 02:05:15.640 |
they've compounded at above 30% for a very long time, as have I. 02:05:22.760 |
But now I feel like I really need to let go because I think I know how to do the basics of my job. 02:05:31.000 |
And if I had to summarize an investing challenge or investing, I think really it's when you first 02:05:38.280 |
start out investing, you're a momentum person. You saw it in GameStop, just a bunch of people 02:05:43.400 |
aping each other. And then it goes from momentum to you start to think about cash flows, how much 02:05:51.080 |
profit is this person going to make, whatever. So that's like the evolution. This is the basic 02:05:56.840 |
thing to this is a reasonably sophisticated way. Then a much smaller group of people think about 02:06:03.240 |
it in terms of macro geopolitics. But then a very finite few crack this special code, 02:06:09.400 |
which is there's a philosophy and it's the philosophy that creates the system. 02:06:13.560 |
And I'm scratching at that furiously, but I cannot break through and I haven't broken through. 02:06:20.520 |
And I know that in order to break through, I got to let go. So this is the journey that I'm in 02:06:25.880 |
as in my professional life. So it is an all consuming thing, but I'm always home for dinner. 02:06:34.200 |
We have very prescribed moments where we take vacation, the weekends. I can tell you about my 02:06:39.640 |
week if you're curious, but it's like- I would love to know your week. It's since it's regimented 02:06:44.120 |
and monotonous. I wake up at 6.45, get the kids, go downstairs. We all have some form of 02:06:57.720 |
not super healthy breakfast. I make a latte. I've become in and the latte is like, I have a machine. 02:07:04.200 |
I measure the beans. I make sure that the timer is such where I have to pull it for a certain 02:07:09.640 |
specific ratio. Just so you know, 20 grams, I got to pull 30 grams with the water and I got to do it 02:07:15.480 |
in 30 seconds, et cetera. So you're a coffee snob. 02:07:19.240 |
It helps me stay in rhythm. Sure. Before I used to have another machine, I just pushed a button. 02:07:25.640 |
But then I would push the button religiously in the exact same way. You know what I mean? 02:07:29.320 |
Can I say actually on that topic, the morning with kids can be a pretty stressful thing. 02:07:38.040 |
Are you able to find sort of happiness? Is that also that morning is a source of happiness? 02:07:44.040 |
It's great. My kids are lovely. They're maniacs. 02:07:51.640 |
I just see, and maybe I've never asked Friedberg this, but I'll just put my words. I see all of 02:07:59.080 |
the things in moments where there was no compassion given to me. And so I just give him a ton of 02:08:08.440 |
love and compassion. I have an infinite patience for my children, not for other kids. 02:08:13.640 |
Yes, of course. But for kids. So anyway, so we have a breakfast thing. 02:08:20.520 |
And then I go upstairs and I change and I work out from eight to nine. 02:08:26.520 |
And that's like the first 15 minutes I walk up on a steep incline, 12 to 14%, 02:08:34.360 |
three and a half to four miles per hour walk. And then Monday's a push day, Tuesday's front 02:08:44.120 |
of the legs, Wednesday's pull, Thursday's back of the legs, eight to nine. Monday I always start, 02:08:52.360 |
I talk to my therapist from nine to 10. So as soon as I finished working out, I get on the phone 02:08:56.920 |
and I talk to him. And it helps me lock in for the week. And I'm just talking about the past. 02:09:10.520 |
Usually, sometimes the recent past, but usually it's about the past past. Something that I 02:09:16.280 |
remember when I was a kid. Because that's the work about just loosening those knots. 02:09:21.400 |
So I put in that hour of work, respect that hour. Then I'm in the office. And then it's like, 02:09:30.280 |
I go until 12, 15, 12, 30, go home, have lunch, like a proper, go home, sit down, have lunch with 02:09:39.640 |
Nat, talk, she leaves her work and we talk, how are we doing? Just check in. Our youngest daughter 02:09:45.960 |
will be there 'cause she's one and she's making a mess. And then I'll have another coffee. That's 02:09:53.240 |
it, my limit for the day. Oh, no more caffeine. That's it. And then I go back to the office and 02:09:59.320 |
I'll be there till six, seven sometimes. And I do that Monday, Tuesday, Wednesday, Thursday. 02:10:04.920 |
Monday, Tuesday, Thursday, Friday, I'm allowed to have meetings. Wednesday, nothing, it's all 02:10:09.880 |
reading, must be, unless it's a complete emergency. It has to be kind of a full reading. And reading 02:10:16.520 |
is a bunch of blogs, YouTube videos. So no, try not to do any talking. 02:10:22.520 |
No talking. It's like being in silence, being present, thinking about things. 02:10:27.400 |
By the way, how do you take notes? Do you have a- 02:10:29.720 |
A sketch, I have a pad and I write stuff down. Sometimes I go to my phone. I'm a little all 02:10:35.160 |
over the place. Sometimes I do Google Docs. I don't have a, this is one thing I need to 02:10:40.040 |
get better at actually. But typically what happens is I actually do a lot of thinking in my mind 02:10:45.880 |
and I'm sort of filing a lot of stuff away. And then it all spills out and then I have to write. 02:10:51.880 |
And then that gives me a body of work that I can evaluate and think about. And then I usually put 02:10:57.560 |
it away. And a lot of the time it goes nowhere, but every now and then I come back to it and it 02:11:03.560 |
just unlocks two or three things. And I have a sense of how else I'm thinking about things. 02:11:07.240 |
And then Friday at the end of the day, Nat and I talked to a couples therapist 02:11:11.800 |
and that's about checking out properly. So it's like, okay, now it's like focusing. 02:11:19.400 |
The weekend is family, being present, being aware. And if there's email, obviously, 02:11:27.560 |
if I have to do meetings from time to time, no problem, but there's boundaries. 02:11:32.600 |
Checking out properly. Oh man, that is so powerful. Just like officially transitioning. 02:11:40.680 |
Yeah. So these are really important boundaries so that I can be immersed. And what that means 02:11:49.080 |
is like, look, on a Saturday afternoon, on a random day, she'll be like, "Where's Chamath?" 02:11:53.960 |
And I'll be up in my room. And I've found a podcast talking about like, 02:11:59.000 |
DSIS, which is like ductal cancer in situ, because I've been fascinated about breast cancer 02:12:04.760 |
surgeries for a while and learning about that. And she's like, "What are you doing?" I'm like, 02:12:09.400 |
"Listening to a podcast about DSIS." And she's like, "What's that?" And I'm like, 02:12:13.720 |
"Ductal cancer in situ." She's like, "Okay." And so I have time to continue to just constantly 02:12:21.320 |
learning, learning, putting stuff in my memory banks to organize into something. And that's a 02:12:28.040 |
week. But then in these fixed moments of time, phone down, everything down, we go on vacation, 02:12:34.520 |
we go on a boat, we go to whatever, where it's just us and the kids. 02:12:39.960 |
Is there a structure when you're at work? Is there a structure to your day in terms of meetings, 02:12:44.360 |
in terms of South Side of Wednesday? Because you're... 02:12:47.880 |
Have to keep meetings to less than 30 minutes. Have to. And oftentimes meetings can be as short 02:12:54.680 |
as like 10 or 15 minutes, because then I'm just like, "Okay." Because I'm trying to reinforce 02:12:59.800 |
that it's very rare that we all have something really important to say. And so the ritual that 02:13:08.040 |
becomes really valuable to get scale is not the ritual of meetings, but the ritual of respecting 02:13:13.880 |
the collective time of the unit. And so it's like, "You know what, folks? I'm going to assume that 02:13:19.640 |
you guys are also tackling really important projects. You also want to have good boundaries 02:13:24.200 |
in this immersion. Go back to your kids and have dinner with them every night. It's not just for 02:13:28.440 |
me, it's for you. So how about this? Why don't you go and do your work? This meeting didn't need to 02:13:32.520 |
be 30 minutes, it could be five. And the rest of the time is yours." And it's weird because when 02:13:38.120 |
people join that system at Social Capital, they just... It's like FaceTime and it's like, "Let me 02:13:43.800 |
make sure and let me talk a lot." It's like, "I don't say anything." I respect the person that 02:13:48.760 |
says nothing for two years and the first thing that they say is not obvious. That person is 02:13:53.000 |
immensely more valuable than the person that tries to talk all the time. What have you learned from 02:13:57.320 |
your... So after Facebook, you started Social Capital or what is now called Social Capital. 02:14:02.280 |
What have you learned from all the successful investing you've done there? About investing 02:14:10.280 |
or about life or about running a team? I'm very loath to give advice because I think 02:14:16.520 |
so much of it is situational. But my observation is that starting a business is really hard, 02:14:21.960 |
any kind of business. And most people don't know what they're doing. And as a result, we make 02:14:28.360 |
enormous mistakes. But I would summarize this and this may be a little heterodoxical. I think 02:14:35.000 |
there are only three kinds of mistakes. Because if we go back to what we said before, in the business, 02:14:39.240 |
it's just learning. You're exploring the dark space to get to the answer faster than other people. 02:14:45.960 |
And the mistakes that you make are three or the three kinds of decisions, let's say. 02:14:52.200 |
You'll hire somebody and they're really, really, really average, but they're a really good person. 02:15:01.720 |
Oh, yeah. You'll hire somebody and they really weren't candid with who they are. 02:15:09.480 |
And their real personality and their morality and their ethics only expose them 02:15:15.640 |
over a long period of time. And then you hire somebody and they're not that good 02:15:24.120 |
morally, but they're highly performant. What do you do with those three things? 02:15:31.480 |
And I think successful companies have figured out how to answer those three things because those 02:15:41.080 |
are the things that, in my opinion, determine success and failure. 02:15:45.880 |
So it's basically hiring and you just identified three failure cases for hiring. 02:15:50.760 |
But very different failure cases and very complicated ones, right? Like the highly 02:15:54.280 |
performant person who's not that great as a human being, do you keep them around? Well, 02:16:00.840 |
a lot of people would err towards keeping that person around. What is the right answer? I don't 02:16:04.680 |
know. It's the context of the situation. And the second one is also very tricky. What about if they 02:16:11.480 |
really turned out that they were just not candid with who they are and it took you a long time to 02:16:14.840 |
figure out who you were? These are all mistakes of the senior person that's running this organization. 02:16:18.920 |
I think if you can learn to manage those situations well, those are the real edge 02:16:26.520 |
cases where you can make mistakes that are fatal to a company. That's what I've learned over 11 02:16:31.800 |
and a half years, honestly. Otherwise, the business of investing, I feel that it's a secret. 02:16:39.720 |
And if you are willing to just keep chipping away, you'll peel back enough of these layers will come 02:16:47.960 |
off and you'll see it. The scales will come off and you'll eventually see it. 02:16:51.560 |
I really struggle with, maybe you can be my therapist for a little bit, 02:16:55.480 |
with that first case which you originally mentioned. Because I love people, I see the 02:17:00.040 |
good in people. I really struggle with just a mediocre performing person who's a good human 02:17:07.240 |
being. That's a tough one. I'll let you off the hook. I think that those are incredibly 02:17:13.800 |
important and useful people. I think that if a company is like a body, they are like cartilage. 02:17:20.520 |
Can you replace cartilage? Yeah. But would you if he didn't have to? No. Okay, can I play devil's 02:17:27.880 |
advocate? Yeah. So those folks, because of their goodness, make it okay to be mediocre. 02:17:36.760 |
They create a culture where, well, what's important in life, which is something I agree in my personal 02:17:45.880 |
life, is to be good to each other, to be friendly, to be good vibes, all that kind of stuff. When I 02:17:51.960 |
was at Google, just the good atmosphere, everyone's playing, it's fun. Fun, right? But to me, 02:18:00.520 |
when I put on my hat of having a mission and a goal, what I love to see is the superstars that 02:18:10.840 |
shine in some way, do something incredible. I want everyone to also admire those superstars, 02:18:22.840 |
perhaps not just for the productivity's sake or performing or successful company's sake, 02:18:28.040 |
but because that too is an incredible thing that humans are able to accomplish, which is shine. 02:18:34.440 |
I hear you, but that's not a decision you make, meaning you get lucky when you have those people 02:18:39.320 |
in your company. That's not the hard part for you. The hard part is figuring out what to do with one, 02:18:44.840 |
two, and three. Keep, demote, promote, fire. What do you do? This is why it's all about those three 02:18:52.040 |
buckets. I personally believe that folks in that bucket one, as long as those folks aren't more 02:18:57.960 |
than 50% to 60% of a company, are good. They can be managed as long as they are one to two degrees 02:19:05.400 |
away from one of those people that you just mentioned. Because it's easy then to drag the 02:19:11.000 |
entire company down if they're too far away from the LeBron James, because you don't know what 02:19:16.280 |
LeBron James looks and feels and smells. You need that tactile sense of what excellence looks like 02:19:22.920 |
in front of you. A great example is if you just go on YouTube and you search these clips of how 02:19:27.960 |
Kobe Bryant's teammates described, not Kobe, but how their own behavior, not performance, 02:19:36.520 |
because there was a bunch of average people that Kobe played with his whole career, 02:19:39.400 |
but their behavior changed by being somewhat closer to him. I think that's an important 02:19:47.400 |
psychological thing to note for how you can do reasonably good team construction. If you're 02:19:53.240 |
lucky enough to find those generational talents, you have to find a composition of a team that 02:19:58.360 |
keeps them roughly close to enough of the org. That way, that group of people can continue to 02:20:06.520 |
add value, and then you'll have courage to fire these next two groups of people. I think the 02:20:11.800 |
answer is to fire those two groups of people. Because no matter how good you are, that stuff 02:20:16.440 |
just injects poison into a living organism, and that living organism will die when exposed to 02:20:22.520 |
poison. - So you invest in a lot of companies, you've looked at a lot of companies. What do you 02:20:28.040 |
think makes for a good leader? So we talked about building a team, but a good leader for a company, 02:20:34.040 |
what are the qualities? - When I first meet people, I never ask to see a resume. 02:20:41.320 |
And when I'm meeting a company CEO for the first time, I couldn't care less about the business, 02:20:49.240 |
in fact. And I try to take the time to let them reveal themselves. Now in this environment, 02:20:57.560 |
I'm doing most of the talking. But if this were the other way around, and you were ever raising 02:21:01.560 |
capital, and you said, "Smath, I'd be interested in you looking at this business," I'd probably say 02:21:05.800 |
eight to 10 words for hours. - You just listen. - Prod. I throw things out, prod, and I let you 02:21:12.600 |
meander. And in you meandering, I'm trying to build a sense of who this person is. Once I have 02:21:19.560 |
a rough sense of that, which is not necessarily right, but it's a starting point, then I can go 02:21:26.680 |
and understand why this idea makes sense in this moment. And what I'm really trying to do is just 02:21:33.560 |
kind of like unpack where are the biases that may make you fail. And then we go back to you. 02:21:41.320 |
- The thing that Silicon Valley has the benefit of though, is that they don't have to do any of 02:21:47.240 |
this stuff if there's momentum. Because then the rule book goes out the window and people 02:21:53.320 |
clamor to invest. So one of the things that I do, and this is again, back to this pugilism that I 02:22:00.120 |
inflict on myself, is I have these two things that I look at. Thing number one is I have a table 02:22:07.480 |
that says how much did we make from all of our best investments? How much did we lose from all 02:22:13.160 |
of our worst investments? What is the ratio of winners to losers over 11 years? And in our case, 02:22:19.560 |
it's 23 to one on billions of dollars. So you can kind of like, you can see a lot of signal. 02:22:26.200 |
But what that allows me to do is really like say, wait a minute, we cannot violate these rules 02:22:36.520 |
around how much money we're willing to commit in an errant personality. The second is I ask myself 02:22:43.960 |
of all the other top VCs in Silicon Valley, name them all, what's our correlation? Meaning 02:22:50.760 |
when I do a deal, how often does anybody from Sequoia, Excel, Benchmark, Kleiner, you name it, 02:22:57.000 |
do it at the same time or after and vice versa. And then I look at the data to see how much they 02:23:04.680 |
do it amongst themselves. >> What's a good sign? 02:23:07.800 |
>> I'm at zero, as virtually close to zero as possible. 02:23:11.000 |
>> And that's a good thing. >> Well, it's not a good thing when 02:23:14.680 |
the markets are way, way up because it creates an enormous amount of momentum. So I have to make 02:23:21.800 |
money the hard way. I have to, you know, cause I'm trafficking in things that are highly uncorrelated 02:23:26.680 |
to the gestalt of Silicon Valley, which can be a lonely business, 02:23:33.240 |
but it's really valuable in moments where markets get crushed because correlation is the first thing 02:23:38.680 |
that causes massive destruction of capital, massive. Because one person all of a sudden 02:23:44.760 |
with one blow up in one company, boom, the contagion hits everybody except the person that 02:23:50.360 |
was not. And so now those are like more sophisticated elements of risk management, 02:23:56.600 |
which is again, this pugilism that I inflict on my, nobody asks me to do that. Nobody actually 02:24:01.880 |
at some level when the markets are up really care that when markets are sideways or when markets 02:24:06.200 |
are down, I think that that allows me to feel proud of our process. >> But that requires you 02:24:14.760 |
>> Outside of the box, it's lonely because you're taking risks. Also, your public personality. So 02:24:20.520 |
you say stuff that if it's wrong, you get yelled at for being, I mean, your mistakes aren't private. 02:24:29.000 |
>> No. And that's something that has been a really, really healthy moment of growth. 02:24:35.800 |
It's like an athlete. If you really want to be a winner, you got to hit the shot 02:24:42.360 |
in front of the fans. And if you miss it, you have to be willing to take the responsibility 02:24:49.400 |
of the fact that you bricked it. And over time, hopefully there's a body of work that says you've 02:24:55.640 |
generally hit more than you've missed. But if you look at even the best shooters, what are they? 52%. 02:25:01.000 |
So these are razor thin margins at the end of the day, which is really, so then what can you 02:25:07.400 |
control? I can't control the defense. I can't control what they throw at me. I can just control 02:25:13.560 |
my preparation and whether I'm in the best position to launch a reasonable shot. >> 02:25:18.720 |
You said that the world's first trillionaire will be somebody in climate change in the past. 02:25:24.520 |
Let's update that. What's today, as we stand here today, what sector will the world's first 02:25:29.880 |
trillionaire come from? >> Yeah, I think it's energy transition. 02:25:32.760 |
>> So energy, so the things we've been talking about. 02:25:37.160 |
>> Well, I think the way that I think about- >> So this is a single individual, 02:25:40.600 |
sorry to interrupt. You see their ability to actually build a company that makes a huge 02:25:46.200 |
amount of money as opposed to this distributed idea that you've been talking about. 02:25:50.040 |
>> Yeah, I'll give you my philosophy on wealth. Most of it is not you. 02:25:56.840 |
An enormous amount of it is the genetic distribution of being born in the right 02:26:03.400 |
place and blah, blah, blah, irrespective of the boundary conditions of how you were born 02:26:07.400 |
or where you were raised, right? So at the end of the day, you and I ended up in the United States. 02:26:13.160 |
It's a huge benefit to us. Second is the benefit of our age. It's much better and much more likely 02:26:21.320 |
to be successful as a 46-year-old in 2023 than a 26-year-old in 2023. Because in my case, I have 02:26:29.320 |
demographics working for me. For the 26-year-old, he or she has demographics working slightly 02:26:35.560 |
against them. >> Can you explain that a little bit? 02:26:37.400 |
What are the demographics here? >> In the case of me, 02:26:42.760 |
the distribution of population in America looks like a pyramid. In that pyramid, I'm wedged in 02:26:51.160 |
between these two massive population cohorts, the boomers and then these Gen Z and millennials. 02:26:56.920 |
That's a very advantageous position. It's not dissimilar to the position that Buffett was, 02:27:02.680 |
where he was packaged in between boomers beneath him and the silent generation above him. 02:27:09.800 |
Being in between two massive population cohorts turns out to be extremely advantageous because 02:27:14.840 |
when the cohort above you transitions power and capital and all of this stuff, 02:27:19.960 |
you're the next person that likely gets handed it. We have a disproportionate likelihood to be... 02:27:25.240 |
We are lucky to be older than younger. That's an advantage. Then the other advantage that has 02:27:32.760 |
nothing to do with me is that I stumbled into technology. I got a degree in electrical 02:27:37.560 |
engineering and I ended up coming to Silicon Valley. It turned out that in that moment, 02:27:42.280 |
it was such a transformational wind of change that was at my back. The wealth that one creates 02:27:50.280 |
is a huge part of those variables. Then the last variable is your direct contributions 02:27:59.400 |
in that moment. The reason why that can create extreme wealth is because when those things come 02:28:09.080 |
together at the right moment, it's like a chemical reaction. It's just crazy. 02:28:15.400 |
That was part number one of what I wanted to say. The second thing is when you look then inside of 02:28:23.560 |
these systems where you have all these tailwinds. In tech, I think I benefit from these three big 02:28:29.240 |
tailwinds. If you build a company or are part of a company or a part of a movement, 02:28:34.440 |
your economic participation tends to be a direct byproduct of the actual value 02:28:44.120 |
that that thing creates in the world. The thing that that creates in the world 02:28:52.360 |
will be bigger if it is not just an economic system, but it's like a philosophical system. 02:28:59.720 |
It changes the way that governance happens. It changes the way that people think about all kinds 02:29:04.680 |
of other things about their lives. There's a reason, I think, why database companies are worth 02:29:11.800 |
X, social companies are worth Y, but the military industrial complex is worth as much. I think there 02:29:19.640 |
is a reason why that if you, for example, were to go off and build some newfangled source of energy 02:29:26.360 |
that's clean and hyperabundant and safe, that what you're really going to displace or reshape 02:29:34.440 |
is trillions and trillions of dollars of worldwide GDP. The global GDP is, I call it, 85 trillion. 02:29:41.240 |
It's going at 2% to 3% a year. In the next 10 years, we'll be dealing with $100 trillion of GDP. 02:29:48.840 |
Somebody who develops clean energy in 2035 will probably shift 10% of that around, $10 trillion. 02:29:57.000 |
A company can easily capture 30% of a market, $3 trillion. A human being can typically own 02:30:07.960 |
a third of one of these companies, $1 trillion. You can get to this answer where it's like, 02:30:14.280 |
"It's going to happen in our lifetime." But you have to, I think, find these systems that are so 02:30:20.280 |
gargantuan and they exist today. It's more bounded because price discovery takes longer. 02:30:26.520 |
In an existing thing, it's more unbounded because you know what it is. You know the tentacles that 02:30:30.920 |
energy reaches. Of that $80 trillion of worldwide GDP, I bet you if you added up all the energy 02:30:37.080 |
companies, but then you added up all of manufacturing, if you added up all of transport, 02:30:42.760 |
you'd probably get to like 60 of the 80. >> Do you have an idea of which 02:30:47.480 |
alternate energy, sustainable energy is the most promising? 02:30:54.280 |
>> Well, I think that we have to do a better job of exploring what I call the suburbs of 02:31:02.680 |
the periodic table. We're really good in Seattle, the upper Northwest. We're kind of good in Portland, 02:31:12.120 |
but we're non-existent in San Diego, and we have zero plan for North Carolina through Florida. 02:31:17.800 |
>> Yeah. Is that a fancy way of saying nuclear should be part of the discussion? 02:31:24.360 |
>> I think nuclear, I mean room temperature semiconductors. I'm not convinced right now 02:31:30.520 |
that the existing set of nuclear solutions will do a good job of scaling beyond bench scale. 02:31:35.640 |
I think there is a lot of complicated technical problems that make it work at a bench scale level, 02:31:40.040 |
even partially, but the energy equation is going to be very difficult to overcome in the absence 02:31:45.560 |
of some leaps in material science. >> Have you seen any leaps? Is there 02:31:50.120 |
promising stuff? You're seeing the cutting edge from a company perspective. 02:31:53.720 |
>> Yeah. I would say not yet, but the precursor, yes. I have been spending a fair amount of time, 02:32:01.240 |
so talking about a new framework that's in my mind, is around these room temp superconductors. 02:32:09.240 |
And so I've been kind of bumbling around in that forest for about a year. I haven't really put 02:32:16.840 |
together any meaningful perspectives, but again, talking about trafficking in companies and 02:32:23.320 |
investments that are very lonely, but they allow me to generate returns that are relatively unique 02:32:29.560 |
and independent. That's an area where I don't see anybody else when I'm there. 02:32:33.480 |
I'll give you another area. We, I think, are about to unleash in a world of zero energy and 02:32:40.840 |
zero compute costs, computational biology will replace wet chemistry. And when you do that, 02:32:50.200 |
you will be able to iterate on tools that will be able to solve a lot of human disease. 02:32:57.480 |
I think if you look at the head of the top 400 most recurring rare diseases, 02:33:03.320 |
I think half the number, 200, is specific point mutation, is just the mismethylation between C 02:33:11.640 |
and T. I mean, that's like, "Whoa, wait, you're telling me in billions of lines of code, 02:33:17.240 |
I forgot a semicolon right there? That's causing this whole thing to miscompile? So I just got to 02:33:21.960 |
go in there and boop, and it's all done?" That's a crazy idea. That was a C++ C throwback for people 02:33:28.040 |
that don't know what I said. - There's two people who are clapping. 02:33:29.880 |
- Two people there, everybody else is like, "What? This is not a pipe, what are you talking about?" 02:33:35.880 |
- So that, couldn't that be a truly, a source of a, if you, the computational biology unlocks, 02:33:41.160 |
I mean, obviously medicine is begging for... - The thing with energy though is that the 02:33:46.680 |
groundwork is well laid. And talking about sort of like the upper bound is well defined, 02:33:53.320 |
the upper bound in medicine is not well defined because it is not the sum total of the market 02:33:59.400 |
cap of the pharma industries. It is actually the sum total of the value of human life. 02:34:03.400 |
And that's an extremely ethical and moral question. - Isn't there a special interest that are 02:34:10.360 |
resisting moving, making progress on the energy side? So like governments and how do you break 02:34:18.600 |
through that? I mean, you have to acknowledge the reality of that, right? 02:34:20.840 |
- I think it's less governments. In fact, like I said, I think President Biden has done 02:34:24.600 |
a really incredible job. Well, Chuck Schumer really has done a really incredible job because... 02:34:27.960 |
So just to give you the math on this, right? Like back to this, so 3% of everything is of a market 02:34:34.760 |
or zealots. But when you get past 5%, things tend to just go nuclear to 50, 60%. The way that they 02:34:43.800 |
wrote this last bill, the cost, I'll just use the cars as an example. The cost of an average car 02:34:50.600 |
is 22 and a half thousand. The cost of the cheapest battery car is 30,000. And lo and behold, 02:34:57.000 |
there's a $7,500 credit. And it's like to think the invisible hand didn't know that that math was 02:35:02.920 |
right, I think is kind of a little bit malarkey. And so the battery EV car is going to be the same 02:35:08.680 |
price as the thing, and it's going to go to 40, 50%. So we're already at this tipping point. So 02:35:14.680 |
we're kind of ready to go. In these other markets, it's a little bit more complicated because there's 02:35:20.920 |
a lot of infrastructure that needs to get built. So the gene editing thing, as an example, 02:35:26.600 |
we have to build a tool chain that looks more like code that you can write to. 02:35:33.960 |
Facebook has written in, I think, PHP, originally. 02:35:37.320 |
Which is, I'm still a big fan of. Sometimes you have to use the ugly solution and make it look 02:35:44.440 |
good versus trying to come up with a good solution, which will be too late. Let me ask you, 02:35:51.720 |
you consider a run for governor of California, then decided against it. What went into each 02:35:57.480 |
of these decisions? And broadly, I just have maybe a selfish question about Silicon Valley. 02:36:03.640 |
Is it over as a world leader for new tech companies? As this beacon of promise of 02:36:14.360 |
young minds stepping in and creating something that changes the world? 02:36:19.720 |
I don't know if those two questions are connected. 02:36:21.480 |
So it's not over, but I think it's definitely, we're in a challenging moment because... 02:36:30.120 |
So back to that analogy of the demographics, if you think about the, like if you bucketed, 02:36:38.920 |
forget like our relative successes, but there's a bunch of us in this mid-50s to mid-30s cohort 02:36:47.640 |
of people that have now been around for 20 years, 15 years to 25 years that have done stuff, right? 02:36:54.040 |
From Andreessen to Zuck to Jack Dorsey, et cetera, Elon, whatever, maybe you throw me in the mix, 02:37:02.040 |
David Sachs, whatever, okay? None of us have done a really good job of becoming a statesman 02:37:10.920 |
or a stateswoman, and really showing a broad empathy and awareness for the broader systems. 02:37:19.480 |
So Silicon Valley is to survive as a system. We need to know that we've transitioned from move 02:37:26.120 |
fast and break things to get to the right answer and take your time if that's what it means. 02:37:30.840 |
And so we have to be a participant of the system. And I believe that. And I think that it's 02:37:36.840 |
important to not be a dilettante and not be thumbing your face to Washington or not push 02:37:43.880 |
the boundaries and say, we'll deal with it after the fact, but to work with folks that are trying 02:37:49.400 |
to do the best, again, steel men their point of view. Work with them, potentially run for office. 02:37:56.520 |
So potentially understand the system. Be a part of their system. 02:38:01.560 |
It makes me sad that there's no tech people or not many tech people in Congress and certainly 02:38:08.120 |
not in the presidential level, not many governors or senators. 02:38:12.440 |
Well, I think that we also have roughly, our rules will never allow some of the best and 02:38:17.960 |
brightest folks to run for president because of just the rules against it. But if- 02:38:23.800 |
I mean, look, I think David Sachs would be an incredible presidential candidate. Now, 02:38:27.720 |
I also think he'd be a great governor. No, he was born in South Africa. 02:38:32.040 |
You know, I think he'd be a great governor. I think he'd be a great 02:38:34.360 |
secretary of state. I mean, he'd be great at whatever he wanted to do. 02:38:38.200 |
Friedberg wasn't born here. So there's a lot of people that could contribute at different levels. 02:38:47.080 |
And I hope that... By the way, the other thing I like about the pod is I also think it helps 02:38:52.200 |
normalize tech a little bit because you just see normal people dealing with normal situations. 02:38:59.240 |
And I think that that's good. You know, it is a really normative place. It's not the caricature 02:39:04.360 |
that it's made out to be, but there is a small virulent strain of people that make it caricature 02:39:09.820 |
Well, that's in one direction. What do you think about the whole 02:39:12.680 |
culture of, I don't know if better terms, but woke activism? So sort of activism, 02:39:21.000 |
which in some contexts is a powerful and important thing, but infiltrating companies. 02:39:26.120 |
I'll answer this in the context of Rene Girard. So like he says that people tend to copy each other. 02:39:30.840 |
And then when they're copying each other, they're really what they're fighting. What 02:39:34.920 |
they're doing is they're fighting over some scarce resource. And then you find a way to 02:39:40.440 |
organize against the group of you, against the person or a thing that you think is the actual 02:39:46.440 |
cause of all of this conflict and you try to expel them. The thing that wokeism doesn't understand 02:39:52.920 |
is that unless that person is truly to blame, the cycle just continues. And, you know, that was a, 02:40:00.360 |
that was a framework that he developed that, you know, he's really conclusively proven to be true 02:40:06.200 |
and it's observable in humanity and life. So these movements, I think the extreme left and the 02:40:13.800 |
extreme right are trying to interpret a way to allow people to compete for some scarce resource. 02:40:21.240 |
But I also think that in all of that, what they don't realize is that they can scapegoat whoever 02:40:28.120 |
they want, but it's not going to work because the bulwark of people in the middle realize that it's 02:40:34.600 |
just not true. Yeah, they realize, but they're still because in leadership positions, there's 02:40:40.760 |
still momentum and they still scapegoat and they continue and it seems to hurt the actual ability 02:40:47.880 |
of those companies to be successful. In fairness though, if you had to graph the effectiveness of 02:40:52.440 |
that function, it's decaying rapidly. It's the least effective it's ever been. You're absolutely 02:40:57.960 |
right. Being canceled five years ago was a huge deal. Today, I think it was Jordan Peterson on 02:41:04.200 |
your podcast. He said, "I've been canceled and it was amazing." He said 38 times or 40. He said some 02:41:09.240 |
number, which was a ginormous number, A, that he kept account of it and B, was able to classify it. 02:41:14.680 |
I'm like, "What classifier is going on in his mind?" where he's like, "Ah, that's an attempt 02:41:18.520 |
to cancel me, but this one is not." But my point is, well, it's clearly not working. 02:41:22.360 |
So the guy is still there and the guy is putting his view out into the world. 02:41:27.160 |
And so it's not to judge whether what he says is right or wrong. It's just to observe that this 02:41:33.400 |
mechanism of action is now weakened, but it's weakened because it's not the thing that people 02:41:39.400 |
think is really to blame. >>COREY: Yeah. You've been canceled on a small scale a few times. 02:41:44.360 |
So that's not some, I'm sure it didn't feel small. Actually, it wasn't small. I'm trying to minimize. 02:41:49.000 |
Did that psychologically hurt you? >>COREY: Yeah. 02:41:54.280 |
>>BF: It was tough? >>COREY: Yeah. In the moment, 02:41:56.840 |
you don't know what's going on, but I would like to thank a certain CEO of a certain well-known 02:42:02.840 |
company. And he sent me basically like a step-by-step manual. 02:42:11.160 |
>>BF: Does it involve mushrooms? No. >>COREY: No. And he was right. 02:42:17.800 |
The storm passed and life went on. >>BF: I don't know if you can share 02:42:23.720 |
the list of steps, but is the fundamental core ideas that just life goes on? 02:42:29.480 |
>>COREY: The core fundamental idea is like, you need to be willing and able to apologize 02:42:38.120 |
for what is in your control, but not for other people's mistakes. 02:42:41.720 |
Your mistakes, yes. And if you feel like there's something, then you should take accountability of 02:42:48.440 |
that. But to apologize for somebody else, for something that they want to hear, 02:42:54.440 |
isn't going to solve anything. >>BF: Yeah. There's something about 02:42:58.840 |
apologies. If you do them, they should be authentic to what you actually want to say 02:43:04.200 |
versus what somebody else wants to hear. >>COREY: Otherwise it doesn't ring true. 02:43:09.240 |
>>BF: Yeah. And people can see through that. >>COREY: And people can see through it. And also, 02:43:13.240 |
what people see through is not just the fact that your apology was somewhat hollow, 02:43:17.880 |
but also that this entire majority of people now walked away. The mob was like, "Okay, thanks." 02:43:21.560 |
And then people are like, "Oh, so you didn't care at all?" 02:43:25.320 |
And so then it reflects more probably on them. >>BF: Yeah. I know you said you don't like to 02:43:33.880 |
give advice, but what advice would you give to a young person? You've lived an incredible life 02:43:40.200 |
from very humble beginnings, difficult childhood, and you're one of the most successful people in 02:43:44.920 |
the world. So what advice? I mean, a lot of people look to you for inspiration. 02:43:49.000 |
Kids in high school or early college that are not doing good or are trying to figure out 02:43:59.080 |
basically what to do when they have complete doubt in themselves. 02:44:12.320 |
It is really important that if somebody that you respect, and I'm going to just for the purpose of 02:44:18.520 |
this put myself in that bucket, and if you're listening to this, I wish somebody had told this 02:44:24.360 |
to me. We are all equal, and you will fight this demon inside you that says you are less than a lot 02:44:36.680 |
of other people for reasons that will be hard to see until you're much, much older. And so you have 02:44:45.400 |
to find either a set of people far, far away, like what I did, or one or two people really, 02:44:53.640 |
really close to you, or maybe it's both that will remind you in key moments of your life 02:45:01.000 |
that that is true. Otherwise, you will give in to that beast. And it's not the end of the world, 02:45:10.920 |
and you'll recover from it. I've made a lot of mistakes, but it requires a lot of energy, 02:45:18.200 |
and sometimes it's just easier to just stop and give up. So I think that if you're starting out 02:45:24.600 |
in the world, if you've been lucky to have a wonderful life and you had wonderful parents, 02:45:28.360 |
man, you should go and give them a huge hug because they did you such a service that 02:45:33.240 |
most folks don't do to most kids, unfortunately. And it's not the fault of these parents, 02:45:38.520 |
but it's just tough. Life is tough. So give them a kiss and then figure out a way where you can just 02:45:45.720 |
do work that validates you and where you feel like you're developing some kind of mastery. 02:45:51.080 |
Who cares what anybody else thinks about it? Just do it because it feels good. Do it because 02:45:55.400 |
you like to get good at something. But if you're not one of those lucky people, 02:45:59.480 |
you can believe in your friends or you can just believe in me. 02:46:04.680 |
I'm telling you, preserve optionality. How you do that is by regulating your reactions to things. 02:46:13.880 |
And your reactions are going to be largely guided in moments where you think that you are not the 02:46:20.120 |
same as everybody else, and specifically that you are less than those people and you're not. 02:46:24.600 |
So just save this part of this podcast and just play it on a loop if you need to. But that is my 02:46:33.800 |
biggest learning is I am equal. I'm the same as all these other people. And you can imagine what 02:46:40.680 |
that means to me to go out in the world, to see people and think, "Okay, I'm the same as this 02:46:44.440 |
person. I'm as good as them." And you could imagine what you're probably thinking of what 02:46:49.320 |
I'm thinking is not that thing. You're probably thinking, "Man, this guy, yeah, 02:46:53.640 |
this guy, I'm so much better." No, I am fighting this thing all the time. 02:47:00.280 |
Well, I've also met a bunch of folks who I think is a counter reaction to that. Once they become 02:47:07.880 |
successful, they start developing a philosophy that they are better or even some people are 02:47:14.680 |
better than others, which I understand. There's LeBron James versus other people and so on. 02:47:21.080 |
But I always really resisted that thought because I feel like it's a slippery slope. 02:47:26.200 |
They're not better. They have mastery in a thing that they've fallen in love with. 02:47:32.200 |
I'm trying to develop mastery in a thing that I love. I love investing. It's like solving puzzles. 02:47:38.840 |
And I love that. I love trying to develop mastery in poker. I really love that. I'm learning how 02:47:46.520 |
to be a parent to a teenager because I finally have one. It's all new stuff to me and I'm learning. 02:47:55.960 |
Yeah. So you don't want to think you're lesser than and you don't want to think you're better 02:48:04.360 |
I've never thought I was better than. I manifested better than 02:48:07.960 |
because I was trying to compensate for feeling less than. My goal is just to feel like everybody 02:48:15.000 |
else feels on the presumption that everybody had a normal life. 02:48:18.120 |
Given your nickname is the dictator, do you trust yourself with power? 02:48:22.440 |
If the world gave you absolute power for a month... 02:48:27.480 |
No. No, because I think that I'm still riddled with bias. I don't deserve that position. And 02:48:35.480 |
I would not want that weight on my shoulders. I had a spot actually where it was a very 02:48:42.760 |
important and big poker game. And it was a spot where I was in the pot. And it was a really large 02:48:51.560 |
pot. It was like a million dollar pot. And I had to make a ruling and the ruling was in my favor. 02:48:58.600 |
And I was just beside myself. I play for the challenge. I like to get pushed to the limit 02:49:08.360 |
of my capabilities. I want to see can I think at the same level of these folks because these 02:49:15.320 |
guys are all experts. They're all pros. And I get enormous joy from that challenge. 02:49:20.920 |
And I like to win, but I like to win just a small amount. You know what I mean? And then I never 02:49:28.520 |
wanted to win in that way. But because it was my game, I had to make this call on a million dollar 02:49:35.240 |
pot and I wanted to just shoot myself. I just was like, "This is gross and disgusting." And 02:49:41.480 |
he was a complete gentleman, which made it even worse. So I do not want absolute power. 02:49:48.920 |
Well, those are the people you do want to have power is the ones that don't want it, 02:49:52.920 |
which is a weird system to have. Because then you, in that kind of system, don't get the leaders 02:49:59.640 |
that you should have. Because the ones that want power aren't the ones that should have power. 02:50:05.400 |
That's a weird, weird system. What do you think? Let me sneak this question in there. 02:50:14.920 |
Why are we here? Give a look up at the stars and think about the big "why" question. 02:50:21.720 |
I think that it's a chance to just enjoy the ride. I don't think it really... I don't believe 02:50:32.840 |
in this idea of legacy that much. I think it's a real trap. 02:50:37.720 |
So do you think you'll be forgotten by history? 02:50:41.560 |
I hope so. I really, really hope so. Because if you think about it, there are two or three people 02:50:46.680 |
that are remembered for positive things, and everybody else, it's all negative things. And 02:50:51.720 |
the likelihood that you'll be remembered for a positive thing is harder and harder and harder. 02:50:56.600 |
And so the surface area of being remembered is negative. And then the second, what will it 02:51:02.040 |
matter? I'll be gone. I really just want to have fun, do my thing, learn, get better. But I want 02:51:13.640 |
to reward myself by feeling like, "Wow, that was awesome." I've told this story many times, and 02:51:21.240 |
I have put, again, my own narrative fallacy on top of this. But Steve Jobs's sister wrote this 02:51:29.480 |
obit in the New York Times when he died, and she ends it by saying his last words were, "Oh, wow, 02:51:34.360 |
oh, wow, oh, wow." That seems like an awesome way to die. You're surrounded by your friends and 02:51:39.960 |
family, not the fact that he died, obviously, but in a moment where what I read into it was your 02:51:45.400 |
family was there. Maybe you thought about all the cool shit that you were able to do. 02:51:52.440 |
And then you just started the simulation all over again. 02:51:57.960 |
And so just on the off chance that that's true, I don't want to take this thing too seriously. 02:52:12.120 |
So every day you can go and you're happy. You're happy with the things you've done. 02:52:17.560 |
Yeah. There are obviously things I want to do that I haven't done, 02:52:24.200 |
but there are no gaping things. I've really, really, really been in love. 02:52:29.400 |
Total gift. There have been moments where I've really, really felt like everybody else. 02:52:38.600 |
There have been moments where I had deep, deep, deep joy and connection with my children. 02:52:45.640 |
There are moments where I've had incredible giggling fun with my friends. 02:52:51.160 |
There's moments where I've been able to enjoy really incredible experiences, wine, food, 02:52:55.400 |
all that stuff. Great. I mean, what more do you want? I could keep asking for more, 02:53:02.040 |
but I would just be a really crappy human being at some point. You know what I mean? It's enough. 02:53:13.560 |
This life is pretty beautiful if you allow yourself to see it. 02:53:17.720 |
It's really great and it's better than it's ever been for most of us, actually. 02:53:25.960 |
And all of the millennials and Gen Zs, you're about to get a boatload of money from your parents. 02:53:33.800 |
And you better figure out how to be happy before you get that money, 02:53:47.640 |
Get a lot of Dairy Queen. No, that only worked the first time. 02:53:53.880 |
It worked two times in grade five and grade six. My God, that next year. 02:53:57.640 |
Flex, I worked my ass off. I'm like, but I could never bring myself to ask her. 02:54:02.440 |
And then she did it. And I was like, man, this woman's a Miss Bruni. This woman's a gem. 02:54:07.080 |
Yeah. But the third time it faded. Isn't that the sad thing about life? 02:54:11.880 |
You know, the finiteness of it, the scarcity of it. 02:54:15.960 |
Without that, ice cream wouldn't be so damn delicious. 02:54:21.320 |
Chamath, you're an incredible human. I definitely recommend that people listen to you on all 02:54:25.800 |
platforms. We're very lucky to be able to get your wisdom. 02:54:30.360 |
I've talked a lot about you with Andrej Karpathy, who's somebody I really respect. 02:54:36.760 |
And he just loves the shit out of you in how much you, how deeply you understand the world. 02:54:44.200 |
On a different, yeah, speaking of semicolons, there's some human beings that understand 02:54:49.480 |
everything at the very low level and at the very high level. And those people are also 02:54:54.680 |
very rare. So it's a huge honor and also a huge honor that you would be so kind to me, 02:55:02.440 |
just like in subtle ways offline, that you would make me feel like I'm worthwhile. 02:55:09.560 |
Well, can I just say something as just a layman listener? 02:55:12.520 |
What you do, just so I could give you my version, is that you take things and people, 02:55:22.920 |
so ideas and people, that are mostly behind a rope and you demystify it. 02:55:31.160 |
And what that does for all of us is it makes me feel like I can be a part of that. 02:55:39.960 |
And that's a really inspiring thing because you're not giving advice, you're not telling us how to 02:55:44.920 |
solve the problem, but you're allowing it to be understood in a way that's really accessible. 02:55:52.680 |
And then you're intellectually curious in ways that some of us would never expect that we were, 02:55:59.240 |
and then you end up in this rabbit hole. And then you have the courage to go and talk to people that 02:56:04.280 |
are really all over the map. For example, when I saw your Jordan Peterson example, you went there, 02:56:13.640 |
you talked about Nazism, and I was just like, "Man, this is a complicated argument these guys 02:56:18.920 |
are going to tackle." It's really impressive. So I have an enormous amount of respect for what you 02:56:26.440 |
do. I think it's very hard to do what you do so consistently. And so I look at you as somebody I 02:56:33.240 |
respect because it just shows somebody who's immersed in something and who's very special. 02:56:41.160 |
So thank you for including me in this. I'm going to play that clip to myself 02:56:44.360 |
privately over and over, just when I feel low and self-critical about myself. Thank you so much, 02:56:50.920 |
brother. This is incredible. Thanks, man. Thank you for listening to this conversation with 02:56:55.640 |
Chamath Palihapitiya. To support this podcast, please check out our sponsors in the description. 02:57:01.640 |
And now let me leave you with some words from Jonathan Swift. 02:57:04.840 |
"A wise person should have money in their head, but not in their heart." 02:57:10.440 |
Thank you for listening and hope to see you next time.