back to indexRichard Craib: WallStreetBets, Numerai, and the Future of Stock Trading | Lex Fridman Podcast #159
Chapters
0:0 Introduction
2:28 WallStreetBets and GameStop saga
16:41 Evil shorting and chill shorting
18:47 Hedge funds
24:20 Vlad
31:16 Numerai
58:32 Futre of AI in stock trading
64:11 Numerai data
67:53 Is stock trading gambling or investing?
71:48 What is money?
75:5 Cryptocurrency
78:22 Dogecoin
82:52 Advice for startups
98:43 Book recommendations
100:45 Advice for young people
104:46 Meaning of life
00:00:00.000 |
The following is a conversation with Richard Crabe, 00:00:02.600 |
founder of Numeri, which is a crowdsourced hedge fund, 00:00:10.000 |
but where the trading is done not directly by humans, 00:00:22.780 |
where the incentives of everybody is aligned, 00:00:26.280 |
the code is kept and owned by the people who develop it, 00:00:29.560 |
the data, anonymized data, is very well organized 00:00:42.640 |
by empowering people who are interested in trading stocks 00:00:54.160 |
Audible Audiobooks, Trial Labs, Machine Learning Company, 00:01:00.920 |
and Athletic Greens, all-in-one nutrition drink. 00:01:08.700 |
As a side note, let me say that this whole set of events 00:01:14.440 |
has been really inspiring to me as a demonstration 00:01:18.760 |
that a distributed system, a large number of regular people 00:01:26.560 |
in taking on the elite centralized power structures, 00:01:31.560 |
especially when those elites are misbehaving. 00:01:35.200 |
I believe that power in as many cases as possible 00:01:50.720 |
in its distributed nature represents is freedom. 00:01:55.960 |
is it enables chaos or progress, or sometimes both. 00:02:04.400 |
Freedom is empowering, but ultimately unpredictable. 00:02:12.500 |
If you enjoy this podcast, subscribe on YouTube, 00:02:15.560 |
review it on Apple Podcasts, follow on Spotify, 00:02:18.680 |
support on Patreon, or connect with me on Twitter 00:02:23.320 |
And now here's my conversation with Richard Crabe. 00:02:29.840 |
the important events around this amazing saga 00:02:32.680 |
that we've been living through of Wall Street Bets, 00:03:08.240 |
to prove that they bought the stock and they're holding. 00:03:11.120 |
And then also to see that how big of an impact 00:03:21.240 |
- Were you impressed by the distributed coordination, 00:03:34.440 |
from a large scale distributed system perspective 00:03:45.760 |
that such incredible level of coordination could happen 00:03:49.920 |
where a lot of people come together in a distributed sense, 00:03:52.760 |
there's an emergent behavior that happens after that? 00:03:57.400 |
And one of the reasons is the lack of kind of like, 00:04:06.600 |
So if you have a username on our Wall Street Bets, 00:04:11.600 |
like some of them are, like Deep Fucking Value 00:04:28.240 |
and then can just talk on forum like style boards. 00:04:32.320 |
If you don't know what Reddit is, check it out. 00:04:56.240 |
and there's no contracts, like you were saying. 00:05:04.440 |
if you've posted a really good investment idea in the past, 00:05:12.640 |
And that's also kind of like giving credibility 00:05:16.960 |
And then they are also putting up screenshots. 00:05:22.480 |
here's the trades I've made and here's a screenshot. 00:05:28.120 |
but still it seems like if you've got a lot of karma 00:05:32.160 |
and you've had a good performance on the community, 00:05:39.760 |
He actually probably did put a million dollars into this. 00:05:47.680 |
So you're kind of integrating all that information 00:05:53.560 |
And then you jump onto this little boat of like behavior, 00:05:57.480 |
like we should buy the stock or sell the stock. 00:06:04.560 |
And all of a sudden you have just a huge number of people 00:06:15.320 |
We love Elon, we love the Tesla, let's all buy Tesla. 00:06:33.440 |
And that's really weird because that's a little bit 00:06:36.240 |
too galaxy brain for a decentralized community. 00:06:44.800 |
And the reason they liked it is because it had 00:06:49.880 |
It had been shorted more than its own float, I believe. 00:06:53.920 |
And so they figured out that if they all bought 00:06:57.520 |
this bad stock, they could short squeeze some hedge funds. 00:07:02.520 |
And those hedge funds would have to capitulate 00:07:05.240 |
and buy the stock at really, really high prices. 00:07:10.560 |
these are a bunch of people, when you short a stock, 00:07:13.160 |
you're betting on the, you're predicting that the stock 00:07:16.480 |
is going to go down and then you will make money if it does. 00:07:24.560 |
and you take a big short position in a company, 00:07:27.900 |
there's a certain level at which you can't sustain 00:07:37.320 |
but there is a limit to how low it can go, right? 00:07:43.200 |
And if the stock doubles overnight, like GameStop did, 00:07:47.080 |
you're putting a lot of stress on that hedge fund. 00:07:51.920 |
And that hedge fund manager might have to say, 00:07:56.280 |
"And the only way to get out is to buy the bad stock 00:07:59.520 |
"that they don't want, like they believe will go down." 00:08:13.640 |
You buy a bunch of watermelons, the price goes up, 00:08:25.900 |
So yes, some Redditors will make a lot of money, 00:08:29.760 |
but actually the whole group will make money. 00:08:32.520 |
And that's really why it was such a clever thing 00:08:42.760 |
but to me always from an outsider's perspective, 00:08:45.880 |
seemed, I hope I'm not using too strong of a word, 00:08:51.240 |
Maybe not unethical, maybe it's just a asshole thing to do. 00:09:03.560 |
I love celebrating the success of a lot of people. 00:09:07.040 |
And this is like the stock market equivalent of like haters. 00:09:15.800 |
you wanna have an economy efficient mechanism 00:09:18.280 |
for punishing sort of overhyped, overvalued things. 00:09:23.280 |
That's what shorting I guess is designed for, 00:09:26.160 |
but it just always felt like these people are just, 00:09:29.600 |
because they're not just betting on the loss of the company. 00:09:33.540 |
It feels like they're also using their leverage and power 00:09:37.380 |
to manipulate media or just to write articles 00:09:43.460 |
Then you get to see that with Elon Musk and so on. 00:09:50.540 |
so people like hedge funds that were shorting 00:10:00.300 |
the overpowerful that's misusing their power. 00:10:04.180 |
the people that are standing up and rising up. 00:10:06.420 |
So it's not just that they were able to collaborate 00:10:19.460 |
and fighting the centralized elites, the powerful. 00:10:23.420 |
And that, I don't know what your thoughts are 00:10:28.620 |
At this stage, it feels like that's really exciting 00:10:44.340 |
people can be manipulated by charismatic leaders. 00:10:48.060 |
And so just like Elon right now is manipulating, 00:10:54.260 |
encouraging people to buy Dogecoin or whatever, 00:11:04.620 |
It's a little bit scary how much power Subreddit can have 00:11:13.020 |
they might be attacking or destroying somebody 00:11:32.660 |
But at this stage, it feels like, I don't know, 00:11:41.100 |
Is there something negative about what happened? 00:11:45.180 |
the Reddit people weren't doing anything exotic. 00:11:50.180 |
It was a creative trade, but it wasn't exotic. 00:12:00.340 |
But it was the hedge fund that was doing the exotic thing. 00:12:13.020 |
and now there's a company out there that's worth more. 00:12:21.100 |
which is, and then we destroyed some hedge funds 00:12:25.100 |
- Is there something to be said about the humor 00:12:36.540 |
but it feels like people can be quite aggressive on there. 00:12:52.460 |
one of the things that people have compared it to 00:12:56.820 |
which is, let's say, some very sincere liberals, 00:13:19.420 |
because who's scared of the Occupy Wall Street people 00:13:31.940 |
- Yeah, the anonymity is a bit terrifying and exciting. 00:13:37.940 |
- I mean, yeah, I don't know what to do with it. 00:13:40.580 |
I've been following events in Russia, for example. 00:13:43.580 |
It's like there's a struggle between centralized power 00:14:08.820 |
how much chaos can I create by writing some bots? 00:14:13.420 |
- And I'm sure I'm not the only one thinking that. 00:14:19.820 |
of good developers out there listening to this, 00:14:23.900 |
And then as that develops further and further 00:14:28.340 |
what impact does that have on financial markets, 00:14:44.060 |
being manipulated by, people talk about Russian bots 00:14:49.900 |
We're probably in the very early stage of that, right? 00:14:56.580 |
So do you have a sense that most of Wall Street Bets folks 00:15:04.020 |
is there's just individual, maybe young investors 00:15:12.740 |
I've known about Wall Street Bets for a while, 00:15:14.060 |
but the reason I found out about GameStop was this, 00:15:16.620 |
I met somebody at a party who told me about it, 00:15:26.060 |
- This was probably, no, this was, yeah, a few days ago. 00:15:29.260 |
Yeah, it was like maybe two weeks ago or something. 00:15:37.460 |
but it was just strange to me that there was someone 00:16:04.580 |
- And they're gonna together search for the next thing. 00:16:15.700 |
in government, in finance, in any kind of position, 00:16:22.900 |
and honestly, that's probably a little bit good. 00:16:28.060 |
- Yeah, it certainly makes you think twice about shorting. 00:16:34.980 |
Like these funds put a lot into one or two names, 00:16:52.140 |
- I do think that there are two kinds of shorting. 00:16:59.380 |
Evil shorting is what Melvin Capital was doing. 00:18:01.820 |
we don't even want them to go down necessarily. 00:18:04.260 |
That doesn't sound a bit strange that I say that, 00:18:37.340 |
because that means we can short the oil companies 00:18:41.020 |
and go long Tesla and make the future come forward faster. 00:18:46.100 |
- So we talked about this incredible distributed system 00:18:52.580 |
And then there's a platform, which is Robinhood, 00:19:03.180 |
and there's Numeri that allow you to make it accessible 00:19:09.540 |
But that said, Robinhood was, in a centralized way, 00:19:23.900 |
I don't know how much you were paying attention 00:19:26.540 |
to sort of the shadiness around the whole thing. 00:19:36.020 |
- Well, I think I wanna see the alternate history. 00:19:44.940 |
- How bad would it have gotten for hedge funds? 00:19:50.140 |
if the momentum of these short squeezes could continue? 00:20:01.860 |
they affect kind of all the other shorts too. 00:20:09.940 |
It wasn't GameStop, luckily, it was BlackBerry. 00:20:16.900 |
It was one of these meme stocks, super bad company. 00:20:23.820 |
- A meme stock is kind of a new term for these stocks 00:20:33.380 |
And so the meme stocks were GameStop, the biggest one, 00:20:46.460 |
So these are high short interest stocks as well. 00:21:03.700 |
It's like, no, it's not why they're doing it. 00:21:09.460 |
And so they were high chance of short squeeze. 00:21:12.660 |
- In saying, I would love to see an alternate history, 00:21:20.900 |
- Well, you wouldn't have needed very many more days 00:21:29.340 |
I think it's underrated how damaging it could have been. 00:21:46.940 |
you need to put up like $40 per $100 of short exposure. 00:22:05.660 |
because they're all kind of holding the same things. 00:22:08.420 |
If that happens, not only do you have to cover the short, 00:22:24.780 |
all the ones that the hedge funds like are coming down 00:22:27.380 |
and all the ones that the hedge funds hate are going up 00:22:35.420 |
So I believe that if you could have had a few more days 00:22:42.500 |
you would have had more and more hedge fund deleveraging. 00:22:45.500 |
- But so hedge funds, I mean, they get a lot of shit, 00:22:48.420 |
but do you have a sense that they do some good 00:22:55.980 |
Wall Street Bets itself is a kind of distributed hedge fund, 00:23:04.060 |
I mean, like if some of those were destroyed, 00:23:08.940 |
Or would there be coupled with the destroying 00:23:13.940 |
the evil shorting, would there be just a lot of pain 00:23:23.700 |
I don't think you wanna rerun American economic history 00:23:36.060 |
because hedge funds are kind of like picking up, 00:23:45.580 |
they're investing in new technology, it's so good, 00:23:50.580 |
because they're the reason venture capitalists exist, 00:23:54.060 |
because their companies can have a liquidity event 00:24:14.020 |
billions type hedge funds that make the media 00:24:30.740 |
sort of had to make decisions really quickly, 00:24:32.700 |
probably had to wake up in the middle of the night 00:24:34.420 |
kind of thing, and he also had a conversation 00:24:38.220 |
with Elon Musk on Clubhouse, which I just signed up for. 00:24:44.660 |
journalistic performances of our time with Elon Musk. 00:24:51.100 |
- How hilarious would it be if he gets a pull surprise? 00:25:03.900 |
I don't know if you can comment on any aspects of that, 00:25:06.740 |
but if you were Vlad, how would you do things differently? 00:25:10.100 |
What are your thoughts about his interaction with Elon, 00:25:16.980 |
- I guess there's a lot of aspects to this interaction. 00:25:19.380 |
One is about transparency, like how much do you want 00:25:38.140 |
which I'm sure he was getting some kind of phone calls 00:25:42.660 |
like it's not contracts that are forcing him, 00:25:48.000 |
like pressured to behave in certain kinds of ways 00:25:52.200 |
Like what do you take from this whole situation? 00:25:58.920 |
I mean, it was pretty cool to have him talk to Elon. 00:26:03.400 |
in the first few seconds of Vlad speaking was, 00:26:13.960 |
He seemed like a 55 year old man talking to a 20 year old. 00:26:22.520 |
You can see why Citadel and him are buddies, right? 00:26:27.880 |
It's like, this is a nice, it's not a bad thing. 00:26:30.240 |
It's like he's got a respectable, professional attitude. 00:26:35.240 |
- Well, he also tried to do like a jokey thing, 00:26:38.200 |
like, no, we're not being ageist here, boomer, 00:26:42.700 |
but like a 60 year old CEO of Bank of America 00:26:58.120 |
- But I think, and maybe that's also what I like 00:27:00.560 |
about Elon's kind of influence on American business. 00:27:03.560 |
It's like, he's super like anti the professional. 00:27:07.320 |
Like why say, you know, a hundred words about nothing? 00:27:13.140 |
And so I liked how he was cutting in and saying, 00:27:26.580 |
The problem of course is, you know, for Elon, 00:27:33.300 |
Tens of millions of dollars, his tweeting like that. 00:27:45.820 |
and just going off the cuff and making mistakes 00:27:56.460 |
Do you think we'll ever find out what really went down 00:28:02.380 |
if there was something shady underneath it all? 00:28:05.220 |
- Yeah, I mean, it would be sad if nothing shady happened. 00:28:12.160 |
- Sometimes I feel like that would Mark Zuckerberg, 00:28:19.080 |
there's a lot of shitty things that Facebook is doing, 00:28:25.160 |
by the way he presents himself about those things. 00:28:28.500 |
Like, I honestly think that a large amount of people 00:28:31.440 |
at Facebook just have a huge, unstable, chaotic system 00:28:36.840 |
and they're all, not all, but a mass are trying to do good 00:28:43.820 |
it sounds like there's a lot of back room conversations 00:28:49.940 |
And there's something about the realness that Elon has 00:28:57.260 |
- I think Mark Zuckerberg had that too when he was younger. 00:29:00.740 |
- And somebody said, you gotta be more professional, man. 00:29:02.860 |
You can't say, you know, lol to an interview. 00:29:06.960 |
And then suddenly he became like this distant person 00:29:11.640 |
Like, you'd rather have him make mistakes, but be honest, 00:29:14.960 |
than be like professional and never make mistakes. 00:29:31.040 |
Or like, you know, take risks as opposed to what the PR, 00:29:51.440 |
It's constantly like, ooh, like, I don't know. 00:29:53.800 |
And there's this nervous energy that builds up over time 00:29:56.680 |
with larger, larger teams where the whole thing, 00:30:03.560 |
incredible engineering and incredible system, 00:30:07.680 |
Like, let's be honest about this like madness 00:30:13.140 |
that we have going on of huge amounts of video 00:30:21.560 |
bunch of conspiracy theories, some of which might be true. 00:30:24.320 |
And then just like this mess that we're dealing with 00:30:30.200 |
It's a place where like democratizes education, 00:30:34.240 |
And instead they're all like sitting in like, 00:30:38.760 |
well, we're just want to improve the health of our platform. 00:30:46.120 |
Let's both advertise how amazing this fricking thing is, 00:30:50.560 |
but also to say like, we don't know what we're doing. 00:30:53.280 |
We have all these Nazis posting videos on YouTube. 00:31:07.000 |
it does seem like Zuckerberg and others get worn down. 00:31:17.800 |
which is an incredible company system idea, I think. 00:31:41.940 |
But the reason we do it is because we're looking for people 00:31:49.060 |
And the way we do it is by obfuscating the data. 00:32:05.280 |
But you do know that if you're good at machine learning 00:32:11.520 |
you know that I can still find patterns in this data, 00:32:14.640 |
even though I don't know what the features mean. 00:32:25.940 |
Like approximately, what are we talking about? 00:32:27.940 |
- So we are buying data from lots of different data vendors. 00:32:32.380 |
And they would also never want us to share that data. 00:32:38.660 |
But that's the kind of data you could never buy yourself 00:32:43.500 |
unless you had maybe a million dollars a year 00:32:48.740 |
So what's happened with the hedge fund industry 00:32:54.600 |
who used to be able to trade and still can trade, 00:33:02.180 |
it would never make sense for them to trade themselves. 00:33:07.100 |
But Numeroid, by giving away this obfuscated data, 00:33:09.620 |
we can give them a really, really high quality data set 00:33:14.820 |
And they can use whatever new machine learning technique 00:33:24.020 |
- And so how much variety is there in underlying data? 00:33:27.060 |
We're talking about, I apologize if I'm using the wrong terms 00:33:34.380 |
The other, there's like options and all that kind of stuff, 00:33:36.820 |
like the, what are they called, order books or whatever. 00:33:47.140 |
Like natural language as well, all that kind of stuff. 00:34:12.580 |
But we try to get lots and lots of those factors 00:34:19.560 |
And then the setup of the problem is commonly in quant 00:34:31.960 |
You're trying to say the like relative position 00:34:46.020 |
the dynamics captured by the data of that period of time 00:34:53.180 |
- Yeah, so our predictions are also not that short. 00:34:55.700 |
We're not really caring about things like order books 00:35:02.340 |
We're actually holding things for quite a bit longer. 00:35:05.060 |
So our prediction time horizon is about one month. 00:35:07.800 |
We end up holding stocks for maybe like three or four months. 00:35:10.740 |
So I kind of believe that's a little bit more like 00:35:17.820 |
Like to go long a stock that's mispriced on one exchange 00:35:21.860 |
and short on another exchange, that's just arbitrage. 00:35:26.140 |
But what we're trying to do is really know something more 00:35:36.980 |
you're trying to understand something fundamental 00:35:43.260 |
about like it's big in the context of the market, 00:35:46.300 |
is it underpriced, overpriced, all that kind of stuff. 00:35:50.780 |
It's not about just like you said, high frequency trading, 00:36:05.260 |
And then now anyone can try to build algorithms 00:36:11.980 |
that make investing decisions on top of that data 00:36:26.340 |
we could obviously model that data in house, right? 00:36:36.200 |
But what we're trying to do is by opening it up 00:36:42.160 |
we can do quite a lot better than if we modeled it ourselves 00:36:58.560 |
because the whole, usually you're charging 2% fees. 00:37:16.760 |
Like if there's a great paper on supervised learning, 00:37:23.500 |
- And is there an ensemble of models going on 00:37:28.500 |
or is it more towards kind of like one or two or three, 00:37:40.380 |
is by how much the users are staking on them. 00:37:51.340 |
You can trust me because I'm going to put skin in the game. 00:37:54.300 |
And so we can take the stake weighted predictions 00:38:02.280 |
and that's a much better model than any one model 00:38:05.980 |
in the sum because ensembling a lot of models together 00:38:08.840 |
is kind of the key thing you need to do in investing too. 00:38:16.800 |
from the perspective of a machine learning engineer, 00:38:29.760 |
So like, and, but the way to invest algorithmically 00:38:49.500 |
So it's like, everything just works nicely together. 00:39:12.160 |
is there something you can say about which kind, 00:39:16.520 |
looking at the big broad view of machine learning today, 00:39:19.300 |
or AI, what kind of algorithms seem to do good 00:39:36.200 |
like old school, sort of more basic kind of classifiers 00:39:40.760 |
Is there some kind of conclusions so far that you can say? 00:39:44.120 |
- There is, there's definitely something pretty nice 00:39:54.960 |
So out of the box, if you're trying to come 100th 00:40:03.080 |
But what's particularly interesting about the problem 00:40:14.780 |
this typically will surprise you when you model our data. 00:40:18.520 |
So one of the things that you look at in finance 00:40:23.520 |
is you don't want to be too exposed to any one risk. 00:40:33.360 |
to invest in over the last 10 years was tech, 00:40:36.700 |
does not mean you should put all of your money into tech. 00:40:43.440 |
put all your money into tech, it's super good. 00:40:45.760 |
But what you want to do is actually be very careful 00:40:49.160 |
of how much of this exposure you have to certain features. 00:40:53.440 |
So on Numerai, what a lot of people figure out 00:40:57.800 |
is actually if you train a model on this kind of data, 00:41:02.160 |
you want to somehow neutralize or minimize your exposure 00:41:20.720 |
and you have an auto encoder and it's figuring out, 00:41:22.560 |
okay, it's going to be red and it's going to be white. 00:41:30.400 |
Why would you reduce your exposure to the thing 00:41:34.640 |
And that's actually this counterintuitive thing 00:41:36.480 |
you have to do with machine learning on financial data. 00:41:41.880 |
would help you generalize the things that are, 00:41:44.160 |
so basically financial data has a large amount of patterns 00:41:49.160 |
that appeared in the past and also a large amount 00:41:52.920 |
of patterns that have not appeared in the past. 00:41:55.180 |
And so like in that sense, you have to reduce the exposure 00:42:02.560 |
That's interesting, but how much of this is art 00:42:05.180 |
and how much of it is science from your perspective so far 00:42:08.020 |
in terms of as you start to climb from the 100th position 00:42:16.100 |
- Yeah, well, if you do make yourself super exposed 00:42:20.460 |
to one or two features, you can have a lot of volatility 00:42:41.100 |
and you might make 100%, but it doesn't in the long run 00:42:47.740 |
And so the best users are trying to stay high 00:42:55.980 |
not necessarily try to be first for a little bit. 00:43:00.100 |
- So to me, a developer, machine learning researcher, 00:43:04.880 |
how do I, Lex Friedman, participate in this competition 00:43:07.980 |
and how do others, which I'm sure there'll be a lot 00:43:10.440 |
of others interested in participating in this competition, 00:43:12.980 |
what are, let's see, there's like a million questions, 00:43:23.540 |
download the data, and on, the data is pretty small. 00:43:32.020 |
there's like an example script, Python script, 00:43:34.980 |
that just builds a XGBoost model very quickly from the data. 00:43:47.660 |
Like what, is this model then submitted somewhere? 00:43:54.700 |
Like how does the whole, how does your model, 00:44:02.660 |
- Okay, well, we want you to keep your baby Frankenstein 00:44:14.620 |
So we never see the code that wrote your model, 00:44:18.100 |
That our whole hedge fund is built from models 00:44:22.380 |
But it's important for the users because it's their IP, 00:44:40.360 |
- What some users do is they set up a compute server 00:44:51.620 |
we need more predictions now, and then you send it to us. 00:45:05.580 |
I mean, is there sort of specific technical things 00:45:25.780 |
and then you would set up a little server on AWS 00:45:34.160 |
And so how does your own money actually come into play 00:45:50.580 |
to see if their model works on the real life data, right? 00:45:56.280 |
But then you can get Numeraire many different ways. 00:46:01.960 |
You can buy it on, you can buy some on Coinbase, 00:46:08.220 |
- So what did you say this is, how do you pronounce it? 00:46:27.580 |
So, and you could buy it, you know, basically anywhere. 00:46:36.940 |
- And it's like, yeah, you need to put some money down 00:46:45.180 |
like you can't buy the cryptocurrency from us. 00:46:49.100 |
- It's like, it's also, we never, if you do badly, 00:47:00.020 |
But what's good about it is it's also not coming to us. 00:47:04.380 |
- It's not like we win when you lose or something, 00:47:27.140 |
So, since I have you here, and you said a hundredth, 00:47:47.300 |
if this is something you want to take on seriously 00:47:58.900 |
We give out really interesting scripts and ideas 00:48:03.620 |
But you're also going to need a lot of compute probably. 00:48:12.380 |
actually the very first time someone won on Numero, 00:48:21.580 |
And then they said, "I spent way more on the compute." 00:48:26.900 |
- So this is fundamentally a machine learning problem first, 00:48:29.180 |
I think, is this is one of the exciting things, 00:48:38.540 |
no offense, but like finance people, finance-minded people. 00:48:45.380 |
But it feels like from the community that I've experienced, 00:48:52.100 |
as a fascinating problem space, source of data, 00:48:57.100 |
but ultimately they're machine learning people 00:49:17.660 |
give me some hints how I can do well at this thing. 00:49:21.300 |
'Cause this Boomer, I'm not sure I still got it, 00:49:24.180 |
but 'cause some of it is, it's like Kaggle competitions, 00:49:33.300 |
like research ideas, like fundamental innovation, 00:49:37.420 |
but I'm sure some of it is like deeply understanding, 00:49:42.700 |
And then like a lot of it will be like figuring out 00:49:47.780 |
I mean, you could argue most of deep learning research 00:49:57.740 |
in a really difficult machine learning problem. 00:50:04.740 |
where they'll set up this kind of toy problem, 00:50:08.340 |
and then there will be an out of sample test, 00:50:16.620 |
the out of sample is the real life stock market. 00:50:23.420 |
like we don't know the answer to the problem. 00:50:25.660 |
We don't, like, you'll have to find out live. 00:50:28.300 |
And so we've had users who've like submitted every week 00:50:41.780 |
And it sounds maybe sounds like maybe a bit too much 00:50:45.580 |
but it's the hardest because it's the stock market. 00:50:48.820 |
It's like, literally there are like billions of dollars 00:50:51.620 |
at stake and like no one's like letting it be inefficient 00:50:56.660 |
So if you can find something that works on Numera, 00:50:58.740 |
you really have something that is like working 00:51:12.260 |
But the fundamental statement here is, which I like, 00:51:16.900 |
is this really the hardest data science problem? 00:51:21.100 |
but ultimately it also boils down to a problem 00:51:26.660 |
It's made accessible, made really easy and efficient 00:51:35.500 |
it's not about the data being out there, like the weather. 00:51:46.020 |
- Like, it's not just, there's always images. 00:51:51.220 |
you give it a little title, this is community, 00:51:53.580 |
and that was one of the hardest, right, for a time, 00:51:58.580 |
and most important data science problems in the world 00:52:12.300 |
and mechanisms by which to judge your performance, 00:52:14.860 |
And NumerEyes, I actually have to step up from that. 00:52:17.340 |
Is there something more you can say about why, 00:52:19.700 |
from your perspective, it's the hardest problem 00:52:24.860 |
I mean, you said it's connected to the market. 00:52:40.460 |
We've tried to regularize the data so you can find patterns 00:52:45.300 |
by doing certain things to the features and the target. 00:52:48.620 |
But ultimately, you're in a space where you don't, 00:52:51.860 |
there's no guarantees that the out-of-sample distributions 00:52:59.660 |
And every single era, which we call on the website, 00:53:07.020 |
which is like sort of showing you the order of the time, 00:53:09.820 |
even the training data has the same dislocations. 00:53:21.300 |
yeah, there's so many things that you might wanna try. 00:53:26.340 |
There's unlimited possible number of models, right? 00:53:42.420 |
you said that Numeri is very much like Wall Street Bets. 00:53:46.180 |
Is there, I think it'd be interesting to dig in 00:53:52.420 |
I think you're speaking to the distributed nature 00:53:55.100 |
of the two and the power of the people nature of the two. 00:54:06.020 |
In which way is Wall Street Bets more powerful? 00:54:09.340 |
- Yeah, this is why the Wall Street Bets story 00:54:14.180 |
And looking at how, just looking at the forum 00:54:17.860 |
of Wall Street Bets, it's, I was talking earlier about how, 00:54:31.180 |
And those kinds of things make this emerging thing possible. 00:54:35.540 |
Numeri, it didn't work at all when we started. 00:54:56.700 |
we could know which model people believed in the most. 00:55:00.100 |
And we could weight models that had high stake more 00:55:04.500 |
and effectively coordinate this group of people 00:55:07.740 |
to be like, well, actually there's no incentive 00:55:20.740 |
that having a negative incentive and a positive incentive 00:55:28.140 |
is this really nice like key thing about blockchain. 00:55:44.300 |
so that you can have coordination to solve one problem. 00:55:52.060 |
because I have less money in my own hedge fund 00:56:01.620 |
In some sense, from a human psychology perspective, 00:56:04.180 |
it's fascinating that the Wall Street bets worked at all. 00:56:16.900 |
if numerized style staking could then be transferred 00:56:23.900 |
and not necessarily for financial investments, 00:56:26.980 |
but I wish sometimes people would have to stake something 00:56:42.180 |
that you don't have to, you can speak your mind, 00:57:02.260 |
Of course, it only with numerized currently works 00:57:13.500 |
In fact, we've open sourced like the code we use 00:57:18.980 |
And if Reddit wanted to, they could even use that code 00:57:23.540 |
to have enabled staking on our Wall Street bets. 00:57:32.380 |
on how could they have more crypto stuff in there, 00:57:37.140 |
in Ethereum, because wouldn't that be interesting? 00:57:40.260 |
Like imagine you could, instead of seeing a screenshot, 00:57:43.820 |
like guys, I promise I will not sell my GameStop. 00:57:56.940 |
which no one in the world, including me, can undo. 00:58:01.060 |
That says, I have staked millions against this claim. 00:58:11.180 |
- And of course, it doesn't have to be millions. 00:58:26.740 |
they would still be short squeezing one day after the next, 00:58:33.660 |
do you think it's possible that numeri-style infrastructure 00:58:38.660 |
where AI systems backed by humans are doing the trading 00:58:57.300 |
- Yeah, the thing is that some of them could be bad actors. 00:59:17.620 |
where we know we can make money, but it will mesh it up. 00:59:22.740 |
We know we can make money, but it will mess things up. 00:59:31.900 |
Of course you can't do this, but maybe he can. 00:59:35.180 |
And maybe he really isn't doing things he knows he could do, 00:59:43.500 |
Would the Reddit army have that kind of morality 00:59:57.580 |
There'll be like one person that says, hey, maybe, 01:00:08.900 |
is somehow that like little voice, that's human morality, 01:00:13.900 |
gets silenced when we get into groups and start chanting. 01:00:22.380 |
I thought that you're saying AI systems can be dangerous, 01:00:25.940 |
but you just described how humans can be dangerous. 01:00:42.220 |
and then come up with the idea from the model, 01:00:51.740 |
But it is possible for like kind of a bunch of humans. 01:00:57.140 |
numeri could get very powerful without it being dangerous, 01:01:01.920 |
but Wall Street bets needs to get a little bit more powerful 01:01:15.620 |
Numeri signals and what that looks like in 10, 20, 30, 01:01:21.120 |
Like right now, I guess maybe you can correct me, 01:01:24.580 |
but the data that we're working with is like a window. 01:01:32.420 |
into a particular aspect, time period of the market. 01:01:53.180 |
for like to buy GameStop, for example, on Wall Street bets. 01:02:03.340 |
and what are the different data sets that are involved? 01:02:18.100 |
So numeri, it's all, you have to model our data 01:02:22.780 |
Numeri signals is whatever data you can find out there, 01:02:26.620 |
you can turn it into a signal and give it to us. 01:02:42.240 |
you're only rewarded for signals that are orthogonal 01:02:47.500 |
So you have to be doing something uncorrelated. 01:02:50.100 |
And so strange alternative data tends to have that property. 01:02:54.980 |
There isn't too many other signals that are correlated 01:03:11.620 |
there was a user that created, I think he's in India. 01:03:14.620 |
He created a signal that is scraped from Wall Street bets. 01:03:19.620 |
And now we have that signal as one of our signals 01:03:28.820 |
- And the structure of the signal is similar. 01:03:30.820 |
So is this just numbers and time series data? 01:03:35.460 |
it's kind of a, you're providing a ranking of stocks. 01:03:37.660 |
So you just say, give a one means you like the stock, 01:03:43.420 |
and you provide that for 5,000 stocks in the world. 01:03:46.780 |
- And they somehow converted the natural language 01:03:57.340 |
And so, yeah, it's taking that, making a sentiment score 01:04:04.140 |
- Like this stock sucks or this stock is awesome. 01:04:15.740 |
This long-term, what do you see that becoming? 01:04:18.300 |
- So we wanna manage all the money in the world. 01:04:23.820 |
And to get that, we need to have all the data 01:04:34.140 |
and really good data that you would lose, right? 01:04:41.460 |
So NumerEye already has some really nice data 01:04:48.420 |
And I actually think we'll 10X the amount of data 01:04:56.140 |
- So it's gonna get very big, the data we give out. 01:05:06.460 |
can turn that into a signal and give it to us. 01:05:18.620 |
that just can enter through this back door of the process. 01:05:22.180 |
- Exactly, and it's also a little bit shorter term. 01:05:26.340 |
So the signals are about a seven day time horizon, 01:05:42.180 |
in a similar sort of style of big style thinking, 01:05:46.540 |
you would like NumerEye to manage all of the world's money. 01:06:11.820 |
- Exactly, I mean, the important thing to note there is, 01:06:16.140 |
And I think the biggest thing it means is like, 01:06:27.700 |
Like it's super weird how the industry works. 01:06:33.900 |
and another hedge fund has to buy the same data source 01:06:46.060 |
it's all about freeing up people who work at hedge funds 01:07:11.740 |
And it's like, he's got like NumerEye posters, 01:07:25.980 |
He's working on getting us to Jupiter's moon. 01:07:29.900 |
That's his mission, the Europa Clipper mission. 01:07:34.380 |
He's smart enough that we really want his intelligence 01:07:52.260 |
- You mentioned briefly that stock markets are good. 01:08:02.000 |
do you think trading stocks is closer to gambling, 01:08:12.840 |
as opposed to betting on companies to succeed. 01:08:15.560 |
And this is maybe connected to our discussion 01:08:18.220 |
But from your sense, the way you think about it, 01:08:35.720 |
And it's like, but all the trades are speculative. 01:08:47.040 |
there's certainly a lot of aspects of gambling math 01:09:04.400 |
And small alterations to your bankroll management 01:09:07.680 |
might be better than improvements to your skill. 01:09:10.940 |
So there, and then there are things we care about 01:09:15.120 |
in our fund, like we wanna make a lot of independent bets. 01:09:19.760 |
like we wanna make a lot of independent bets, 01:09:24.600 |
than if you have a lot of bets that depend on each other, 01:09:27.840 |
But yeah, I mean, the point is that you want the prices 01:09:41.600 |
- Yeah, like you shouldn't have there be like a hedge fund 01:09:46.320 |
that's able to say, well, I've looked at some data 01:10:00.000 |
do you see that the market often like drifts away 01:10:12.760 |
options and shorting and all that kind of stuff, 01:10:19.820 |
it's like layers of game on top of the actual, 01:10:29.720 |
- Yeah, there are a lot of games that people play 01:10:37.000 |
And I think a lot of the stuff people dislike 01:10:39.840 |
when they look at the history of what's happened, 01:10:48.160 |
Like these are the kind of like enemies of 2008. 01:10:52.200 |
And then the long-term capital management thing, 01:10:54.840 |
it was like they had 30 times leverage or something. 01:10:58.240 |
Just that no one, like you could just go to a gas station 01:11:09.640 |
It's like common sense just like went out the window. 01:11:12.240 |
So yeah, I don't respect long-term capital management. 01:11:20.680 |
- Okay, but Nimrod doesn't actually use any derivatives 01:11:29.360 |
We, and that does help the companies we're investing in. 01:11:37.640 |
And we were not, we played some role in its success. 01:11:59.760 |
'cause let's talk to you about cryptocurrency, 01:12:07.680 |
You said Nimrod, the vision, the goal is to run, 01:12:47.320 |
that's trading that represents what they're doing, 01:13:00.520 |
and all these other things and different crypto hedge funds 01:13:09.440 |
it feels like how I used to think about money stuff 01:13:35.240 |
but it's also a mechanism of exchanging wealth. 01:13:38.700 |
But what wealth means becomes a totally different thing, 01:13:44.520 |
to where it's almost like these little contracts, 01:13:54.000 |
than just like cash being exchanged at 7-Eleven, 01:14:04.480 |
when you have the ability to take out a loan? 01:14:08.780 |
You can bring a whole new future into being with finance. 01:14:13.780 |
If you couldn't get a student loan to get a college degree, 01:14:27.500 |
and like, yeah, all you have is this like loan, 01:14:30.700 |
which is like, so now you can bring a different future 01:15:06.100 |
You said you were early in on cryptocurrency. 01:15:23.740 |
And I was like, well, it has no net present value. 01:15:42.060 |
so I didn't feel like I was early in cryptocurrency. 01:15:44.380 |
In fact, 'cause I was like, it was like 2014, 01:15:48.620 |
And then, but then I really liked some of the things 01:16:10.220 |
- Like smart contracts and all that kind of stuff. 01:16:18.860 |
- Yeah, that's a cool dance between Bitcoin and Ethereum 01:16:29.860 |
Like if Bitcoin or Ethereum 2.0 or some version 01:16:37.100 |
or if there's even a concept of winning out at all. 01:16:39.900 |
Is there a cryptocurrency that you're especially 01:16:55.300 |
I'm not looking to like invest in cryptocurrencies anymore, 01:17:05.020 |
I mean, there's not, there wasn't too much difference 01:17:08.620 |
between even Ethereum and Bitcoin in some ways, right? 01:17:13.540 |
But there are some that I like the privacy ones. 01:17:15.860 |
I mean, I was like, I like Zcash for it's like coolness. 01:17:19.060 |
It's actually, it's like a different kind of invention 01:17:25.500 |
- Okay, can you speak to just briefly to privacy? 01:17:31.140 |
some privacy of the, so I guess everything is public. 01:17:36.300 |
- Yeah, none of the transactions are private. 01:17:44.700 |
I have some numeraire and you can just see it. 01:17:48.020 |
In fact, you can go to a website and says like, 01:17:54.540 |
And I'm like, how the hell do you guys know this? 01:17:57.700 |
- So they can reverse engineer, whatever that's called. 01:18:09.540 |
then they also, when you can make private transactions, 01:18:15.580 |
- And it's unclear, it's like, what's quite cool 01:18:18.780 |
about Zcash is I wonder what games are being played there. 01:18:27.180 |
can you describe why Dogecoin is going to win? 01:18:32.380 |
Like it very likely will take over the world. 01:18:38.740 |
Or on a more serious note, like what are your thoughts 01:18:45.100 |
where you've spoken to sort of the meme stocks, 01:18:51.260 |
It feels like the joke can become the reality 01:18:57.260 |
like the meme, the joke has power in this world. 01:19:14.980 |
It's like Elon tweeting and that becomes a catalyst 01:19:18.780 |
for everybody on the internet kind of like spreading 01:19:41.820 |
at captivating the discourse on the internet. 01:20:04.180 |
there's a little probability that I might buy it. 01:20:08.820 |
- Right, and so I imagine you just have millions of people 01:20:11.140 |
who have had all these great jokes told to them. 01:20:22.020 |
- Wouldn't that be interesting if like the entirety, 01:20:24.060 |
if we travel in time, like multiple centuries, 01:20:35.540 |
Like we're high on probably some really advanced drugs. 01:20:42.540 |
It's a weird, like dystopian future of just humor. 01:20:50.940 |
how like good it feels to just not take shit seriously 01:20:55.580 |
and just relieve like the pressure of the world. 01:20:58.700 |
At the same time, the reason I don't always like 01:21:26.020 |
Because I think both are possible directions. 01:21:30.460 |
I mean, America, I always felt that about America, 01:21:33.340 |
a lot of people are telling jokes kind of all the time 01:21:42.100 |
you're like, I really want to have a sincere conversation. 01:21:46.780 |
because everything is so, there's so much levity. 01:21:51.340 |
I like how sincere actually like your Twitter can be. 01:21:54.940 |
You're like, I am in love with the world today. 01:22:00.340 |
I'm never gonna stop because I realized like, 01:22:03.140 |
you have to be able to sometimes just be real 01:22:05.260 |
and be positive and just be, say the cliche things, 01:22:09.220 |
which ultimately those things actually capture 01:22:20.260 |
Now from an engineering perspective of being able to joke, 01:22:23.020 |
but everyone's mostly to pull back and be like, 01:22:26.860 |
here's real problems, let's solve them and so on. 01:22:37.180 |
But I guess your advice is to invest everything 01:22:55.180 |
It's just interesting 'cause my mind has been in that space 01:23:03.420 |
What's your advice on how to build a successful startup? 01:23:13.820 |
and it might be a particular thing about America, 01:23:20.660 |
tell people what you really want to happen in the world. 01:23:27.900 |
like if you're asking someone to invest in your company, 01:23:38.580 |
you actually believe, you're actually more optimistic, 01:23:54.540 |
because no one wants to work in a mediocre company, 01:23:57.540 |
no one wants to invest in a mediocre company. 01:24:02.620 |
And obviously this doesn't apply to all businesses, 01:24:04.460 |
but if you play a venture-backed startup kind of game, 01:24:28.020 |
And you should push back and play the real game. 01:24:33.860 |
So both, I mean, there's an interesting duality there. 01:24:41.100 |
about like your plans and what you are like privately, 01:24:48.100 |
- And maybe it's connected with what you were saying about, 01:24:51.940 |
Like if you appear to be sincerely optimistic 01:24:59.700 |
it's putting yourself up to be kind of like ridiculed 01:25:03.060 |
And so if you say, my mission is to, yeah, go to Mars. 01:25:20.220 |
then actually amazing people come and work with you. 01:25:30.460 |
like that fire that you have when you're optimistic 01:25:36.180 |
something totally cool, something totally new, 01:25:38.940 |
that when those people get in a room together, 01:25:50.820 |
So all of that together, ultimately, I don't know. 01:25:55.420 |
That's what makes this crazy ride of a startup 01:25:59.620 |
And Elon is an example of a person who succeeded at that. 01:26:02.500 |
There's not many other inspiring figures, which is sad. 01:26:06.100 |
I used to be at Google and there's something that happens 01:26:11.100 |
that sometimes when the company grows bigger and bigger 01:26:21.900 |
of making the world's information accessible to everyone. 01:26:24.940 |
And I remember, I don't know, that's beautiful. 01:26:27.980 |
I still love that dream of, they used to scan books, 01:26:40.500 |
I'm just awe-inspired by how awesome humans are, man. 01:26:48.620 |
I don't know what the meanings are over there, 01:26:55.260 |
And I'd love to be able to be part of something like that. 01:26:58.900 |
And you're right, for that you have to be bold. 01:27:06.820 |
Something I always talk about, especially in FinTech, 01:27:16.400 |
Even if you succeed, this is gonna be terrible. 01:27:20.240 |
And it's just strange how people can kind of get 01:27:41.460 |
So like I worked on autonomous vehicles quite a bit, 01:27:44.540 |
and there's so many ways in which you can present 01:27:48.120 |
that idea to yourself, to the team you work with, 01:27:50.700 |
to just, yeah, like to yourself when you're quietly 01:27:58.340 |
Like if you're really ambitious with autonomous vehicles, 01:28:01.240 |
it changes the nature of like human robot interaction, 01:28:10.000 |
It changes like everything about robotics and AI, 01:28:13.320 |
machine learning, it changes everything about manufacturing. 01:28:16.520 |
I mean, the cars, the transportation is so fundamentally 01:28:26.660 |
And if you go bold and take risks and be willing 01:28:33.900 |
as opposed to cautiously, you could really change the world. 01:28:37.640 |
And it's so sad for me to see all these autonomous 01:28:41.660 |
they're like really more focused about fundraising 01:28:46.520 |
They're really afraid, the entirety of their marketing 01:28:49.660 |
is grounded in fear and presenting enough smoke 01:28:53.540 |
to where they keep raising funds so they can cautiously 01:28:56.800 |
use technology of a previous decade or previous two decades 01:29:03.320 |
as opposed to do crazy things and bold and go huge 01:29:12.840 |
Like the idea can be big, but if you don't allow yourself 01:29:16.160 |
to take that idea and think really big with it, 01:29:31.360 |
How much interaction do you have with investors? 01:29:34.120 |
I get that whole process is an entire mystery to me. 01:29:36.880 |
Is there some people that just have influence 01:29:38.640 |
on the trajectory of your thinking completely? 01:29:46.960 |
Yeah, I mean, I came here and I was amazed how, 01:29:52.200 |
yeah, people would, I was only here for a few months 01:30:08.320 |
like an email update every like three minutes. 01:30:15.560 |
Like, so for some, I like it when it's just like, 01:30:31.280 |
There's a lot for them to add on like business development. 01:30:39.020 |
research into how to make a great hedge fund. 01:30:47.160 |
So they're basically is like, I believe in you. 01:30:49.080 |
There's something, I like the cut of your jib. 01:30:52.440 |
There's something in your idea, in your ambition, 01:31:12.460 |
who's actually a co-founder of Renaissance Technologies 01:31:16.820 |
in the like 1980s and worked with Jim Simons. 01:31:25.880 |
and I was meeting some other guy and he was in the room. 01:31:28.240 |
And I was explaining how quantitative finance works. 01:31:33.040 |
I was like, so, you know, they use mathematical models. 01:31:36.640 |
And then he was like, yeah, I started Renaissance. 01:31:46.720 |
So yeah, and then I think he kind of said, well, 01:31:52.280 |
he was working at First Round Capital as a partner. 01:31:55.940 |
And they kind of said they didn't want to invest. 01:31:59.200 |
And then I wrote a blog post describing the idea. 01:32:01.920 |
And I was like, I really think you guys should invest. 01:32:08.900 |
And I was like, you don't see like what I'm doing here. 01:32:11.280 |
This is a tech company, not a hedge fund, right? 01:32:22.720 |
obviously it's a risky thing you're taking on, 01:32:30.420 |
He comes up in another world of mine really often. 01:32:33.560 |
He's just a brilliant guy on the mathematics side 01:32:38.800 |
but he's also a brilliant finance, hedge fund manager guy. 01:32:44.160 |
Have you gotten a chance to interact with him at all? 01:32:47.080 |
Have you learned anything from him on the math, 01:32:51.120 |
on the finance, on the philosophy life side of things? 01:32:57.200 |
It was like, actually in the show "Billions", 01:33:05.640 |
And they have a lot of like World Series of Bracelet, 01:33:14.420 |
And I met him there and yeah, it was kind of brief, 01:33:19.220 |
but I was just like, he's like, "Oh, how do you, 01:33:39.140 |
who's the current CEO of Renaissance, Peter Brown. 01:33:43.460 |
And Peter Muller, who's a hedge fund manager at PDT. 01:34:02.420 |
- He's the guy who donated the most money to Trump. 01:34:13.500 |
So it's quite cool how, yeah, like the, it was really fun. 01:34:17.660 |
And then I managed to knock out Peter Muller. 01:34:19.620 |
I have a, I got a little trophy for knocking him out 01:34:28.660 |
But I will say Jim outlasted me in the tournament. 01:34:41.720 |
But they're also, so it was a $10,000 buy-in. 01:34:49.020 |
but it all goes to charity, Jim's math charity. 01:34:53.580 |
But then the way they play, they have like rebuys. 01:35:06.260 |
and I'm like, man, like, so I ended up, you know, 01:35:12.840 |
It's like you couldn't play at all without doing that. 01:35:35.640 |
and they also don't speak about things, right? 01:35:39.920 |
So it's not like going to meet a famous rocket engineer 01:35:53.040 |
And they were also kind of making fun of me a little bit. 01:35:57.720 |
Like they would say, like they'd call me like, 01:36:08.920 |
I don't think AI is going to have a big role in finance. 01:36:13.000 |
And I was like, hearing this from the CEO of Renaissance 01:36:21.160 |
he's like, I can see it having a really big impact 01:36:28.320 |
And so I don't think it's like the perfect application. 01:36:30.680 |
And I was like, that was interesting to hear. 01:36:32.560 |
'Cause it's like, and I think there was that same day 01:36:35.580 |
that Libra, I think it is, the poker playing AI 01:36:42.860 |
So it was kind of funny hearing them like say, 01:36:44.540 |
oh, I'm not sure AI could ever attack that problem. 01:36:47.580 |
And then that very day it's attacking the problem 01:36:51.100 |
- Well, there's a kind of a magic to somebody 01:37:08.200 |
but I tend to believe from my interactions with people 01:37:11.220 |
that it's a kind of prod to say, like prove me wrong. 01:37:15.720 |
- That's ultimately, that's how those guys talk. 01:37:42.260 |
about artificial intelligence and the nature of money, 01:37:48.900 |
in terms of patterns from all of these millions 01:37:52.320 |
of humans interacting using this methodology of money? 01:37:59.660 |
In that sense, you converting it to a dataset 01:38:02.300 |
is one of the biggest gifts to the research community, 01:38:07.060 |
to the whole, anyone who loves data science and AI, 01:38:13.820 |
And I'd love to see where this goes actually. 01:38:15.620 |
- Thing I say sometimes, long before AGI destroys the world, 01:38:26.380 |
- And I don't know if I'm gonna be the one who invents that. 01:38:43.300 |
People love it when I ask this kind of question 01:38:45.580 |
about books, about ideas and philosophers and so on. 01:38:58.180 |
that had an influence on your life when you were growing up 01:39:04.540 |
that people check out, blog posts, podcasts, videos, 01:39:09.740 |
Is there something that just kind of had an impact on you 01:39:22.220 |
It's like, I was like halfway through the book. 01:39:24.860 |
And I really do like a lot of the ideas there. 01:39:34.340 |
It's like, why, like there's a little picture 01:39:40.460 |
So I like, there's kind of like some depth there. 01:39:44.220 |
if you're thinking of doing a startup, it's a good book. 01:39:47.300 |
- A book I like a lot is maybe my favorite book 01:40:03.820 |
it like makes the world feel like kind of colder. 01:40:10.660 |
and coming from nothing should be this way or whatever. 01:40:19.260 |
And the way David Deutsch sees things and argues, 01:40:29.100 |
That's, you know, some of the statements of things like, 01:40:32.820 |
you know, anything that doesn't violate the laws of physics 01:40:45.740 |
- You've mentioned kind of advice for startups. 01:40:47.660 |
Is there in general, whether you do a startup or not, 01:40:56.260 |
and were, I would say, exceptionally successful. 01:40:59.260 |
Is there advice, somebody who's like 20 today, 18, 01:41:05.700 |
or in college and so on that you would give them? 01:41:09.540 |
- I think I often tell young people don't start companies. 01:41:16.060 |
unless you're prepared to make it your life's work. 01:41:28.580 |
I'm gonna start a company and sell it or whatever. 01:41:31.260 |
And it's just like, what are you talking about? 01:41:37.260 |
so into it that you wanna spend a really long time on it. 01:41:44.100 |
Like, is it literally the fact that you need to give 100% 01:41:49.860 |
Or is it more about just the mindset that's required? 01:41:53.620 |
- Yeah, I mean, I think, well, any, I think, yeah, 01:41:56.660 |
certainly don't wanna have a plan to sell the company 01:42:02.180 |
What's like, or it's like a company that has a very, 01:42:14.980 |
And then I also think something I've thought about recently 01:42:25.940 |
And part of me thinks if I had spent another two years there 01:42:38.460 |
to basically accelerate how long Numerai took. 01:42:41.140 |
So the idea that you can sit in an air conditioned room 01:42:44.700 |
and get free food, or even sit at home now in your underwear 01:42:48.860 |
and make a huge amount of money and learn whatever you want 01:42:57.900 |
That's the case for, I was terrified of that. 01:43:00.340 |
Like at Google, I thought I would become really comfortable 01:43:06.140 |
And that I was afraid the quant situation is, 01:43:13.260 |
that it's exceptionally valuable, the lessons you learn 01:43:17.580 |
'cause you get to get paid while you learn from others. 01:43:34.180 |
you have to be really careful on that to get comfortable. 01:43:40.820 |
And then you get, and then you convince yourself like, 01:43:48.220 |
And then there's momentum and all of a sudden 01:43:50.300 |
you're on your deathbed and there's grandchildren 01:43:56.940 |
So I'm afraid of that momentum, but you're right. 01:44:00.780 |
Like there's something special about the education you get 01:44:08.160 |
I had like a bunch of papers on quant finance, 01:44:21.440 |
and you can learn about, so that, I also thought, 01:44:28.420 |
I don't think there are too many people that know like 01:44:40.640 |
And that was because I had like free time in my job. 01:44:43.340 |
- Okay, let me ask the perfectly impractical, 01:44:52.440 |
you're trying to do so many amazing things, why? 01:44:56.240 |
What's the meaning of this life of yours or ours? 01:45:07.820 |
is like asking the wrong question or something. 01:45:12.020 |
- No, usually people get too nervous to be able to say that 01:45:29.820 |
There's a useful like a palate cleanser aspect to it 01:45:38.280 |
all the little busy, hurried day-to-day activities, 01:45:42.000 |
all the meetings, all the things you like a part of. 01:45:52.120 |
allows you to kind of zoom out and think about it. 01:45:53.920 |
But there's ultimately, I think it's an impossible thing 01:46:03.040 |
who's exceptionally successful, exceptionally busy now, 01:46:09.800 |
to ask these kinds of questions about like death. 01:46:14.140 |
Do you consider your own mortality kind of thing? 01:46:24.900 |
- Yeah, it's amazing how many things you can like 01:46:28.100 |
that are trivial that could like occupy a lot of your mind 01:46:31.280 |
until something bad happens or something flips you. 01:46:36.280 |
- And then you start thinking about the people you love 01:46:55.720 |
So I had like almost like bronchitis or whatever. 01:47:01.980 |
but I started, and then you're forced to be isolated. 01:47:12.500 |
And then I've heard stories of, I think it's Sean Parker. 01:47:18.220 |
and he had to like just stay in bed for years. 01:47:24.860 |
So yeah, I had about 15 days of this recently, 01:47:34.620 |
- Were there moments when you were just like terrified 01:47:59.220 |
And then there are even new strains in South Africa, 01:48:04.020 |
And maybe the new strain had some interaction 01:48:17.680 |
You get confusion and kind of a lot of fatigue 01:48:24.820 |
So, you know, sometimes it's like, oh man, I feel tired. 01:48:27.440 |
Okay, I'm just gonna go have coffee and then I'll be fine. 01:48:31.480 |
I don't even wanna get out of bed to get coffee 01:48:37.120 |
there's no like quick fix cure and you're trapped at home. 01:48:40.760 |
- So now you have this little thing that happened to you 01:48:48.200 |
in trying to create something special in this world, right? 01:48:54.960 |
Listen, this was like one of my favorite conversation 01:48:58.320 |
'cause the way you think about this world of money 01:49:14.800 |
with Richard Kray and thank you to our sponsors. 01:49:17.800 |
Audible Audiobooks, Trial Labs, Machine Learning Company, 01:49:24.240 |
and Athletic Greens, all-in-one nutrition drink. 01:49:32.080 |
And now let me leave you with some words from Warren Buffett. 01:49:36.560 |
"Games are won by players who focus on the playing field, 01:49:40.600 |
not by those whose eyes are glued to the scoreboard." 01:49:44.600 |
Thank you for listening and hope to see you next time.