back to index

Richard Craib: WallStreetBets, Numerai, and the Future of Stock Trading | Lex Fridman Podcast #159


Chapters

0:0 Introduction
2:28 WallStreetBets and GameStop saga
16:41 Evil shorting and chill shorting
18:47 Hedge funds
24:20 Vlad
31:16 Numerai
58:32 Futre of AI in stock trading
64:11 Numerai data
67:53 Is stock trading gambling or investing?
71:48 What is money?
75:5 Cryptocurrency
78:22 Dogecoin
82:52 Advice for startups
98:43 Book recommendations
100:45 Advice for young people
104:46 Meaning of life

Whisper Transcript | Transcript Only Page

00:00:00.000 | The following is a conversation with Richard Crabe,
00:00:02.600 | founder of Numeri, which is a crowdsourced hedge fund,
00:00:07.600 | very much in the spirit of Wall Street Bets,
00:00:10.000 | but where the trading is done not directly by humans,
00:00:13.600 | but by artificial intelligence systems
00:00:15.920 | submitted by those humans.
00:00:18.300 | It's a fascinating and extremely difficult
00:00:21.080 | machine learning competition,
00:00:22.780 | where the incentives of everybody is aligned,
00:00:26.280 | the code is kept and owned by the people who develop it,
00:00:29.560 | the data, anonymized data, is very well organized
00:00:33.760 | and made freely available.
00:00:35.640 | I think this kind of idea has a chance
00:00:37.960 | to change the nature of stock trading
00:00:40.440 | and even just money management in general
00:00:42.640 | by empowering people who are interested in trading stocks
00:00:47.480 | with the modern and quickly advancing tools
00:00:50.320 | of machine learning.
00:00:51.820 | Quick mention of our sponsors.
00:00:54.160 | Audible Audiobooks, Trial Labs, Machine Learning Company,
00:00:57.840 | Blinkist app that summarizes books,
00:01:00.920 | and Athletic Greens, all-in-one nutrition drink.
00:01:04.760 | Click the sponsor links to get a discount
00:01:06.720 | and to support this podcast.
00:01:08.700 | As a side note, let me say that this whole set of events
00:01:11.300 | around GameStop and Wall Street Bets
00:01:14.440 | has been really inspiring to me as a demonstration
00:01:18.760 | that a distributed system, a large number of regular people
00:01:23.760 | are able to coordinate and collaborate
00:01:26.560 | in taking on the elite centralized power structures,
00:01:31.560 | especially when those elites are misbehaving.
00:01:35.200 | I believe that power in as many cases as possible
00:01:39.360 | should be distributed, and in this case,
00:01:42.040 | the internet as it is for many cases
00:01:44.860 | is the fundamental enabler of that power.
00:01:48.640 | And at the core, what the internet
00:01:50.720 | in its distributed nature represents is freedom.
00:01:53.960 | Of course, the thing about freedom
00:01:55.960 | is it enables chaos or progress, or sometimes both.
00:02:00.960 | And that's kind of the point of the thing.
00:02:04.400 | Freedom is empowering, but ultimately unpredictable.
00:02:09.160 | And I think in the end, freedom wins.
00:02:12.500 | If you enjoy this podcast, subscribe on YouTube,
00:02:15.560 | review it on Apple Podcasts, follow on Spotify,
00:02:18.680 | support on Patreon, or connect with me on Twitter
00:02:21.800 | at Lex Friedman.
00:02:23.320 | And now here's my conversation with Richard Crabe.
00:02:27.140 | From your perspective, can you summarize
00:02:29.840 | the important events around this amazing saga
00:02:32.680 | that we've been living through of Wall Street Bets,
00:02:34.920 | the subreddit and GameStop, and in general,
00:02:38.120 | just what are your thoughts about it
00:02:39.680 | from a technical to the philosophical level?
00:02:42.080 | - I think it's amazing.
00:02:42.920 | It's like my favorite story ever.
00:02:44.920 | Like when I was reading about it,
00:02:47.720 | I was like, this is the best.
00:02:50.400 | And it's also connected with my company,
00:02:53.840 | which we can talk about.
00:02:54.800 | But what I liked about it is like,
00:02:57.080 | I like decentralized coordination
00:02:59.480 | and looking at the mechanisms
00:03:01.560 | that these are Wall Street Bets users use
00:03:04.900 | to hype each other up, to get excited,
00:03:08.240 | to prove that they bought the stock and they're holding.
00:03:11.120 | And then also to see that how big of an impact
00:03:16.000 | that that decentralized coordination had.
00:03:19.120 | It really was a big deal.
00:03:21.240 | - Were you impressed by the distributed coordination,
00:03:24.440 | the collaboration amongst like,
00:03:26.400 | I don't know what the numbers are.
00:03:27.480 | I know in numerized looking at the data,
00:03:30.480 | after all of this is over and done,
00:03:32.640 | it'd be interesting to see like
00:03:34.440 | from a large scale distributed system perspective
00:03:38.200 | to see how everything played out.
00:03:40.340 | But just from your current perspective,
00:03:42.640 | what we know, is it obvious to you
00:03:45.760 | that such incredible level of coordination could happen
00:03:49.920 | where a lot of people come together in a distributed sense,
00:03:52.760 | there's an emergent behavior that happens after that?
00:03:54.960 | - No, it's not at all obvious.
00:03:57.400 | And one of the reasons is the lack of kind of like,
00:04:00.560 | credibility.
00:04:01.780 | To coordinate with someone,
00:04:02.760 | you need to kind of make credible contracts
00:04:04.560 | or credible claims.
00:04:06.600 | So if you have a username on our Wall Street Bets,
00:04:11.600 | like some of them are, like Deep Fucking Value
00:04:14.720 | is one of them.
00:04:15.540 | The actual username, by the way,
00:04:16.960 | we're talking about,
00:04:18.240 | there's a website called Reddit
00:04:19.600 | and there's subreddits on it.
00:04:21.120 | And a lot of people, mostly anonymous,
00:04:24.120 | I think for the most part anonymous,
00:04:27.080 | can create user accounts
00:04:28.240 | and then can just talk on forum like style boards.
00:04:31.240 | You should know what Reddit is.
00:04:32.320 | If you don't know what Reddit is, check it out.
00:04:34.920 | If you don't know what Reddit is,
00:04:36.040 | maybe go to the aww subreddit first,
00:04:40.520 | A-W-W with cute pictures of cats and dogs.
00:04:43.120 | That's my recommendation.
00:04:44.080 | Anyway.
00:04:44.920 | That'd be a good start to Reddit.
00:04:46.480 | When you get into it more,
00:04:47.320 | go to our Wall Street Bets.
00:04:48.960 | - It gets dark quickly.
00:04:50.680 | Oh, we'll probably talk about that too.
00:04:53.040 | So yeah.
00:04:54.660 | So there's these users
00:04:56.240 | and there's no contracts, like you were saying.
00:04:58.160 | - There's no contracts.
00:04:59.000 | The users are anonymous,
00:05:01.440 | but there are little things that do help.
00:05:03.540 | So for example,
00:05:04.440 | if you've posted a really good investment idea in the past,
00:05:07.680 | that exists on Reddit as well.
00:05:10.160 | And it might have lots of upvotes.
00:05:12.640 | And that's also kind of like giving credibility
00:05:14.800 | to your next thing.
00:05:16.960 | And then they are also putting up screenshots.
00:05:20.800 | Like this is the,
00:05:22.480 | here's the trades I've made and here's a screenshot.
00:05:26.120 | Now you could fake the screenshot,
00:05:28.120 | but still it seems like if you've got a lot of karma
00:05:32.160 | and you've had a good performance on the community,
00:05:35.240 | it somehow becomes credible enough
00:05:37.640 | for other people to be like, you know what?
00:05:39.760 | He actually probably did put a million dollars into this.
00:05:43.220 | And you know what?
00:05:44.160 | I can follow that trade easily.
00:05:46.360 | - And there's a bunch of people like that.
00:05:47.680 | So you're kind of integrating all that information
00:05:50.280 | together yourself to see like,
00:05:52.280 | huh, there's something happening here.
00:05:53.560 | And then you jump onto this little boat of like behavior,
00:05:57.480 | like we should buy the stock or sell the stock.
00:06:00.440 | And then another person jumps on,
00:06:02.440 | another person jumps on.
00:06:04.560 | And all of a sudden you have just a huge number of people
00:06:07.300 | behaving in the same direction.
00:06:08.640 | It's like flock of whatever birds.
00:06:10.600 | - Exactly.
00:06:11.440 | What was strange with this one,
00:06:12.500 | it wasn't just let's all buy Tesla.
00:06:15.320 | We love Elon, we love the Tesla, let's all buy Tesla.
00:06:18.800 | Because that we've heard before, right?
00:06:21.040 | Everybody likes Tesla.
00:06:23.960 | Well, now they do.
00:06:25.060 | So what they did with this, in this case,
00:06:29.560 | they're buying a stock that was bad.
00:06:31.360 | They're buying it because it was bad.
00:06:33.440 | And that's really weird because that's a little bit
00:06:36.240 | too galaxy brain for a decentralized community.
00:06:42.240 | How did they come up with it?
00:06:43.240 | How did they know that was the right one?
00:06:44.800 | And the reason they liked it is because it had
00:06:47.260 | really, really high short interest.
00:06:49.880 | It had been shorted more than its own float, I believe.
00:06:53.920 | And so they figured out that if they all bought
00:06:57.520 | this bad stock, they could short squeeze some hedge funds.
00:07:02.520 | And those hedge funds would have to capitulate
00:07:05.240 | and buy the stock at really, really high prices.
00:07:08.400 | - And we should say that shorted means that
00:07:10.560 | these are a bunch of people, when you short a stock,
00:07:13.160 | you're betting on the, you're predicting that the stock
00:07:16.480 | is going to go down and then you will make money if it does.
00:07:19.560 | And then what's a short squeeze?
00:07:22.320 | - It's really that if you are a hedge fund
00:07:24.560 | and you take a big short position in a company,
00:07:27.900 | there's a certain level at which you can't sustain
00:07:32.900 | holding that position.
00:07:34.800 | There's no limit to how high a stock can go,
00:07:37.320 | but there is a limit to how low it can go, right?
00:07:39.720 | So if you short something,
00:07:40.840 | you have infinite loss potential.
00:07:43.200 | And if the stock doubles overnight, like GameStop did,
00:07:47.080 | you're putting a lot of stress on that hedge fund.
00:07:51.920 | And that hedge fund manager might have to say,
00:07:53.720 | "You know what?
00:07:54.560 | "I have to get out of the trade.
00:07:56.280 | "And the only way to get out is to buy the bad stock
00:07:59.520 | "that they don't want, like they believe will go down."
00:08:02.600 | So it's an interesting situation,
00:08:05.400 | particularly because it's not zero sum.
00:08:09.160 | If you say, "Let's all get together
00:08:11.400 | "and make a bubble in watermelons."
00:08:13.640 | You buy a bunch of watermelons, the price goes up,
00:08:16.160 | it comes down again.
00:08:17.240 | It's a zero sum game.
00:08:20.880 | If someone's already shorted a stock
00:08:22.480 | and you can make them short squeeze,
00:08:24.040 | it's actually a positive sum game.
00:08:25.900 | So yes, some Redditors will make a lot of money,
00:08:28.160 | some will lose a lot,
00:08:29.760 | but actually the whole group will make money.
00:08:32.520 | And that's really why it was such a clever thing
00:08:37.000 | for them to do.
00:08:38.160 | - And coupled with the fact that shorting,
00:08:40.680 | I mean, maybe you can push back,
00:08:42.760 | but to me always from an outsider's perspective,
00:08:45.880 | seemed, I hope I'm not using too strong of a word,
00:08:48.920 | but it seemed almost unethical.
00:08:51.240 | Maybe not unethical, maybe it's just a asshole thing to do.
00:08:54.640 | Okay, I'm speaking not from an economics
00:08:57.800 | or financial perspective,
00:08:58.840 | I'm speaking from just somebody who loves,
00:09:02.400 | I'm a fan of a lot of people,
00:09:03.560 | I love celebrating the success of a lot of people.
00:09:07.040 | And this is like the stock market equivalent of like haters.
00:09:12.040 | I know that's not what it is.
00:09:14.460 | I know that there's efficient,
00:09:15.800 | you wanna have an economy efficient mechanism
00:09:18.280 | for punishing sort of overhyped, overvalued things.
00:09:23.280 | That's what shorting I guess is designed for,
00:09:26.160 | but it just always felt like these people are just,
00:09:29.600 | because they're not just betting on the loss of the company.
00:09:33.540 | It feels like they're also using their leverage and power
00:09:37.380 | to manipulate media or just to write articles
00:09:40.820 | or just to hate on you on social media.
00:09:43.460 | Then you get to see that with Elon Musk and so on.
00:09:46.540 | So this is like the man,
00:09:50.540 | so people like hedge funds that were shorting
00:09:53.820 | are like the sort of embodiment of the evil
00:09:58.780 | or just the bad guy,
00:10:00.300 | the overpowerful that's misusing their power.
00:10:02.900 | And here's the crowd,
00:10:04.180 | the people that are standing up and rising up.
00:10:06.420 | So it's not just that they were able to collaborate
00:10:10.540 | on Wall Street bets to sort of effectively
00:10:13.380 | make money for themselves.
00:10:14.860 | It's also that this is like a symbol
00:10:17.420 | of the people getting together
00:10:19.460 | and fighting the centralized elites, the powerful.
00:10:23.420 | And that, I don't know what your thoughts are
00:10:27.180 | about that in general.
00:10:28.620 | At this stage, it feels like that's really exciting
00:10:32.180 | that people have power,
00:10:35.340 | just like regular people have power.
00:10:38.220 | At the same time, it's scary a little bit
00:10:40.820 | because just studying history,
00:10:44.340 | people can be manipulated by charismatic leaders.
00:10:48.060 | And so just like Elon right now is manipulating,
00:10:54.260 | encouraging people to buy Dogecoin or whatever,
00:10:56.740 | there can be good charismatic leaders
00:11:00.820 | and there can be bad charismatic leaders.
00:11:02.620 | And so it's nerve wracking.
00:11:04.620 | It's a little bit scary how much power Subreddit can have
00:11:08.620 | to destroy somebody.
00:11:11.260 | Because right now we're celebrating
00:11:13.020 | they might be attacking or destroying somebody
00:11:15.060 | that everybody doesn't like.
00:11:17.180 | But what if they attack somebody
00:11:19.180 | that is actually good for this world?
00:11:21.060 | So that, and that's kind of the awesomeness
00:11:25.980 | and the price of freedom.
00:11:28.580 | It's like it could destroy the world
00:11:31.220 | or it can save the world.
00:11:32.660 | But at this stage, it feels like, I don't know,
00:11:35.220 | overall, when you sit back,
00:11:36.740 | do you think this was just a positive wave
00:11:39.660 | of emergent behavior?
00:11:41.100 | Is there something negative about what happened?
00:11:43.300 | - Well, yeah, the cool thing is
00:11:45.180 | the Reddit people weren't doing anything exotic.
00:11:50.180 | It was a creative trade, but it wasn't exotic.
00:11:55.140 | It was just buying the stock.
00:11:57.740 | Okay, maybe they bought some options too.
00:12:00.340 | But it was the hedge fund that was doing the exotic thing.
00:12:03.940 | So I like that.
00:12:06.860 | It's hard to say, well, we've got together
00:12:10.620 | and we've pulled all our money together
00:12:13.020 | and now there's a company out there that's worth more.
00:12:16.540 | What's wrong with that?
00:12:17.700 | - Yeah. - Right?
00:12:18.660 | But it doesn't talk about the motivations,
00:12:21.100 | which is, and then we destroyed some hedge funds
00:12:23.380 | in the process.
00:12:25.100 | - Is there something to be said about the humor
00:12:28.260 | and the, I don't know, the edginess,
00:12:31.900 | sometimes viciousness of that subreddit?
00:12:34.300 | I haven't looked at it too much,
00:12:36.540 | but it feels like people can be quite aggressive on there.
00:12:40.980 | So is there, what is that?
00:12:45.260 | Is that what freedom looks like?
00:12:48.180 | - I think it does, yeah.
00:12:50.180 | You definitely need to let people,
00:12:52.460 | one of the things that people have compared it to
00:12:54.820 | is the Occupy Wall Street,
00:12:56.820 | which is, let's say, some very sincere liberals,
00:13:01.460 | like 23 years old, whatever,
00:13:03.580 | and they go out with signs
00:13:05.060 | and they have some kind of case to make.
00:13:07.460 | But this isn't sincere, really.
00:13:11.820 | It's like a little bit more nihilistic,
00:13:14.780 | a little bit more YOLO,
00:13:16.140 | and therefore a little bit more scary
00:13:19.420 | because who's scared of the Occupy Wall Street people
00:13:22.820 | with the signs?
00:13:23.700 | Nobody.
00:13:24.580 | But these hedge funds really are scared.
00:13:26.060 | I was scared of the Wall Street bats people.
00:13:29.860 | I'm still scared of them.
00:13:31.940 | - Yeah, the anonymity is a bit terrifying and exciting.
00:13:36.940 | - Yeah.
00:13:37.940 | - I mean, yeah, I don't know what to do with it.
00:13:40.580 | I've been following events in Russia, for example.
00:13:43.580 | It's like there's a struggle between centralized power
00:13:46.220 | and the distributed.
00:13:47.540 | I mean, that's the struggle of the history
00:13:50.340 | of human civilization, right?
00:13:51.820 | But this, on the internet,
00:13:54.100 | just that you can multiply people.
00:13:58.100 | Some of them don't have to be real.
00:13:59.980 | You can probably create bots.
00:14:01.620 | It starts getting me, as a programmer,
00:14:05.540 | I start to think, hmm, me as one person,
00:14:08.820 | how much chaos can I create by writing some bots?
00:14:12.300 | - Yeah.
00:14:13.420 | - And I'm sure I'm not the only one thinking that.
00:14:17.180 | I'm sure there's hundreds, thousands
00:14:19.820 | of good developers out there listening to this,
00:14:22.460 | thinking the same thing.
00:14:23.900 | And then as that develops further and further
00:14:26.580 | in the next decade or two,
00:14:28.340 | what impact does that have on financial markets,
00:14:30.580 | on just destruction of reputations,
00:14:34.860 | or politics, the bickering of left and right
00:14:41.500 | political discourse, the dynamics of that
00:14:44.060 | being manipulated by, people talk about Russian bots
00:14:48.500 | or whatever.
00:14:49.900 | We're probably in the very early stage of that, right?
00:14:52.700 | - Exactly.
00:14:54.020 | - And this is a good example.
00:14:56.580 | So do you have a sense that most of Wall Street Bets folks
00:15:00.580 | are actually individual people?
00:15:02.620 | Right, that's the feeling I have,
00:15:04.020 | is there's just individual, maybe young investors
00:15:06.780 | just doing a little bit of an investment,
00:15:08.740 | but just on a large scale.
00:15:10.140 | - Yeah, exactly.
00:15:11.220 | The reason I found out,
00:15:12.740 | I've known about Wall Street Bets for a while,
00:15:14.060 | but the reason I found out about GameStop was this,
00:15:16.620 | I met somebody at a party who told me about it,
00:15:18.940 | and he was like 21 years old,
00:15:20.300 | and he's like, it's gonna go up 100%,
00:15:22.260 | in the next one day.
00:15:23.900 | - Are we talking about last year?
00:15:26.060 | - This was probably, no, this was, yeah, a few days ago.
00:15:29.260 | Yeah, it was like maybe two weeks ago or something.
00:15:33.340 | So it was already high, GameStop,
00:15:37.460 | but it was just strange to me that there was someone
00:15:39.340 | telling me at a party how to trade stocks,
00:15:42.780 | who was like 21 years old,
00:15:44.380 | and I started to look into it,
00:15:49.100 | and yeah, and he did make,
00:15:51.540 | he made 140% in one day, he was right,
00:15:55.540 | and now he's supercharged.
00:15:57.900 | He's a little bit wealthier,
00:15:58.980 | and now he's gonna wait for the next thing,
00:16:01.180 | and this decentralized entity
00:16:03.100 | is just gonna get bigger and bigger.
00:16:04.580 | - And they're gonna together search for the next thing.
00:16:06.580 | So there's thousands of folks like him,
00:16:08.820 | and they're going to probably search
00:16:10.020 | for the next thing to attack.
00:16:11.580 | People that have power in this world,
00:16:13.420 | that sit there with power right now,
00:16:15.700 | in government, in finance, in any kind of position,
00:16:20.220 | are probably a little bit scared right now,
00:16:22.900 | and honestly, that's probably a little bit good.
00:16:26.260 | It's dangerous, but it's good.
00:16:28.060 | - Yeah, it certainly makes you think twice about shorting.
00:16:31.380 | It certainly makes you think twice
00:16:32.700 | about putting a lot of money into a short.
00:16:34.980 | Like these funds put a lot into one or two names,
00:16:38.620 | and so it was very, very badly risk managed.
00:16:41.700 | - Do you think shorting is,
00:16:44.500 | can you speak at a high level,
00:16:45.980 | just for your own as a person,
00:16:47.900 | is it good for the world?
00:16:50.060 | Is it good for markets?
00:16:52.140 | - I do think that there are two kinds of shorting.
00:16:55.140 | Evil shorting.
00:16:56.140 | (laughing)
00:16:58.300 | And chill shorting.
00:16:59.380 | Evil shorting is what Melvin Capital was doing.
00:17:04.460 | And it's like you put a huge position down,
00:17:08.780 | you get all your buddies to also short it,
00:17:10.780 | and you start making press,
00:17:12.420 | and trying to bring this company down.
00:17:16.660 | And I don't think, in some cases,
00:17:19.340 | you go out after fraudulent companies,
00:17:22.300 | say this company's a fraud,
00:17:24.100 | maybe that's okay.
00:17:25.620 | But they weren't even saying it,
00:17:27.620 | they're just saying it's a bad company,
00:17:29.180 | and we're gonna bring it to the ground,
00:17:30.980 | bring it to its knees.
00:17:32.100 | A quant fund, like Numerai,
00:17:37.060 | we always have lots of positions,
00:17:39.100 | and we never have a position
00:17:40.220 | that's more than 1% of our fund.
00:17:43.380 | So we actually have, right now, 250 shorts.
00:17:48.220 | I don't know any of them, except for one,
00:17:52.020 | because it was one of the meme stocks.
00:17:53.860 | (laughing)
00:17:56.100 | But we shorting them, not to make them go,
00:18:01.820 | we don't even want them to go down necessarily.
00:18:04.260 | That doesn't sound a bit strange that I say that,
00:18:05.900 | but we just want them to not go up
00:18:09.660 | as much as our longs.
00:18:11.380 | - Right.
00:18:12.220 | - So by shorting a little bit,
00:18:16.220 | we can actually go long more
00:18:17.860 | in the things we do believe in.
00:18:19.540 | So when we were going long in Tesla,
00:18:22.340 | we could do it with more money than we had,
00:18:24.860 | because we'd borrow from banks,
00:18:27.740 | who would lend us money,
00:18:29.580 | because we had longs and shorts,
00:18:32.180 | because we didn't have market exposure,
00:18:34.580 | we didn't have market risk.
00:18:35.740 | And so I think that's a good thing,
00:18:37.340 | because that means we can short the oil companies
00:18:41.020 | and go long Tesla and make the future come forward faster.
00:18:44.140 | And I do think that's not a bad thing.
00:18:46.100 | - So we talked about this incredible distributed system
00:18:50.980 | created by Wall Street Bets.
00:18:52.580 | And then there's a platform, which is Robinhood,
00:18:56.900 | which allows investors to efficiently,
00:18:58.980 | as far as you can correct me if I'm wrong,
00:19:00.460 | but there's those and there's others
00:19:03.180 | and there's Numeri that allow you to make it accessible
00:19:07.180 | for people to invest.
00:19:09.540 | But that said, Robinhood was, in a centralized way,
00:19:14.540 | applied its power to restrict trading
00:19:17.700 | on the stock that we're referring to.
00:19:19.860 | Do you have a thought on actually,
00:19:21.820 | like the things that happened?
00:19:23.900 | I don't know how much you were paying attention
00:19:26.540 | to sort of the shadiness around the whole thing.
00:19:30.340 | Do you think it was forced to do it?
00:19:32.460 | Or was there something shady going on?
00:19:34.380 | What are your thoughts in general?
00:19:36.020 | - Well, I think I wanna see the alternate history.
00:19:39.820 | Like I wanna see the counterfactual history
00:19:42.940 | of them not doing that.
00:19:44.100 | - Not doing it.
00:19:44.940 | - How bad would it have gotten for hedge funds?
00:19:47.940 | How much more damage could have been done
00:19:50.140 | if the momentum of these short squeezes could continue?
00:19:53.340 | What happens when there are short squeezes,
00:19:58.700 | even if they're in a few stocks,
00:20:01.860 | they affect kind of all the other shorts too.
00:20:04.340 | And suddenly brokers are saying things like,
00:20:07.100 | you need to put up more collateral.
00:20:08.700 | So we had a short.
00:20:09.940 | It wasn't GameStop, luckily, it was BlackBerry.
00:20:15.020 | And it went up like 100% in a day.
00:20:16.900 | It was one of these meme stocks, super bad company.
00:20:19.020 | The AIs don't like it, okay?
00:20:21.100 | The AIs think it's going down.
00:20:22.580 | - What's a meme stock?
00:20:23.820 | - A meme stock is kind of a new term for these stocks
00:20:27.580 | that catch mimetic momentum on Reddit.
00:20:33.380 | And so the meme stocks were GameStop, the biggest one,
00:20:37.460 | GameStonk, as Elon calls it, AMC.
00:20:40.860 | And BlackBerry was one, Nokia was one.
00:20:46.460 | So these are high short interest stocks as well.
00:20:49.420 | So these are targeted stocks.
00:20:51.380 | Some people say, isn't it adorable
00:20:54.300 | that these people are investing money
00:20:57.380 | in these companies that are nostalgic?
00:21:00.540 | It's like you go into the AMC movie theater,
00:21:02.780 | it's like nostalgic.
00:21:03.700 | It's like, no, it's not why they're doing it.
00:21:06.220 | It's that they had a lot of short interest.
00:21:08.100 | That was the main thing.
00:21:09.460 | And so they were high chance of short squeeze.
00:21:12.660 | - In saying, I would love to see an alternate history,
00:21:15.180 | do you have a sense that that,
00:21:17.100 | what is your prediction
00:21:19.300 | of what that history would have looked like?
00:21:20.900 | - Well, you wouldn't have needed very many more days
00:21:23.980 | of that kind of chaos to hurt hedge funds.
00:21:29.340 | I think it's underrated how damaging it could have been.
00:21:32.820 | Because when your shorts go up,
00:21:38.300 | your collateral requirements for them go up.
00:21:41.500 | Similar to Robinhood.
00:21:42.780 | Like we have a prime broker that said to us,
00:21:46.940 | you need to put up like $40 per $100 of short exposure.
00:21:51.940 | And then the next day they said,
00:21:53.100 | actually you have to put up all of it, 100%.
00:21:56.940 | And we were like, what?
00:21:59.060 | But if that happens to all the short,
00:22:02.540 | all the commonly held hedge fund shorts,
00:22:05.660 | because they're all kind of holding the same things.
00:22:08.420 | If that happens, not only do you have to cover the short,
00:22:13.420 | which means you're buying the bad companies,
00:22:17.300 | you need to sell your good companies
00:22:19.860 | in order to cover the short.
00:22:22.140 | So suddenly like all the good companies,
00:22:24.780 | all the ones that the hedge funds like are coming down
00:22:27.380 | and all the ones that the hedge funds hate are going up
00:22:30.860 | in a cascading way.
00:22:35.420 | So I believe that if you could have had a few more days
00:22:38.740 | of GameStop doubling, AMC doubling,
00:22:42.500 | you would have had more and more hedge fund deleveraging.
00:22:45.500 | - But so hedge funds, I mean, they get a lot of shit,
00:22:48.420 | but do you have a sense that they do some good
00:22:52.540 | for the world?
00:22:53.500 | I mean, ultimately, so, okay, first of all,
00:22:55.980 | Wall Street Bets itself is a kind of distributed hedge fund,
00:22:59.340 | Numeri is a kind of hedge fund.
00:23:01.580 | So hedge fund is a very broad category.
00:23:04.060 | I mean, like if some of those were destroyed,
00:23:07.260 | would that be good for the world?
00:23:08.940 | Or would there be coupled with the destroying
00:23:13.940 | the evil shorting, would there be just a lot of pain
00:23:16.700 | in terms of investment in good companies?
00:23:18.980 | - Yeah, a thing I like to tell people
00:23:21.380 | if they hate hedge funds is,
00:23:23.700 | I don't think you wanna rerun American economic history
00:23:27.980 | without hedge funds.
00:23:29.700 | - So on mass, they're good.
00:23:34.020 | - Yeah, you really wouldn't want to,
00:23:36.060 | because hedge funds are kind of like picking up,
00:23:38.660 | they're making liquidity, right, in stocks.
00:23:41.220 | And so if you love venture capitalists,
00:23:45.580 | they're investing in new technology, it's so good,
00:23:48.380 | you have to also kind of like hedge funds,
00:23:50.580 | because they're the reason venture capitalists exist,
00:23:54.060 | because their companies can have a liquidity event
00:23:56.540 | when they go to the public markets.
00:23:58.980 | So it's kind of essential that we have them.
00:24:01.980 | There are many different kinds of them.
00:24:04.460 | I believe we could maybe get away
00:24:06.260 | with only having an AI hedge fund,
00:24:09.620 | but we don't necessarily need these evil
00:24:14.020 | billions type hedge funds that make the media
00:24:16.820 | and try to kill companies,
00:24:18.660 | but we definitely need hedge funds.
00:24:20.180 | - Maybe from your perspective,
00:24:21.740 | because you run such an organization,
00:24:26.740 | and Vlad, the CEO of Robinhood,
00:24:30.740 | sort of had to make decisions really quickly,
00:24:32.700 | probably had to wake up in the middle of the night
00:24:34.420 | kind of thing, and he also had a conversation
00:24:38.220 | with Elon Musk on Clubhouse, which I just signed up for.
00:24:42.340 | It was a fascinating, one of the great
00:24:44.660 | journalistic performances of our time with Elon Musk.
00:24:49.180 | - Pull a surprise for Elon.
00:24:51.100 | - How hilarious would it be if he gets a pull surprise?
00:24:53.900 | (laughing)
00:24:55.900 | And then his Wikipedia would be like,
00:24:57.340 | journalist and part-time entrepreneur.
00:25:00.460 | - Business magnate.
00:25:01.300 | - And business magnate.
00:25:03.900 | I don't know if you can comment on any aspects of that,
00:25:06.740 | but if you were Vlad, how would you do things differently?
00:25:10.100 | What are your thoughts about his interaction with Elon,
00:25:13.340 | how he should have played it differently?
00:25:16.980 | - I guess there's a lot of aspects to this interaction.
00:25:19.380 | One is about transparency, like how much do you want
00:25:22.340 | to tell people about really what went down,
00:25:24.860 | there's NDAs potentially involved,
00:25:27.280 | how much in private do you want to push back
00:25:33.260 | and say no, fuck you, to centralized power?
00:25:36.860 | Whatever the phone calls you're getting,
00:25:38.140 | which I'm sure he was getting some kind of phone calls
00:25:40.820 | that might not be contractual,
00:25:42.660 | like it's not contracts that are forcing him,
00:25:44.800 | but he was being, what do you call it,
00:25:48.000 | like pressured to behave in certain kinds of ways
00:25:50.560 | from all kinds of directions.
00:25:52.200 | Like what do you take from this whole situation?
00:25:56.360 | - I was very excited to see Vlad's response.
00:25:58.920 | I mean, it was pretty cool to have him talk to Elon.
00:26:01.480 | And one of the things that struck me
00:26:03.400 | in the first few seconds of Vlad speaking was,
00:26:06.840 | I was like, is Vlad like a boomer?
00:26:10.920 | (laughing)
00:26:12.680 | Like, but here we are.
00:26:13.960 | He seemed like a 55 year old man talking to a 20 year old.
00:26:18.680 | Elon was like the 20 year old,
00:26:20.520 | and he's like the 55 year old man.
00:26:22.520 | You can see why Citadel and him are buddies, right?
00:26:26.080 | Like you can, you can see why.
00:26:27.880 | It's like, this is a nice, it's not a bad thing.
00:26:30.240 | It's like he's got a respectable, professional attitude.
00:26:35.240 | - Well, he also tried to do like a jokey thing,
00:26:38.200 | like, no, we're not being ageist here, boomer,
00:26:42.700 | but like a 60 year old CEO of Bank of America
00:26:46.920 | would try to make a joke for the kids.
00:26:48.480 | That's what Vlad sounded like.
00:26:49.920 | - Yeah, I was like, what is this?
00:26:51.840 | This guy's like, what is he, 30?
00:26:53.760 | - Yeah.
00:26:55.080 | - And I'm like, this is weird.
00:26:56.920 | - Yeah.
00:26:58.120 | - But I think, and maybe that's also what I like
00:27:00.560 | about Elon's kind of influence on American business.
00:27:03.560 | It's like, he's super like anti the professional.
00:27:07.320 | Like why say, you know, a hundred words about nothing?
00:27:13.140 | And so I liked how he was cutting in and saying,
00:27:15.340 | Vlad, what do you mean?
00:27:16.180 | Spill the beans, bro.
00:27:17.460 | - Yeah, so you don't have to be courteous.
00:27:19.540 | It's like the first principles thinking,
00:27:21.100 | it's like, what the hell happened?
00:27:23.260 | - Yes.
00:27:24.100 | - And let's just talk like normal people.
00:27:26.580 | The problem of course is, you know, for Elon,
00:27:31.580 | it's cost them, what is it?
00:27:33.300 | Tens of millions of dollars, his tweeting like that.
00:27:36.540 | But perhaps it's a worthy price to pay
00:27:39.660 | because ultimately there's something magical
00:27:42.160 | about just being real and honest
00:27:45.820 | and just going off the cuff and making mistakes
00:27:48.340 | and paying for them, but just being real.
00:27:50.800 | And then moments like this,
00:27:52.900 | that was an opportunity for Vlad to be that.
00:27:55.060 | And it felt like he wasn't.
00:27:56.460 | Do you think we'll ever find out what really went down
00:28:02.380 | if there was something shady underneath it all?
00:28:05.220 | - Yeah, I mean, it would be sad if nothing shady happened.
00:28:09.540 | - Right.
00:28:10.380 | - And then his presence made it shady.
00:28:12.160 | - Sometimes I feel like that would Mark Zuckerberg,
00:28:15.280 | the CEO of Facebook.
00:28:18.040 | Sometimes I feel like, yeah,
00:28:19.080 | there's a lot of shitty things that Facebook is doing,
00:28:22.320 | but sometimes I think he makes it look worse
00:28:25.160 | by the way he presents himself about those things.
00:28:28.500 | Like, I honestly think that a large amount of people
00:28:31.440 | at Facebook just have a huge, unstable, chaotic system
00:28:36.840 | and they're all, not all, but a mass are trying to do good
00:28:40.780 | with this chaotic system.
00:28:42.340 | But the presentation is like,
00:28:43.820 | it sounds like there's a lot of back room conversations
00:28:47.700 | that are trying to manipulate people.
00:28:49.940 | And there's something about the realness that Elon has
00:28:53.440 | that it feels like CEO should have
00:28:55.780 | and Vlad had that opportunity.
00:28:57.260 | - I think Mark Zuckerberg had that too when he was younger.
00:28:59.900 | - Younger.
00:29:00.740 | - And somebody said, you gotta be more professional, man.
00:29:02.860 | You can't say, you know, lol to an interview.
00:29:06.960 | And then suddenly he became like this distant person
00:29:10.800 | that was hot.
00:29:11.640 | Like, you'd rather have him make mistakes, but be honest,
00:29:14.960 | than be like professional and never make mistakes.
00:29:18.080 | - Yeah, one of the difficult hires, I think,
00:29:22.080 | is like marketing people or like PR people
00:29:25.760 | is you have to hire people that get the fact
00:29:28.360 | that you can say lol on an interview.
00:29:31.040 | Or like, you know, take risks as opposed to what the PR,
00:29:36.040 | I've talked to quite a few big CEOs
00:29:39.840 | and the people around them are trying
00:29:44.480 | to constantly minimize risk of like,
00:29:46.960 | what if he says the wrong thing?
00:29:48.080 | What if she says the wrong thing?
00:29:49.560 | It's like, what?
00:29:50.400 | Like, be careful.
00:29:51.440 | It's constantly like, ooh, like, I don't know.
00:29:53.800 | And there's this nervous energy that builds up over time
00:29:56.680 | with larger, larger teams where the whole thing,
00:29:59.840 | like I visited YouTube, for example,
00:30:01.560 | everybody I talked to at YouTube,
00:30:03.560 | incredible engineering and incredible system,
00:30:06.400 | but everybody's scared.
00:30:07.680 | Like, let's be honest about this like madness
00:30:13.140 | that we have going on of huge amounts of video
00:30:15.900 | that we can't possibly ever handle.
00:30:17.700 | There's a bunch of hate on YouTube.
00:30:19.760 | There's this chaos of comments,
00:30:21.560 | bunch of conspiracy theories, some of which might be true.
00:30:24.320 | And then just like this mess that we're dealing with
00:30:27.440 | and it's exciting, it's beautiful.
00:30:30.200 | It's a place where like democratizes education,
00:30:32.880 | all that kind of stuff.
00:30:34.240 | And instead they're all like sitting in like,
00:30:36.960 | trying to be very polite and saying like,
00:30:38.760 | well, we're just want to improve the health of our platform.
00:30:41.680 | Like, it's like this discussion like,
00:30:44.800 | all right, man, let's just be real.
00:30:46.120 | Let's both advertise how amazing this fricking thing is,
00:30:50.560 | but also to say like, we don't know what we're doing.
00:30:53.280 | We have all these Nazis posting videos on YouTube.
00:30:56.100 | We don't know how to like handle it.
00:30:58.040 | And just being real like that,
00:31:00.080 | 'cause I suppose that's just a skill.
00:31:02.240 | Maybe it can't be taught, but over time,
00:31:05.120 | the whatever the dynamics of the company is,
00:31:07.000 | it does seem like Zuckerberg and others get worn down.
00:31:09.840 | They just get tired.
00:31:11.000 | - Yeah.
00:31:11.960 | - They get tired of--
00:31:12.800 | - Not being real.
00:31:13.760 | - Of not being real, which is sad.
00:31:16.400 | So let's talk about Numeroid,
00:31:17.800 | which is an incredible company system idea, I think.
00:31:24.720 | But good place to start.
00:31:26.540 | What is Numeroid and how does it work?
00:31:29.480 | - So Numeroid is the first hedge fund
00:31:33.020 | that gives away all of its data.
00:31:35.200 | So this is like probably the last thing
00:31:37.700 | a hedge fund would do, right?
00:31:39.180 | Why would we give away a data?
00:31:40.300 | It's like giving away your edge.
00:31:41.940 | But the reason we do it is because we're looking for people
00:31:46.820 | to model our data.
00:31:49.060 | And the way we do it is by obfuscating the data.
00:31:52.860 | So when you look at Numeroid's data
00:31:55.360 | that you can download for free,
00:31:57.520 | it just looks like a million rows
00:32:00.000 | and of numbers between zero and one.
00:32:02.920 | And you have no idea what the columns mean.
00:32:05.280 | But you do know that if you're good at machine learning
00:32:09.100 | or have done regressions before,
00:32:11.520 | you know that I can still find patterns in this data,
00:32:14.640 | even though I don't know what the features mean.
00:32:17.560 | - And the data itself is a time series data.
00:32:20.480 | And even though it's obfuscated, anonymized,
00:32:24.460 | what is the source data?
00:32:25.940 | Like approximately, what are we talking about?
00:32:27.940 | - So we are buying data from lots of different data vendors.
00:32:32.380 | And they would also never want us to share that data.
00:32:35.700 | So we have strict contracts with them.
00:32:38.660 | But that's the kind of data you could never buy yourself
00:32:43.500 | unless you had maybe a million dollars a year
00:32:46.760 | of budget to buy data.
00:32:48.740 | So what's happened with the hedge fund industry
00:32:50.600 | is you have a lot of talented people
00:32:54.600 | who used to be able to trade and still can trade,
00:32:59.600 | but now they have such a data disadvantage,
00:33:02.180 | it would never make sense for them to trade themselves.
00:33:07.100 | But Numeroid, by giving away this obfuscated data,
00:33:09.620 | we can give them a really, really high quality data set
00:33:12.220 | that would otherwise be very expensive.
00:33:14.820 | And they can use whatever new machine learning technique
00:33:18.660 | they want to find patents in that data
00:33:22.060 | that we can use in our hedge fund.
00:33:24.020 | - And so how much variety is there in underlying data?
00:33:27.060 | We're talking about, I apologize if I'm using the wrong terms
00:33:31.100 | but one is just like the stock price.
00:33:34.380 | The other, there's like options and all that kind of stuff,
00:33:36.820 | like the, what are they called, order books or whatever.
00:33:39.780 | Is there maybe other totally unrelated
00:33:45.180 | to directly to the stock market data?
00:33:47.140 | Like natural language as well, all that kind of stuff.
00:33:51.340 | - Yeah, we were really focused on stock data
00:33:55.180 | that's specific to stocks.
00:33:56.620 | So things like you can have like a P,
00:33:59.900 | every stock has like a PE ratio.
00:34:01.940 | For some stocks, it's not as meaningful,
00:34:03.820 | but every stock has that.
00:34:05.600 | Every stock has one year momentum,
00:34:07.620 | how much they went up in the last year.
00:34:09.800 | But those are very common factors.
00:34:12.580 | But we try to get lots and lots of those factors
00:34:15.460 | that we have for many, many years,
00:34:17.060 | like 15, 20 years history.
00:34:19.560 | And then the setup of the problem is commonly in quant
00:34:25.740 | called like cross-sectional global equity.
00:34:28.180 | You're not really trying to say,
00:34:29.140 | I want, I believe the stock will go up.
00:34:31.960 | You're trying to say the like relative position
00:34:35.500 | of this stock in feature space
00:34:37.940 | makes it not a bad buy in a portfolio.
00:34:41.740 | - So it captures some period of time
00:34:44.540 | and you're trying to find the patterns,
00:34:46.020 | the dynamics captured by the data of that period of time
00:34:49.940 | in order to make short-term predictions
00:34:51.740 | about what's going to happen.
00:34:53.180 | - Yeah, so our predictions are also not that short.
00:34:55.700 | We're not really caring about things like order books
00:34:58.740 | and tick data, not high frequency at all.
00:35:02.340 | We're actually holding things for quite a bit longer.
00:35:05.060 | So our prediction time horizon is about one month.
00:35:07.800 | We end up holding stocks for maybe like three or four months.
00:35:10.740 | So I kind of believe that's a little bit more like
00:35:13.380 | investing than kind of plumbing.
00:35:17.820 | Like to go long a stock that's mispriced on one exchange
00:35:21.860 | and short on another exchange, that's just arbitrage.
00:35:26.140 | But what we're trying to do is really know something more
00:35:30.020 | about the longer term future of the stock.
00:35:31.900 | - Yeah, so from the patterns,
00:35:33.300 | from these like periods of time series data,
00:35:36.980 | you're trying to understand something fundamental
00:35:39.460 | about the stock, not like about deep value,
00:35:43.260 | about like it's big in the context of the market,
00:35:46.300 | is it underpriced, overpriced, all that kind of stuff.
00:35:48.860 | So like, this is about investing.
00:35:50.780 | It's not about just like you said, high frequency trading,
00:35:54.620 | which I think is a fascinating open question
00:35:57.040 | from a machine learning perspective.
00:35:58.460 | But just to like sort of build on that.
00:36:00.880 | So you've anonymized the data
00:36:02.820 | and now you're giving away the data.
00:36:05.260 | And then now anyone can try to build algorithms
00:36:11.980 | that make investing decisions on top of that data
00:36:14.740 | or predictions on the top of that data.
00:36:16.460 | - Exactly.
00:36:17.300 | - And so that's, what does that look like?
00:36:21.380 | What's the goal of that?
00:36:22.220 | What are the underlying principles of that?
00:36:24.440 | - So the first thing is,
00:36:26.340 | we could obviously model that data in house, right?
00:36:29.060 | We can make an XGBoost model on the data
00:36:32.220 | and that would be quite good too.
00:36:36.200 | But what we're trying to do is by opening it up
00:36:40.260 | and letting anybody participate,
00:36:42.160 | we can do quite a lot better than if we modeled it ourselves
00:36:47.580 | and a lot better on the stock market
00:36:50.140 | doesn't need to be very much.
00:36:52.100 | Like it really matters the difference
00:36:53.980 | between if you can make 10 and 12%
00:36:56.540 | in an equity market neutral hedge fund,
00:36:58.560 | because the whole, usually you're charging 2% fees.
00:37:02.780 | So if you can do 2% better,
00:37:05.060 | that's like all your fees, it's worth it.
00:37:07.580 | So we're trying to make sure
00:37:09.460 | that we always have the best possible model
00:37:11.340 | as new machine learning libraries come out,
00:37:13.300 | new techniques come out,
00:37:14.940 | they get automatically synthesized.
00:37:16.760 | Like if there's a great paper on supervised learning,
00:37:19.840 | someone on Numeri will figure out
00:37:21.820 | how to use it on Numeri's data.
00:37:23.500 | - And is there an ensemble of models going on
00:37:28.500 | or is it more towards kind of like one or two or three,
00:37:33.460 | like best performing models?
00:37:35.380 | - So the way we decide on how to weight
00:37:37.800 | all of the predictions together
00:37:40.380 | is by how much the users are staking on them.
00:37:44.740 | How much of the cryptocurrency
00:37:46.300 | that they're putting behind their models.
00:37:48.340 | So they're saying, I believe in my model.
00:37:51.340 | You can trust me because I'm going to put skin in the game.
00:37:54.300 | And so we can take the stake weighted predictions
00:37:57.980 | from all our users, add those together,
00:38:01.060 | average those together,
00:38:02.280 | and that's a much better model than any one model
00:38:05.980 | in the sum because ensembling a lot of models together
00:38:08.840 | is kind of the key thing you need to do in investing too.
00:38:12.440 | - Yeah, so you're putting,
00:38:14.520 | so there's a kind of duality from the user,
00:38:16.800 | from the perspective of a machine learning engineer,
00:38:19.400 | where you're, it's both a competition,
00:38:21.280 | just a really interesting,
00:38:22.520 | difficult machine learning problem,
00:38:24.760 | and it's a way to invest algorithmically.
00:38:29.760 | So like, and, but the way to invest algorithmically
00:38:33.840 | also is a way to put skin in the game
00:38:37.400 | that communicates to you that you're,
00:38:40.380 | the quality of the algorithm,
00:38:42.880 | and also forces you to really be serious
00:38:46.980 | about the models that you build.
00:38:49.500 | So it's like, everything just works nicely together.
00:38:52.340 | Like, I guess one way to say that
00:38:54.700 | is the interests are aligned.
00:38:57.180 | - Exactly.
00:38:58.020 | - Okay, so it's just like poker is not fun
00:39:02.380 | when it's like for very low stakes.
00:39:04.840 | The higher the stakes,
00:39:05.920 | the more the dynamics of the system
00:39:07.520 | starts playing out correctly.
00:39:09.240 | Like as a small side note,
00:39:12.160 | is there something you can say about which kind,
00:39:16.520 | looking at the big broad view of machine learning today,
00:39:19.300 | or AI, what kind of algorithms seem to do good
00:39:24.300 | in these kinds of competitions at this time?
00:39:27.200 | Is there some universal thing you can say,
00:39:30.040 | like neural networks suck,
00:39:32.120 | recurring neural networks suck,
00:39:33.280 | transformers suck, or they're awesome,
00:39:36.200 | like old school, sort of more basic kind of classifiers
00:39:39.920 | are better, all that.
00:39:40.760 | Is there some kind of conclusions so far that you can say?
00:39:44.120 | - There is, there's definitely something pretty nice
00:39:46.280 | about tree models, like XGBoost.
00:39:50.240 | And they just seem to work pretty nicely
00:39:53.200 | on this type of data.
00:39:54.960 | So out of the box, if you're trying to come 100th
00:39:59.560 | in the competition, in the tournament,
00:40:01.480 | maybe you would try to use that.
00:40:03.080 | But what's particularly interesting about the problem
00:40:09.160 | that not many people understand,
00:40:11.880 | if you're familiar with machine learning,
00:40:14.780 | this typically will surprise you when you model our data.
00:40:18.520 | So one of the things that you look at in finance
00:40:23.520 | is you don't want to be too exposed to any one risk.
00:40:28.360 | Like even if the best sector in the world
00:40:33.360 | to invest in over the last 10 years was tech,
00:40:36.700 | does not mean you should put all of your money into tech.
00:40:40.840 | So if you train a model, it would say,
00:40:43.440 | put all your money into tech, it's super good.
00:40:45.760 | But what you want to do is actually be very careful
00:40:49.160 | of how much of this exposure you have to certain features.
00:40:53.440 | So on Numerai, what a lot of people figure out
00:40:57.800 | is actually if you train a model on this kind of data,
00:41:02.160 | you want to somehow neutralize or minimize your exposure
00:41:05.480 | to these certain features, which is unusual
00:41:08.280 | because if you did train a stoplight
00:41:11.920 | or stop street detection on computer vision,
00:41:16.000 | your favorite feature, let's say you could,
00:41:20.720 | and you have an auto encoder and it's figuring out,
00:41:22.560 | okay, it's going to be red and it's going to be white.
00:41:25.140 | That's the last thing you want to be,
00:41:28.000 | you want to reduce your exposure to.
00:41:30.400 | Why would you reduce your exposure to the thing
00:41:31.800 | that's helping you, your model the most?
00:41:34.640 | And that's actually this counterintuitive thing
00:41:36.480 | you have to do with machine learning on financial data.
00:41:38.680 | - So reducing, it's reducing your exposure
00:41:41.880 | would help you generalize the things that are,
00:41:44.160 | so basically financial data has a large amount of patterns
00:41:49.160 | that appeared in the past and also a large amount
00:41:52.920 | of patterns that have not appeared in the past.
00:41:55.180 | And so like in that sense, you have to reduce the exposure
00:41:58.180 | to red lights, to the color red.
00:42:02.560 | That's interesting, but how much of this is art
00:42:05.180 | and how much of it is science from your perspective so far
00:42:08.020 | in terms of as you start to climb from the 100th position
00:42:12.380 | to the 95th in the competition?
00:42:16.100 | - Yeah, well, if you do make yourself super exposed
00:42:20.460 | to one or two features, you can have a lot of volatility
00:42:25.460 | when you're playing numeri.
00:42:26.980 | You could maybe very rapidly rise to be high
00:42:31.240 | if you were getting lucky.
00:42:33.060 | - Yes.
00:42:34.300 | - And that's a bit like the stock market.
00:42:36.200 | Sure, take on massive risk exposure,
00:42:38.860 | put all your money into one stock
00:42:41.100 | and you might make 100%, but it doesn't in the long run
00:42:46.100 | work out very well.
00:42:47.740 | And so the best users are trying to stay high
00:42:52.740 | for as long as possible,
00:42:55.980 | not necessarily try to be first for a little bit.
00:43:00.100 | - So to me, a developer, machine learning researcher,
00:43:04.880 | how do I, Lex Friedman, participate in this competition
00:43:07.980 | and how do others, which I'm sure there'll be a lot
00:43:10.440 | of others interested in participating in this competition,
00:43:12.980 | what are, let's see, there's like a million questions,
00:43:15.780 | but like first one is how do I get started?
00:43:19.740 | - Well, you can go to numeri.ai, sign up,
00:43:23.540 | download the data, and on, the data is pretty small.
00:43:28.540 | In the data pack you download,
00:43:32.020 | there's like an example script, Python script,
00:43:34.980 | that just builds a XGBoost model very quickly from the data.
00:43:39.360 | And so in a very short time,
00:43:44.000 | you can have an example model.
00:43:46.100 | - Is it a particular structure?
00:43:47.660 | Like what, is this model then submitted somewhere?
00:43:50.660 | So there needs to be some kind of structure
00:43:52.500 | that communicates with some kind of API.
00:43:54.700 | Like how does the whole, how does your model,
00:43:57.940 | once you build it, once you create
00:43:59.340 | a little baby Frankenstein,
00:44:01.340 | how does it then live in its--
00:44:02.660 | - Okay, well, we want you to keep your baby Frankenstein
00:44:05.060 | at home and take care of it.
00:44:06.820 | We don't want it.
00:44:07.860 | So you never upload your model to us.
00:44:10.720 | You always only giving us predictions.
00:44:14.620 | So we never see the code that wrote your model,
00:44:17.120 | which is pretty cool.
00:44:18.100 | That our whole hedge fund is built from models
00:44:20.540 | where we've never ever seen the code.
00:44:22.380 | But it's important for the users because it's their IP,
00:44:27.020 | why would they wanna give it to us?
00:44:27.980 | - That's brilliant.
00:44:28.860 | - So they've got it themselves,
00:44:31.060 | but they can basically almost like license
00:44:34.180 | the predictions from that model to us.
00:44:36.660 | - License the prediction, yeah, yeah.
00:44:38.700 | - So--
00:44:39.540 | - Think about it.
00:44:40.360 | - What some users do is they set up a compute server
00:44:43.900 | and we call it numeric compute.
00:44:45.140 | It's like a little AWS kind of image
00:44:47.300 | and you can automate this process.
00:44:50.140 | So we can ping you, we can be like,
00:44:51.620 | we need more predictions now, and then you send it to us.
00:44:54.660 | - Okay, cool.
00:44:56.360 | So that's, is that described somewhere,
00:45:00.100 | like what the preferred is, the AWS,
00:45:02.140 | or whether another cloud platform, is there,
00:45:05.580 | I mean, is there sort of specific technical things
00:45:07.660 | you wanna say that comes to mind
00:45:09.620 | that is a good path for getting started?
00:45:12.820 | So download the data, maybe play around,
00:45:16.100 | see if you can modify the basic algorithm
00:45:20.100 | provided in the example,
00:45:25.780 | and then you would set up a little server on AWS
00:45:28.620 | that then runs this model and takes pings
00:45:31.860 | and then makes predictions.
00:45:34.160 | And so how does your own money actually come into play
00:45:37.860 | doing the stake of cryptocurrency?
00:45:41.260 | - Yeah, so you don't have to stake.
00:45:44.100 | You can start without staking,
00:45:45.460 | and many users might try for months
00:45:49.020 | without staking anything at all
00:45:50.580 | to see if their model works on the real life data, right?
00:45:54.860 | And is not overfit.
00:45:56.280 | But then you can get Numeraire many different ways.
00:46:01.960 | You can buy it on, you can buy some on Coinbase,
00:46:04.940 | you can buy some on Uniswap,
00:46:06.780 | you can buy some on Binance.
00:46:08.220 | - So what did you say this is, how do you pronounce it?
00:46:12.100 | So this is the Numeraire cryptocurrency.
00:46:15.060 | - Yeah, NMR.
00:46:16.420 | - NMR, what's, did you just say NMR?
00:46:19.260 | - It is technically called Numeraire.
00:46:21.740 | - Numeraire, I like it.
00:46:23.540 | - Yeah, but NMR is simple.
00:46:26.220 | - NMR, Numeraire, okay.
00:46:27.580 | So, and you could buy it, you know, basically anywhere.
00:46:31.900 | - Yeah, so it's a bit strange
00:46:33.060 | 'cause sometimes people be like,
00:46:34.380 | is this like pay to play?
00:46:36.100 | - Right.
00:46:36.940 | - And it's like, yeah, you need to put some money down
00:46:40.740 | to show us you believe in your model.
00:46:42.580 | But weirdly, we're not selling you the,
00:46:45.180 | like you can't buy the cryptocurrency from us.
00:46:48.260 | - Right.
00:46:49.100 | - It's like, it's also, we never, if you do badly,
00:46:54.100 | we destroy your cryptocurrency.
00:46:56.420 | Okay, that's not good, right?
00:46:58.540 | You don't want it to be destroyed.
00:47:00.020 | But what's good about it is it's also not coming to us.
00:47:03.540 | - Right.
00:47:04.380 | - It's not like we win when you lose or something,
00:47:06.980 | like we're the house.
00:47:08.580 | Like we're definitely on the same team.
00:47:10.420 | - Yes.
00:47:11.260 | - You're helping us make a hedge fund
00:47:12.500 | that's never been done before.
00:47:14.180 | - Yeah, so again, interests are aligned.
00:47:15.740 | There's no tension there at all,
00:47:18.340 | which is really fascinating.
00:47:19.940 | You're giving away everything,
00:47:21.020 | and then the IP is owned by,
00:47:23.020 | so does the code, you never share the code.
00:47:25.700 | That's fascinating.
00:47:27.140 | So, since I have you here, and you said a hundredth,
00:47:31.020 | I didn't ask out of how many, so we'll just.
00:47:33.860 | (laughing)
00:47:34.940 | But if I then, once you get started
00:47:37.940 | and you find this interesting,
00:47:39.860 | how do you then win or do well,
00:47:44.580 | but also how do you potentially try to win
00:47:47.300 | if this is something you want to take on seriously
00:47:49.900 | from the machine learning perspective,
00:47:51.180 | not from a financial perspective?
00:47:53.300 | - Yeah, I think that, first of all,
00:47:55.780 | you want to talk to the community.
00:47:57.260 | People are pretty open.
00:47:58.900 | We give out really interesting scripts and ideas
00:48:01.940 | for things you might want to try.
00:48:03.620 | But you're also going to need a lot of compute probably.
00:48:09.500 | And so some of the best users are,
00:48:12.380 | actually the very first time someone won on Numero,
00:48:15.780 | I wrote them a personal email.
00:48:17.060 | I was like, "You've won some money.
00:48:18.820 | "We're so excited to give you $300."
00:48:21.580 | And then they said, "I spent way more on the compute."
00:48:23.900 | (laughing)
00:48:26.060 | But--
00:48:26.900 | - So this is fundamentally a machine learning problem first,
00:48:29.180 | I think, is this is one of the exciting things,
00:48:31.620 | I don't know if we'll,
00:48:32.700 | in how many ways we can approach this,
00:48:34.420 | but really this is less about kind of,
00:48:38.540 | no offense, but like finance people, finance-minded people.
00:48:43.300 | They're also, I'm sure, great people.
00:48:45.380 | But it feels like from the community that I've experienced,
00:48:49.620 | these are people who see finance
00:48:52.100 | as a fascinating problem space, source of data,
00:48:57.100 | but ultimately they're machine learning people
00:48:59.980 | or AI people, which is a very different
00:49:02.340 | kind of flavor of community.
00:49:03.580 | And I mean, I should say to that,
00:49:05.740 | I'd love to participate in this,
00:49:08.940 | and I will participate in this.
00:49:10.660 | And I'd love to hear from other people,
00:49:12.860 | if you're listening to this,
00:49:13.740 | if you're a machine learning person,
00:49:15.500 | you should participate in it and tell me,
00:49:17.660 | give me some hints how I can do well at this thing.
00:49:21.300 | 'Cause this Boomer, I'm not sure I still got it,
00:49:24.180 | but 'cause some of it is, it's like Kaggle competitions,
00:49:28.380 | like some of it is certainly set of ideas,
00:49:33.300 | like research ideas, like fundamental innovation,
00:49:37.420 | but I'm sure some of it is like deeply understanding,
00:49:40.020 | getting like an intuition about the data.
00:49:42.700 | And then like a lot of it will be like figuring out
00:49:45.780 | like what works, like tricks.
00:49:47.780 | I mean, you could argue most of deep learning research
00:49:50.060 | is just tricks on top of tricks,
00:49:51.540 | but there's some of it is just the art
00:49:55.900 | of getting to know how to work
00:49:57.740 | in a really difficult machine learning problem.
00:50:00.540 | - And I think what's important,
00:50:02.020 | the important difference with something
00:50:03.500 | like a Kaggle competition,
00:50:04.740 | where they'll set up this kind of toy problem,
00:50:08.340 | and then there will be an out of sample test,
00:50:10.660 | like, hey, you did well out of sample.
00:50:12.340 | And this is like, okay, cool.
00:50:13.860 | But what's cool with Numera is,
00:50:16.620 | the out of sample is the real life stock market.
00:50:20.940 | We don't even know,
00:50:23.420 | like we don't know the answer to the problem.
00:50:25.660 | We don't, like, you'll have to find out live.
00:50:28.300 | And so we've had users who've like submitted every week
00:50:31.620 | for like four years,
00:50:33.660 | because it's kind of,
00:50:36.660 | we say it's the hardest data science problem
00:50:40.620 | on the planet, right?
00:50:41.780 | And it sounds maybe sounds like maybe a bit too much
00:50:44.620 | for like a marketing thing,
00:50:45.580 | but it's the hardest because it's the stock market.
00:50:48.820 | It's like, literally there are like billions of dollars
00:50:51.620 | at stake and like no one's like letting it be inefficient
00:50:55.460 | on purpose.
00:50:56.660 | So if you can find something that works on Numera,
00:50:58.740 | you really have something that is like working
00:51:01.740 | on the real stock market.
00:51:03.660 | - Yeah, because there's like humans involved
00:51:05.700 | in the stock market.
00:51:06.540 | I mean, it's, you know, you could argue
00:51:08.460 | there might be harder data sets,
00:51:09.860 | like maybe predicting the weather,
00:51:11.300 | all those kinds of things.
00:51:12.260 | But the fundamental statement here is, which I like,
00:51:16.060 | I was thinking like,
00:51:16.900 | is this really the hardest data science problem?
00:51:19.500 | And you start thinking about that,
00:51:21.100 | but ultimately it also boils down to a problem
00:51:24.820 | where the data is accessible.
00:51:26.660 | It's made accessible, made really easy and efficient
00:51:31.260 | at like submitting algorithms.
00:51:33.900 | So it's not just, you know,
00:51:35.500 | it's not about the data being out there, like the weather.
00:51:38.060 | It's about making the data super accessible,
00:51:40.460 | making, building a community around it.
00:51:42.980 | Like this is what ImageNet did.
00:51:45.180 | - Exactly.
00:51:46.020 | - Like, it's not just, there's always images.
00:51:49.020 | The point is you aggregate them together,
00:51:51.220 | you give it a little title, this is community,
00:51:53.580 | and that was one of the hardest, right, for a time,
00:51:58.580 | and most important data science problems in the world
00:52:03.940 | because it was accessible,
00:52:05.260 | because it was made sort of like,
00:52:09.500 | there was mechanisms by which like standards
00:52:12.300 | and mechanisms by which to judge your performance,
00:52:14.020 | all those kinds of things.
00:52:14.860 | And NumerEyes, I actually have to step up from that.
00:52:17.340 | Is there something more you can say about why,
00:52:19.700 | from your perspective, it's the hardest problem
00:52:24.020 | in the world?
00:52:24.860 | I mean, you said it's connected to the market.
00:52:26.540 | So if you can find a pattern in the market,
00:52:29.340 | that's a really difficult thing to do
00:52:31.220 | because a lot of people are trying to do it.
00:52:33.340 | - Exactly.
00:52:34.180 | But there's also the biggest one is
00:52:36.780 | it's non-stationary time series.
00:52:40.460 | We've tried to regularize the data so you can find patterns
00:52:45.300 | by doing certain things to the features and the target.
00:52:48.620 | But ultimately, you're in a space where you don't,
00:52:51.860 | there's no guarantees that the out-of-sample distributions
00:52:55.700 | will conform to any of the training data.
00:52:59.660 | And every single era, which we call on the website,
00:53:04.660 | like every single era in the data,
00:53:07.020 | which is like sort of showing you the order of the time,
00:53:09.820 | even the training data has the same dislocations.
00:53:16.300 | And so, yeah, and then there's,
00:53:21.300 | yeah, there's so many things that you might wanna try.
00:53:26.340 | There's unlimited possible number of models, right?
00:53:29.300 | And so by having it be open,
00:53:35.300 | we can at least search that space.
00:53:40.380 | - Zooming back out to the philosophical,
00:53:42.420 | you said that Numeri is very much like Wall Street Bets.
00:53:46.180 | Is there, I think it'd be interesting to dig in
00:53:51.580 | why you think so.
00:53:52.420 | I think you're speaking to the distributed nature
00:53:55.100 | of the two and the power of the people nature of the two.
00:53:59.620 | So maybe can you speak to the similarities
00:54:02.220 | and the differences and in which way
00:54:04.340 | is Numeri more powerful?
00:54:06.020 | In which way is Wall Street Bets more powerful?
00:54:09.340 | - Yeah, this is why the Wall Street Bets story
00:54:10.940 | is so interesting to me because it's like,
00:54:12.740 | feels like we're connected.
00:54:14.180 | And looking at how, just looking at the forum
00:54:17.860 | of Wall Street Bets, it's, I was talking earlier about how,
00:54:21.380 | how can you make credible claims?
00:54:23.420 | You're anonymous.
00:54:24.260 | Okay, well, maybe you can take a screenshot.
00:54:26.900 | Or maybe you can upvote someone.
00:54:29.020 | Maybe you can have karma on Reddit.
00:54:31.180 | And those kinds of things make this emerging thing possible.
00:54:35.540 | Numeri, it didn't work at all when we started.
00:54:40.020 | It didn't work at all.
00:54:42.460 | People made multiple accounts.
00:54:43.780 | They made really random models
00:54:45.620 | and hoped they would get lucky.
00:54:47.020 | And some of them did.
00:54:48.420 | - Yes.
00:54:49.380 | - Staking was our like solution to,
00:54:53.140 | could we make it so that we could trust,
00:54:56.700 | we could know which model people believed in the most.
00:55:00.100 | And we could weight models that had high stake more
00:55:04.500 | and effectively coordinate this group of people
00:55:07.740 | to be like, well, actually there's no incentive
00:55:10.060 | to creating bot accounts anymore.
00:55:12.460 | Either I stake my accounts,
00:55:14.540 | in which case I should believe in them
00:55:15.980 | 'cause I could lose my stake, or I don't.
00:55:18.620 | And that's a very powerful thing
00:55:20.740 | that having a negative incentive and a positive incentive
00:55:24.620 | can make things a lot better.
00:55:26.780 | And staking is like this,
00:55:28.140 | is this really nice like key thing about blockchain.
00:55:31.460 | It's like something special you can do
00:55:33.660 | where they're not even trusting us
00:55:35.780 | with their stake in some ways.
00:55:37.460 | They're trusting the blockchain, right?
00:55:39.980 | So the incentives, like you say,
00:55:42.180 | it's about making these perfect incentives
00:55:44.300 | so that you can have coordination to solve one problem.
00:55:47.820 | And nowadays I sleep easy
00:55:52.060 | because I have less money in my own hedge fund
00:55:55.620 | than our users are staking on their models.
00:56:00.580 | - That's powerful.
00:56:01.620 | In some sense, from a human psychology perspective,
00:56:04.180 | it's fascinating that the Wall Street bets worked at all.
00:56:07.380 | Amidst that chaos emerging behavior,
00:56:12.860 | like behavior that made sense emerged.
00:56:15.620 | It would be fascinating to think
00:56:16.900 | if numerized style staking could then be transferred
00:56:21.900 | to places like Reddit,
00:56:23.900 | and not necessarily for financial investments,
00:56:26.980 | but I wish sometimes people would have to stake something
00:56:32.980 | in the comments they make on the internet.
00:56:36.580 | That's the problem with anonymity,
00:56:39.740 | is like anonymity is freedom and power
00:56:42.180 | that you don't have to, you can speak your mind,
00:56:44.940 | but it's too easy to just be shitty.
00:56:47.580 | - Exactly.
00:56:49.860 | - So this, I mean, you're making me realize
00:56:52.340 | from like a profoundly philosophical aspect,
00:56:56.380 | numerized staking is a really clean way
00:56:59.580 | to solve this problem.
00:57:01.220 | It's a really beautiful way.
00:57:02.260 | Of course, it only with numerized currently works
00:57:05.340 | for a very particular problem, right?
00:57:07.740 | Not for human interaction on the internet,
00:57:10.260 | but that's fascinating.
00:57:11.540 | - Yeah, there's nothing to stop people.
00:57:13.500 | In fact, we've open sourced like the code we use
00:57:16.300 | for staking in a protocol we call erasure.
00:57:18.980 | And if Reddit wanted to, they could even use that code
00:57:23.540 | to have enabled staking on our Wall Street bets.
00:57:28.540 | And they're actually researching now,
00:57:30.540 | they've had some Ethereum grants
00:57:32.380 | on how could they have more crypto stuff in there,
00:57:37.140 | in Ethereum, because wouldn't that be interesting?
00:57:40.260 | Like imagine you could, instead of seeing a screenshot,
00:57:43.820 | like guys, I promise I will not sell my GameStop.
00:57:48.820 | We're just gonna go huge.
00:57:50.660 | We're not gonna sell at all.
00:57:52.060 | And here is a smart contract,
00:57:56.940 | which no one in the world, including me, can undo.
00:58:01.060 | That says, I have staked millions against this claim.
00:58:06.060 | - That's powerful.
00:58:09.140 | - And then what could you do?
00:58:11.180 | - And of course, it doesn't have to be millions.
00:58:12.660 | It could be just very small amount,
00:58:14.540 | but then just a huge number of users
00:58:16.300 | doing that kind of stake.
00:58:17.420 | - Exactly.
00:58:18.260 | - That could change the internet.
00:58:21.820 | - It would change, and then Wall Street.
00:58:23.820 | - It would change Wall Street.
00:58:25.220 | They would never have been able to,
00:58:26.740 | they would still be short squeezing one day after the next,
00:58:30.100 | every single hedge fund collapsing.
00:58:32.620 | - If we look into the future,
00:58:33.660 | do you think it's possible that numeri-style infrastructure
00:58:38.660 | where AI systems backed by humans are doing the trading
00:58:44.020 | is what the entirety of the stock market is,
00:58:48.140 | or the entirety of the economy,
00:58:49.980 | is run by basically this army of AI systems
00:58:54.180 | with high level human supervision?
00:58:57.300 | - Yeah, the thing is that some of them could be bad actors.
00:59:02.300 | - Some of the humans?
00:59:03.420 | - No, well, these systems could be tricky.
00:59:06.020 | So actually I once met a hedge fund manager,
00:59:08.980 | this is kind of interesting.
00:59:09.860 | He said, very famous one, and he said,
00:59:13.020 | sometimes we can see things in the market
00:59:17.620 | where we know we can make money, but it will mesh it up.
00:59:22.740 | We know we can make money, but it will mess things up.
00:59:25.300 | And we choose not to do those things.
00:59:28.180 | And on the one hand, maybe this is like,
00:59:29.900 | oh, you're being super arrogant.
00:59:31.900 | Of course you can't do this, but maybe he can.
00:59:35.180 | And maybe he really isn't doing things he knows he could do,
00:59:39.820 | but would change, be pretty bad.
00:59:43.500 | Would the Reddit army have that kind of morality
00:59:49.580 | or concern for what they're doing?
00:59:53.620 | Probably not based on what we've seen.
00:59:55.540 | - The madness of crowds.
00:59:57.580 | There'll be like one person that says, hey, maybe,
01:00:00.780 | and then they get trampled over.
01:00:02.580 | That's the terrifying thing, actually.
01:00:05.800 | A lot of people have written about this,
01:00:08.900 | is somehow that like little voice, that's human morality,
01:00:13.900 | gets silenced when we get into groups and start chanting.
01:00:16.780 | And that's terrifying.
01:00:18.780 | But I think maybe I misunderstood.
01:00:22.380 | I thought that you're saying AI systems can be dangerous,
01:00:25.940 | but you just described how humans can be dangerous.
01:00:28.460 | So which is safer?
01:00:30.540 | - So one thing is,
01:00:32.020 | so Wall Street bets these kinds of attacks,
01:00:37.780 | it's not possible to model numerized data
01:00:42.220 | and then come up with the idea from the model,
01:00:45.060 | let's short squeeze, just game stop.
01:00:47.020 | It's not even framed in that way.
01:00:49.460 | It's not like possible to have that idea.
01:00:51.740 | But it is possible for like kind of a bunch of humans.
01:00:54.980 | So I think this,
01:00:57.140 | numeri could get very powerful without it being dangerous,
01:01:01.920 | but Wall Street bets needs to get a little bit more powerful
01:01:05.580 | and it'll be pretty dangerous.
01:01:07.100 | - Yeah, well, I mean,
01:01:09.380 | so this is a good place to kind of think
01:01:12.900 | about numeri data today.
01:01:15.620 | Numeri signals and what that looks like in 10, 20, 30,
01:01:19.780 | 50, 100 years.
01:01:21.120 | Like right now, I guess maybe you can correct me,
01:01:24.580 | but the data that we're working with is like a window.
01:01:28.120 | It's a anonymized obfuscated window
01:01:32.420 | into a particular aspect, time period of the market.
01:01:36.740 | And you can expand that more and more
01:01:39.860 | and more and more potentially.
01:01:41.220 | You can imagine in different dimensions
01:01:43.620 | to where it encapsulates all the things
01:01:45.940 | that where you could include kind of human
01:01:50.460 | to human communication that was available
01:01:53.180 | for like to buy GameStop, for example, on Wall Street bets.
01:01:57.660 | So maybe the step back,
01:01:59.220 | can you speak to what is numeri signals
01:02:03.340 | and what are the different data sets that are involved?
01:02:07.540 | - So with numeri signals,
01:02:08.980 | you're still providing predictions to us
01:02:13.380 | but you can do that from your own data sets.
01:02:18.100 | So numeri, it's all, you have to model our data
01:02:21.020 | to come up with predictions.
01:02:22.780 | Numeri signals is whatever data you can find out there,
01:02:26.620 | you can turn it into a signal and give it to us.
01:02:29.780 | So it's a way for us to import signals
01:02:32.620 | on data we don't yet have.
01:02:35.460 | And that's why it's particularly valuable
01:02:38.720 | because it's gonna be signals,
01:02:42.240 | you're only rewarded for signals that are orthogonal
01:02:45.020 | to our core signal.
01:02:47.500 | So you have to be doing something uncorrelated.
01:02:50.100 | And so strange alternative data tends to have that property.
01:02:54.980 | There isn't too many other signals that are correlated
01:02:58.260 | with what's happening on Wall Street bets.
01:03:02.940 | That's not gonna be like correlated
01:03:04.500 | with the price to earnings ratio, right?
01:03:06.780 | And we have some users as of recently,
01:03:10.100 | as of like a week ago,
01:03:11.620 | there was a user that created, I think he's in India.
01:03:14.620 | He created a signal that is scraped from Wall Street bets.
01:03:19.620 | And now we have that signal as one of our signals
01:03:26.460 | in thousands that we use at Numeri.
01:03:28.820 | - And the structure of the signal is similar.
01:03:30.820 | So is this just numbers and time series data?
01:03:33.260 | - It's exactly, and it's just like,
01:03:35.460 | it's kind of a, you're providing a ranking of stocks.
01:03:37.660 | So you just say, give a one means you like the stock,
01:03:41.500 | zero means you don't like the stock
01:03:43.420 | and you provide that for 5,000 stocks in the world.
01:03:46.780 | - And they somehow converted the natural language
01:03:49.620 | that's in the Wall Street bet.
01:03:51.860 | - Exactly, so there's, and they made,
01:03:53.300 | they open sourced this Colab notebook.
01:03:55.500 | You can go and see it and look at it.
01:03:57.340 | And so, yeah, it's taking that, making a sentiment score
01:04:00.380 | and then turning it into a rank of stocks.
01:04:02.140 | - A sentiment score.
01:04:03.220 | - Yeah.
01:04:04.140 | - Like this stock sucks or this stock is awesome.
01:04:07.220 | And then converting that's, that's fast.
01:04:08.820 | Just even looking at that data
01:04:10.060 | would be fascinating.
01:04:11.500 | So on the signal side, what's the vision?
01:04:15.740 | This long-term, what do you see that becoming?
01:04:18.300 | - So we wanna manage all the money in the world.
01:04:21.740 | That's NumerEyes mission.
01:04:23.820 | And to get that, we need to have all the data
01:04:27.460 | and have all of the talent.
01:04:29.380 | Like there's no way, so first principles,
01:04:32.420 | if you had really good modeling
01:04:34.140 | and really good data that you would lose, right?
01:04:37.020 | It's just a question of how much do you need
01:04:39.540 | to get really good?
01:04:41.460 | So NumerEye already has some really nice data
01:04:43.980 | that we give out.
01:04:45.220 | This year, we are 10X-ing that.
01:04:48.420 | And I actually think we'll 10X the amount of data
01:04:50.620 | we have on NumerEye every year
01:04:52.740 | for at least the next 10 years.
01:04:55.300 | - Wow.
01:04:56.140 | - So it's gonna get very big, the data we give out.
01:04:59.140 | And signals is more data.
01:05:03.700 | People with any other random dataset
01:05:06.460 | can turn that into a signal and give it to us.
01:05:09.180 | - And in some sense, that kind of data
01:05:10.580 | is the edge cases, the weirdness is the,
01:05:12.900 | so you're focused on like the bulk,
01:05:14.660 | like the main data,
01:05:16.380 | and then there's just like weirdness
01:05:17.700 | from all over the place
01:05:18.620 | that just can enter through this back door of the process.
01:05:22.180 | - Exactly, and it's also a little bit shorter term.
01:05:26.340 | So the signals are about a seven day time horizon,
01:05:31.220 | and on NumerEye, it's like a 30 day.
01:05:33.940 | So it's often for faster situations.
01:05:38.580 | - You've written about a master plan,
01:05:40.380 | and you've mentioned, which I love,
01:05:42.180 | in a similar sort of style of big style thinking,
01:05:46.540 | you would like NumerEye to manage all of the world's money.
01:05:51.080 | So how do we get there from yesterday
01:05:56.420 | to several years from now?
01:05:59.500 | Like what is the plan?
01:06:01.540 | You've already started to allure to it,
01:06:05.220 | to get all the data and get it--
01:06:07.060 | - Yeah.
01:06:07.900 | - And get the talent, humans, models.
01:06:11.820 | - Exactly, I mean, the important thing to note there is,
01:06:14.820 | what would that mean, right?
01:06:16.140 | And I think the biggest thing it means is like,
01:06:19.380 | if there was one hedge fund,
01:06:21.300 | you would have not so much talent wasted
01:06:26.140 | on all the other hedge funds.
01:06:27.700 | Like it's super weird how the industry works.
01:06:30.320 | It's like one hedge fund gets a data source
01:06:32.700 | and hires a PhD,
01:06:33.900 | and another hedge fund has to buy the same data source
01:06:36.340 | and hire a PhD,
01:06:37.500 | and suddenly a third of American PhDs
01:06:39.980 | are working at hedge funds,
01:06:41.260 | and we're not even on Mars.
01:06:43.660 | And like, so in some ways, NumerEye,
01:06:46.060 | it's all about freeing up people who work at hedge funds
01:06:49.740 | to go work for Elon.
01:06:51.000 | - Yeah, and also the people who are working
01:06:55.740 | on NumerEye problem,
01:06:58.200 | it feels like a lot of the knowledge there
01:07:00.380 | is also transferable to other domains.
01:07:02.380 | - Exactly.
01:07:03.940 | One of our top users is,
01:07:05.260 | he works at NASA Jet Propulsion Lab.
01:07:08.140 | And he's like amazing.
01:07:09.980 | I went to go visit him there.
01:07:11.740 | And it's like, he's got like NumerEye posters,
01:07:13.800 | and it looks like the movies,
01:07:16.340 | like it looks like Apollo 11 or whatever.
01:07:19.380 | Yeah, the point is he didn't quit his job
01:07:23.060 | to join full time.
01:07:25.980 | He's working on getting us to Jupiter's moon.
01:07:29.900 | That's his mission, the Europa Clipper mission.
01:07:31.940 | - Actually, literally what you're saying.
01:07:33.540 | - Literally.
01:07:34.380 | He's smart enough that we really want his intelligence
01:07:37.420 | to reach the stock market,
01:07:38.580 | 'cause the stock market's a good thing,
01:07:39.660 | hedge funds are a good thing,
01:07:40.940 | all kinds of hedge funds, especially.
01:07:43.140 | But we don't want him to quit his job.
01:07:45.620 | So he can just do NumerEye on the weekends.
01:07:47.220 | And that's what he does.
01:07:48.060 | He just made a model,
01:07:48.900 | and it just automatically submits to us.
01:07:50.460 | And he's like one of our best users.
01:07:52.260 | - You mentioned briefly that stock markets are good.
01:07:56.580 | From my sort of outsider perspective,
01:07:59.580 | is there a sense,
01:08:02.000 | do you think trading stocks is closer to gambling,
01:08:06.920 | or is it closer to investing?
01:08:09.480 | Sometimes it feels like it's gambling,
01:08:12.840 | as opposed to betting on companies to succeed.
01:08:15.560 | And this is maybe connected to our discussion
01:08:17.320 | of shorting in general.
01:08:18.220 | But from your sense, the way you think about it,
01:08:21.040 | is it fundamentally still investing?
01:08:22.960 | - I do think, I mean, it's a good question.
01:08:29.720 | I've also seen lately people say,
01:08:31.960 | this is like speculation.
01:08:33.520 | Is there too much speculation in the market?
01:08:35.720 | And it's like, but all the trades are speculative.
01:08:38.520 | Like all the trades have a horizon.
01:08:40.520 | Like people want them to work.
01:08:42.520 | So I would say that,
01:08:47.040 | there's certainly a lot of aspects of gambling math
01:08:52.360 | that applies to investing.
01:08:55.000 | Like one thing you don't do in gambling
01:08:57.400 | is put all your money in one bet.
01:09:00.480 | You have bankroll management,
01:09:02.400 | and it's a key part of it.
01:09:04.400 | And small alterations to your bankroll management
01:09:07.680 | might be better than improvements to your skill.
01:09:10.940 | So there, and then there are things we care about
01:09:15.120 | in our fund, like we wanna make a lot of independent bets.
01:09:18.680 | We talk about it,
01:09:19.760 | like we wanna make a lot of independent bets,
01:09:21.840 | because that's gonna be a higher sharp
01:09:24.600 | than if you have a lot of bets that depend on each other,
01:09:26.480 | like all in one sector.
01:09:27.840 | But yeah, I mean, the point is that you want the prices
01:09:34.360 | of the stocks to be reflective of how,
01:09:38.520 | of their value.
01:09:40.760 | - Of the underlying value.
01:09:41.600 | - Yeah, like you shouldn't have there be like a hedge fund
01:09:46.320 | that's able to say, well, I've looked at some data
01:09:49.520 | and all of this stuff's super mispriced.
01:09:52.480 | Like that's super bad for society
01:09:54.800 | if it looks like that to someone.
01:09:56.800 | - I guess the underlying question then is,
01:10:00.000 | do you see that the market often like drifts away
01:10:03.200 | from the underlying value of companies
01:10:05.840 | and it becomes a game in itself?
01:10:07.880 | Like would these, whatever they're called,
01:10:11.000 | like derivatives, like the option,
01:10:12.760 | options and shorting and all that kind of stuff,
01:10:19.820 | it's like layers of game on top of the actual,
01:10:24.240 | what you said, which is like the basic thing
01:10:26.120 | that the Wall Street Bets was doing,
01:10:27.440 | which is like just buying stocks.
01:10:29.720 | - Yeah, there are a lot of games that people play
01:10:33.280 | that are in the derivatives market.
01:10:37.000 | And I think a lot of the stuff people dislike
01:10:39.840 | when they look at the history of what's happened,
01:10:42.600 | they hate like credit default swaps
01:10:45.520 | or collateralized debt obligations.
01:10:48.160 | Like these are the kind of like enemies of 2008.
01:10:52.200 | And then the long-term capital management thing,
01:10:54.840 | it was like they had 30 times leverage or something.
01:10:58.240 | Just that no one, like you could just go to a gas station
01:11:03.240 | and ask anybody at the gas station,
01:11:05.840 | is it a good idea to have 30 times leverage?
01:11:08.240 | And they just say no.
01:11:09.640 | It's like common sense just like went out the window.
01:11:12.240 | So yeah, I don't respect long-term capital management.
01:11:17.240 | (laughing)
01:11:20.680 | - Okay, but Nimrod doesn't actually use any derivatives
01:11:24.480 | unless you call shorting derivative.
01:11:26.920 | We do put money into companies.
01:11:29.360 | We, and that does help the companies we're investing in.
01:11:32.320 | It's just in little ways.
01:11:33.600 | We really did buy Tesla and it did.
01:11:37.640 | And we were not, we played some role in its success.
01:11:42.640 | Super small, make no mistake.
01:11:46.620 | But still, I think that's important.
01:11:48.200 | - Can I ask you a pothead question,
01:11:51.240 | which is what is money, man?
01:11:54.240 | So if we just kind of zoom out and look at,
01:11:59.760 | 'cause let's talk to you about cryptocurrency,
01:12:02.080 | which perhaps could be the future of money.
01:12:04.460 | In general, how do you think about money?
01:12:07.680 | You said Nimrod, the vision, the goal is to run,
01:12:12.680 | to manage the world's money.
01:12:15.480 | What is money in your view?
01:12:18.480 | - I don't have a good answer to that,
01:12:22.320 | but it's definitely in my personal life,
01:12:25.280 | it's become more and more warped.
01:12:28.020 | And you start to care about the real thing,
01:12:33.360 | like what's really going on here.
01:12:35.700 | Elon talks about things like this,
01:12:39.120 | like what is a company really?
01:12:40.920 | And it's like, it's a bunch of people
01:12:42.080 | who like kind of show up to work together
01:12:43.480 | and like they solve a problem.
01:12:45.560 | And there might not be a stock out there
01:12:47.320 | that's trading that represents what they're doing,
01:12:49.880 | but it's not the real thing.
01:12:52.000 | And being involved in crypto,
01:12:55.040 | like I put in a crowd sale of Ethereum
01:13:00.520 | and all these other things and different crypto hedge funds
01:13:05.520 | and things that I've invested in.
01:13:07.080 | And it's just kind of like,
01:13:09.440 | it feels like how I used to think about money stuff
01:13:13.040 | is just like totally warped.
01:13:14.840 | Because you stop caring about the price
01:13:20.960 | and you just care about the product.
01:13:25.180 | - So by the product,
01:13:27.600 | you mean like the different mechanisms
01:13:29.200 | that money is exchanged.
01:13:30.520 | I mean, money is ultimately a kind of,
01:13:32.560 | on one is a store of wealth,
01:13:35.240 | but it's also a mechanism of exchanging wealth.
01:13:38.700 | But what wealth means becomes a totally different thing,
01:13:42.600 | especially with cryptocurrency
01:13:44.520 | to where it's almost like these little contracts,
01:13:47.040 | these little agreements,
01:13:48.400 | these transactions between human beings
01:13:50.440 | that represent something that's bigger
01:13:54.000 | than just like cash being exchanged at 7-Eleven,
01:13:57.200 | it feels like.
01:13:58.120 | - Yeah, maybe I'll answer what is finance?
01:14:00.840 | It's what are you doing
01:14:04.480 | when you have the ability to take out a loan?
01:14:08.780 | You can bring a whole new future into being with finance.
01:14:13.780 | If you couldn't get a student loan to get a college degree,
01:14:20.280 | you couldn't get a college degree
01:14:22.780 | if you didn't have the money.
01:14:23.980 | But now like weirdly you can get it with,
01:14:27.500 | and like, yeah, all you have is this like loan,
01:14:30.700 | which is like, so now you can bring a different future
01:14:33.420 | into the world.
01:14:34.260 | And that's how when I was saying earlier
01:14:36.140 | about if you rerun American history,
01:14:38.100 | economic history without these things,
01:14:40.740 | like you're not allowed to take out loans,
01:14:42.540 | you're not allowed to have derivatives,
01:14:44.860 | you're not allowed to have money,
01:14:47.100 | it just doesn't really work.
01:14:48.940 | And it's a really magic thing,
01:14:50.420 | how much you can do with finance
01:14:53.380 | by kind of bringing the future forward.
01:14:56.020 | - Finance is empowering.
01:14:57.420 | It's, we sometimes forget this,
01:14:59.300 | but it enables innovation,
01:15:01.100 | it enables big risk takers and bold builders
01:15:03.980 | that ultimately make this world better.
01:15:06.100 | You said you were early in on cryptocurrency.
01:15:09.820 | Can you give your high level overview
01:15:12.100 | of just your thoughts about the past,
01:15:14.420 | present and future of cryptocurrency?
01:15:17.740 | - Yeah, so my friends told me about Bitcoin
01:15:19.820 | and I was interested in equities a lot.
01:15:23.740 | And I was like, well, it has no net present value.
01:15:27.700 | It has no future cash flows.
01:15:29.940 | Bitcoin pays no dividends.
01:15:32.140 | So I really couldn't get my head around it.
01:15:35.340 | And like that this could be valuable.
01:15:37.860 | And then I, but I did,
01:15:42.060 | so I didn't feel like I was early in cryptocurrency.
01:15:44.380 | In fact, 'cause I was like, it was like 2014,
01:15:46.500 | it felt like a long time after Bitcoin.
01:15:48.620 | And then, but then I really liked some of the things
01:15:52.260 | that Ethereum was doing.
01:15:54.140 | It seemed like a super visionary thing.
01:15:56.900 | Like I was reading something that was,
01:15:59.460 | that was just gonna change the world
01:16:00.700 | when I was reading the white paper.
01:16:03.300 | And I liked the different constructs
01:16:06.060 | you could have inside of Ethereum
01:16:07.940 | that you couldn't have on Bitcoin.
01:16:10.220 | - Like smart contracts and all that kind of stuff.
01:16:11.780 | - Exactly, yeah.
01:16:12.620 | And even the, they were, yeah,
01:16:15.060 | even spoke about different, yeah,
01:16:17.540 | different constructions you could have.
01:16:18.860 | - Yeah, that's a cool dance between Bitcoin and Ethereum
01:16:21.540 | of it's in the space of ideas.
01:16:23.620 | Feels so young.
01:16:25.260 | Like I wonder what cryptocurrencies
01:16:27.500 | will look like in the future.
01:16:29.860 | Like if Bitcoin or Ethereum 2.0 or some version
01:16:33.620 | will stick around or any of those,
01:16:35.700 | like who's gonna win out
01:16:37.100 | or if there's even a concept of winning out at all.
01:16:39.900 | Is there a cryptocurrency that you're especially
01:16:43.620 | find interesting that technically,
01:16:48.300 | financially, philosophically you think
01:16:50.940 | is something you're keeping your eye on?
01:16:54.060 | - Well, I don't really,
01:16:55.300 | I'm not looking to like invest in cryptocurrencies anymore,
01:16:59.180 | but I, they are, I mean,
01:17:02.260 | and many are almost identical.
01:17:05.020 | I mean, there's not, there wasn't too much difference
01:17:08.620 | between even Ethereum and Bitcoin in some ways, right?
01:17:13.540 | But there are some that I like the privacy ones.
01:17:15.860 | I mean, I was like, I like Zcash for it's like coolness.
01:17:19.060 | It's actually, it's like a different kind of invention
01:17:23.580 | compared to some of the other things.
01:17:25.500 | - Okay, can you speak to just briefly to privacy?
01:17:29.140 | What is there some mechanisms of preserving
01:17:31.140 | some privacy of the, so I guess everything is public.
01:17:34.660 | - Yeah. - Is that the problem?
01:17:36.300 | - Yeah, none of the transactions are private.
01:17:38.980 | And so, even like I have some of my,
01:17:44.700 | I have some numeraire and you can just see it.
01:17:48.020 | In fact, you can go to a website and says like,
01:17:49.940 | you can go to like ether scan
01:17:51.420 | and it'll say like numeraire founder.
01:17:54.540 | And I'm like, how the hell do you guys know this?
01:17:57.700 | - So they can reverse engineer, whatever that's called.
01:18:00.180 | - Yeah, and so they can see me move it too.
01:18:02.140 | They can see me, oh, why is he moving it?
01:18:04.180 | So, but yeah, Zcash,
01:18:09.540 | then they also, when you can make private transactions,
01:18:12.620 | you can also play different games.
01:18:14.420 | - Yes.
01:18:15.580 | - And it's unclear, it's like, what's quite cool
01:18:18.780 | about Zcash is I wonder what games are being played there.
01:18:21.620 | No one will know.
01:18:23.020 | - So from a deeply analytical perspective,
01:18:27.180 | can you describe why Dogecoin is going to win?
01:18:31.100 | Which it surely will.
01:18:32.380 | Like it very likely will take over the world.
01:18:34.940 | And once we expand out into the universe,
01:18:37.300 | we'll take over the universe.
01:18:38.740 | Or on a more serious note, like what are your thoughts
01:18:43.060 | on the recent success of Dogecoin
01:18:45.100 | where you've spoken to sort of the meme stocks,
01:18:49.140 | the memetics of the whole thing.
01:18:51.260 | It feels like the joke can become the reality
01:18:57.260 | like the meme, the joke has power in this world.
01:19:02.260 | - Yeah.
01:19:03.260 | - It's fascinating.
01:19:04.100 | - Exactly.
01:19:05.500 | It's like, why is it correlated
01:19:09.940 | with Elon tweeting about it?
01:19:12.220 | - It's not just Elon alone tweeting, right?
01:19:14.980 | It's like Elon tweeting and that becomes a catalyst
01:19:18.780 | for everybody on the internet kind of like spreading
01:19:22.100 | the joke, right?
01:19:22.940 | - Exactly.
01:19:24.100 | - The joke of it.
01:19:24.940 | So it's the initial spark of the fire
01:19:27.620 | for Wall Street bets type of situation.
01:19:30.620 | - Yeah.
01:19:31.460 | - And that's fascinating because jokes seem
01:19:34.220 | to spread faster than other mechanisms.
01:19:37.940 | - Yeah.
01:19:38.780 | - Like funny shit is very effective
01:19:41.820 | at captivating the discourse on the internet.
01:19:46.220 | - Yeah, and I think you can have,
01:19:49.460 | like I like the one meme, like Doge,
01:19:52.220 | I haven't heard that name in a long time.
01:19:54.220 | (laughing)
01:19:56.460 | - Yeah.
01:19:57.820 | - Like I think back to that meme often.
01:20:00.780 | That's like funny.
01:20:01.740 | And every time I think back to it,
01:20:04.180 | there's a little probability that I might buy it.
01:20:07.980 | - Buy some Dogecoin.
01:20:08.820 | - Right, and so I imagine you just have millions of people
01:20:11.140 | who have had all these great jokes told to them.
01:20:14.460 | And every now and then they reminisce,
01:20:16.340 | oh, that was really funny.
01:20:17.860 | And then they're like, let me buy some.
01:20:22.020 | - Wouldn't that be interesting if like the entirety,
01:20:24.060 | if we travel in time, like multiple centuries,
01:20:27.300 | where the entirety of the communication
01:20:29.620 | of the human species is like humor.
01:20:32.500 | Like it's all just jokes.
01:20:35.540 | Like we're high on probably some really advanced drugs.
01:20:38.780 | And we're all just laughing nonstop.
01:20:42.540 | It's a weird, like dystopian future of just humor.
01:20:47.420 | Elon has made me realize
01:20:50.940 | how like good it feels to just not take shit seriously
01:20:54.740 | every once in a while
01:20:55.580 | and just relieve like the pressure of the world.
01:20:58.700 | At the same time, the reason I don't always like
01:21:03.340 | when people finish their sentences with lol
01:21:06.460 | is like that you don't,
01:21:08.820 | when you don't take anything seriously.
01:21:11.540 | When everything becomes a joke,
01:21:14.020 | then it feels like that way of thinking
01:21:20.420 | feels like it will destroy the world.
01:21:22.460 | It's like, I often think it like,
01:21:24.180 | will memes save the world or destroy?
01:21:26.020 | Because I think both are possible directions.
01:21:28.740 | - Yeah, I think this is a big problem.
01:21:30.460 | I mean, America, I always felt that about America,
01:21:33.340 | a lot of people are telling jokes kind of all the time
01:21:36.460 | and they're kind of good at it.
01:21:38.020 | And you take someone aside, an American,
01:21:42.100 | you're like, I really want to have a sincere conversation.
01:21:44.260 | It's like hard to even keep a straight face
01:21:46.780 | because everything is so, there's so much levity.
01:21:50.300 | So it's complicated.
01:21:51.340 | I like how sincere actually like your Twitter can be.
01:21:54.940 | You're like, I am in love with the world today.
01:21:57.900 | - I get so much shit for it.
01:21:59.500 | It's hilarious.
01:22:00.340 | I'm never gonna stop because I realized like,
01:22:03.140 | you have to be able to sometimes just be real
01:22:05.260 | and be positive and just be, say the cliche things,
01:22:09.220 | which ultimately those things actually capture
01:22:12.020 | some fundamental truths about life.
01:22:14.940 | But it's a dance.
01:22:16.620 | And I think Elon does a good job of that.
01:22:20.260 | Now from an engineering perspective of being able to joke,
01:22:23.020 | but everyone's mostly to pull back and be like,
01:22:26.860 | here's real problems, let's solve them and so on.
01:22:29.740 | And then be able to jump back to a joke.
01:22:31.860 | So it's ultimately, I think,
01:22:33.780 | I guess a skill that we have to learn.
01:22:37.180 | But I guess your advice is to invest everything
01:22:41.820 | anyone listening owns into Dogecoin.
01:22:44.700 | That's what I heard from this interaction.
01:22:46.180 | - Yeah, no, exactly.
01:22:47.020 | Yeah.
01:22:48.260 | My hedge fund is unavailable.
01:22:50.460 | Just go straight to Dogecoin.
01:22:52.420 | - You're running a successful company.
01:22:55.180 | It's just interesting 'cause my mind has been in that space
01:22:58.700 | of potentially being one of the millions
01:23:01.620 | of other entrepreneurs.
01:23:03.420 | What's your advice on how to build a successful startup?
01:23:08.420 | How to build a successful company?
01:23:10.540 | - I think that one thing I do like,
01:23:13.820 | and it might be a particular thing about America,
01:23:16.940 | but there is something about playing,
01:23:20.660 | tell people what you really want to happen in the world.
01:23:24.020 | Don't stop.
01:23:25.460 | It's not gonna make it,
01:23:27.900 | like if you're asking someone to invest in your company,
01:23:31.100 | don't say, I think maybe one day
01:23:33.180 | we might make a million dollars.
01:23:34.780 | When you actually believe something else,
01:23:38.580 | you actually believe, you're actually more optimistic,
01:23:41.500 | but you're toning down your optimism
01:23:44.020 | because you want to appear low risk.
01:23:49.020 | But actually it's super high risk
01:23:52.580 | if your company becomes mediocre
01:23:54.540 | because no one wants to work in a mediocre company,
01:23:57.540 | no one wants to invest in a mediocre company.
01:24:00.020 | So you should play the real game.
01:24:02.620 | And obviously this doesn't apply to all businesses,
01:24:04.460 | but if you play a venture-backed startup kind of game,
01:24:07.700 | like play for keeps, play to win, go big.
01:24:10.540 | And it's very hard to do that.
01:24:13.220 | I've always feel like,
01:24:15.620 | you can start narrowing your focus
01:24:20.540 | because 10 people are telling you,
01:24:23.380 | you got to care about this boring thing
01:24:26.220 | that won't matter five years from now.
01:24:28.020 | And you should push back and play the real game.
01:24:32.420 | - So be bold.
01:24:33.860 | So both, I mean, there's an interesting duality there.
01:24:37.860 | So there's the way you speak to other people
01:24:41.100 | about like your plans and what you are like privately,
01:24:45.340 | just in your own mind.
01:24:48.100 | - And maybe it's connected with what you were saying about,
01:24:50.140 | yeah, sincerity as well.
01:24:51.940 | Like if you appear to be sincerely optimistic
01:24:55.820 | about something that's big or crazy,
01:24:59.700 | it's putting yourself up to be kind of like ridiculed
01:25:02.220 | or something.
01:25:03.060 | And so if you say, my mission is to, yeah, go to Mars.
01:25:07.980 | It's just so bonkers that it's hard to say.
01:25:12.660 | - It is.
01:25:13.500 | But one powerful thing, just like you said,
01:25:16.500 | is if you say it and you believe it,
01:25:20.220 | then actually amazing people come and work with you.
01:25:25.620 | - Exactly.
01:25:26.460 | - It's not just skill, but the dreams.
01:25:28.780 | There's something about optimism,
01:25:30.460 | like that fire that you have when you're optimistic
01:25:33.900 | of actually having the hope of building
01:25:36.180 | something totally cool, something totally new,
01:25:38.940 | that when those people get in a room together,
01:25:41.100 | like they can actually do it.
01:25:42.740 | - Yeah.
01:25:43.580 | - And also it makes life really fun
01:25:49.340 | when you're in that room.
01:25:50.820 | So all of that together, ultimately, I don't know.
01:25:55.420 | That's what makes this crazy ride of a startup
01:25:57.740 | really look fun.
01:25:59.620 | And Elon is an example of a person who succeeded at that.
01:26:02.500 | There's not many other inspiring figures, which is sad.
01:26:06.100 | I used to be at Google and there's something that happens
01:26:11.100 | that sometimes when the company grows bigger and bigger
01:26:13.900 | and bigger, where that kind of ambition
01:26:16.820 | kind of quiets down a little bit.
01:26:18.820 | - Yeah.
01:26:19.660 | - Google had this ambition, still does,
01:26:21.900 | of making the world's information accessible to everyone.
01:26:24.940 | And I remember, I don't know, that's beautiful.
01:26:27.980 | I still love that dream of, they used to scan books,
01:26:33.240 | but just in every way possible,
01:26:34.820 | make the world's information accessible.
01:26:37.540 | Same with Wikipedia.
01:26:38.660 | Every time I open up Wikipedia,
01:26:40.500 | I'm just awe-inspired by how awesome humans are, man.
01:26:47.180 | At creating this together,
01:26:48.620 | I don't know what the meanings are over there,
01:26:50.300 | but it's just beautiful.
01:26:52.740 | Like what they've created is incredible.
01:26:55.260 | And I'd love to be able to be part of something like that.
01:26:58.900 | And you're right, for that you have to be bold.
01:27:01.820 | - And it's strange to me also,
01:27:03.520 | I think you're right that there's
01:27:04.640 | how many boring companies there are.
01:27:06.820 | Something I always talk about, especially in FinTech,
01:27:08.920 | it's like, why am I excited about this?
01:27:11.240 | This is so lame.
01:27:13.440 | Like what is, this isn't even important.
01:27:16.400 | Even if you succeed, this is gonna be terrible.
01:27:19.320 | This is not good.
01:27:20.240 | And it's just strange how people can kind of get
01:27:23.560 | fake enthusiastic about boring ideas
01:27:27.800 | when there's so many bigger ideas
01:27:31.120 | that, yeah, I mean, you read these things,
01:27:33.740 | like this company raises money,
01:27:35.060 | and it's just like, that's a lot of money
01:27:36.460 | for the worst idea I've ever heard.
01:27:38.580 | - Some ideas are really big.
01:27:41.460 | So like I worked on autonomous vehicles quite a bit,
01:27:44.540 | and there's so many ways in which you can present
01:27:48.120 | that idea to yourself, to the team you work with,
01:27:50.700 | to just, yeah, like to yourself when you're quietly
01:27:53.140 | looking in the mirror in the morning,
01:27:55.260 | that's really boring or really exciting.
01:27:58.340 | Like if you're really ambitious with autonomous vehicles,
01:28:01.240 | it changes the nature of like human robot interaction,
01:28:06.360 | it changes the nature of how we move.
01:28:08.200 | Forget money, forget all that stuff.
01:28:10.000 | It changes like everything about robotics and AI,
01:28:13.320 | machine learning, it changes everything about manufacturing.
01:28:16.520 | I mean, the cars, the transportation is so fundamentally
01:28:19.760 | connected to cars, and if that changes,
01:28:22.360 | it's changing the fabric of society,
01:28:24.640 | of movies, of everything.
01:28:26.660 | And if you go bold and take risks and be willing
01:28:30.760 | to go bankrupt with your company,
01:28:33.900 | as opposed to cautiously, you could really change the world.
01:28:37.640 | And it's so sad for me to see all these autonomous
01:28:39.720 | companies, autonomous vehicle companies,
01:28:41.660 | they're like really more focused about fundraising
01:28:45.280 | and kind of like smoke and mirrors.
01:28:46.520 | They're really afraid, the entirety of their marketing
01:28:49.660 | is grounded in fear and presenting enough smoke
01:28:53.540 | to where they keep raising funds so they can cautiously
01:28:56.800 | use technology of a previous decade or previous two decades
01:29:00.720 | to kind of test vehicles here and there,
01:29:03.320 | as opposed to do crazy things and bold and go huge
01:29:07.320 | at scale to huge data collection.
01:29:09.880 | So that's just an example.
01:29:12.840 | Like the idea can be big, but if you don't allow yourself
01:29:16.160 | to take that idea and think really big with it,
01:29:20.240 | then you're not gonna make anything happen.
01:29:22.400 | Yeah, you're absolutely right in that.
01:29:24.280 | So you've been connected in your work
01:29:28.960 | with a bunch of amazing people.
01:29:31.360 | How much interaction do you have with investors?
01:29:34.120 | I get that whole process is an entire mystery to me.
01:29:36.880 | Is there some people that just have influence
01:29:38.640 | on the trajectory of your thinking completely?
01:29:42.980 | Or is it just this collective energy
01:29:44.840 | that they behind the company?
01:29:46.960 | Yeah, I mean, I came here and I was amazed how,
01:29:52.200 | yeah, people would, I was only here for a few months
01:29:54.880 | and I met some incredible investors
01:29:56.720 | and I almost run out of money.
01:29:59.760 | And once they invested, I was like,
01:30:04.760 | I am not gonna let you down.
01:30:06.840 | And I was like, okay, I'm gonna send them
01:30:08.320 | like an email update every like three minutes.
01:30:11.640 | And then they don't care at all.
01:30:14.220 | So they kind of wanna, I don't know.
01:30:15.560 | Like, so for some, I like it when it's just like,
01:30:18.240 | they're always available to talk,
01:30:20.060 | but a lot of building a business,
01:30:23.400 | especially a high tech business,
01:30:25.440 | there's little for them to add, right?
01:30:29.280 | There's little for them to add on product.
01:30:31.280 | There's a lot for them to add on like business development.
01:30:34.340 | And if we are doing product research,
01:30:36.440 | which is for us research into the market,
01:30:39.020 | research into how to make a great hedge fund.
01:30:41.820 | And we do that for years,
01:30:43.400 | there's not much to tell the investors.
01:30:47.160 | So they're basically is like, I believe in you.
01:30:49.080 | There's something, I like the cut of your jib.
01:30:52.440 | There's something in your idea, in your ambition,
01:30:55.800 | in your plans that I like.
01:30:57.920 | And it's almost like a pat on the back.
01:30:59.800 | It's like, go get them kid.
01:31:01.600 | - Yeah, it is a bit like that.
01:31:02.800 | And that's cool.
01:31:04.400 | That's a good way to do it.
01:31:05.320 | I'm glad they do it that way.
01:31:07.160 | Like the one in meeting I had,
01:31:08.860 | which was like really good with this
01:31:10.280 | was meeting Howard Morgan,
01:31:12.460 | who's actually a co-founder of Renaissance Technologies
01:31:16.820 | in the like 1980s and worked with Jim Simons.
01:31:21.040 | And he was in the room
01:31:25.880 | and I was meeting some other guy and he was in the room.
01:31:28.240 | And I was explaining how quantitative finance works.
01:31:33.040 | I was like, so, you know, they use mathematical models.
01:31:36.640 | And then he was like, yeah, I started Renaissance.
01:31:40.500 | I know a bit about this.
01:31:41.940 | And then I was like, oh my God.
01:31:46.720 | So yeah, and then I think he kind of said, well,
01:31:50.360 | yeah, he said, well,
01:31:51.440 | 'cause I was talking,
01:31:52.280 | he was working at First Round Capital as a partner.
01:31:55.940 | And they kind of said they didn't want to invest.
01:31:59.200 | And then I wrote a blog post describing the idea.
01:32:01.920 | And I was like, I really think you guys should invest.
01:32:03.640 | And then they end up.
01:32:04.580 | - Oh, interesting.
01:32:05.780 | You convinced them.
01:32:06.620 | - Yeah, 'cause they were like,
01:32:07.620 | we don't really invest in hedge funds.
01:32:08.900 | And I was like, you don't see like what I'm doing here.
01:32:11.280 | This is a tech company, not a hedge fund, right?
01:32:14.400 | - Yeah, and Numerai is brilliant.
01:32:15.800 | It's, when it caught my eye,
01:32:18.280 | there's something special there.
01:32:19.280 | So I really do hope you succeed in the,
01:32:22.720 | obviously it's a risky thing you're taking on,
01:32:24.920 | the ambition of it, the size of it.
01:32:26.600 | But I do hope you succeed.
01:32:28.360 | You mentioned Jim Simons.
01:32:30.420 | He comes up in another world of mine really often.
01:32:33.560 | He's just a brilliant guy on the mathematics side
01:32:37.760 | as a mathematician,
01:32:38.800 | but he's also a brilliant finance, hedge fund manager guy.
01:32:44.160 | Have you gotten a chance to interact with him at all?
01:32:47.080 | Have you learned anything from him on the math,
01:32:51.120 | on the finance, on the philosophy life side of things?
01:32:55.040 | - I've played poker with him.
01:32:56.360 | It was pretty cool.
01:32:57.200 | It was like, actually in the show "Billions",
01:32:59.720 | they kind of do a little thing
01:33:00.920 | about this poker tournament thing
01:33:03.040 | with all the hedge fund managers.
01:33:04.280 | And that's real life thing.
01:33:05.640 | And they have a lot of like World Series of Bracelet,
01:33:09.800 | World Series of Poker Bracelet holders.
01:33:11.880 | But it's kind of Jim's thing.
01:33:14.420 | And I met him there and yeah, it was kind of brief,
01:33:19.220 | but I was just like, he's like, "Oh, how do you,
01:33:21.500 | why are you here?"
01:33:22.340 | And I was like, "Oh, Howard sent me."
01:33:23.940 | He's like, "Go play this tournament,
01:33:26.000 | meet some of the other players."
01:33:27.820 | And then-
01:33:29.020 | - Was it Texas Hold'em?
01:33:30.180 | - Yeah, Texas Hold'em tournament.
01:33:31.740 | Yeah.
01:33:32.580 | - Do you play poker yourself or was it-
01:33:33.860 | - Yeah, I do.
01:33:34.900 | I mean, it was crazy.
01:33:36.460 | On my right was the CEO,
01:33:39.140 | who's the current CEO of Renaissance, Peter Brown.
01:33:43.460 | And Peter Muller, who's a hedge fund manager at PDT.
01:33:47.320 | And yeah, I mean, it was just like,
01:33:50.500 | and then just everyone.
01:33:51.820 | And then all these bracelet World Series,
01:33:53.300 | like people that I know from like TV.
01:33:55.640 | And Robert Mercer, who's fucking crazy.
01:34:00.480 | - Who's that?
01:34:02.420 | - He's the guy who donated the most money to Trump.
01:34:05.720 | And he's just like-
01:34:09.180 | - It's a lot of personality.
01:34:10.340 | - Character, yeah.
01:34:11.460 | Jeez, it's crazy.
01:34:13.500 | So it's quite cool how, yeah, like the, it was really fun.
01:34:17.660 | And then I managed to knock out Peter Muller.
01:34:19.620 | I have a, I got a little trophy for knocking him out
01:34:22.700 | because he was a previous champion.
01:34:24.500 | In fact, I think he's won the most.
01:34:25.900 | I think he's won three times.
01:34:27.620 | Super smart guy.
01:34:28.660 | But I will say Jim outlasted me in the tournament.
01:34:35.580 | And they're all extremely good at poker.
01:34:41.720 | But they're also, so it was a $10,000 buy-in.
01:34:44.620 | And I was like, this is kind of expensive,
01:34:49.020 | but it all goes to charity, Jim's math charity.
01:34:53.580 | But then the way they play, they have like rebuys.
01:34:58.420 | And like, they all do a shit ton of rebuys.
01:35:01.140 | This is for charity.
01:35:02.760 | So immediately they're like going all in
01:35:06.260 | and I'm like, man, like, so I ended up, you know,
01:35:09.900 | adding more as well.
01:35:12.000 | So the stakes-
01:35:12.840 | It's like you couldn't play at all without doing that.
01:35:15.400 | Yeah, the stakes are high.
01:35:16.680 | But you're connected to a lot of these folks
01:35:18.780 | that are kind of titans of just,
01:35:22.500 | of economics and tech in general.
01:35:27.240 | Do you feel a burden from this?
01:35:28.760 | You're a young guy.
01:35:30.240 | I did feel a bit out of place there.
01:35:32.440 | Like the company was quite new
01:35:35.640 | and they also don't speak about things, right?
01:35:39.920 | So it's not like going to meet a famous rocket engineer
01:35:44.320 | who will tell you how to make a rocket.
01:35:46.220 | They do not want to tell you anything
01:35:48.060 | about how to make a hedge fund.
01:35:49.600 | It's like all secretive.
01:35:51.360 | And that part I didn't like.
01:35:53.040 | And they were also kind of making fun of me a little bit.
01:35:57.720 | Like they would say, like they'd call me like,
01:36:00.580 | I don't know, the Bitcoin kid.
01:36:02.000 | Yeah.
01:36:02.840 | And then they would say even things like,
01:36:05.260 | I remember Peter said to me something like,
01:36:08.920 | I don't think AI is going to have a big role in finance.
01:36:13.000 | And I was like, hearing this from the CEO of Renaissance
01:36:16.080 | was like weird to hear because I was like,
01:36:18.000 | of course it will.
01:36:19.120 | And he's like, but he can see,
01:36:21.160 | he's like, I can see it having a really big impact
01:36:23.440 | on things like self-driving cars.
01:36:25.640 | But finance, it's too noisy and whatever.
01:36:28.320 | And so I don't think it's like the perfect application.
01:36:30.680 | And I was like, that was interesting to hear.
01:36:32.560 | 'Cause it's like, and I think there was that same day
01:36:35.580 | that Libra, I think it is, the poker playing AI
01:36:40.580 | started to beat like the human.
01:36:42.860 | So it was kind of funny hearing them like say,
01:36:44.540 | oh, I'm not sure AI could ever attack that problem.
01:36:47.580 | And then that very day it's attacking the problem
01:36:49.380 | of the game we're playing.
01:36:51.100 | - Well, there's a kind of a magic to somebody
01:36:55.740 | who's exceptionally successful,
01:36:57.520 | looking at you, giving you respect,
01:37:01.140 | but also saying that what you're doing
01:37:03.560 | is not going to succeed in a sense.
01:37:06.280 | Like they're not really saying it,
01:37:08.200 | but I tend to believe from my interactions with people
01:37:11.220 | that it's a kind of prod to say, like prove me wrong.
01:37:14.880 | - Yeah.
01:37:15.720 | - That's ultimately, that's how those guys talk.
01:37:18.060 | They see good talent and they're like.
01:37:20.440 | - Yeah.
01:37:21.280 | And I think they're also saying
01:37:22.400 | it's not gonna succeed quickly in some way.
01:37:25.840 | They're like, this is gonna take a long time
01:37:29.080 | and maybe that's good to know.
01:37:32.760 | - And certainly AI in trading,
01:37:36.320 | that's one of the most
01:37:37.720 | philosophically interesting questions
01:37:42.260 | about artificial intelligence and the nature of money,
01:37:45.940 | because it's like, how much can you extract
01:37:48.900 | in terms of patterns from all of these millions
01:37:52.320 | of humans interacting using this methodology of money?
01:37:57.180 | It's like one of the open questions
01:37:58.620 | in the artificial intelligence.
01:37:59.660 | In that sense, you converting it to a dataset
01:38:02.300 | is one of the biggest gifts to the research community,
01:38:07.060 | to the whole, anyone who loves data science and AI,
01:38:11.020 | this is kind of fascinating.
01:38:13.820 | And I'd love to see where this goes actually.
01:38:15.620 | - Thing I say sometimes, long before AGI destroys the world,
01:38:19.820 | a narrow intelligence will win all the money
01:38:21.780 | in the stock market.
01:38:23.260 | Way, like just a narrow AI.
01:38:25.420 | - Yeah.
01:38:26.380 | - And I don't know if I'm gonna be the one who invents that.
01:38:29.900 | So I'm building Numerai to make sure
01:38:31.980 | that that narrow AI uses our data.
01:38:34.500 | (laughing)
01:38:35.820 | - So you're giving a platform
01:38:36.980 | to where millions of people can participate
01:38:38.820 | and do build that narrow AI themselves.
01:38:43.300 | People love it when I ask this kind of question
01:38:45.580 | about books, about ideas and philosophers and so on.
01:38:50.580 | I was wondering if you had books
01:38:54.660 | or ideas, philosophers, thinkers
01:38:58.180 | that had an influence on your life when you were growing up
01:39:01.460 | or just today that you would recommend
01:39:04.540 | that people check out, blog posts, podcasts, videos,
01:39:08.900 | all that kind of stuff.
01:39:09.740 | Is there something that just kind of had an impact on you
01:39:12.140 | that you can recommend?
01:39:13.780 | - A super kind of obvious one.
01:39:16.040 | I really was reading "Zero to One"
01:39:21.020 | while coming up with Numerai.
01:39:22.220 | It's like, I was like halfway through the book.
01:39:24.860 | And I really do like a lot of the ideas there.
01:39:27.140 | And it's also about kind of thinking big
01:39:29.340 | and also it's like peculiar little book.
01:39:34.340 | It's like, why, like there's a little picture
01:39:36.340 | of the hipster versus Unabomber.
01:39:38.780 | And it's a weird little book.
01:39:40.460 | So I like, there's kind of like some depth there.
01:39:42.700 | - In terms of a book on a,
01:39:44.220 | if you're thinking of doing a startup, it's a good book.
01:39:47.300 | - A book I like a lot is maybe my favorite book
01:39:52.300 | is David Deutsch's "Beginning of Infinity."
01:39:55.080 | I just found that so optimistic.
01:40:00.380 | It puts you, everything you read in science,
01:40:03.820 | it like makes the world feel like kind of colder.
01:40:06.500 | 'Cause like, it's like, you know,
01:40:08.140 | we're just coming from evolution
01:40:10.660 | and coming from nothing should be this way or whatever.
01:40:15.260 | And humans are not very powerful.
01:40:16.860 | We're just like scum on the earth.
01:40:19.260 | And the way David Deutsch sees things and argues,
01:40:21.660 | he argues them with the same rigor
01:40:23.880 | that the cynics often use
01:40:26.220 | and then has a much better conclusion.
01:40:29.100 | That's, you know, some of the statements of things like,
01:40:32.820 | you know, anything that doesn't violate the laws of physics
01:40:35.820 | can be solved.
01:40:39.300 | - So ultimately arriving at a hopeful,
01:40:41.860 | like a hopeful path forward.
01:40:42.700 | - Yeah, without being like a hippie.
01:40:45.740 | - You've mentioned kind of advice for startups.
01:40:47.660 | Is there in general, whether you do a startup or not,
01:40:50.620 | do you have advice for young people today?
01:40:52.740 | You're like an example of somebody
01:40:54.180 | who's paved their own path
01:40:56.260 | and were, I would say, exceptionally successful.
01:40:59.260 | Is there advice, somebody who's like 20 today, 18,
01:41:02.860 | undergrad or thinking about going to college
01:41:05.700 | or in college and so on that you would give them?
01:41:09.540 | - I think I often tell young people don't start companies.
01:41:13.580 | Is it not, don't start a company
01:41:16.060 | unless you're prepared to make it your life's work.
01:41:19.180 | Like that's a really good way of putting it.
01:41:22.500 | And a lot of people think, well, you know,
01:41:24.980 | this semester I'm gonna take a semester off
01:41:27.340 | and in that one semester,
01:41:28.580 | I'm gonna start a company and sell it or whatever.
01:41:31.260 | And it's just like, what are you talking about?
01:41:33.420 | It doesn't really work that way.
01:41:35.020 | You should be like super into the idea,
01:41:37.260 | so into it that you wanna spend a really long time on it.
01:41:40.680 | - Is that more about psychology
01:41:42.820 | or actually time allocation?
01:41:44.100 | Like, is it literally the fact that you need to give 100%
01:41:47.620 | for potentially years for it to succeed?
01:41:49.860 | Or is it more about just the mindset that's required?
01:41:53.620 | - Yeah, I mean, I think, well, any, I think, yeah,
01:41:55.820 | you don't wanna have,
01:41:56.660 | certainly don't wanna have a plan to sell the company
01:41:59.260 | like quickly or something.
01:42:02.180 | What's like, or it's like a company that has a very,
01:42:05.540 | it's like a big fashion component.
01:42:07.380 | Like it'll only work now.
01:42:08.700 | It's like an app or something.
01:42:12.660 | So yeah, that's a big one.
01:42:14.980 | And then I also think something I've thought about recently
01:42:18.900 | is I had a job as a quant at a fund
01:42:22.340 | for about two and a half years.
01:42:25.940 | And part of me thinks if I had spent another two years there
01:42:30.940 | I would have learned a lot more
01:42:33.700 | and had even more knowledge to be where,
01:42:38.460 | to basically accelerate how long Numerai took.
01:42:41.140 | So the idea that you can sit in an air conditioned room
01:42:44.700 | and get free food, or even sit at home now in your underwear
01:42:48.860 | and make a huge amount of money and learn whatever you want
01:42:53.380 | and get, it's just crazy.
01:42:55.060 | It's such a good deal.
01:42:56.460 | - Yeah. Oh, that's interesting.
01:42:57.900 | That's the case for, I was terrified of that.
01:43:00.340 | Like at Google, I thought I would become really comfortable
01:43:03.980 | in that air conditioned room.
01:43:06.140 | And that I was afraid the quant situation is,
01:43:10.580 | I mean, what you present is really brilliant
01:43:13.260 | that it's exceptionally valuable, the lessons you learn
01:43:17.580 | 'cause you get to get paid while you learn from others.
01:43:21.080 | If you see that, if you see jobs
01:43:24.100 | in the space of your passion that way,
01:43:27.740 | that it's just an education.
01:43:29.300 | It's like the best kind of education.
01:43:31.380 | But of course you have, from my perspective
01:43:34.180 | you have to be really careful on that to get comfortable.
01:43:37.300 | Again, a relationship, then you buy a house
01:43:39.720 | or whatever the hell it is.
01:43:40.820 | And then you get, and then you convince yourself like,
01:43:44.180 | well, I have to pay these fees for the car,
01:43:46.620 | for the house, blah, blah, blah.
01:43:48.220 | And then there's momentum and all of a sudden
01:43:50.300 | you're on your deathbed and there's grandchildren
01:43:53.020 | and you're drinking whiskey
01:43:55.100 | and complaining about kids these days.
01:43:56.940 | So I'm afraid of that momentum, but you're right.
01:44:00.780 | Like there's something special about the education you get
01:44:05.020 | working at these companies.
01:44:06.300 | - Yeah, and I remember on my desk,
01:44:08.160 | I had like a bunch of papers on quant finance,
01:44:11.300 | a bunch of papers on optimization
01:44:13.260 | and then the paper on Ethereum,
01:44:15.500 | just on my desk as well.
01:44:16.660 | And the white paper, and it's like,
01:44:19.220 | it's amazing how kind of,
01:44:21.440 | and you can learn about, so that, I also thought,
01:44:24.700 | I think this idea of like learning about
01:44:26.980 | intersections of things.
01:44:28.420 | I don't think there are too many people that know like
01:44:30.660 | as much about crypto and quant finance
01:44:33.620 | and machine learning as I do.
01:44:36.260 | And that's a really nice set of three things
01:44:39.280 | to know stuff about.
01:44:40.640 | And that was because I had like free time in my job.
01:44:43.340 | - Okay, let me ask the perfectly impractical,
01:44:48.640 | but the most important question.
01:44:49.860 | What's the meaning of all the things
01:44:52.440 | you're trying to do so many amazing things, why?
01:44:56.240 | What's the meaning of this life of yours or ours?
01:45:00.840 | - I don't know.
01:45:01.880 | - Humans.
01:45:03.460 | - Yeah, so have you had people say,
01:45:06.340 | asking what meaning of life is,
01:45:07.820 | is like asking the wrong question or something.
01:45:10.220 | - The question is wrong.
01:45:11.180 | - Yeah.
01:45:12.020 | - No, usually people get too nervous to be able to say that
01:45:14.660 | 'cause it's like, your question sucks.
01:45:16.620 | I don't think there's an answer.
01:45:18.820 | It's like the searching for it.
01:45:21.140 | It's like sometimes asking it.
01:45:22.940 | It's like sometimes sitting back
01:45:24.540 | and looking up at the stars and being like,
01:45:26.780 | huh, I wonder if there's aliens up there.
01:45:29.820 | There's a useful like a palate cleanser aspect to it
01:45:34.820 | 'cause it kind of wakes you up to like
01:45:38.280 | all the little busy, hurried day-to-day activities,
01:45:42.000 | all the meetings, all the things you like a part of.
01:45:45.240 | We're just like ants, a part of a system,
01:45:47.640 | a part of another system.
01:45:48.920 | And then asking this bigger question
01:45:52.120 | allows you to kind of zoom out and think about it.
01:45:53.920 | But there's ultimately, I think it's an impossible thing
01:45:57.680 | for a limited cognitive capacity to capture.
01:46:01.520 | But it's fun to listen to somebody
01:46:03.040 | who's exceptionally successful, exceptionally busy now,
01:46:07.240 | who's also young like you,
01:46:09.800 | to ask these kinds of questions about like death.
01:46:14.140 | Do you consider your own mortality kind of thing?
01:46:18.280 | And life, whether that enters your mind
01:46:21.760 | because it often doesn't.
01:46:23.280 | It kind of almost gets in the way.
01:46:24.900 | - Yeah, it's amazing how many things you can like
01:46:28.100 | that are trivial that could like occupy a lot of your mind
01:46:31.280 | until something bad happens or something flips you.
01:46:36.280 | - And then you start thinking about the people you love
01:46:38.460 | that are in your life.
01:46:39.780 | Then you start thinking about like,
01:46:41.080 | holy shit, this ride ends.
01:46:42.820 | - Exactly, yeah.
01:46:44.500 | I just had COVID and I had it quite bad.
01:46:49.080 | It wasn't really bad.
01:46:50.820 | It was just like, I also got a simultaneous
01:46:53.620 | like lung infection.
01:46:55.720 | So I had like almost like bronchitis or whatever.
01:46:59.280 | I don't even, I don't understand that stuff,
01:47:01.980 | but I started, and then you're forced to be isolated.
01:47:06.300 | - Right.
01:47:07.140 | - And so it's actually kind of nice
01:47:08.860 | because it's very depressing.
01:47:12.500 | And then I've heard stories of, I think it's Sean Parker.
01:47:15.780 | He had like all these diseases as a child
01:47:18.220 | and he had to like just stay in bed for years.
01:47:20.660 | And then he like made Napster.
01:47:23.300 | It's like pretty cool.
01:47:24.860 | So yeah, I had about 15 days of this recently,
01:47:27.700 | just last month.
01:47:28.540 | And it feels like it did shock me
01:47:30.420 | into a new kind of energy and ambition.
01:47:34.620 | - Were there moments when you were just like terrified
01:47:37.980 | at the combination of loneliness?
01:47:39.700 | And like, you know, the thing about COVID
01:47:42.660 | is like there's some degree of uncertainty.
01:47:45.460 | Like it feels like it's a new thing,
01:47:47.420 | a new monster that's arrived on this earth.
01:47:50.920 | And so, you know, dealing with it alone,
01:47:54.300 | a lot of people are dying.
01:47:55.700 | It's like wondering like--
01:47:57.260 | - Yeah, you do wonder, I mean, for sure.
01:47:59.220 | And then there are even new strains in South Africa,
01:48:03.180 | which is where I was.
01:48:04.020 | And maybe the new strain had some interaction
01:48:06.940 | with my genes and I'm just gonna die.
01:48:09.540 | - But ultimately it was liberating somehow.
01:48:11.460 | - I loved it.
01:48:12.620 | Oh, I love that I got out of it.
01:48:15.460 | - Okay.
01:48:16.300 | - 'Cause it also affects your mind.
01:48:17.680 | You get confusion and kind of a lot of fatigue
01:48:21.400 | and you can't do your usual tricks
01:48:23.360 | of psyching yourself out of it.
01:48:24.820 | So, you know, sometimes it's like, oh man, I feel tired.
01:48:27.440 | Okay, I'm just gonna go have coffee and then I'll be fine.
01:48:29.720 | It's like, now it's like, I feel tired,
01:48:31.480 | I don't even wanna get out of bed to get coffee
01:48:33.480 | 'cause I feel so tired.
01:48:34.720 | And then you have to confront,
01:48:37.120 | there's no like quick fix cure and you're trapped at home.
01:48:40.760 | - So now you have this little thing that happened to you
01:48:44.120 | that was a reminder that you're mortal
01:48:46.480 | and you get to carry that flag
01:48:48.200 | in trying to create something special in this world, right?
01:48:53.200 | With Neumeri.
01:48:54.960 | Listen, this was like one of my favorite conversation
01:48:58.320 | 'cause the way you think about this world of money
01:49:03.280 | and just this world in general is so clear
01:49:05.200 | and you're able to explain it so eloquently.
01:49:08.720 | Richard, it was really fun.
01:49:09.960 | Really appreciate you talking to me.
01:49:11.000 | - Thank you, thank you.
01:49:12.200 | - Thanks for listening to this conversation
01:49:14.800 | with Richard Kray and thank you to our sponsors.
01:49:17.800 | Audible Audiobooks, Trial Labs, Machine Learning Company,
01:49:21.880 | Blinkist app that summarizes books
01:49:24.240 | and Athletic Greens, all-in-one nutrition drink.
01:49:27.560 | Click the sponsor links to get a discount
01:49:29.840 | and to support this podcast.
01:49:32.080 | And now let me leave you with some words from Warren Buffett.
01:49:36.560 | "Games are won by players who focus on the playing field,
01:49:40.600 | not by those whose eyes are glued to the scoreboard."
01:49:44.600 | Thank you for listening and hope to see you next time.
01:49:47.200 | (upbeat music)
01:49:49.800 | (upbeat music)
01:49:52.400 | [BLANK_AUDIO]