back to index

Coffeezilla: SBF, FTX, Fraud, Scams, Fake Gurus, Money, Fame, and Power | Lex Fridman Podcast #345


Chapters

0:0 Introduction
0:38 Coffee
2:48 SBF and FTX
17:28 8 billion
26:36 Evil vs incompetence
36:32 Key lessons from FTX collapse
49:50 Should SBF go to jail?
57:36 Role of influencers and celebrities
62:4 How FTX covered up fraud
66:0 Interview with SBF
81:3 SafeMoon fraud
87:16 Bitcoin
97:54 Psychology of investigating fraud
107:1 Investigating politics and corruption
114:13 Coffeezilla origin story
118:53 MLM marketing scams
126:32 Andrew Tate and Hustlers University
144:15 Save the Kids crypto scandal
151:34 Money and fame
158:38 MrBeast
165:0 Fake gurus and Get-Rich-Quick schemes
184:20 Process of investigation
197:5 Twitter Files release
206:11 Time management and productivity
218:11 Advice for young people

Whisper Transcript | Transcript Only Page

00:00:00.000 | Do you think he is incompetent, insane, or evil?
00:00:03.900 | The following is a conversation with Coffee Zilla,
00:00:09.280 | an investigator and journalist exposing frauds,
00:00:12.020 | scams, and fake gurus.
00:00:14.160 | He's one of the most important journalistic voices
00:00:16.640 | we have working today,
00:00:18.260 | both in terms of his integrity and fearlessness
00:00:21.080 | in the pursuit of truth.
00:00:23.320 | Please follow, watch, and support his work
00:00:25.480 | at youtube.com/coffeezilla.
00:00:28.640 | This is the Lex Friedman Podcast.
00:00:30.640 | To support it, please check out our sponsors
00:00:32.840 | in the description.
00:00:34.320 | And now, dear friends, here's Coffee Zilla.
00:00:37.540 | How do you like your coffee?
00:00:40.260 | Dark and soul-crushing?
00:00:42.100 | That was the number one question on the internet.
00:00:44.660 | Do you like your coffee to reverberate deeply
00:00:47.480 | through the worst of human nature?
00:00:49.080 | Is that how you drink your coffee?
00:00:50.420 | I've gone through a lot of phases on coffee.
00:00:52.600 | I used to, in college, I would go super deep
00:00:56.600 | into grinding fresh beans, all of that kind of stuff,
00:01:00.920 | water temperature exactly right.
00:01:03.600 | And then I hit a phase where I was just,
00:01:05.600 | it was the maintenance dose.
00:01:07.640 | Then I went to espresso because I could get a lot more in.
00:01:11.400 | And now I go through phases of sometimes I like it
00:01:14.200 | with a little oat milk,
00:01:15.600 | sometimes a little half and half and sugar.
00:01:17.480 | Oh, you've gotten soft in your old age.
00:01:19.080 | I've gone a little, I have.
00:01:21.320 | But hey, if I'm doing a SBF interview,
00:01:23.300 | it's black that day, nothing less.
00:01:25.480 | It's black, no sugar.
00:01:26.320 | - The lights go down.
00:01:27.360 | What do you actually do in those situations,
00:01:30.040 | like leading up to a show, do you get hyped up?
00:01:32.640 | Like how do you put yourself in the right mind space
00:01:34.840 | to explore some of these really difficult topics?
00:01:37.280 | - I think a lot of it's preparation.
00:01:41.280 | And then once it happens,
00:01:43.120 | it's mostly fueled by sort of adrenaline, I would say.
00:01:46.280 | I really deeply care about getting to the root cause
00:01:51.640 | of some of these issues because I think so often people
00:01:55.280 | in positions of power are let off the hook.
00:01:57.560 | So I really care about holding their feet to the fire
00:02:00.520 | and it translates into like a lot of energy the day of.
00:02:04.360 | So I never find myself, like funny enough,
00:02:07.180 | I usually drink a lot of caffeine
00:02:09.400 | leading up to the interview.
00:02:10.920 | And then I try to drink like minimum the day of
00:02:13.600 | because I have so much adrenaline,
00:02:15.000 | I don't wanna be like hyper stimulated.
00:02:17.240 | - I have to say of all the recent guests I've had,
00:02:20.960 | the energy you had when you walked into the door
00:02:25.160 | was pretty intense. - I'm excited!
00:02:26.000 | Are people not excited to be here?
00:02:27.960 | - I don't know.
00:02:30.040 | I think they're scared. - And you're a big deal, Lex.
00:02:31.040 | I mean, I don't know if you know.
00:02:31.880 | - I think they're scared.
00:02:32.700 | I feel like you were gonna knock down the door or something.
00:02:34.440 | You were very excited.
00:02:35.560 | Like that just energy, it was terrifying
00:02:39.160 | because I'm terrified of social interaction.
00:02:41.040 | Anyway, speaking of terrifying--
00:02:43.360 | - You chose a good living of interviewing people.
00:02:45.800 | - Face your fears, my friend.
00:02:47.460 | So let's talk about SBF and FTX.
00:02:50.920 | Who is Sam Banquetfried?
00:02:53.760 | Can you tell the story from the beginning
00:02:56.200 | as you understand it?
00:02:57.560 | - Yeah, so Sam Banquetfried is a kid who grew up
00:03:02.040 | sort of from a position of huge privilege.
00:03:04.520 | Both his parents are lawyers.
00:03:06.520 | I believe both of them were from Harvard.
00:03:08.600 | He went to MIT, went to like,
00:03:10.600 | or sorry, backing up a bit more,
00:03:11.940 | he went to like this top prep high school,
00:03:14.280 | then to MIT, then he went to Jane Street.
00:03:17.720 | And after that, he started a trading firm
00:03:20.160 | in I think 2017 called Alameda Research
00:03:22.920 | with a few friends.
00:03:24.120 | Some of them were from Jane Street,
00:03:25.720 | some of them worked at Google.
00:03:27.360 | And they were sort of the smartest kids on the block
00:03:31.120 | or that's what everyone thought.
00:03:32.540 | They made a lot of their money
00:03:33.480 | on something called the Kimchi Premium,
00:03:35.400 | or at least the story goes,
00:03:36.840 | which just to explain that,
00:03:38.580 | the price of Bitcoin in Korea was substantially higher
00:03:42.200 | than in the rest of the world.
00:03:43.760 | And so you could arbitrage that
00:03:45.320 | by buying Bitcoin elsewhere
00:03:47.080 | and selling it on a Korean exchange.
00:03:48.640 | So they made their money early
00:03:51.120 | doing kind of smart trades like that.
00:03:53.120 | They flipped that into market making,
00:03:55.320 | which they were pretty early on that,
00:03:57.360 | just providing liquidity to an exchange.
00:04:00.640 | And it's a strategy that is considered delta neutral,
00:04:04.360 | which means basically,
00:04:06.220 | if you take kind of both sides of the trade
00:04:08.360 | and you're making a spread, like a fee on that,
00:04:11.040 | you make money, whether it goes up or down.
00:04:13.440 | So in theory, there shouldn't be
00:04:14.540 | that much risk associated with it.
00:04:16.480 | So Alameda kind of blew up
00:04:18.840 | because they would offer these people,
00:04:21.400 | people who are giving out loans,
00:04:22.480 | they'd say, "Hey, we'll give you
00:04:23.520 | this really attractive rate of return.
00:04:25.520 | And we're doing strategies that seemingly are low risk.
00:04:28.960 | So we're a low risk bet.
00:04:30.740 | We're the smart kids from Jane Street,
00:04:32.440 | and you can kind of trust us to be
00:04:34.480 | this smarter than everyone else kind of thing."
00:04:37.600 | Around 2019, Sam started FTX,
00:04:42.060 | which is an exchange.
00:04:44.520 | Specifically, it specialized in derivatives.
00:04:47.160 | So like margins, kind of more sophisticated crypto products.
00:04:52.080 | And it got in with Binance early on.
00:04:55.440 | So Binance actually has a prior relationship to FTX,
00:04:58.360 | which we'll explore in a second,
00:04:59.680 | 'cause they're gonna play a role in FTX's collapse actually.
00:05:02.520 | Binance is the number one crypto exchange,
00:05:05.320 | and they're led by, he's called CZ on Twitter.
00:05:09.000 | I don't wanna butcher his full name.
00:05:10.840 | But really smart guy has played his hand really well
00:05:15.160 | and built up a quite large exchange.
00:05:18.640 | And Binance was funding a bunch of different startups.
00:05:21.680 | So they funded, they helped invest into FTX.
00:05:24.720 | Early on, they invested 100 million.
00:05:26.460 | So these guys were kind of like teammates early on,
00:05:28.680 | SBF and CZ.
00:05:30.760 | And FTX quickly grew.
00:05:33.080 | They got like, especially in 2020, 2021,
00:05:36.040 | they got a lot of endorsements.
00:05:38.000 | They got a lot of credibility in the space.
00:05:40.720 | And eventually, FTX actually bought out Binance.
00:05:43.840 | They gave them $2 billion.
00:05:45.280 | So pretty good investment for CZ in a couple of years.
00:05:48.480 | And now lead that up to 2022, what happens?
00:05:53.720 | Luna collapse, Three Arrows Capital collapse,
00:05:56.760 | which if you don't know,
00:05:57.580 | there's just these kind of cataclysmic events in crypto
00:06:00.840 | led by some pretty risky behavior,
00:06:02.760 | whether Luna was a token
00:06:04.560 | that promised really attractive returns
00:06:06.440 | that were unsustainable ultimately,
00:06:08.400 | and it just kind of spiraled.
00:06:09.880 | It did what's called a stable coin death spiral,
00:06:12.720 | which we can talk about if we need to.
00:06:14.440 | - A stable coin death spiral.
00:06:15.720 | I can't wait till that like actually enters the lexicon,
00:06:18.720 | like a Wikipedia page on it,
00:06:20.360 | like economic students are learning in school.
00:06:24.000 | That's like a chapter in a book.
00:06:25.320 | Anyway, I mean, this is the reality of our world.
00:06:28.160 | This is a really big part of the economic system
00:06:31.520 | is cryptocurrency and stable coin is part of that.
00:06:34.080 | - Yeah, it's weird because on the one hand,
00:06:36.040 | cryptocurrency is supposed to somewhat simplify
00:06:39.000 | or add transparency to the financial markets.
00:06:41.800 | The idea is for the first time,
00:06:43.000 | you don't have to wait for an SEC filing
00:06:44.560 | from some corporate business.
00:06:46.320 | You can look at what they're doing on chain, right?
00:06:48.480 | So that's good because a lot of big financial problems
00:06:51.640 | are caused by lack of transparency
00:06:53.760 | and lack of understanding risk.
00:06:55.680 | But ironically, you get some people
00:06:58.680 | creating these arbitrarily complicated financial products
00:07:02.040 | like algorithmic stable coins,
00:07:03.600 | which then introduce more risk and blow everything up.
00:07:05.640 | So anyways, three hours capital blew up
00:07:09.200 | and all of a sudden this crypto industry,
00:07:10.720 | which everyone thought was going to the moon,
00:07:12.520 | Bitcoin to 100,000 is in some trouble.
00:07:15.760 | And FTX seems like the only people who,
00:07:19.600 | besides like Binance,
00:07:20.720 | who's also really big and stable and Coinbase,
00:07:23.600 | they seemed like they were doing fine.
00:07:25.080 | In fact, they were bailing out companies in the summer.
00:07:27.920 | I don't know if you remember that SBF
00:07:29.720 | was likened to like Jamie Diamond,
00:07:32.040 | who's the CEO of Chase,
00:07:34.360 | who like kind of was like the buck stops here.
00:07:36.680 | I'm like the backstop, right?
00:07:40.240 | So SBF was supporting the industry.
00:07:42.520 | He was like the stable guy.
00:07:44.360 | So come to like around October and November,
00:07:47.480 | there's all this talk about regulation.
00:07:49.080 | Everything's been blowing up.
00:07:50.680 | SBF's leading the charge on regulation.
00:07:53.440 | And CZ, the CEO of Binance gets word
00:07:57.240 | that maybe SBF is kind of like cutting them out
00:08:00.200 | or making regulation that would maybe impact his business.
00:08:04.000 | And he doesn't like that too much.
00:08:06.800 | They start kind of feuding a little bit on Twitter.
00:08:09.160 | So when it comes out, a CoinDesk report came out
00:08:12.480 | that FTX's balance sheet wasn't looking that good.
00:08:16.240 | Like it looked pretty weak.
00:08:17.440 | They had a lot of coins that in theory had value
00:08:21.160 | if you looked at their market price,
00:08:23.120 | but for a variety of reasons,
00:08:25.880 | if you tried to sell them, they'd collapse in value.
00:08:27.960 | So it was sort of like this thing, a house built on sand.
00:08:31.600 | And a friend of mine on Twitter,
00:08:35.000 | he goes by Dirty Bubble Media.
00:08:36.560 | He released a report and he basically said,
00:08:38.960 | "I think these guys are insolvent."
00:08:41.920 | Well, CZ saw that and he retweeted it
00:08:44.360 | and started adding fuel to the fire of like the speculation.
00:08:47.320 | 'Cause up to this point, everyone thought FTX is super safe,
00:08:49.760 | super secure.
00:08:50.720 | There's no reason to not keep your money there.
00:08:52.520 | Tom Brady keeps his money there, whatever.
00:08:54.880 | And CZ kind of adds fuel to the fire by saying,
00:08:58.200 | "Not only am I retweeting this,
00:09:00.360 | adding kind of like validity to this speculation,
00:09:03.600 | but also I'm going to take the FTT that I got,"
00:09:07.080 | which part of their balance sheet was this FTT token,
00:09:09.800 | which is FTX's like proprietary token.
00:09:13.320 | And Alameda and FTX control a lot of it.
00:09:18.120 | They were using this token to basically be
00:09:21.240 | a large amount of collateral for their whole balance sheet.
00:09:24.960 | So it accounted for this huge amount of their value.
00:09:28.480 | And the CEO of Binance had a huge chunk of it as well.
00:09:31.840 | And he said, "I'm going to sell all of it."
00:09:34.120 | And the fuel that that introduced to the market is,
00:09:36.920 | if he sells all this FTT,
00:09:38.920 | and this FTT is underwriting a lot of the value of FTX.
00:09:43.000 | - Does FTT almost approximate like similar things
00:09:47.080 | if you were to buy a stock in a public company?
00:09:49.760 | Is that kind of like a stock in FTX?
00:09:51.800 | - It sort of was this proxy
00:09:54.080 | because what FTX was committed to doing
00:09:57.200 | was sort of like buying back FTT tokens.
00:10:00.600 | They would do this thing called the buy and burn.
00:10:02.240 | I think there was some amount of sharing
00:10:04.400 | in the revenue fees of FTX.
00:10:07.200 | It was kind of this convoluted thing.
00:10:09.200 | In my opinion, the exact value of FTT
00:10:11.560 | was speculated from the beginning.
00:10:13.280 | And it was clear that it was very tied
00:10:16.520 | to the performance of FTX,
00:10:19.000 | which is important because we'll get to later,
00:10:22.680 | FTX sort of built their whole scaffold on FTT,
00:10:26.520 | which meant that this scaffold was very wobbly
00:10:28.840 | because if FTX loses a little bit of confidence,
00:10:32.080 | then your value goes down.
00:10:33.520 | When your value goes down,
00:10:34.880 | you lose more confidence and this goes down.
00:10:37.200 | So it was kind of like this thing that,
00:10:39.320 | this flywheel that when it was going well,
00:10:42.080 | you got huge amounts of growth.
00:10:43.760 | When it's going bad, you get a exchange death spiral,
00:10:48.160 | so to speak.
00:10:49.320 | - Actually, this structure,
00:10:50.920 | it's a pretty non-standard structure.
00:10:53.200 | Did the architects of its initial design
00:10:57.560 | anticipate the wobbliness of the whole system?
00:11:00.560 | So putting fraud and all those things
00:11:02.600 | that happen later aside,
00:11:03.960 | do you think it was difficult to anticipate
00:11:06.760 | this kind of FTT, FTX,
00:11:08.600 | elementary research, weird dynamic?
00:11:11.640 | - No, because I think sophisticated traders
00:11:15.360 | always think in terms of diversification and correlation.
00:11:19.120 | So if you're trying to,
00:11:20.640 | the way to think about risk in investing is like,
00:11:23.800 | if I invest in you, Lex Friedman,
00:11:27.000 | and then I also invest in some product you produce,
00:11:31.080 | the performance of those two things
00:11:32.120 | will be pretty correlated.
00:11:33.480 | So whether I invest in you or I invest in this product
00:11:35.880 | that you are completely behind,
00:11:38.840 | I'm not de-risking.
00:11:40.320 | I'm basically counting all on you doing well, right?
00:11:42.640 | And if you do bad, my investments do very bad.
00:11:44.680 | So if I'm trying to build a stable thing,
00:11:46.800 | I shouldn't put all my eggs in the Lex Friedman basket
00:11:50.680 | unless I'm positive that you're gonna do well, right?
00:11:52.640 | And these people--
00:11:53.800 | - As your financial advisor,
00:11:54.960 | I would definitely recommend
00:11:56.480 | you do not put all your eggs in this basket.
00:11:58.720 | - Right, and so you can think about it like,
00:12:00.440 | if I know that,
00:12:01.680 | these people were trained to think like this.
00:12:04.440 | And so the idea that you could start this exchange,
00:12:08.080 | you're worth billions of dollars,
00:12:10.240 | and you underwrite your whole system by betting,
00:12:13.520 | putting most of it on your own token is insane.
00:12:16.400 | And what's crazy is we'll later find out
00:12:19.400 | that they were basically taking customer assets,
00:12:22.200 | which were real things,
00:12:23.480 | like Bitcoin and Ethereum,
00:12:25.000 | with risks that were not so correlated to FDX,
00:12:28.960 | and they were swapping them out.
00:12:30.240 | They were using to go basically gamble those
00:12:33.040 | and putting FTT in its place as quote value.
00:12:37.560 | So they were increasing the risk of the system
00:12:39.760 | in order to bet big,
00:12:41.520 | with the idea that if they bet big and won,
00:12:43.880 | we'd all be singing their praises.
00:12:45.640 | If they bet big and lost,
00:12:48.760 | I don't know if they had a plan,
00:12:50.040 | but I think they were being extremely risky,
00:12:52.360 | and there's no way to avoid their knowledge of that.
00:12:55.240 | - So when you say customer assets,
00:12:57.120 | I come to this crypto exchange,
00:12:58.960 | I have Bitcoin or some other cryptocurrency,
00:13:01.320 | and this is a thing that has pretty stable value over time.
00:13:06.320 | I mean, as crypto, relatively so,
00:13:10.400 | and I'm going to store it on this crypto exchange.
00:13:13.720 | And that's the whole point.
00:13:15.160 | So this thing, to the degree that crypto holds value,
00:13:21.280 | is supposed to continue holding value.
00:13:23.520 | There's not supposed to be an extra risk
00:13:26.400 | inserted into the thing.
00:13:27.760 | - Right, and FTX was pretty clear from the beginning
00:13:30.760 | that they wouldn't invest your assets in anything else.
00:13:33.720 | They wouldn't do anything else with it.
00:13:35.240 | They wouldn't trade it.
00:13:37.200 | That's what made FTX such like a horror story
00:13:41.680 | for investor confidence,
00:13:42.960 | is basically they made every signal
00:13:46.440 | that they would not do anything nefarious with your tokens.
00:13:49.640 | They would just put them to the side,
00:13:51.720 | put them in a separate account
00:13:52.960 | that they don't have access to,
00:13:54.520 | and they just kind of wait there
00:13:56.240 | until the day that you're ready to withdraw them.
00:13:59.320 | That's explicitly what they told their customers.
00:14:03.080 | So going back to the story a little bit,
00:14:05.480 | CZ then says, "Hey, I'm selling this token
00:14:07.840 | "that underwrites the value of FTX."
00:14:09.920 | There's a total panic.
00:14:11.960 | SBF during this time says several lies,
00:14:15.600 | such as, "FTX assets are fine.
00:14:18.200 | "We have enough money to cover all withdrawals."
00:14:20.800 | And a day later, he basically admits
00:14:23.680 | that that wasn't the case.
00:14:25.160 | They don't have the money, they're shutting down.
00:14:27.120 | And then a few days later after that,
00:14:29.040 | they declared bankruptcy.
00:14:30.360 | I should be clear, there's Alameda Research,
00:14:34.520 | FTX International, and FTX US,
00:14:37.280 | which is the US side of things.
00:14:39.120 | These are three different entities.
00:14:41.680 | All of them are in bankruptcy,
00:14:43.320 | and it's not clear to the extent
00:14:45.880 | that they were commingling funds,
00:14:47.960 | but it's clear that they were commingling funds
00:14:49.760 | to some extent.
00:14:50.600 | So they kind of were taking from each other.
00:14:54.080 | And that is where the fraud happens, right?
00:14:56.440 | Because if going back to our earlier analogy,
00:14:59.280 | if you're supposed to set funds aside,
00:15:01.280 | and I find out you were using it
00:15:02.840 | to go make all these arbitrage trades,
00:15:05.400 | or do market making, or all these activities
00:15:07.640 | you were known for, for your hedge fund trading firm thing,
00:15:10.320 | that's a huge problem, because he basically lied about this.
00:15:14.080 | And especially when he's saying explicitly
00:15:16.760 | that we have these things, we have these funds,
00:15:19.200 | and these things turn out to be lies,
00:15:21.240 | well, again, the question of fraud comes in,
00:15:23.520 | and it's just like, there's no way he didn't know.
00:15:25.880 | So the obvious question might be,
00:15:27.880 | well, why isn't he locked up?
00:15:29.760 | Why is he running around?
00:15:31.760 | And it's because really his story
00:15:35.640 | is that he didn't know any of it.
00:15:37.080 | He found out that they had, to steel man his position,
00:15:40.800 | he would say he was totally disconnected
00:15:44.240 | from what Alameda was doing.
00:15:45.840 | He had no idea that they had such a large margin position
00:15:50.320 | that they'd had an accounting quirk,
00:15:52.680 | and that accounting quirk hit $8 billion from his view.
00:15:57.680 | And so when he was saying that they had money to cover it,
00:16:01.840 | he was saying that truthfully to the best of his ability,
00:16:05.080 | and he just was so distracted at the time
00:16:07.480 | that he made a series of increasingly embarrassing mistakes.
00:16:11.400 | And now he owes it to the people to right those wrongs
00:16:16.240 | by publicly making this huge apology to us.
00:16:18.720 | You might've seen him on,
00:16:19.960 | I mean, he's been talking to nearly everyone,
00:16:23.200 | about basically how he's just didn't know what he's doing.
00:16:26.600 | He's the stupidest man alive.
00:16:28.440 | It's basically it.
00:16:29.280 | - So what are some interesting things
00:16:30.480 | you've learned from those interviews?
00:16:32.320 | - I think I've appreciated
00:16:36.640 | why you don't talk in that position.
00:16:39.200 | Most people wouldn't talk.
00:16:40.600 | Most people would listen to their legal counsel and not talk.
00:16:43.440 | I do not think any lawyer worth their salt
00:16:46.240 | would tell him to talk, because right now,
00:16:48.160 | I think the danger of what he's doing
00:16:49.520 | is he's locking himself into a story
00:16:52.000 | of how things happened,
00:16:53.920 | and I don't think that story is gonna hold up
00:16:55.600 | in the coming months,
00:16:56.760 | because I think it's impossible,
00:16:58.800 | from the insider conversations I've had
00:17:00.680 | with Alameda Research employees, with FTX employees,
00:17:04.400 | it's impossible to square what they are telling me
00:17:08.120 | with no incentive to lie
00:17:11.880 | with what SBF is telling me with every incentive to lie,
00:17:15.760 | which is fundamentally that he didn't know
00:17:18.280 | they were commingling funds.
00:17:19.400 | He didn't know they were gambling with customer money,
00:17:22.280 | and it was basically this huge mistake,
00:17:24.440 | and it's Alameda's fault,
00:17:26.080 | but he wasn't involved in Alameda, a company he owned.
00:17:28.520 | - So like a compassionate,
00:17:30.240 | but hard-hitting gangster that you are,
00:17:33.000 | very recently, you interacted with SBF on,
00:17:37.520 | I like how you adjust the suspenders as you're saying this.
00:17:41.120 | You interact on Twitter Spaces with SBF,
00:17:47.400 | and really put his feet to the fire
00:17:51.240 | with some hard-hitting questions.
00:17:53.400 | What did you ask,
00:17:54.480 | and what did you learn from that interaction?
00:17:56.640 | - Sure, I should say first,
00:17:58.160 | this was not a willing interaction.
00:17:59.960 | I mean, I thought that was kind of the funny part of it,
00:18:01.880 | 'cause I've been asking him for an interview for a while.
00:18:03.560 | He's been giving interviews to nearly everyone who wants one,
00:18:06.320 | big channels, small channels.
00:18:07.840 | He didn't give me one,
00:18:09.880 | but I managed to get some by sneaking up
00:18:12.360 | on some Twitter Spaces and DMing the people,
00:18:14.560 | and being like, "Hey, can I come up?"
00:18:16.360 | So I didn't get him to ask everything that I wanted,
00:18:19.400 | because he would leave sometimes,
00:18:22.640 | after I asked some of the questions.
00:18:25.120 | But really what I asked was about this eight billion,
00:18:28.800 | and zooming in on the improbability
00:18:32.440 | of his lack of knowledge.
00:18:34.120 | It's sort of like if you run a company,
00:18:38.120 | and you know the insides and outs,
00:18:40.440 | and you're the top of your field, top in class,
00:18:44.680 | and all of a sudden it all goes bust,
00:18:45.960 | and you say, "I had no idea how any of this worked."
00:18:48.360 | I didn't know, it's like the guy who runs Whataburger
00:18:51.120 | saying, "I didn't know where we sourced our beef."
00:18:53.520 | I didn't know where we, that's a Texas example, actually.
00:18:56.040 | - Thank you, I appreciate that.
00:18:57.200 | - Let's take it worldwide, Walmart.
00:19:00.400 | I didn't know we used Chinese manufacturers.
00:19:03.480 | That's impossible.
00:19:04.720 | To become Walmart, you have to know
00:19:07.760 | how your supply chain works.
00:19:09.360 | You have to know, even if you're at a high level,
00:19:11.200 | you know how this stuff works.
00:19:12.400 | - Can I do a hard turn on that,
00:19:14.080 | and go as one must to Hitler?
00:19:16.640 | Hitler's writing is not on any of the documents
00:19:20.480 | around, as far as I know, on the final solution.
00:19:23.920 | So in some crazy world, he could theoretically say,
00:19:28.440 | "I knew, I didn't know anything about the death camps."
00:19:32.320 | So there's this plausible deniability, in theory.
00:19:36.480 | But that, most people would look at that and say,
00:19:40.120 | "It's very unlikely, you don't know."
00:19:43.720 | Especially if all the insiders are coming out
00:19:45.480 | and saying, "No, no, no, of course he knew.
00:19:46.720 | "He was directing us from the top."
00:19:48.680 | I mean, what was clear, what's interesting
00:19:51.400 | about the structure, and I love the nitty-gritty--
00:19:53.640 | - Sorry, we're back to SPF.
00:19:55.120 | - Yeah, sorry. - We went to Hitler,
00:19:55.960 | now we're back to the SPF. - I wanted to turn us
00:19:58.200 | as fast as possible away from Hitler.
00:20:00.200 | (laughing)
00:20:02.440 | - So the insiders in what, Alameda Research?
00:20:05.600 | - Alameda Research.
00:20:06.680 | What was interesting is that there was this
00:20:08.960 | sort of one-way window going on,
00:20:11.040 | between Alameda and FTX, where FTX employees
00:20:14.800 | didn't know a lot of what was going on.
00:20:16.760 | Alameda insiders, and I would say by design,
00:20:20.840 | knew almost everything that was going on at FTX.
00:20:23.800 | And so I think that was really interesting
00:20:28.040 | from the perspective of a lot of the so-called,
00:20:31.760 | like what you could, what he's trying to ascribe to
00:20:34.320 | as like failures or mistakes or ignorance and negligence.
00:20:39.200 | When looked at closely, are much more designed,
00:20:43.400 | and they sort of don't arise spontaneously.
00:20:46.600 | 'Cause like let's say, so there's this thing in banking
00:20:49.040 | and like trading, where if I run a bank
00:20:51.360 | and you run like a trading firm,
00:20:52.920 | we need to have informational walls between us,
00:20:55.160 | 'cause there's huge conflicts of interest that can arise.
00:20:58.400 | So the negligent argument might be that like,
00:21:00.800 | oh, we just didn't know, we're sort of these dumb kids
00:21:03.080 | in the Bahamas, so we shared information equally.
00:21:06.720 | But when you see a one-way wall,
00:21:08.440 | that starts to look a lot different, right?
00:21:10.720 | If I have a backend source of looking at,
00:21:13.120 | or sorry, you're the trading firm.
00:21:14.240 | So you have a backend way to look at all my accounts,
00:21:17.440 | and I have no idea that you're doing that,
00:21:19.720 | that all of a sudden looks like a much more designed thing.
00:21:22.600 | When it would be plausible, let's say going to use
00:21:26.720 | another analogy too, if you're saying, look,
00:21:30.000 | I commingled funds 'cause I was so bad
00:21:31.800 | at corporate structures, you would expect those companies
00:21:35.440 | to have a very simple corporate structure
00:21:37.760 | 'cause you didn't know what you're doing, right?
00:21:39.440 | But what we see with FTX and Alameda,
00:21:42.360 | is they had something like 50 companies and subsidiaries,
00:21:45.720 | and this impossibly complicated web of corporate activity.
00:21:50.720 | You don't get there by accident, you don't wake up and go,
00:21:55.640 | oh, I designed like this watch that ticked
00:21:58.360 | a very specific way, but it was all accidental.
00:22:01.200 | If you really didn't know what you were doing,
00:22:02.720 | you'd end up with a simple structure.
00:22:04.640 | So even just like from a fundamental perspective,
00:22:08.960 | what SBF was doing and like the activities
00:22:11.560 | they were engaging in were so complicated
00:22:13.920 | and purposely designed to obfuscate what they were doing,
00:22:17.200 | it's impossible to subscribe to the negligence argument.
00:22:21.760 | And I wanna quickly say too, like,
00:22:23.800 | I don't think a lot of people have honed in on this.
00:22:25.720 | There was insider trading going on
00:22:28.360 | from Alameda's perspective, where they would know
00:22:32.520 | what coins FTX was going to list on their exchange.
00:22:36.000 | There's a famous effect where,
00:22:37.800 | let's say you're this legitimate exchange,
00:22:39.360 | you list a coin, the price spikes.
00:22:41.400 | Insiders told me it was a regular practice
00:22:45.080 | for Alameda insiders to know that FTX
00:22:48.400 | was going to list a coin and as a company buy up that coin,
00:22:52.040 | so they could sell it after it listed.
00:22:54.000 | And they made millions of dollars.
00:22:55.000 | How do you do that accidentally?
00:22:56.600 | - Yeah, and that's illegal.
00:22:59.240 | - Totally illegal.
00:23:00.360 | So that's illegal from like,
00:23:02.480 | and if an individual does it, it's illegal, it's fraud.
00:23:05.240 | What if a company is systematically doing it?
00:23:08.400 | And you can't tell me that FTX or someone at FTX
00:23:13.000 | wasn't feeding that information to Alameda
00:23:16.200 | or somehow giving them keys to know that.
00:23:19.440 | And that would happen at the highest level.
00:23:21.080 | It would happen at SPS level.
00:23:22.760 | And this is why his arguments of, I was dumb, I was naive,
00:23:26.760 | I was sort of ignorant are so preposterous
00:23:29.360 | because he's dumb and ignorant
00:23:31.080 | the second it becomes criminal to be smart
00:23:34.000 | and sophisticated, right?
00:23:36.000 | - But then also coming out and talking about it,
00:23:38.080 | which is a--
00:23:39.680 | - It's a bizarre move.
00:23:40.600 | - It's a bizarre and almost a dark move.
00:23:43.480 | Can you tell the story of the 8 billion?
00:23:45.880 | You mentioned 8 billion.
00:23:46.760 | What's the 8 billion?
00:23:47.600 | What's the missing 8 billion?
00:23:48.480 | - Yeah, so it's really interesting
00:23:50.440 | because it's sort of like wire fraud.
00:23:52.680 | You're sort of, he's sort of copping to like smaller crimes
00:23:56.600 | to avoid the big crime.
00:23:57.680 | The big crime is you know everything
00:23:59.240 | and you were behind it, right?
00:24:01.120 | The smaller crimes are like a little wire fraud here,
00:24:04.840 | little wire fraud there.
00:24:06.040 | So what the 8 billion is,
00:24:07.640 | is that FTX didn't always have banking.
00:24:09.600 | It's hard to get banking as a crypto exchange.
00:24:11.720 | There's all these questions of like,
00:24:12.720 | where's the money coming from?
00:24:13.800 | Is it coming from money launderers?
00:24:15.000 | So for a variety of reasons,
00:24:17.080 | it's always been hard for exchanges to get bank accounts.
00:24:19.280 | So before when FTX was just getting started,
00:24:22.400 | they didn't have a bank account.
00:24:23.440 | So how do you put money on FTX?
00:24:24.760 | Well, they would have you wire your money
00:24:27.360 | to their trading firm.
00:24:28.760 | Their wiring instructions would go to their trading firm.
00:24:30.760 | It's easier to get banking as a trading firm.
00:24:33.320 | So you'd put your money with the trading firm
00:24:35.040 | and then they'd credit you the money on FTX.
00:24:38.800 | Okay, first of all, this is a whole circumvention
00:24:40.800 | of all these banking guidelines and regulations.
00:24:43.960 | That's the first like thing that I think is legal.
00:24:46.600 | But basically what SBF argued
00:24:50.720 | is that there was an accounting glitch error problem
00:24:55.440 | where when you'd send money to Alameda,
00:24:58.000 | even though they'd credit you on FTX,
00:25:00.240 | they wouldn't safeguard your deposits.
00:25:02.240 | Like your deposits would go into
00:25:03.680 | what he called a stub account,
00:25:05.400 | which is just like some account that's not very well labeled
00:25:09.560 | kind of like a placeholder account.
00:25:11.560 | And he didn't realize that those were Alameda's funds
00:25:16.120 | or they were playing with those funds
00:25:17.760 | and that they basically should have safeguarded that
00:25:19.680 | for customers.
00:25:20.520 | That's his explanation.
00:25:22.200 | It's preposterous 'cause it's $8 billion, but anyways.
00:25:25.800 | - Just poor labeling of accounts.
00:25:27.760 | - Of an $8 billion account.
00:25:29.880 | I mean, it's like-- - What's a billion?
00:25:31.640 | - A billion, this is like--
00:25:32.960 | - I do this all the time in programming.
00:25:33.800 | - This is the craziest thing.
00:25:35.720 | Like he was talking to me
00:25:37.400 | and at one point in the conversation, he's like,
00:25:39.440 | "Yeah, I didn't have precise knowledge."
00:25:41.520 | 'Cause he said, "I didn't have knowledge
00:25:42.800 | "of Alameda's accounts."
00:25:44.280 | And I said, "Well, Forbes a month ago
00:25:45.800 | "was getting detailed accounting of Alameda."
00:25:48.680 | And he goes, "Oh, that wasn't detailed accounting.
00:25:51.360 | "I just knew I was right within 10 billion or so."
00:25:53.920 | What is that error margin?
00:25:57.040 | $10 billion for a company that is arguably
00:26:01.200 | never worth more than 100 million.
00:26:03.840 | Probably never even worth more than 50 billion.
00:26:06.760 | Your error margin is $10 billion.
00:26:08.920 | You have to be, this is a guy who is sending around
00:26:12.800 | statements that like there was no risk involved.
00:26:15.520 | And you're telling me he had a error margin
00:26:17.720 | of $10 billion.
00:26:18.720 | That is the difference between like a healthy company
00:26:21.680 | and a company so deep underwater you're going to jail.
00:26:24.640 | So you have to believe that he is impossibly stupid
00:26:29.640 | and square that with the sophistication
00:26:32.520 | that he brought to the table.
00:26:33.760 | I think it's an impossible argument.
00:26:35.000 | I don't even think it's.
00:26:36.120 | - Do you think he is incompetent, insane or evil?
00:26:40.160 | If you were to bet money on each of those.
00:26:45.720 | - Incompetent, insane or evil.
00:26:48.600 | - Insane meaning he's lying to everyone,
00:26:51.440 | but also to himself, which is a little bit different
00:26:56.440 | than incompetence.
00:26:57.680 | - He's not incompetent.
00:26:59.040 | So I think he's some combination of insane and evil,
00:27:05.200 | but it's impossible to know unless we know
00:27:07.280 | deep inside his heart how self-deluded he is
00:27:12.280 | versus a calculated strategy.
00:27:15.160 | And I think if you look at SBF,
00:27:18.560 | I think he's a fascinating individual.
00:27:20.880 | Just, I mean, you know, he's a horrible human being.
00:27:23.680 | Let's start there.
00:27:24.720 | But he's also somewhat interesting
00:27:28.920 | from a psychology perspective
00:27:30.480 | because he's very open about the fact
00:27:32.760 | that he understands image
00:27:35.520 | and he understands how to cultivate image,
00:27:38.360 | the importance of image so well
00:27:41.280 | that I think a lot of people,
00:27:42.760 | even though they've talked about it,
00:27:44.080 | aren't emphasizing that enough when interacting with him.
00:27:48.200 | This is a man whose entire history
00:27:50.680 | is about cultivating the right opinions at the right time
00:27:54.400 | to achieve the right effect.
00:27:56.120 | Why do you think he would suddenly change that approach
00:27:59.040 | when he has all the more reason to cultivate an image?
00:28:01.760 | - So he is extremely good naturally or--
00:28:05.800 | - I don't know if he's good,
00:28:07.600 | but he's like, he's hyper aware.
00:28:10.160 | - So he's deliberate in cultivating a public image
00:28:12.880 | and controlling the public image.
00:28:14.920 | - You know about the like Democrat donations.
00:28:17.680 | Like he knew to donate to the right people, $40 million.
00:28:21.600 | He says on a call that we released with Tiffany Fong,
00:28:25.800 | he says on a call
00:28:26.640 | that he donated the same amount to Republicans.
00:28:29.640 | There's speculation on whether this is true
00:28:31.360 | 'cause he's a liar or whatever.
00:28:32.720 | So caveat there.
00:28:33.880 | But he said he donated to Republicans the same amount,
00:28:36.160 | but he donated dark
00:28:37.440 | because he knew that most journalists are liberal
00:28:40.920 | and they would kind of hold that against him.
00:28:43.160 | So he wanted all the sides to be in his favor, in his pocket
00:28:48.160 | while simultaneously understanding the entire media landscape
00:28:53.720 | and playing them like a fiddle
00:28:55.520 | by cultivating this image of,
00:28:57.080 | I'm this progressive woke billionaire
00:28:59.800 | who wants to give it all away, do everything for charity.
00:29:02.920 | I drive a Corolla
00:29:04.120 | while living in a million dollar penthouse, multimillion.
00:29:07.200 | But that was sort of the angle.
00:29:09.040 | He understood so well how to play the media.
00:29:11.840 | And I think he underestimated when he did this,
00:29:16.840 | how much people would put him under a deeper microscope.
00:29:21.080 | And I don't think he has achieved the same level of success
00:29:25.600 | in cultivating this new image.
00:29:26.920 | 'Cause I think people are so skeptical now,
00:29:28.480 | no one's buying it.
00:29:29.880 | But I think he's trying it.
00:29:31.680 | He's doing it to the best of his ability.
00:29:32.520 | - But it has worked leading up
00:29:33.840 | to this particular wobbly situation.
00:29:37.520 | So before that,
00:29:38.680 | wasn't there a public perception of him being
00:29:41.000 | a force for good, a financial force for good?
00:29:44.040 | - 100%, yeah.
00:29:45.400 | Somebody from Sequoia Capital,
00:29:48.120 | wrote this glowing review
00:29:49.360 | that he's gonna be the world's first trillionaire.
00:29:51.040 | There's so many pieces done
00:29:51.960 | on he's the most generous billionaire in the world
00:29:54.440 | that he was sort of like the steel man of,
00:29:59.440 | it's possible to make tons of money.
00:30:02.440 | This is like the effective altruism movement,
00:30:03.960 | make as much money as you can, as fast as you can,
00:30:06.080 | and then give it all away.
00:30:07.400 | And he was sort of like the poster child for that.
00:30:10.800 | And he did give some of it away
00:30:12.560 | and got a lot of press for it.
00:30:15.280 | And I think that was kind of by design.
00:30:16.960 | I wanna address a real quick point.
00:30:18.800 | A lot of people have said that like Binance played a role.
00:30:22.080 | And while they catalyze this,
00:30:24.160 | insolvency is a problem
00:30:28.480 | that will eventually manifest either way.
00:30:30.880 | So I don't put any blame on CZ
00:30:33.280 | for basically causing this meltdown.
00:30:36.840 | The underlying foundation was unstable
00:30:39.680 | and it was gonna fall apart at the next push.
00:30:41.560 | I mean, he just happened to be the final kind of like,
00:30:44.400 | I don't know, the straw that broke the camel's back.
00:30:48.480 | - Yeah, the catalyst that revealed the fraud.
00:30:50.280 | - Yeah, but it's like, I don't think he's culpable
00:30:53.040 | for FTX's malfeasance in how they handled accounts,
00:30:57.800 | if that makes sense.
00:30:58.640 | - So what role did they play?
00:31:01.200 | Could they have helped alleviate some of the pain
00:31:04.400 | of investors, of people that held funds there?
00:31:09.400 | - Yeah, I mean, probably, I don't know.
00:31:14.680 | I would see some kind of weird obligation
00:31:19.400 | like with the 2 billion they made on FTX.
00:31:21.960 | Remember they got 2 billion, some of it in cash,
00:31:24.640 | some of it in FTT tokens.
00:31:26.640 | I don't know how much actual cash they have from that deal,
00:31:29.040 | but they have billions from the success
00:31:30.840 | of what seems to be a fraud.
00:31:32.400 | It seems to have been a fraud from early on.
00:31:34.080 | They had the backdoor as early as 2020,
00:31:38.240 | from what I can tell.
00:31:39.080 | - So the backdoor between FTX and Alameda?
00:31:41.440 | - Alameda, yes.
00:31:42.280 | - Alameda, I see.
00:31:43.120 | Do you think CZ saw through who SPF is?
00:31:46.640 | What he's doing?
00:31:50.080 | - No, I think CZ is like, he's a shark.
00:31:54.720 | I think he's good at building a big business.
00:31:57.200 | - Like a good shark or a bad shark?
00:31:59.720 | I don't know.
00:32:00.560 | - I don't know, I think sharks just eat.
00:32:01.920 | I mean, I don't know.
00:32:03.280 | I think--
00:32:04.160 | - My relationship with shark has like a finding Nemo,
00:32:06.480 | there's a shark in that.
00:32:07.880 | - Sure. - I don't know.
00:32:08.720 | - I think like Jeff Bezos is a shark.
00:32:10.920 | So whether people have loaded connotations
00:32:13.000 | of like how they feel about Jeff Bezos.
00:32:15.120 | I mean, I would say like,
00:32:16.240 | I think CZ is a ruthless businessman.
00:32:18.960 | I think he's cold, he's calculated, he's very deliberate.
00:32:23.560 | And I think what he should do in this position
00:32:27.080 | is forfeit the funds that he profited from that investment
00:32:30.720 | because largely I think it was,
00:32:33.320 | it's owed to the customers.
00:32:35.320 | There's so much hurting out there.
00:32:36.680 | So I think they could do a lot of good around that.
00:32:39.760 | I don't know if they will,
00:32:41.080 | 'cause I don't know if he sees it in his best interest.
00:32:43.720 | I think that's probably how he's thinking.
00:32:45.760 | But yeah, I think they could have helped
00:32:49.360 | or they could still help there.
00:32:51.200 | - Who do you think suffered the most from this so far?
00:32:54.240 | - The little account holders, this is always true.
00:32:58.240 | So one of the big temptations with fraud,
00:33:01.120 | I've covered a lot of scams, frauds,
00:33:03.480 | is everyone looks at the big number.
00:33:05.280 | Everyone, that's the headline, billions of dollars,
00:33:08.080 | the top 50 creditors, what everyone thinks at first.
00:33:11.440 | But quickly when you dig down,
00:33:12.840 | you realize that most people who lost $10 million,
00:33:15.280 | I mean, I'm sure that's terrible for them.
00:33:16.680 | I wish them to get their money back,
00:33:18.360 | but it's usually the people with like 50,000 or less
00:33:22.120 | that are most impacted.
00:33:23.320 | Usually they do not have the money to spare.
00:33:25.640 | Usually they're not diversified in a sophisticated way.
00:33:28.840 | So I think it's those people.
00:33:30.920 | I think it's the small account holders
00:33:32.480 | that I feel the worst for.
00:33:33.400 | And unfortunately they often get the least
00:33:35.400 | press time or air time.
00:33:37.120 | - That's the really difficult truth of this
00:33:39.120 | is that especially in the culture of cryptocurrency,
00:33:41.520 | there's a lot of young people who are not diversified.
00:33:45.560 | They're basically all in on a particular crypto.
00:33:49.440 | And it just breaks my heart to think
00:33:51.360 | that there's somebody with a 50,000 or 30,000 or 20,000.
00:33:54.800 | But the point is that money is everything they own.
00:33:58.680 | And now their life is basically destroyed.
00:34:04.000 | Like, you know, imagine you're 18, 19, 20, 21 years old.
00:34:09.360 | You saved up, you've been working, you saved up.
00:34:12.640 | And this is it.
00:34:13.840 | This is basically destroying a life.
00:34:16.160 | - What's so brutal about this is that
00:34:18.720 | this all comes on the back.
00:34:20.120 | The entire crypto market comes on the back of,
00:34:24.640 | comes from the deep distrust of traditional finance.
00:34:28.600 | Right, yeah, 2008,
00:34:29.880 | everybody lost trust in the banking systems.
00:34:33.000 | And they lost trust that if those banking systems
00:34:35.880 | acted in a corrupt way,
00:34:37.600 | that they would receive the justice.
00:34:39.280 | It turned out that the banks received favorable treatment,
00:34:41.680 | people didn't.
00:34:42.920 | So people lost faith in the structure
00:34:47.920 | of our financial system in a way that is,
00:34:50.240 | we're still feeling the reverberations of it.
00:34:52.760 | And so when crypto came along,
00:34:53.720 | it was like kind of this way to reinvent the wheel,
00:34:55.960 | reinvent the world for the like sort of lowly
00:34:59.840 | and the like less powerful
00:35:01.600 | and kind of level the playing field.
00:35:03.680 | So what's so sad about events like this
00:35:06.200 | is you see that fundamentally
00:35:08.560 | a lot of the power structures are the same
00:35:10.960 | where the people at the top
00:35:13.160 | face little repercussions for what they do.
00:35:15.080 | The people at the bottom are still getting screwed.
00:35:17.560 | The people at the bottom are still getting lied to.
00:35:19.640 | And law enforcement is way behind the ball.
00:35:23.280 | - Do you think this really damaged people trust
00:35:27.720 | in cryptocurrency?
00:35:28.720 | - For sure, way bigger than Luna,
00:35:31.040 | way bigger than Three Arrows Capital.
00:35:33.040 | It's because of who SBF was.
00:35:35.320 | It's not just the dollar figure behind it.
00:35:37.960 | It's because he wooed so many of our media elites
00:35:42.120 | who should have been calling him out
00:35:44.560 | or at least investigating him and not rubber stamping him.
00:35:48.100 | It's an indictment of our financial system,
00:35:51.840 | even our most sophisticated people in BlackRock,
00:35:54.360 | in Sequoia who didn't see this coming,
00:35:56.920 | who also rubber stamped him.
00:35:58.960 | And you just wonder like if you can't trust
00:36:03.440 | the top people in crypto who are supposedly the good guys,
00:36:07.320 | the guys saving crypto,
00:36:09.020 | just a month ago or two months ago,
00:36:12.400 | he was the guy on Capitol Hill
00:36:14.400 | that was talking to Gary Gensler,
00:36:15.720 | to all the top people in Washington.
00:36:18.040 | He was orchestrating the regulation of crypto.
00:36:21.080 | If that guy is a complete fraudster, liar, psychopath,
00:36:26.080 | and nobody knew it until it was too late,
00:36:28.800 | what does that mean about the system itself
00:36:31.340 | that we're building?
00:36:32.400 | - So you are one of, if not the best,
00:36:35.360 | like fraudster investigators in the world.
00:36:39.040 | Did you sense, was this on your radar at all,
00:36:42.400 | SBF over the past couple of years?
00:36:46.040 | Were any red flags going off for you?
00:36:49.040 | - Yes, so funny enough,
00:36:51.240 | one of my videos from six months ago or so blew up
00:36:54.080 | because I gotta give a lot of credit to Matt Levine
00:36:57.400 | of Bloomberg, great journalist, great finance journalist.
00:37:00.400 | And I wanna say when I talk about media elite,
00:37:03.360 | there are people doing great work
00:37:05.640 | in these mainstream institutions.
00:37:07.040 | It's not a monolith.
00:37:08.200 | Just like independent media isn't all doing great work
00:37:10.760 | and all the corporate media is bad or whatever.
00:37:12.760 | There's like these overarching narratives
00:37:14.680 | that I don't subscribe to.
00:37:16.120 | So Matt Levine's a great journalist.
00:37:17.440 | He did an interview with SBF where he got Sam
00:37:21.960 | to basically describe a lot of what was going on in DeFi,
00:37:25.560 | but it kind of a larger philosophy around crypto.
00:37:28.120 | And he described a Ponzi scheme
00:37:29.860 | where he just described this black box.
00:37:31.600 | It does nothing, but if we ascribe it value,
00:37:34.160 | then we can create more value and more value and more value.
00:37:36.320 | And it kind of was this ridiculous description
00:37:38.680 | of a Ponzi scheme, but there was no moral judgment on it.
00:37:40.800 | It was like, oh yeah, this is great.
00:37:41.840 | We can make a lot of money from this.
00:37:43.480 | And Matt is like, well, it sounds like you're
00:37:45.800 | in the Ponzi business and business is good.
00:37:48.520 | I made a video about that.
00:37:49.700 | I said, this is ridiculous.
00:37:50.960 | This is absurd, whatever.
00:37:52.160 | It's obscene.
00:37:53.680 | But I didn't explicitly call SBF a fraud there.
00:37:58.680 | And I think if I'm being, I think I saw some of it,
00:38:04.320 | but like many people, I think a lot of us were kind of like,
00:38:14.560 | I think a lot of us missed how wrong everyone could be
00:38:18.600 | at the same time.
00:38:19.680 | I did notice leading up to the crash, what was happening.
00:38:23.720 | And I called it out a day or a day and a half
00:38:26.920 | before it happened.
00:38:28.040 | 'Cause I saw my friends post a Dirty Bubble Media.
00:38:31.080 | And this was the first real look
00:38:32.720 | into the heart of their finances.
00:38:35.080 | 'Cause they're this black box.
00:38:36.480 | You just kind of had to evaluate them without knowing much.
00:38:41.880 | And once we got a peek under the hood
00:38:44.640 | of what their finances were, I realized, oh my gosh,
00:38:47.840 | these guys might be completely insolvent.
00:38:49.240 | So I made a tweet about it.
00:38:50.880 | I hope some people saw it and got their money out.
00:38:53.380 | But pretty quickly after that, I caught the narrative
00:38:58.120 | of what was really going on at Alameda,
00:39:00.780 | that it was basically this Ponzi scheme that they had built.
00:39:04.020 | - Do you ever sit like Batman in the dark
00:39:06.600 | since you fight crime and wonder, like sad,
00:39:10.480 | just staring into the darkness and thinking,
00:39:12.240 | I should have caught this earlier?
00:39:13.940 | - Yeah, I think--
00:39:18.120 | - In your $10 billion studio.
00:39:20.840 | - $10 million studio. - $10 million studio.
00:39:22.560 | - We're working our way there.
00:39:24.640 | - With a bunch of cocaine on the table.
00:39:26.160 | - It's never enough.
00:39:27.240 | It's never enough.
00:39:28.800 | You always could be catching stuff sooner.
00:39:31.320 | You always could be doing more.
00:39:33.520 | - I mean, the fascinating thing you said
00:39:35.480 | is that one of the lessons here
00:39:38.180 | is that a large number of people,
00:39:40.800 | influential, smart people,
00:39:42.360 | could all be wrong at the same time
00:39:44.520 | in terms of their evaluation of SBF.
00:39:47.880 | - This is one thing that I don't understand too,
00:39:50.920 | is like, I think it's one thing to not see something.
00:39:54.120 | I think it's another thing to like rubber stamp
00:39:56.680 | or explicitly endorse.
00:39:59.120 | I feel like a lot of people didn't look too close at SBF
00:40:04.120 | because I think a lot of the warning signs were there.
00:40:07.320 | But my feeling is, if you're a Sequoia,
00:40:09.620 | if you're a BlackRock, wouldn't you do that due diligence?
00:40:13.760 | I mean, like before just endorsing something,
00:40:17.320 | especially in the crypto space,
00:40:18.580 | this is just why I don't do any deals
00:40:20.300 | in the crypto space ever,
00:40:21.620 | 'cause it's impossible to know which ones
00:40:23.660 | are gonna be the big hits or the big frauds or whatever.
00:40:27.100 | But if you're gonna make that bet,
00:40:29.300 | if you're gonna make that play,
00:40:30.820 | you would think that you would do all the research
00:40:34.300 | in the world and you would get sophisticated looks
00:40:37.220 | at their liabilities, at how they were structured,
00:40:39.920 | all that stuff.
00:40:41.040 | And that is the most shocking part,
00:40:43.080 | is not that people missed it because you can miss fraud,
00:40:47.160 | but that there were so many glowing endorsements,
00:40:50.080 | like this guy is not gonna be that thing.
00:40:52.560 | We explicitly endorse him.
00:40:54.280 | I saw a Fortune magazine.
00:40:55.720 | He was called the next Warren Buffett.
00:40:58.240 | It's just crazy.
00:40:59.200 | - Do you think it's possible to have enough
00:41:00.600 | like Tom Brady endorsements
00:41:03.720 | that you don't really investigate?
00:41:05.480 | So like, that there's a kind of momentum,
00:41:07.740 | like societal social momentum.
00:41:09.420 | - I think that's the problem.
00:41:11.060 | I actually think that's hugely a blind spot of our society
00:41:16.060 | is we have all these heuristics
00:41:19.740 | that can be points of failure,
00:41:22.540 | where like a rule of thumb is,
00:41:24.980 | if you go to an Ivy League, well, you must be smart.
00:41:27.540 | A rule of thumb is,
00:41:28.620 | if both your parents are Harvard lawyers,
00:41:30.220 | you must know the law.
00:41:31.960 | You must like kind of be sophisticated.
00:41:34.280 | The rule of thumb is, if you're running
00:41:35.620 | a billion dollar exchange,
00:41:37.120 | you must be somehow somewhat ethical, right?
00:41:39.980 | And all of these heuristics can lead to giant blind spots
00:41:43.300 | where you kind of just go, oh, we'll check.
00:41:45.900 | Like I don't really,
00:41:46.740 | it's a lot of energy to look into people.
00:41:48.840 | And if enough of those rules of thumb are met,
00:41:51.020 | we just kind of check them off
00:41:52.100 | and put them through the system.
00:41:53.180 | So yeah, it's been hugely exposing
00:41:55.660 | for sort of like our blind spots.
00:41:57.860 | - And you don't know maybe how to look in.
00:42:00.340 | For example, there's a few assumptions.
00:42:02.280 | Now there's a lot of people are very skeptical
00:42:04.260 | of institutions of government and so on,
00:42:05.820 | but perhaps too much so.
00:42:08.780 | - I agree.
00:42:09.620 | - But for some part of history,
00:42:11.260 | there was too much faith in government.
00:42:13.980 | And so right now I think there's faith
00:42:17.180 | in certain large companies.
00:42:19.020 | There's distrust in certain ones and trust in others.
00:42:22.140 | Like people seem to distrust Facebook,
00:42:24.340 | extremely skeptical of Facebook,
00:42:26.740 | but they trust, I think Google with their data.
00:42:29.620 | I think they trust Apple with their data.
00:42:31.180 | - Sure. - Much more so.
00:42:32.560 | Like search, people don't seem to be Google search.
00:42:36.080 | Like, I'm just gonna, I'm gonna put it in there.
00:42:38.920 | Have you ever looked at your Google search history?
00:42:41.040 | Your Google search history has gotta be
00:42:42.880 | some of the darkest things.
00:42:44.680 | - Oh, I don't think I've ever looked
00:42:46.720 | at my Google search history.
00:42:47.920 | - You should. - I'm pretty careful
00:42:49.020 | with like browser hygiene and stuff like that,
00:42:54.020 | because I think it's--
00:42:56.360 | - Well, Google search history,
00:42:57.320 | unless you explicitly delete is there.
00:42:58.960 | I recommend you look at it.
00:43:00.280 | It's fascinating.
00:43:01.120 | Look, because it goes back to the first days
00:43:03.960 | of you using Google search history.
00:43:05.640 | - Fun fact actually about that.
00:43:07.080 | No, no, no, I am aware of that.
00:43:10.080 | I just mean for like certain sensitive topics
00:43:12.640 | where like I'm investigating some fraud
00:43:14.320 | and I go sign into their website, right?
00:43:16.440 | Log in.
00:43:17.360 | I won't use like a traditional browser.
00:43:19.180 | I'll use a VPN and I'll like put it on like Brave
00:43:21.560 | or something like that.
00:43:22.480 | - You log in, your create an account as Lex Friedman.
00:43:25.000 | - Yeah, exactly.
00:43:26.400 | Exactly, exactly.
00:43:27.800 | - You mentioned effective altruism.
00:43:29.440 | - Yes.
00:43:30.280 | - So SBF has been associated with this effective altruism,
00:43:32.960 | which made me look twice at EA and see like, wait,
00:43:37.460 | what's going on here?
00:43:42.540 | Was this used by SBF to give himself a good public image
00:43:48.320 | or is there something about effective altruism
00:43:51.520 | that makes it easy to misuse by bad people?
00:43:56.040 | What do you think?
00:43:56.880 | - Yeah, it's interesting.
00:43:57.700 | He could have endorsed a wide range of philosophies
00:44:00.680 | and I guess you just have to wonder,
00:44:02.520 | would those philosophies also be tainted
00:44:05.800 | if he had gone bad?
00:44:09.120 | I guess effective altruism is sort of unique
00:44:11.360 | because he used it as part of his brand.
00:44:13.880 | It wasn't like he described himself as a consequentialist
00:44:17.280 | and like that ended up mattering.
00:44:19.660 | It was like he described himself as an effective altruist
00:44:22.880 | and he used that part of the brand to lift himself up.
00:44:25.600 | I guess that's why it's getting so much scrutiny.
00:44:28.500 | I think the merits of it should speak for themselves.
00:44:31.600 | I mean, I don't personally,
00:44:32.760 | I'm not personally an effective altruist.
00:44:35.360 | I personally am motivated by giving in part emotionally
00:44:40.200 | and for some reason that I can't exactly describe,
00:44:43.860 | I think that's somewhat important.
00:44:45.760 | I don't think you should detach giving
00:44:47.900 | from some personal connection.
00:44:49.860 | I find trouble with that and like I said,
00:44:53.780 | it's for reasons I can't describe
00:44:54.880 | because effective altruism is sort of the most logical
00:44:58.160 | ivory tower position you could possibly take.
00:45:00.640 | It's like strip all humanity away from giving,
00:45:03.320 | let's treat it like a business
00:45:04.760 | and how many people can we serve
00:45:06.280 | through the McDonald's line of charity
00:45:08.460 | for like the dollar, right?
00:45:09.940 | I just personally don't resonate with that,
00:45:13.340 | but I don't think the entire movement
00:45:15.940 | is like indicted because of it.
00:45:18.720 | Typically most people who care about giving and charity
00:45:24.440 | on the whole are nice people.
00:45:26.760 | So I can't speak for the whole movement.
00:45:29.120 | I certainly don't think SBF indicts the whole movement,
00:45:31.940 | even though I personally don't subscribe to it.
00:45:33.840 | - Yeah, it made me pause, reflect and step back
00:45:37.800 | about the movement and about anything
00:45:41.480 | that has a strong ideology.
00:45:43.000 | So if there's anything in your life
00:45:45.360 | that has a strong set of ideas behind it, be careful.
00:45:50.160 | - Yeah, I mean, look, I kind of feel like
00:45:53.140 | what it teaches me and what I kind of think about
00:45:57.440 | when I think about systems
00:45:59.320 | is that no system saves you from the individual.
00:46:02.480 | No system saves you from the individual,
00:46:05.160 | their intentions, their lust for power or greed.
00:46:10.120 | I mean, I think one of the great ideas
00:46:13.640 | is the decentralization of power.
00:46:16.760 | And like, this is why I think democracies are so great
00:46:20.560 | is 'cause they decentralize power
00:46:23.540 | across a wide range of like interests and groups
00:46:28.140 | and that being an effective way
00:46:29.740 | to kind of try as best as you can
00:46:32.600 | to spread out the impact of one individual
00:46:35.500 | because one bad individual can do a lot of harm,
00:46:39.340 | as I mean, clearly as seen here.
00:46:41.980 | But no, I don't think it has anything to do with ideology
00:46:44.900 | because it's not like being an effective altruist
00:46:47.460 | made Sam Bankman free to fraud.
00:46:49.720 | He was a fraud who happened to be an effective altruist.
00:46:52.500 | That makes sense.
00:46:53.540 | - So there is something about, yes,
00:46:54.980 | no system protects you from an individual,
00:46:56.780 | but some system enable or serve as a better catalyst
00:47:01.260 | than others for the worst aspects of human nature.
00:47:06.020 | So for example, communist ideology,
00:47:08.500 | I don't know if it's the ideology
00:47:10.000 | or its implementation in the 20th century.
00:47:13.960 | It seemed like such a sexy and powerful and viral ideology
00:47:18.540 | that it somehow allows the evil bad people
00:47:21.400 | to like sneak into the very top.
00:47:23.760 | And so like, that's what I mean about
00:47:25.760 | certain ideas sound so nice that allow you,
00:47:30.120 | like the lower classes, the workers,
00:47:32.560 | the people that do all the work, they should have power.
00:47:35.280 | They have been screwed over for far too long.
00:47:37.820 | They need to take power back.
00:47:39.320 | That sounds like a really powerful idea.
00:47:41.740 | And then it just seems like with those powerful ideas,
00:47:45.140 | evil people sneak in to the top
00:47:47.320 | and start to abuse that power.
00:47:48.600 | - Yeah, I think, I mean, I don't have a lot of
00:47:52.020 | probably big brain political takes,
00:47:54.320 | but what I can say is that you can never get away
00:47:56.900 | from both the system and the individual mattering.
00:47:59.960 | For sure, some systems incentivize some behaviors
00:48:03.280 | in certain ways, but some people will take that and go,
00:48:06.000 | "Okay, all we need to do is design the perfect system
00:48:08.120 | "and then these individuals will act completely rationally
00:48:11.260 | "or responsibly in accordance to what our incentives say."
00:48:14.220 | That's not true.
00:48:15.440 | You could also say, "All we have to do
00:48:17.400 | "is focus on the individual,
00:48:18.860 | "and all we have to do is just create a society
00:48:21.020 | "which raises very well-adjusted people,
00:48:23.240 | "and then we can throw them into any system
00:48:24.900 | "with any incentives, and they will act
00:48:27.320 | "responsibly, ethically, morally."
00:48:28.880 | And I also don't think that's true.
00:48:30.400 | So incentives are real, but also the individual
00:48:34.360 | ultimately plays a large role too.
00:48:35.920 | So yeah, I don't know.
00:48:37.200 | I come down sort of in the middle there.
00:48:39.080 | - And some of that is just accidents of history too.
00:48:43.040 | Which individuals finds which system, you know--
00:48:46.280 | - You become the face of that.
00:48:47.920 | - Yeah, with FTX versus Coinbase versus Binance,
00:48:52.280 | which individual, which kinds of ideas
00:48:55.800 | and life story come to power.
00:48:59.800 | That matters.
00:49:00.800 | It's kind of fascinating that history turns
00:49:02.920 | on these small little events
00:49:04.880 | done by small little individuals.
00:49:07.820 | That Hitler's a failed artist, or you have FDR,
00:49:12.040 | or you have all these different characters
00:49:14.640 | that do good or do evil onto the world.
00:49:16.800 | And it's single individuals, and they have a life story,
00:49:19.760 | and it could have turned out completely different.
00:49:22.240 | I mean, it's the flap of the butterfly wings.
00:49:24.600 | So yeah, you're right.
00:49:25.840 | We should be skeptical as attributing too much
00:49:29.400 | to the system or the individual.
00:49:31.260 | It's all like a beautiful mess.
00:49:32.860 | - The Lex Freedman podcast.
00:49:36.320 | That was like a Lex line.
00:49:39.200 | I've heard quite a few episodes,
00:49:41.120 | and that's like such a Lex line.
00:49:42.360 | It's a beautiful mess.
00:49:43.960 | Beautifully said.
00:49:44.800 | - All right.
00:49:45.620 | - Sorry, I'm a fan of the show!
00:49:46.560 | - Okay, all right, I love you too.
00:49:48.880 | - All right.
00:49:50.240 | - Can you think of possible trajectories
00:49:52.000 | how this FTX, SBF saga ends?
00:49:57.000 | And which one do you hope for?
00:50:00.360 | Do you hope that SBF goes to jail?
00:50:03.240 | That's the individual.
00:50:04.520 | And in terms of the investors and the customers,
00:50:07.920 | what do you hope happens?
00:50:08.920 | And what do you think are the possible things
00:50:10.320 | that can happen?
00:50:11.160 | - So A, yeah, I definitely think SBF should go to jail.
00:50:15.700 | For nothing else, for a semblance of justice,
00:50:20.240 | the facsimile of justice to occur for all the investors.
00:50:23.460 | I also think there are people probably
00:50:26.880 | several steps down the chain that probably knew,
00:50:30.080 | at least Caroline Ellison.
00:50:32.300 | You can have questions about sort of their,
00:50:34.240 | you know, Dan Friedberg,
00:50:35.600 | who I'd love to talk about as well.
00:50:38.040 | There are a lot of people in that room
00:50:39.800 | who I think knew.
00:50:40.640 | I think we do so much of like,
00:50:43.160 | the one guy is all to blame,
00:50:44.840 | let's throw everything at him.
00:50:46.880 | When clearly this was a company wide issue.
00:50:51.020 | So everyone who knew, I think should face
00:50:54.740 | the same punishment, which I think should be jail
00:50:56.720 | for all of those people.
00:50:58.160 | - In part to send a signal to anybody
00:51:01.800 | that tries this kind of stuff in the future.
00:51:04.040 | - Yeah, absolutely.
00:51:05.120 | I mean, one of the big things that you saw,
00:51:07.640 | like, okay, take a microcosm of all of this action
00:51:11.840 | and just look at like the influencer space, right?
00:51:14.760 | There's a ton of deals that were done
00:51:16.640 | that I've covered ad nauseum about influencer finds out
00:51:21.320 | they can make a lot of money selling a crypto coin.
00:51:24.140 | The first thing they wonder is, am I gonna get caught?
00:51:27.640 | If I do this, is there a consequence?
00:51:30.640 | And if the answer is no,
00:51:32.360 | then it's a pretty easy decision
00:51:34.300 | as long as you don't like have any moral scruples about it,
00:51:36.520 | which apparently none of them did,
00:51:37.760 | or a lot of them didn't, I should say.
00:51:40.280 | So as soon as somebody steps in and regulates,
00:51:44.060 | that math changes.
00:51:45.960 | And all of a sudden there's a self-interest reason
00:51:47.580 | to not go do the bad thing.
00:51:49.400 | So for example, and I can give a concrete example of this.
00:51:52.240 | There was a NFT, the first ever NFT
00:51:56.320 | sort of like official indictment
00:52:00.040 | or the DOJ released this press release
00:52:02.440 | that they're charging these guys who ran a NFT project
00:52:05.720 | that they didn't follow through on their promises.
00:52:07.320 | They made all these promises, lied,
00:52:08.720 | and then ran away with the money.
00:52:10.400 | First ever consequence for anyone in the NFT space.
00:52:13.420 | That day that that press release came out,
00:52:15.480 | I saw several NFT projects come back to life from the dead.
00:52:21.620 | Because all those founders are freaking out
00:52:24.200 | and they realized we scammed people.
00:52:26.080 | We have to go at least make it look like
00:52:27.920 | we're doing the right thing, right?
00:52:29.000 | Even just, so that's on the optic side,
00:52:30.720 | but there's also tons of people who now go,
00:52:32.960 | oh, basically law enforcement is on the scene.
00:52:36.600 | We can't do the same thing.
00:52:37.560 | So there is a very pragmatic reason for this punishment.
00:52:42.560 | It's very much just because people work it into their math
00:52:45.480 | of should I commit fraud?
00:52:47.420 | And the last several years have been very,
00:52:49.920 | sort of has been like a little bit of a nihilistic landscape
00:52:55.800 | where no one was getting punished.
00:52:57.360 | And so there's this question of you're almost an idiot
00:52:59.680 | if you didn't take the deals.
00:53:01.680 | And so I think it's really important,
00:53:03.040 | extremely important for kind of law enforcement
00:53:07.400 | to play a role, regulation to play a role,
00:53:09.920 | to make it harder to commit those crimes.
00:53:11.360 | And if you commit those crimes,
00:53:12.880 | there's actual real world punishment for it.
00:53:15.400 | To your point about like what's gonna happen
00:53:17.800 | to the investors, I think that was kind of your question.
00:53:21.120 | It's tough because if the money's not there,
00:53:23.560 | the money's not there.
00:53:24.400 | I mean, there's gonna be the guy,
00:53:26.000 | they got the best in class guy.
00:53:27.760 | It's the guy who ran the dissolving of Enron.
00:53:30.360 | So I mean, I can't imagine someone better equipped
00:53:33.000 | to run a complicated corporate fraud like dissolution.
00:53:36.760 | But yeah, it's tough 'cause everyone's gonna get probably,
00:53:41.040 | I don't know, 10 cents on the dollar, maybe less.
00:53:44.160 | - I wonder if there's a way
00:53:45.000 | to do a progressive redistribution of funds.
00:53:47.920 | So I'm just really worried about the pain
00:53:51.080 | that small investors feel.
00:53:52.560 | - Yeah, I think there's a lot of thought around that.
00:53:57.960 | I forget if they actually do do this.
00:54:02.040 | I mean, I know there's a lot of law
00:54:03.160 | about like you can't treat creditors differently.
00:54:06.680 | You have to treat them all the same.
00:54:08.240 | So I think it'll be some kind of proportional payback.
00:54:12.360 | It's certainly not gonna be that the guys at the top
00:54:14.800 | get a significant amount of their money back
00:54:17.160 | and the rest get nothing.
00:54:18.880 | Unfortunately, I think there's such a small amount
00:54:22.760 | of assets that back this whole thing in the end.
00:54:25.160 | And that value is actually declining every day
00:54:28.040 | because it was inextricably tied to FBF.
00:54:31.400 | It was like the FTT tokens, which now what are those worth?
00:54:35.320 | The serum tokens, that was his project
00:54:37.300 | or the project they made.
00:54:38.640 | What is that worth?
00:54:39.480 | Basically nothing.
00:54:40.320 | So it's a hard situation.
00:54:44.280 | And there's a bigger ethical concern, which is FTX US,
00:54:49.280 | it's unclear how backed it was,
00:54:51.200 | but it was clearly more backed than FTX International.
00:54:55.080 | Do you take all that money and throw it into a big pot
00:54:57.960 | and give people money back?
00:54:59.520 | Or do you give the US people back their amount of money,
00:55:03.840 | which is probably gonna be significantly more
00:55:05.720 | and leave everyone internationally out in the cold.
00:55:08.400 | And to add to that ethical issue,
00:55:10.660 | let's say you're a liquidator and you're US based.
00:55:14.720 | There's a tremendous question, like legal questions
00:55:18.360 | about how do you ethically do that?
00:55:20.480 | It's not clear.
00:55:22.000 | There's a tremendous incentive
00:55:23.520 | to just favor the US people over everyone else
00:55:26.280 | 'cause it's our country, America, whatever.
00:55:28.400 | But I don't know if that's necessarily fair.
00:55:30.820 | It's really hard.
00:55:31.860 | It's like, it's impossible.
00:55:32.960 | - And some, I forget where you said this,
00:55:35.100 | but one of the, I mean, it probably permeates
00:55:39.120 | a lot of the investigations you do,
00:55:41.680 | which is this idea that it's really sad
00:55:44.200 | that the middle class in most situations like this
00:55:47.080 | get fucked over.
00:55:48.400 | So the IRS go after the middle class,
00:55:52.880 | they don't go after the rich.
00:55:54.040 | It's basically everyone who doesn't have a lot of leverage
00:55:57.760 | in terms of lawyers, money, get fucked over.
00:56:02.520 | - Yes, and then they're the ones,
00:56:05.840 | like it's always the rich and powerful
00:56:07.440 | who get the favorable treatment.
00:56:08.800 | As like a microcosm of this, it's a funny story.
00:56:11.200 | So one of the big criticisms of crypto,
00:56:14.960 | and I think rightly so,
00:56:16.400 | is the irreversibility of the transaction.
00:56:18.520 | So if I accidentally send a transaction somewhere,
00:56:21.880 | it's gone, right?
00:56:23.240 | So crypto.com accidentally sent a lady $10 million,
00:56:26.880 | and now they want the money back, and they're suing her.
00:56:30.760 | But the funny thing is, is if you are on crypto.com
00:56:33.880 | and you send, let's say I accidentally send you money
00:56:37.120 | and I come knocking on your door,
00:56:38.280 | Lex, I didn't mean to send you like $1,000,
00:56:42.480 | I need my money back.
00:56:44.440 | Or if I go to crypto.com and I said,
00:56:45.800 | hey, I sent that to the wrong person, can you reverse it?
00:56:47.480 | They'll say, screw off, no way.
00:56:49.880 | If I go to court, they'll kill me in court
00:56:52.080 | 'cause they're gonna go, look,
00:56:52.920 | this is how the blockchain works.
00:56:54.120 | But then they do it, they do the exact same thing.
00:56:56.080 | They send this lady $10 million,
00:56:57.720 | they're suing her and they're gonna win.
00:56:58.960 | Now what's in court is not whether they get the money back,
00:57:02.000 | it's should she be liable for theft, I believe.
00:57:04.960 | So, and that's just another case of
00:57:08.440 | the same rules apply differently to different people,
00:57:12.080 | whether you have the money to back you or not.
00:57:14.200 | It's a very sad thing.
00:57:15.600 | And that's why I think people like,
00:57:17.200 | you need journalists fighting for the little person.
00:57:22.200 | We really need it.
00:57:25.040 | And it's kind of like this unfortunate thing
00:57:26.560 | where that's the most risky thing to do, like legally.
00:57:29.520 | You should not be doing that,
00:57:31.000 | but I think it's important to do.
00:57:33.600 | - It's the ethical thing, it's the right thing to do.
00:57:36.880 | What do you think about influencers and celebrities
00:57:39.440 | that supported FTX and SBF?
00:57:41.320 | Do they, should they be punished?
00:57:43.360 | - Yeah, I think they should take a huge reputational hit.
00:57:46.200 | I mean, I think they should be embarrassed.
00:57:48.640 | I think they should be ashamed of themselves.
00:57:51.680 | - But it was really hard to know, sorry to interrupt,
00:57:54.280 | for them to know, like for example,
00:57:56.440 | I think about this a lot.
00:57:58.120 | It's like, who do I, because I don't investigate,
00:58:03.120 | like sponsored by Athletic Greens, okay?
00:58:07.000 | It's a nutritional drink.
00:58:08.840 | Should I investigate them deeply?
00:58:10.920 | I don't know.
00:58:12.000 | You just kind of use reputational,
00:58:13.640 | like it seems to work for me.
00:58:15.080 | Should I like investigate them deeply?
00:58:19.040 | - I think your credibility hit will depend
00:58:21.840 | on what domain you're an expert in.
00:58:23.880 | If you're sponsored by a robotics company
00:58:26.520 | and you're an expert in robotics,
00:58:28.360 | if that company turns out to be a disaster and a fraud,
00:58:31.400 | then you should have looked more deeply.
00:58:32.960 | We're talking mostly about,
00:58:34.720 | like I hold Tom Brady a lot less accountable
00:58:37.400 | than financial advisors, financial influencers,
00:58:40.200 | because that is their world of expertise.
00:58:42.400 | And you treat their recommendation differently,
00:58:46.080 | proportionally to what you think their expertise is.
00:58:48.880 | So in some ways, I don't actually think,
00:58:50.760 | Tom Brady, I'm sure he reached a lot of people.
00:58:52.600 | I personally didn't feel at all moved by his recommendation
00:58:55.360 | 'cause you know it's just money.
00:58:57.100 | But when you hear somebody who should be an expert
00:58:58.960 | in that thing, endorse a product in that space.
00:59:03.160 | You hold that opinion to a higher standard.
00:59:06.080 | And when they're completely cataclysmically wrong,
00:59:10.320 | it's gonna be a different level of accountability.
00:59:12.440 | And I think rightfully so.
00:59:13.600 | When Jim Cramer was saying Bear Stearns is fine,
00:59:17.600 | he made that terrible call with Bear Stearns in 2008.
00:59:20.680 | He was rightfully reamed for all of that.
00:59:24.480 | Even though it could be considered that like,
00:59:26.720 | well, did he have all the information?
00:59:29.080 | Maybe not.
00:59:29.960 | But he's a financial advisor.
00:59:31.780 | He does this for a living.
00:59:33.760 | If you go on and you make a big call
00:59:35.960 | and you turn out to be wrong and people lose tons of money,
00:59:39.320 | you are going to take a hit and I think rightfully so.
00:59:41.400 | But no, I don't think these people should go to jail
00:59:43.220 | or anything like that.
00:59:44.060 | - No, but it's such a complicated thing.
00:59:45.320 | I mean, I just feel it personally myself.
00:59:47.040 | I get it, but you still feel the burden of the fact
00:59:50.980 | that your opinion has influence.
00:59:53.800 | I know it shouldn't.
00:59:55.080 | I know Tom Brady's opinion on financial investment
00:59:57.320 | should not have influence, but it does.
00:59:59.920 | That's just the reality of it.
01:00:01.480 | That's a real burden.
01:00:02.900 | I didn't know anything about SBF or FTX.
01:00:05.840 | It wasn't on my radar at all.
01:00:08.560 | But I could have seen myself taking them on as a sponsor.
01:00:12.000 | I've seen a lot of people I respect,
01:00:14.360 | Sam Harris and others, talk with SBF
01:00:17.880 | like he's doing good for the world.
01:00:20.800 | So I could see myself being hoodwinked
01:00:22.520 | having not done research.
01:00:24.360 | And the same thing, it makes me wonder.
01:00:26.800 | I don't want to become cynical, man,
01:00:29.400 | but it makes you wonder who are the people in your life
01:00:31.560 | you trust that are like, that could be the next SBF
01:00:35.880 | or worse, big, powerful leaders, Hitler
01:00:39.680 | and all that kind of stuff.
01:00:41.040 | To what degree do you want to investigate?
01:00:44.840 | Do you want to hold their feet to the fire,
01:00:48.080 | see through their bullshit, call them on their bullshit?
01:00:50.240 | And also as a friend, if you happen to be friends
01:00:52.840 | or have a connection, how to help them
01:00:54.700 | not slip into the land of fraud.
01:00:58.440 | I don't know, all of that is just overwhelming.
01:01:00.840 | - Yeah, I mean, we should be clear.
01:01:03.000 | Finance is sort of a special space
01:01:06.000 | where you're talking about people's money.
01:01:09.800 | You're not talking about whether someone
01:01:11.280 | takes a bad supplement or like a supplement
01:01:13.300 | that is just, they're $50 out.
01:01:15.620 | I think the scale of harm and therefore responsibility
01:01:20.480 | escalates depending on what field you're in.
01:01:22.560 | Just like I wouldn't hold Tom Brady as,
01:01:25.880 | like if he gives a bad football opinion,
01:01:27.860 | and he should have known better,
01:01:29.440 | that is a different scale of harm
01:01:31.280 | than a doctor giving bad advice, right?
01:01:33.480 | Like he tells you a pill works
01:01:35.920 | and the pill kills you or something like that.
01:01:38.280 | There's just different levels of accountability
01:01:40.200 | depending on the field you're in,
01:01:41.120 | and you have to be aware of it.
01:01:42.440 | Finance is an extreme, you have to be extremely conservative
01:01:45.900 | if you're gonna give financial advice
01:01:47.680 | because you're playing with people's lives
01:01:49.420 | and you cannot play with them haphazardly.
01:01:51.920 | You cannot gamble with them.
01:01:53.280 | You cannot play with them on a bet
01:01:54.600 | 'cause you're getting paid a lot of money.
01:01:56.520 | It's just the nature of the space.
01:01:58.800 | And so with the space comes the responsibility
01:02:02.040 | and the accountability.
01:02:02.920 | And I don't think you can get around that.
01:02:04.560 | - Who was Dan Friedberg that you mentioned?
01:02:07.160 | Some of these figures in the SBF realm
01:02:09.440 | that are interesting to you.
01:02:10.680 | - Super interesting kind of subject
01:02:12.620 | because Dan Friedberg is the former general counsel
01:02:16.520 | for Ultimate Bet.
01:02:17.840 | Ultimate Bet was a poker site
01:02:22.360 | where famously they got in a scandal
01:02:24.280 | because the owner, Russ Hamilton,
01:02:27.760 | was cheating with a little software piece of code
01:02:30.840 | they called God Mode.
01:02:32.080 | God Mode allowed you to see the guy across from his hand.
01:02:35.960 | Obviously you can imagine you can win pretty consistently
01:02:38.300 | if you know exactly what your opponent has.
01:02:40.680 | Very unethical.
01:02:42.520 | They, I should be clear that for some inexplicable reason,
01:02:46.360 | I don't think they were ever charged
01:02:48.040 | and convicted of a crime,
01:02:49.580 | but they were investigated by a gambling commission
01:02:51.880 | that found they made tens of millions of dollars this way.
01:02:54.160 | For sure.
01:02:55.520 | And Dan Friedberg is the general counsel.
01:02:58.320 | He's caught on a call,
01:02:59.680 | basically conspiring with Russ to hide this fraud.
01:03:04.040 | He's saying we should blame it on a consultant third party.
01:03:07.600 | And Russ Hamilton famously says,
01:03:09.680 | "It was me, I did it.
01:03:10.840 | I don't wanna give the money back.
01:03:12.680 | Find basically a way to get rid of this."
01:03:14.580 | So that's Dan Friedberg's big achievement.
01:03:16.840 | That's what he's known for, he's most known for.
01:03:19.400 | And this is the guy they pick
01:03:20.840 | as their chief regulatory officer for FTX.
01:03:24.600 | Why do you hire somebody who, I get it,
01:03:27.760 | not formally charged and convicted, investigated,
01:03:30.320 | there's all the, and there's tape out there.
01:03:31.920 | So I wanna be clear about
01:03:33.240 | what's actually available evidence.
01:03:35.860 | But someone who's seemingly only achievement
01:03:39.040 | is hiding fraud, why do you hire that guy
01:03:42.260 | if the intention is not to hide fraud?
01:03:46.080 | So this is a question I put to Sam Bankman Fried
01:03:49.400 | and his answer was, "Well, we have a lot of lawyers."
01:03:53.120 | And I said, "Well, it's your chief regulatory officer."
01:03:55.200 | He's like, "Well, it wasn't, we did regulate a lot."
01:03:57.640 | And it was just this big dance of,
01:03:59.720 | basically he's done great work, he's a great guy.
01:04:04.000 | And I think that tells you everything you need to know.
01:04:07.320 | - And there's figures like that probably
01:04:09.120 | even at the lower levels,
01:04:10.260 | like just infiltrate the entire organization.
01:04:12.360 | - Well, it's just like, yeah, why wasn't there a CFA?
01:04:14.920 | Why wasn't there anyone in that space
01:04:17.320 | who could seemingly be the eyes that goes,
01:04:20.960 | holy whatever, we're in dangerous territory here, right?
01:04:25.960 | So yeah, it seems very deliberate.
01:04:28.880 | I mean, I talked to one FTX employee
01:04:30.280 | that they talked about, who's told me they talked about
01:04:33.600 | taking, I think it was taking FTX US public.
01:04:37.360 | And Sam was very against the idea.
01:04:40.360 | And the employee in retrospect speculated
01:04:44.000 | that it might've been because you'd faced so much scrutiny.
01:04:46.640 | Like regulation-wise, like you'd have to go through a lot,
01:04:50.040 | like more thorough audits, all that kind of stuff
01:04:52.320 | that basically he knew they would never pass.
01:04:54.560 | So yeah, I mean, it's red flags all the way down
01:04:58.160 | with that guy.
01:04:59.480 | - And you hope all of them get punished.
01:05:02.720 | - Everyone who knew.
01:05:03.680 | I mean, I think for sure there are people at FTX
01:05:06.120 | who didn't know.
01:05:07.080 | I think there are some people at Alameda who didn't know.
01:05:09.880 | - There's degrees, sorry to interrupt,
01:05:12.200 | but there's degrees of not knowing.
01:05:13.520 | - Yes.
01:05:14.360 | - There's a looking away when you kind of know shady stuff.
01:05:19.000 | That's still the same as knowing, right?
01:05:20.920 | That's might be even worse.
01:05:22.240 | - Well, yeah, like I was talking to one insider
01:05:24.760 | and we were talking about the insider trading.
01:05:26.600 | They were telling me about this insider trading.
01:05:28.840 | And I said, do you think this was criminal?
01:05:32.920 | And they said, it was probably criminal in hindsight, yes.
01:05:36.440 | And the question is, someone who answers a question
01:05:39.280 | like that, what does that like mean?
01:05:41.800 | You know, like it was probably criminal.
01:05:43.600 | So you're right, there are different degrees.
01:05:46.520 | I mean, I'll say at the most basic,
01:05:48.940 | I would be very happy if everyone
01:05:50.400 | who had direct knowledge went to jail,
01:05:52.720 | which I don't think will happen to be clear.
01:05:54.280 | I think a lot of people are gonna cut deals.
01:05:56.040 | Prosecutors are gonna cut deals
01:05:57.520 | so they actually nail Sam Bankman-Fried.
01:05:59.520 | I think that's their only focus.
01:06:01.160 | - What about his reputation?
01:06:02.360 | What do you think about all these interviews?
01:06:04.560 | Do you think they are helping him?
01:06:07.680 | Do you think they're good for the world?
01:06:09.560 | Do you think they're bad for the world?
01:06:10.920 | Like, what's your sense?
01:06:12.320 | And like, say you get a sit down interview
01:06:15.480 | with him for three hours,
01:06:18.120 | and I'm holding the door closed.
01:06:19.780 | Is that a useful conversation or not?
01:06:23.140 | Or at this point, it should be legal.
01:06:25.520 | And that's it.
01:06:27.700 | - I think it's useful.
01:06:29.020 | I mean, I think it's all about how you interview him.
01:06:32.260 | You can interview someone responsibly,
01:06:33.940 | you can interview him irresponsibly.
01:06:36.580 | I think we've seen examples of both.
01:06:39.780 | - What's an irresponsible?
01:06:41.060 | I keep interrupting you rudely.
01:06:42.260 | - That's okay, no, no, no.
01:06:43.300 | - It's unacceptable.
01:06:44.140 | - No, no, no, I think it's fine.
01:06:45.900 | There was like a New York Times interview,
01:06:47.260 | which spends any amount of time talking about his sleep.
01:06:51.500 | And he's like, "Yeah, I'm sleeping great."
01:06:52.980 | I mean, I think that's so deeply disrespectful
01:06:55.320 | to the victims.
01:06:56.300 | And especially when you're not even releasing
01:06:58.480 | an interview live, it's like,
01:06:59.940 | you have time to triage what you're gonna talk about.
01:07:02.400 | Why would you spend any amount of time
01:07:03.840 | talking about the sleep that a fraudster is getting?
01:07:08.000 | It's just so weird.
01:07:09.760 | - Well, let's do it.
01:07:10.600 | Can I steal a man in that case?
01:07:12.240 | I don't think it turned out well.
01:07:14.520 | - I think that's true.
01:07:15.480 | I think, okay, here's the thing.
01:07:17.180 | I could see myself talking about somebody's sleep
01:07:20.920 | or getting in somebody's mind
01:07:23.080 | if I knew I have unlimited time with them.
01:07:25.720 | If I knew I had like four hours.
01:07:27.280 | 'Cause you get into the mind of the person,
01:07:28.760 | how they think, how they see the world.
01:07:31.080 | Because I think that ultimately reveals,
01:07:34.080 | if they're actually really good at lying,
01:07:36.120 | it reveals the depths, the complexity of the mind
01:07:40.820 | that through like osmosis, you get to understand
01:07:43.000 | like this person is not as trivial as you realize.
01:07:47.440 | Also, it makes you maybe realize that this person
01:07:50.160 | has a lot of hope, has a lot of positive ambition
01:07:53.760 | that has developed over their life.
01:07:57.120 | And then certain interesting ways, things went wrong.
01:08:00.120 | - Yes.
01:08:00.940 | - They become corrupt and all that kind of stuff.
01:08:02.600 | - That's all fine.
01:08:04.200 | But this conversation was not properly contextualized
01:08:08.280 | in the world of what he did.
01:08:12.920 | I've asked about this interview
01:08:14.360 | 'cause I was like so curious.
01:08:15.200 | It was out of the New York Times.
01:08:17.120 | And there was not much mention of fraud or jail
01:08:19.720 | or the big crimes like misappropriation
01:08:22.280 | of even client assets.
01:08:24.000 | It was just sort of this, Sam sat down with me,
01:08:27.440 | he's under investigation, but there's not much specifics.
01:08:30.360 | And then it's like, yeah, he's playing storybook brawl,
01:08:33.120 | he's sleeping, and it's just like,
01:08:35.160 | okay, this isn't adding to the conversation.
01:08:37.920 | - Especially when the New York Times,
01:08:39.080 | it's like you should be grilling.
01:08:41.360 | - Right, right, exactly.
01:08:42.560 | So, but as I said, it's all range, the gamut.
01:08:46.080 | And some interviews, some of it's okay,
01:08:48.280 | and then some of it's weird.
01:08:49.640 | The Andrew Sorkin interview,
01:08:51.520 | he asked some hard-hitting questions,
01:08:52.940 | which I really appreciated.
01:08:54.680 | And then at the end, he goes,
01:08:55.680 | "Ladies and gentlemen, Sam Bankman-Fried,"
01:08:57.160 | and everyone gives an ovation for Sam.
01:09:01.760 | I mean, the steel man of that, of course,
01:09:03.200 | is they're actually applauding Andrew Sorkin.
01:09:06.200 | But the way you lay it up,
01:09:07.660 | I wouldn't go like, ladies and gentlemen,
01:09:09.640 | it's like an applause line.
01:09:10.480 | It's like, ladies and gentlemen, the Eagles,
01:09:11.760 | Elton John, Lex Friedman.
01:09:13.680 | And so to go, so you have this like deal book summit
01:09:17.500 | where you have all these important figures
01:09:19.440 | that are positively important.
01:09:21.360 | And at the end, you have Sam Bankman-Fried, a fraudster,
01:09:24.040 | and you go, "Ladies and gentlemen, Sam Bankman-Fried,"
01:09:25.480 | everyone's applauding.
01:09:26.760 | That I think is a net, like I think that's a negative.
01:09:29.180 | I think the way that the optics of that just were all wrong.
01:09:32.920 | And so I think, yeah, you have to be very responsible.
01:09:36.300 | I think it's useful,
01:09:37.480 | going back to how you can usefully do this,
01:09:40.360 | you can, even when somebody's determined to lie to you,
01:09:43.560 | it's always important to pin them down
01:09:48.520 | to an accounting of events,
01:09:50.480 | because that is unimaginably helpful
01:09:52.900 | when it comes to a prosecutor
01:09:54.160 | trying to prove this guy's guilty,
01:09:55.880 | is if you say you didn't do a crime,
01:09:59.080 | but you don't tell me any details about it,
01:10:01.500 | day of the trial,
01:10:03.400 | you can basically make up any story, right?
01:10:05.940 | But if you tell me in detail where you were that day,
01:10:09.140 | I can go hunt down, you say you were with Joe,
01:10:11.100 | I go hunt down Joe and he says he wasn't with you,
01:10:12.980 | boom, you've lost credibility,
01:10:14.140 | and now you're much more likely to be convicted.
01:10:18.580 | So it's really important to get SBF's exact accounting
01:10:22.620 | of how things went wrong,
01:10:24.100 | because right now he's positioning himself
01:10:26.540 | to throw his Alameda CEO, Caroline Ellison, under the bus.
01:10:30.300 | Like she did everything, she knew everything,
01:10:32.460 | I knew nothing.
01:10:33.860 | Well, if Caroline Ellison's gonna take the stand and go,
01:10:35.620 | "Well, I have all these text messages,
01:10:37.020 | "and this is all a lie,
01:10:37.860 | "then Sam Megman Freed is gonna be completely ruined,
01:10:42.860 | "self-ruined by his own design."
01:10:44.580 | So I think it's important.
01:10:45.420 | - So more like a legal type of,
01:10:47.740 | like get the details of where he was,
01:10:49.580 | what he was thinking, what the--
01:10:50.780 | - I think it's like, yeah,
01:10:51.700 | I think the public probably cares
01:10:54.060 | to get to know what happened to,
01:10:55.580 | and again, I think if you're careful,
01:10:59.460 | you can expose someone as they lie to you
01:11:02.980 | without giving into those lies, right?
01:11:05.420 | Like without capitulating to,
01:11:07.940 | "Oh, I'm just gonna assume you're correct."
01:11:10.140 | I think you can point to,
01:11:11.520 | "Well, Lex, you say it happened this way,
01:11:13.260 | "but you've lied about X, Y, and Z,
01:11:14.920 | "why should we believe you?"
01:11:16.260 | That's a suddenly a totally different conversation
01:11:18.460 | than just being like, "Oh, okay, that's how it happened."
01:11:20.740 | - The thing I caught that bothered me,
01:11:24.620 | and the thing I hope to do in interviews
01:11:26.740 | if I eventually get good at this thing,
01:11:32.020 | is the human aspect of it,
01:11:34.420 | which I think you have to do in person,
01:11:36.820 | is he seems a bit nonchalant about the pain
01:11:39.100 | and the suffering of people.
01:11:41.060 | I have red flags about,
01:11:43.500 | in the way he communicates about the loss of money,
01:11:48.500 | like the pain that people are feeling about the money,
01:11:53.100 | I get red flags.
01:11:54.420 | Forget if you're involved in that pain or not,
01:11:57.980 | you're not feeling that pain.
01:12:01.220 | - Well, he'll say he is,
01:12:02.840 | but he'll be playing a game of League of Legends
01:12:05.080 | while doing it.
01:12:05.920 | - No, but I just see it from his face,
01:12:07.320 | that the dynamic, and that needs to be grilled,
01:12:10.360 | like that little human dance there.
01:12:12.720 | I considered, I talked to him,
01:12:17.480 | I considered doing an in-person interview with him.
01:12:20.040 | - Are you still considering it?
01:12:25.840 | - I don't know, do you think I should in person?
01:12:30.880 | - I think it depends if you think you have anything
01:12:33.220 | to add to the conversation.
01:12:34.180 | A lot of people have already done.
01:12:35.020 | - Yeah, there's been already, you did an incredible job.
01:12:37.340 | - Thanks.
01:12:38.180 | I think--
01:12:40.500 | - I think I would like to grill the shit out of him
01:12:42.620 | as a fellow human, but not investigative,
01:12:44.860 | like Coffee's Ill Investigator.
01:12:46.180 | - Yeah, yeah, yeah.
01:12:47.020 | - Like another human being,
01:12:49.780 | another human being who I can have compassion for,
01:12:52.460 | who has caused a lot of suffering in the world.
01:12:54.880 | Like that, that grilling,
01:12:57.680 | like basically convey the anger that people,
01:13:01.880 | and the pain that people are feeling, right?
01:13:04.440 | Like that.
01:13:05.320 | - Yeah, I think it'd be really hard.
01:13:09.520 | I mean, like that guy is sort of a master dancer,
01:13:12.780 | and what he would say at the end of it,
01:13:14.320 | 'cause I've listened to so many interviews of him,
01:13:17.840 | I probably am like a GPT model for Sam.
01:13:20.720 | I think he would do some kind of thing about like,
01:13:24.060 | yeah, I really hear you,
01:13:26.560 | and it's just terrible.
01:13:28.500 | I feel such an obligation to the people who've lost money,
01:13:30.620 | and it's just, it's a lot of money.
01:13:32.380 | It's a lot of money.
01:13:33.380 | You know, he'd do something like that,
01:13:34.760 | and it would be very superficially like, okay,
01:13:37.980 | but when you drill down to the details of what he did,
01:13:42.340 | it's just impossible that he didn't know.
01:13:44.620 | And one of the things that I wish I had asked,
01:13:47.020 | maybe I can talk about like,
01:13:48.380 | I wish I had gone on this,
01:13:49.820 | just so hard when you're doing a live interview
01:13:51.460 | to kind of focus on one thing.
01:13:53.420 | Everyone's asked about the terms of service.
01:13:55.180 | So in the terms of service, there was like,
01:13:57.980 | we can't touch your funds, your funds are safe,
01:13:59.700 | we're never gonna do anything with that.
01:14:00.820 | Anytime anyone brings that up, he says, oh,
01:14:03.060 | well, there's this other terms of service over here
01:14:05.460 | with margin trading accounts.
01:14:06.700 | Remember we talked about it, it's a derivatives platform.
01:14:09.100 | If you're in our derivative side,
01:14:11.220 | you're subject to different terms of service,
01:14:13.260 | which kind of lets us like move your money around
01:14:15.360 | with everyone else, okay?
01:14:17.140 | So we treat it as one big pool of funds.
01:14:19.140 | And that's sort of the explanation
01:14:20.140 | of how this all happened is,
01:14:21.700 | we had this huge leverage position,
01:14:23.420 | and we lost everything.
01:14:25.180 | But what no one has sort of done a good enough job
01:14:27.820 | getting to the heart of,
01:14:29.260 | is that this pool of funds never was segregated properly.
01:14:33.180 | It was all treated under the same umbrella
01:14:35.420 | of we can use your funds.
01:14:37.260 | There was no amount of, we have the client deposits,
01:14:40.400 | which were just deposited with us
01:14:41.940 | and not like used to margin trade or do anything over here.
01:14:45.900 | These funds over here, we have saved, they didn't.
01:14:48.900 | Fundamentally, they lied from the get go
01:14:51.560 | about how they were treating the most precious assets,
01:14:55.000 | which is your customer deposits
01:14:56.960 | that you said you didn't invest.
01:14:58.540 | Clearly you put them all over here, you YOLO gambled them.
01:15:01.960 | And then when everyone starts withdrawing from here,
01:15:05.460 | they don't have any money over here.
01:15:06.880 | So that is like one of the most fundamental things
01:15:09.720 | that I haven't seen anyone grill him on.
01:15:12.420 | And the next time if I get the chance to ambush him again,
01:15:15.640 | that's what I'm gonna drill down on
01:15:17.000 | because it's impossible for that not to be fraud.
01:15:21.200 | There's no world where you had a pool of funds over here
01:15:24.680 | and now you don't have them
01:15:26.080 | without you somehow borrowing over here.
01:15:30.120 | 'Cause if you deposited one Bitcoin
01:15:31.400 | and I never sold that Bitcoin and it's earmarked,
01:15:33.760 | Lex Friedman, and you come and it's not there,
01:15:36.520 | something had to happen, right?
01:15:39.180 | - Well, so this is so interesting.
01:15:40.520 | So for me, the approach that,
01:15:45.320 | like you said, the most important question of,
01:15:49.040 | 'cause for you it's like,
01:15:50.340 | were those funds segregated?
01:15:51.820 | For me, the question is, as a human being,
01:15:57.120 | how would you feel if you were observing that?
01:15:59.760 | So like, you know, that like marshmallow test with the babies
01:16:03.960 | like it's the human thing,
01:16:05.360 | this is a human nature question.
01:16:06.760 | Like I can understand there's a pile of money and you,
01:16:11.400 | the good faith interpretation is like,
01:16:14.680 | well, I know what to do with that pile of money
01:16:16.520 | to grow that pile of money.
01:16:17.620 | Let me just take a little bit of that.
01:16:19.920 | Like, how willing are you to do that kind of thing?
01:16:23.080 | How able are you to do that kind of thing?
01:16:25.000 | And when shit goes wrong, what goes through your mind?
01:16:28.680 | How does it become corrupted?
01:16:30.000 | How do you begin to deluge yourself?
01:16:31.960 | How do you delegate responsibility for the failures?
01:16:35.640 | Like, as opposed to getting facts,
01:16:38.280 | try to sneak into the human mind of a person
01:16:42.720 | when they're thinking of that.
01:16:43.760 | Because the facts, they're gonna start waffling.
01:16:47.720 | They're gonna start like trying to make sure
01:16:50.800 | they don't say anything that gets them incriminated.
01:16:53.680 | But I just, I want to understand the human being there
01:16:57.400 | because I think that indirectly gives you a sense of
01:16:59.640 | where were you in this big picture?
01:17:01.280 | - I think I've talked to so many people
01:17:03.520 | who have sort of committed some range of like outright fraud
01:17:07.680 | to like misleading marketing.
01:17:10.000 | No one thinks they're a bad person.
01:17:12.760 | Nobody admits that they did it and they knew they did,
01:17:16.560 | or almost nobody does.
01:17:18.440 | There's actually one funny exception.
01:17:19.840 | But I had a guy who admitted like,
01:17:23.040 | "Yeah, I did it, it was wrong.
01:17:24.680 | And, you know, but I did it and I wanted the money."
01:17:27.120 | Which was kind of like almost refreshing in its honesty.
01:17:30.120 | But the reason I focus on like the facts
01:17:33.880 | is because unless you find a bright red line,
01:17:38.000 | humans can rationalize anything.
01:17:40.000 | I can rationalize any level of like,
01:17:42.160 | "Well, I did this because I had the best of intentions."
01:17:44.200 | And if you play the intention game,
01:17:45.960 | you'll never convict anyone
01:17:47.280 | because everyone has good intentions.
01:17:48.640 | Everyone's honest, everyone's doing the best they can
01:17:51.160 | and got misleaded and got misguided and dah, dah, dah, dah.
01:17:54.040 | Ultimately, you have to drill down to the concrete
01:17:57.120 | and go, "Look, I get it.
01:17:59.720 | You're just like the last 50 guys that I interviewed.
01:18:02.840 | You had the best of intentions.
01:18:04.240 | It all went wrong, I'm very sorry for you.
01:18:06.080 | But at the end of the day, there's people hurting
01:18:07.640 | and there's people that have significant damage
01:18:09.560 | to their life because of you.
01:18:11.080 | What did you actually do?
01:18:12.800 | And what can we prove taking intention out of it,
01:18:15.720 | taking motivation out?
01:18:16.640 | What can we prove that you did
01:18:18.080 | that was unethical, illegal, or immoral?"
01:18:21.040 | And that is sort of what usually I try to go to
01:18:24.680 | because I will do those human interviews,
01:18:27.200 | but it's just like the same record on repeat.
01:18:32.040 | I mean, a lot of people go to the same-
01:18:34.040 | - I'm with you, I'm with you on everything you said,
01:18:36.840 | but there is ways to avoid the record on repeat.
01:18:40.920 | I mean, those are different skillset.
01:18:42.120 | You're exceptionally good at the investigative,
01:18:44.680 | like investigating.
01:18:46.200 | I do believe there's a way to break through the repeat.
01:18:49.440 | There's different techniques to it.
01:18:51.680 | One of which is like taking outside
01:18:53.320 | of their particular story.
01:18:54.640 | Yes, when everyone looks at their own story,
01:18:57.520 | they can see themselves as a good player doing good.
01:19:00.160 | But you can do other thought experiments.
01:19:03.640 | I mean, there's-
01:19:04.840 | - But they'll follow you.
01:19:05.720 | They'll know what the thought experiment is.
01:19:07.320 | - No, well, it depends.
01:19:08.920 | It depends, my friend.
01:19:10.440 | I mean, to me, there's a million of them,
01:19:15.440 | but just exploring your ethics.
01:19:19.680 | Would you kill somebody to protect your family?
01:19:22.760 | And you explore that.
01:19:24.440 | You start to sneak into like,
01:19:25.880 | what's your sense of the in-group versus the out-group?
01:19:30.880 | How much damage you can do to the out-group?
01:19:33.200 | And who is the out-group?
01:19:34.720 | And you start to build that sense of the person.
01:19:37.400 | Are we like the two mobsters that we're dressed as?
01:19:40.080 | Do we protect the family and fuck everyone else?
01:19:43.320 | You're with us and the ones who are against us, fuck them.
01:19:46.440 | Or do we have a sense that human beings
01:19:49.560 | are all have value, equal value,
01:19:52.000 | and we want to, we're a joint humanity.
01:19:54.040 | There's ways to get to that.
01:19:55.320 | And you start to build up this sense of like,
01:19:58.280 | some people that make a lot of money are better than others.
01:20:02.200 | They deserve to be at the top.
01:20:03.980 | If you have that feeling,
01:20:05.600 | you start to get a sense of like,
01:20:08.160 | yeah, the poor people are the dumbasses.
01:20:10.160 | They're the idiots.
01:20:11.000 | If you believe that,
01:20:12.320 | then you start to understand that this person
01:20:14.040 | may have been at the core of this whole corrupt organization.
01:20:18.160 | - Yes, two things.
01:20:19.760 | One, I think you should join me on this side of the table.
01:20:22.880 | We'll put SBF over here.
01:20:24.400 | We'll good guy, bad guy, human, facts.
01:20:27.720 | - You're the bad guy.
01:20:28.800 | I'm like, no, no, no, slow down, coffee.
01:20:30.800 | - What is your feeling about humanity?
01:20:32.940 | Yeah.
01:20:34.640 | - Have you been getting enough sleep?
01:20:36.120 | - Yeah, right.
01:20:37.480 | So I think, no,
01:20:38.320 | I think there's a lot of truth to what you said.
01:20:40.640 | One thing I've noticed that is hard to combat
01:20:44.360 | is sort of like preference falsification.
01:20:47.240 | And just like,
01:20:48.360 | just the outright lying about those things
01:20:51.480 | is tough to kind of pin down.
01:20:53.260 | But yeah, you're absolutely right.
01:20:55.060 | There's ways to interview people.
01:20:56.120 | There's all sorts of interesting techniques.
01:20:58.000 | And yeah, I don't disagree.
01:20:59.600 | - Good cop, bad cop.
01:21:01.520 | We should do this.
01:21:02.360 | This should be like a sitcom.
01:21:03.180 | Okay.
01:21:04.020 | You did an incredible documentary on SafeMoon.
01:21:07.480 | The title is "I Uncovered a Billion Dollar Fraud."
01:21:11.560 | Can you tell me the story of SafeMoon?
01:21:14.080 | - Sure.
01:21:14.920 | So SafeMoon was a crypto coin that exploded on the scene
01:21:19.920 | in 2021, I think at this point.
01:21:24.040 | Sorry, I'm losing track of my years.
01:21:25.560 | One year in crypto is like five years in real life.
01:21:28.920 | But it kind of gained a huge amount of popularity
01:21:31.980 | because of this idea that,
01:21:34.200 | it's in the name,
01:21:35.040 | you go safely to the moon.
01:21:36.480 | How they were gonna do this
01:21:37.880 | is with sort of a sophisticated smart contract idea
01:21:41.200 | where there's,
01:21:43.600 | I kind of have to explain the way some contracts
01:21:46.600 | get rug pulled for a second,
01:21:48.200 | or there's scams happen.
01:21:50.280 | So sometimes it's called like the shit coin space,
01:21:54.000 | the alt coin space,
01:21:54.840 | anything like below Bitcoin, Ethereum,
01:21:56.960 | and maybe the top five or 10
01:21:58.800 | is kind of seen as this wasteland of gambling.
01:22:01.520 | And you don't know if the developers
01:22:04.520 | are gonna become anything or not.
01:22:06.160 | You're kind of like reading the white paper,
01:22:08.160 | trying to figure it out.
01:22:09.400 | So there's this big question about like,
01:22:11.200 | how can you get scammed?
01:22:12.360 | How can, back to the interests,
01:22:14.940 | you don't want the developer to have some like,
01:22:18.800 | parachute cord where they can pull all the money out.
01:22:21.240 | So one way this happens is that in decentralized finance,
01:22:26.240 | there's something called the liquidity pool.
01:22:29.840 | Okay?
01:22:30.680 | It's basically this big pot of money
01:22:32.120 | that allows people to trade
01:22:34.120 | between two different currencies.
01:22:36.320 | So let's say like SafeMoon and Bitcoin, right?
01:22:39.180 | Or Ethereum, or it's actually on the Binance Smart Chain.
01:22:41.640 | So it'd be BNB.
01:22:43.180 | And this pool of money can be controlled by the developers
01:22:48.240 | in such a way they can steal it all, right?
01:22:50.760 | They can just grab it.
01:22:52.320 | I don't wanna go too much into details
01:22:53.600 | 'cause I feel like I'll lose people here.
01:22:54.640 | But the point of SafeMoon was,
01:22:56.520 | the core idea was we're locking this money up.
01:23:00.120 | You can't touch it.
01:23:01.520 | And actually every transaction that you buy SafeMoon with,
01:23:05.680 | we'll take a 5% tax of that.
01:23:07.860 | We'll do a 10% tax, but 5% of it,
01:23:10.320 | we'll go back to all the holders of SafeMoon.
01:23:13.240 | Okay?
01:23:14.060 | And 5% of it, we'll go back into this little pool of money.
01:23:16.640 | Okay?
01:23:17.480 | So the idea is as you trade,
01:23:19.560 | as this token becomes more viral,
01:23:22.280 | two things will happen.
01:23:23.340 | One, the people who are holding it long-term
01:23:25.280 | will be rewarded for holding it long-term
01:23:26.940 | by receiving this 5% tax that's distributed to everyone.
01:23:30.200 | And two, you can kind of trust
01:23:32.200 | that your money's gonna have this stable value
01:23:34.900 | because this pool of money here in the middle,
01:23:37.360 | that's kind of guaranteeing you can get your SafeMoon out
01:23:39.800 | into this actually valuable currency,
01:23:42.640 | it's not gonna move.
01:23:43.760 | So the story of SafeMoon was that fundamentally,
01:23:47.660 | this was not the case.
01:23:49.380 | They promised that this money was gonna be locked up.
01:23:51.620 | It was not actually locked up at all.
01:23:53.080 | They said it was automatically locked up.
01:23:54.480 | You don't have to worry about it.
01:23:55.660 | Well, it was very manually locked up
01:23:57.280 | and they didn't actually lock a lot of it up.
01:24:00.000 | They took a lot of it for themselves, for the developers.
01:24:02.640 | So there's a lot of players in this.
01:24:05.240 | A lot of them have left by now.
01:24:08.240 | There's kind of this main CEO
01:24:09.520 | that everyone knows, John Karony now.
01:24:11.560 | And despite saying that they were gonna lock up
01:24:15.000 | all the funds for four years,
01:24:16.440 | somehow he's gone from,
01:24:18.000 | as everyone else in the token
01:24:20.680 | has lost 99% of the value of the token.
01:24:23.460 | So they've lost 99%.
01:24:25.860 | He's gotten like a $6 million crypto portfolio,
01:24:30.860 | multimillion dollar real estate portfolio,
01:24:34.500 | invested millions into various companies.
01:24:37.460 | So he's accrued this huge wealth.
01:24:40.140 | And so I made a video basically exposing that
01:24:42.060 | and showing how this coin,
01:24:43.460 | which once had a $4 billion market cap,
01:24:46.020 | is just viral everywhere.
01:24:47.460 | Everyone was talking about it.
01:24:48.940 | Because of these viral ideas,
01:24:50.980 | it is sort of a captivating idea
01:24:53.040 | that by holding it, you could get returns, right?
01:24:55.940 | Like you just hold onto it, you automatically get money.
01:24:58.180 | And it's a viral idea that this money
01:25:00.020 | in the middle in the pot isn't gonna leave you.
01:25:03.260 | When those things turned out to be false,
01:25:05.780 | this community has had a slow death
01:25:08.380 | as a lot of people realized it was a scam.
01:25:10.660 | And there's been a core part of the community,
01:25:13.220 | which gets to an interesting dynamic
01:25:14.720 | we can talk about if you want to,
01:25:16.780 | where they have like doubled down on the belief in Karony.
01:25:20.740 | And so part of it was out of a hope
01:25:22.820 | to let those people know what was really going on
01:25:24.740 | in their coin and like hopefully save some of them.
01:25:27.520 | Not in like some altruistic sense,
01:25:30.820 | but like, or not in like some like, I'm like a hero sense,
01:25:33.260 | but in the sense of like,
01:25:35.440 | I think a lot of them didn't know,
01:25:37.080 | like literally didn't know.
01:25:37.920 | So just sort of like as a public service,
01:25:39.820 | letting them know so they could get their money out
01:25:42.700 | and hopefully save themselves a lot of pain and suffering.
01:25:46.780 | So yeah.
01:25:47.620 | So they really dug in.
01:25:49.180 | So there's- Some did, some did.
01:25:50.660 | Some did, some left.
01:25:51.860 | I mean, a lot of people have left,
01:25:52.940 | but the people who are left
01:25:54.780 | are people with large amounts of safe moon holdings
01:25:57.940 | that are down immensely.
01:25:59.260 | And you can imagine at a certain point in losses,
01:26:01.880 | there's a tremendous psychological pressure to go,
01:26:05.180 | look, I'm in it.
01:26:06.980 | I got to go for the long haul.
01:26:08.780 | And then you want to believe that this thing is legitimate
01:26:11.940 | and will succeed because A,
01:26:13.900 | there's an ego component around, I haven't been scammed.
01:26:16.880 | I'm too smart to get scammed.
01:26:18.580 | It's tremendously,
01:26:20.780 | it hurts psychologically to acknowledge
01:26:22.620 | you've been taken for a ride.
01:26:24.860 | And also you just want this thing to succeed
01:26:26.520 | for your financial wellbeing.
01:26:27.540 | So you like want to believe it.
01:26:28.860 | So there's tremendous psychological pressure
01:26:30.820 | to build cult-like communities around these tokens.
01:26:35.660 | And I've noticed with the incentive of like community built,
01:26:40.420 | it's sort of new to finance.
01:26:41.820 | There's like these meme coins or these cults.
01:26:45.820 | I don't want to,
01:26:46.660 | it's not really fair to call all of them cults.
01:26:47.940 | Like some of them are open to criticism,
01:26:49.620 | but one of the things that defines cults
01:26:52.060 | is they're not open to sunlight or criticism.
01:26:54.900 | There's these financial communities that are opening up
01:26:58.020 | with crypto, with a few stocks,
01:27:00.940 | where if you criticize them, you are attacked.
01:27:03.460 | And the entire community has every incentive
01:27:05.780 | to kind of like downplay your legitimate criticisms
01:27:08.940 | or kind of go after you.
01:27:11.940 | And so it creates this interesting dynamic
01:27:14.420 | that I'm fascinated by.
01:27:16.180 | - What do you think about Bitcoin then?
01:27:19.920 | Do you think it's one of those communities
01:27:22.580 | that does attack you when criticized?
01:27:24.860 | So which, I guess, which coins do you think
01:27:27.460 | are open to criticism and which are not?
01:27:29.940 | - It's kind of tough.
01:27:30.780 | Like no community is a monolith.
01:27:33.100 | So just like, it's just a spectrum of how open they are.
01:27:37.980 | There's just like, there's always this core contingent
01:27:40.940 | of extreme believers who will go after anyone
01:27:44.700 | who criticizes them.
01:27:46.060 | And it's just about how wide of a band
01:27:48.620 | that makes up of the entire token.
01:27:50.540 | - Sure.
01:27:51.380 | How intensely, how active that small community is.
01:27:54.140 | - Correct.
01:27:54.980 | - So it's in Bitcoin, they're called Bitcoin maximalists.
01:27:57.380 | - Yes.
01:27:58.220 | - But you could also call any community's subgroup
01:28:02.220 | like that maximalist, whatever the belief is.
01:28:04.460 | - Correct.
01:28:05.300 | - I don't know.
01:28:06.140 | Dunkin Donuts maximalists.
01:28:08.020 | That community is probably small
01:28:09.260 | in terms of attacking online.
01:28:11.060 | You know which community has a very intense following?
01:28:14.740 | So I got attacked on the internet
01:28:18.380 | when I said Messi's better than Ronaldo.
01:28:21.460 | - Oh yeah.
01:28:22.380 | That's controversial.
01:28:23.660 | - And so that's a very intense maximalist community there.
01:28:27.460 | The other one that surprised me is when I said,
01:28:31.580 | now I did it in jest, okay folks?
01:28:35.460 | I said Emacs is a better ID than Vim.
01:28:38.620 | - I love Emacs.
01:28:39.500 | I agree.
01:28:41.340 | - Listen, I have trauma.
01:28:42.700 | I wake up sweating sometimes at night thinking-
01:28:45.740 | - Emacs master race.
01:28:47.020 | - The Vim people are after me.
01:28:49.620 | They're everywhere.
01:28:50.620 | They're in the shadows.
01:28:51.460 | No, Vim is an amazing,
01:28:52.900 | and it's actually a surprise.
01:28:54.460 | I've recently learned that it's still even more so
01:28:58.540 | than before, an incredibly active community.
01:29:00.540 | So a lot of people wrote to me.
01:29:01.380 | - But do you use SpaceMax?
01:29:02.580 | It's just Emacs and Vim?
01:29:04.380 | - No, I haven't.
01:29:05.420 | I use Raw.
01:29:06.260 | - Old school Emacs?
01:29:07.940 | - But-
01:29:08.780 | - Oh, you gotta use, yeah, yeah, yeah.
01:29:09.620 | - But hold on a second.
01:29:10.740 | I actually recently, I have recently said,
01:29:13.020 | "You know what?
01:29:13.860 | "Let's make love, not war."
01:29:16.420 | And I went to VS Code.
01:29:17.820 | I went to a more modern ID.
01:29:22.140 | - Sure.
01:29:22.980 | - But, 'cause I did most of my programming in Emacs.
01:29:25.180 | I did most of anything as one does in Emacs,
01:29:27.580 | just 'cause I also love Lisp,
01:29:28.820 | so I can customize everything.
01:29:30.300 | Then I realized, like,
01:29:31.420 | like how long will Vim and Emacs be around, really?
01:29:35.300 | I was thinking, as a programmer,
01:29:38.700 | looking like 10, 20 years out,
01:29:40.900 | you know, I should challenge myself to learn new IDs,
01:29:44.580 | to learn the new tools
01:29:46.860 | that the majority of the community is using
01:29:48.780 | so that I can understand what are the benefits and the cost.
01:29:51.300 | I found myself getting a little too comfortable
01:29:53.060 | with the tools that I grew up with.
01:29:54.980 | - Sure.
01:29:55.820 | - And I think one of the fundamental ways
01:29:56.980 | of being as a programmer,
01:29:58.140 | as anyone involved with technology,
01:29:59.900 | based on how quickly it's evolving,
01:30:02.860 | is to keep learning new tools.
01:30:04.140 | Like, the way of life should be constantly learning.
01:30:06.940 | You're not a mathematician or a physicist
01:30:08.940 | or any of those disciplines that are more stable.
01:30:11.780 | This is like, everything is changing.
01:30:13.100 | Crypto's, like you said, a perfect example of that.
01:30:15.780 | You have to constantly update your understanding
01:30:18.220 | of digital finance, constantly,
01:30:22.540 | in order to be able to function,
01:30:23.900 | in order to be able to criticize it,
01:30:26.380 | in order to be able to know what to invest in.
01:30:28.060 | So yeah, that was why I did,
01:30:30.660 | I tried PyCharm a bunch,
01:30:33.820 | the whole JetBrains infrastructure,
01:30:36.580 | and then also VS Code, 'cause that's really popular.
01:30:39.500 | I mean, you know, Atom and Sublime, all of those.
01:30:43.660 | I've been exploring, I've been exploring.
01:30:45.780 | But VS Code is amazing.
01:30:46.820 | - You should check out SpaceMax.
01:30:48.220 | I'm just gonna give one more pitch for it.
01:30:49.740 | It's just basically like a customizable configuration.
01:30:53.780 | Well, Emacs is already customizable,
01:30:55.100 | but it's pretty useful.
01:30:58.220 | I'm not even much of a coder,
01:30:59.940 | but for like certain journaling applications
01:31:03.620 | or like time management, like I find it really useful, so.
01:31:06.380 | But anyway, we're so like,
01:31:09.020 | I feel like half this podcast is what it should have been,
01:31:11.060 | and half of it's just us nerding out
01:31:12.660 | about our own engineering, like idiosyncrasies.
01:31:15.860 | Sorry, sorry guys.
01:31:17.700 | - All right, so what were we talking about?
01:31:20.260 | SafeMoon and Bitcoin, Bitcoin.
01:31:23.700 | What do you think, is there,
01:31:25.100 | have you made enemies in certain communities?
01:31:29.260 | What do you think about Bitcoin?
01:31:30.620 | - So I've made certain enemies
01:31:33.380 | in the sort of crypto skeptics space,
01:31:36.180 | 'cause there's sort of this range of skepticism
01:31:38.220 | you can have about cryptocurrency.
01:31:40.260 | I'm obviously a skeptic of a lot of it,
01:31:42.660 | but there are certain aspects of crypto
01:31:46.780 | that I think are inevitable,
01:31:48.580 | and I'm gonna do my best to kind of describe those here,
01:31:51.460 | but I'm not committed to any crypto specifically,
01:31:53.580 | but there are some, I've taken a lot of heat,
01:31:55.900 | ironically, for not being skeptical enough.
01:31:57.780 | There's some people who believe that like the entire thing
01:32:00.020 | is a complete waste of time.
01:32:01.740 | There are r/buttcoin on Reddit.
01:32:04.700 | It's an amazing community, actually, it's very funny.
01:32:07.220 | They have--
01:32:08.060 | - What's a buttcoin?
01:32:08.900 | Is that-- - It's like a play
01:32:09.740 | on Bitcoin.
01:32:10.580 | They're like, they're just like,
01:32:12.580 | at least we admit it's a scam.
01:32:14.180 | Very funny guys, very funny people there.
01:32:17.180 | So, but they'll be like, you know,
01:32:19.660 | Coffee's Alicia just admit
01:32:20.980 | that all of it's a giant Ponzi scheme.
01:32:22.460 | All of it's basically like not real.
01:32:24.980 | - So everything including Bitcoin?
01:32:26.980 | - Yeah, it's all basically, all the Ponzinomics,
01:32:29.780 | it's Ponzinomics all the way down.
01:32:32.140 | It's like there is no fundamental use case
01:32:34.580 | that is that useful.
01:32:36.420 | I don't know if, I guess I don't wanna straw man them here.
01:32:39.540 | I don't wanna say that,
01:32:40.780 | I don't know if they're saying that it's all useless.
01:32:44.260 | At minimum, they're saying the level of interest
01:32:47.380 | in cryptocurrencies is far,
01:32:50.700 | the actual usefulness of it is far less
01:32:53.460 | than the amount of attention and time and money
01:32:56.220 | that's being poured into it.
01:32:57.060 | So like the revolutionariness of this technology
01:32:59.580 | is not at all revolutionary.
01:33:01.420 | Let me kind of steel man what I think the pro crypto take is.
01:33:06.060 | I think that technologies are sort of this inert thing
01:33:11.980 | and the success of them in my opinion is not based on PR,
01:33:16.700 | it's not based on marketing,
01:33:18.260 | it's based on cheaper, faster, better.
01:33:20.860 | Fundamentally, the success of any technology
01:33:22.860 | relies on those three things and longevity of it.
01:33:26.220 | So I have two employees
01:33:28.660 | and both of them are out of the country.
01:33:31.780 | So I have to frequently make
01:33:33.140 | international wire payments to them.
01:33:35.220 | - Is one of them SPF, just as a reporter I have to ask.
01:33:39.340 | - No. - Okay.
01:33:41.100 | He's not on the payroll.
01:33:42.500 | - Yeah, I think he'd have to pay me.
01:33:44.100 | - I'm trying to do my best Coffeezilla words,
01:33:46.140 | I can hard hit investigative questions.
01:33:49.340 | - So with these international payments,
01:33:51.540 | you face all sorts of slow fees
01:33:55.900 | and you face like kind of like this time thing
01:33:59.180 | and it's this painful process.
01:34:01.140 | So if I use different cryptocurrencies,
01:34:06.380 | some of them are like really fast,
01:34:07.860 | some of them have really low fees.
01:34:09.740 | I just believe in a world where digital currencies
01:34:14.180 | with fast payments, with cheap payments,
01:34:16.620 | revolutionize the global exchange of currency.
01:34:21.260 | And I don't know if this is going to include the blockchain,
01:34:25.500 | it's just that the blockchain is the first thing
01:34:27.580 | that's really embraced truly digital currency,
01:34:30.460 | which doesn't need to go through this complicated system
01:34:33.980 | of wire transfers and just happens.
01:34:36.300 | So I can send you,
01:34:37.580 | let's say I wanna send you Ethereum or Bitcoin,
01:34:40.260 | I can send it to you just as fast,
01:34:42.380 | if I send you a dollar or a billion dollars
01:34:46.820 | and I can send it to you just as fast
01:34:48.740 | if you're across from me
01:34:50.220 | or if you're across the world from me.
01:34:52.300 | That I think is a step change in easier, faster, better
01:34:57.300 | in terms of like just this really basic
01:35:00.540 | international payments kind of idea.
01:35:03.100 | So I think at like its core,
01:35:04.660 | if the lowest form use case of cryptocurrencies is that,
01:35:08.380 | I think it will change the world in some variety.
01:35:12.380 | It's just kind of the larger question is,
01:35:14.020 | is that technology going to include
01:35:16.220 | the blockchain specifically or not?
01:35:19.500 | The other benefit is transparency,
01:35:22.620 | which I personally like as an investigator.
01:35:24.940 | It's just that previously it's like hard to describe
01:35:28.380 | how opaque our financial system is
01:35:32.500 | until you've tried to investigate someone or something.
01:35:35.340 | Understanding finances, unless you have a subpoena,
01:35:38.700 | unless you're like the FBI or like the SEC
01:35:41.620 | and you can get a subpoena for someone's finances
01:35:43.700 | or you're going through discovery,
01:35:45.260 | you don't know what someone has.
01:35:46.660 | You're basically playing poker with everyone
01:35:48.300 | and the cards are face down.
01:35:50.380 | For the first time, the blockchain to some extent,
01:35:53.500 | 'cause there are ways to obfuscate it.
01:35:55.460 | And in some ways cryptocurrency has enabled more fraud,
01:35:58.220 | which is kind of this irony.
01:35:59.700 | But in some ways it's enabled people
01:36:01.220 | to also audit a lot better and in real time.
01:36:04.420 | And I think that is a structural change
01:36:06.540 | that is fundamentally for the better.
01:36:10.380 | The question of all this is,
01:36:12.940 | do those betters outweigh the cons that this introduces?
01:36:18.540 | And how much can regulation mitigate those cons?
01:36:22.220 | Some of those cons being like fraud, money laundering,
01:36:25.100 | all these negative externalities
01:36:27.300 | that are easier with cryptocurrency.
01:36:28.820 | - Why do you think cryptocurrency in particular
01:36:30.500 | seems to attract fraudulent people?
01:36:33.340 | Like scammers and fraudsters?
01:36:37.300 | - 'Cause it's unregulated, it's the wild west
01:36:39.060 | and you can transmit large amounts of money
01:36:40.780 | very quickly across the world.
01:36:42.460 | - What about-- - With very little oversight.
01:36:43.940 | - Creating new crypto projects, like new coins.
01:36:47.420 | Because you have to show very little actual use case,
01:36:50.700 | you can just promise.
01:36:51.780 | So it's like true of any emerging technology.
01:36:53.980 | So much vaporware happens at the beginning
01:36:56.220 | when it's all promise.
01:36:57.580 | Because fundamentally, let's say you're legitimate,
01:36:59.860 | I'm illegitimate.
01:37:00.820 | We look the same at the start of a technology.
01:37:04.340 | 'Cause both of us are promising what this can do.
01:37:06.100 | And in fact, the less scruples and morals I have,
01:37:08.700 | in some ways I can out-compete you.
01:37:10.020 | 'Cause I can say mine does what Lexis does,
01:37:12.260 | but like way better and way faster
01:37:13.660 | and it's gonna happen in a year rather than 10 years.
01:37:15.740 | You're being honest, I'm playing a dishonest game,
01:37:18.220 | I look better.
01:37:19.500 | Once this space matures
01:37:21.260 | and you actually have some people actually doing the things
01:37:23.740 | that they say they're going to do,
01:37:25.580 | suddenly this equation changes.
01:37:27.060 | Now you're Amazon, you're delivering in two days,
01:37:29.380 | I can say whatever I want.
01:37:31.460 | You do the thing you do and I have no credibility.
01:37:34.020 | So I think that like part of the fraud is,
01:37:38.660 | just the ability to transmit so much money so quickly
01:37:40.860 | with such little oversight.
01:37:42.220 | Part of it is like, this just happens
01:37:44.020 | with any emergent technology.
01:37:45.780 | Vaporware is a real thing.
01:37:47.220 | And hopefully as this space matures,
01:37:49.980 | as regulation comes in, things will improve.
01:37:53.480 | - Well, let me ask you your own psychology.
01:37:57.020 | - Sure.
01:37:57.940 | - You're going after some of the richest,
01:38:01.020 | some of the most powerful people in the world.
01:38:02.900 | Do you worry about your own financial,
01:38:05.580 | legal and psychological wellbeing?
01:38:07.380 | - Yes.
01:38:10.180 | Yeah, I do.
01:38:11.020 | I mean, I'm not totally oblivious to the precariousness
01:38:14.060 | of like any kind of journalism like this.
01:38:19.260 | Obviously there's risks.
01:38:20.660 | I've always believed, there's a quote
01:38:23.180 | and I'm going to butcher it,
01:38:24.460 | but I hope you guys understand the spirit of it.
01:38:26.700 | News is when you print something,
01:38:29.540 | someone else doesn't want you to print,
01:38:30.720 | everything else is public relations.
01:38:32.800 | I really believe to do meaningful journalism,
01:38:36.320 | you have to go after people.
01:38:38.100 | Like it's not inherently a safe profession.
01:38:40.820 | I mean, if you're going to do important work,
01:38:42.520 | you have to have risk tolerances.
01:38:44.780 | And I think everyone has a line
01:38:46.740 | of what that risk tolerance is.
01:38:48.340 | And it's different for everyone.
01:38:50.900 | I don't think I could do what Edward Snowden did.
01:38:53.580 | I think that would be my bright red line,
01:38:55.120 | is going against my own government.
01:38:57.260 | It's such a, in my opinion, I really see him as a hero.
01:39:01.300 | Like it's such a selfless act of self-destruction.
01:39:04.320 | You know that the party you're going after
01:39:07.300 | has all the power and will crush you.
01:39:10.740 | And you do it anyway out of the like the true,
01:39:14.180 | I don't know, platonic ideal of journalism.
01:39:15.900 | I think that's beautiful.
01:39:16.940 | I don't think I could do that.
01:39:17.980 | I think I need some ability to live and subsist
01:39:21.580 | in the society that I am in.
01:39:24.260 | And I think my bright red line would be like,
01:39:27.500 | if I'm forced to flee the country for my work,
01:39:29.700 | I think I'd finally have to say no.
01:39:32.260 | But for as journalists go, I'm pretty risk,
01:39:36.020 | I take risk pretty well.
01:39:38.780 | I especially like think risk is important to take
01:39:41.580 | when you're young and when you can do that.
01:39:44.020 | I think when I have, I mean, I'm married.
01:39:46.140 | So when I have a family,
01:39:47.460 | I think I will probably dial this risk thing down,
01:39:51.020 | just being honest.
01:39:52.580 | I mean, I think you kind of have to.
01:39:54.380 | But right now, I mean,
01:39:57.040 | I'm kind of like running on all cylinders.
01:39:58.420 | I'm willing to take on quite a range of people,
01:40:02.080 | but I think a lot about it.
01:40:04.540 | - Wolfpack of one or small.
01:40:07.340 | - Yeah.
01:40:08.180 | - As opposed to having like a New York Times behind you
01:40:10.660 | or a huge organizations with lawyers,
01:40:13.880 | with a team, with a history.
01:40:15.700 | - These people are less courageous.
01:40:17.700 | This is the dirty truth.
01:40:20.020 | The bigger the organization,
01:40:21.500 | the more conservative a lot of them are.
01:40:23.500 | It's true that sometimes they like,
01:40:26.100 | and this is not to bash big organizations.
01:40:28.140 | I'm just saying this as an observation
01:40:30.080 | of someone who's talked to a lot of people.
01:40:32.660 | And especially in the world of fraud,
01:40:35.220 | a lot of them are scared to engage in fraud
01:40:39.140 | that is obvious, but hasn't been litigated yet.
01:40:42.700 | This is why you'll never see documentaries
01:40:45.000 | about ongoing fraud on Netflix.
01:40:46.740 | It's too much of a liability.
01:40:48.220 | They'll sue Netflix to hell.
01:40:49.660 | And they know that if they win,
01:40:51.140 | Netflix has the money to pay it.
01:40:52.980 | So corporations like the New York Times,
01:40:55.940 | a lot of these, some of them are very,
01:40:58.320 | like they're as courageous as they can be.
01:41:00.380 | But at the bottom line,
01:41:01.560 | if someone sues you to hell and back
01:41:03.620 | and you have to pay up,
01:41:04.700 | you will disappear.
01:41:06.140 | And you're relying on liability insurance,
01:41:08.680 | which you're already paying out the ass for
01:41:10.700 | to try to cover you if you get sued.
01:41:12.740 | But if you get sued, even if you win,
01:41:14.660 | that liability insurance now goes up in price
01:41:16.800 | the next year.
01:41:17.760 | And if you're the New York Times,
01:41:18.600 | it goes up by a lot.
01:41:19.740 | So, I mean, I think there's work
01:41:23.700 | that independent journalists can do uniquely
01:41:26.860 | that they can actually take like in some ways more risk
01:41:31.040 | than a giant institution,
01:41:32.300 | which has a lot more in my sense to lose,
01:41:34.540 | even though it would appear like they have
01:41:36.180 | more in terms of defense too.
01:41:37.740 | - But you get, you can be bullied legally.
01:41:41.060 | - Yeah.
01:41:41.900 | - Do you get afraid of that?
01:41:44.020 | - Sure, I mean, I just,
01:41:46.300 | all these things are things you have to be aware of
01:41:49.860 | and then forget to do your job.
01:41:51.500 | Like you have to be,
01:41:52.700 | you know, it's like being like a snowboarder
01:41:54.540 | and it's like, do you realize you could hit your head?
01:41:56.700 | And it's like, yeah, of course.
01:41:57.780 | But in order to go do the flip or whatever,
01:42:00.140 | you have to just accept the risk,
01:42:02.700 | mitigate the risk as much as possible and move on.
01:42:04.800 | So we have like insurance,
01:42:08.460 | we keep like a pool of funds for that kind of thing.
01:42:12.060 | Like I'm very conservative with how I spend my money
01:42:15.140 | basically all on production
01:42:17.580 | and like trying to make my life as secure as I can.
01:42:21.620 | And then I just do the work that I, you know,
01:42:24.140 | I want to do because.
01:42:25.700 | - So 99% of your fund goes into the studio
01:42:31.260 | and then into that elaborate space of yours.
01:42:35.540 | - Yeah, of course.
01:42:36.440 | - How many kittens had to die to manufacture that studio?
01:42:41.620 | But anyway, that's my investigation for later.
01:42:44.400 | What keeps you mentally strong through all of this?
01:42:50.260 | What's your source of mental like strength
01:42:52.780 | or your psychological strength through this?
01:42:54.940 | - I think there was a time
01:42:57.460 | when I was getting a lot of cease and desist.
01:43:00.500 | Some people were like actually like saying
01:43:03.060 | like they're gonna show up to my house,
01:43:04.220 | all that kind of stuff.
01:43:05.140 | I don't think I was that,
01:43:06.880 | I think I was pretty worried about that for a while.
01:43:08.620 | My wife was a huge source of strength here
01:43:12.220 | where she was like, hey, if you're not comfortable with it,
01:43:15.780 | you need to get out of the game
01:43:17.160 | or you need to basically like suck it up
01:43:19.460 | and like, this is what it is.
01:43:21.300 | If you're gonna go after these people,
01:43:23.020 | you have to basically be mentally strong around this
01:43:28.060 | and seeing her have that realization
01:43:31.060 | helped me have the same realization
01:43:33.680 | and I really deeply admire and respect that about her
01:43:37.620 | and it solved a lot of my concerns around that.
01:43:41.380 | It just made me realize every profession has risks.
01:43:45.580 | It is what it is, you mitigate and then you move on.
01:43:48.580 | - Why do you think there's so few journalists like you?
01:43:51.620 | You're basically the embodiment,
01:43:53.620 | at least in the space you operate,
01:43:55.060 | of what great journalism should be.
01:43:57.820 | Why do you think there's very few like you?
01:43:59.980 | - That's such an enormous compliment
01:44:05.060 | and probably overstatement,
01:44:06.980 | but I first want to pay respect.
01:44:11.980 | There are a lot of great journalists
01:44:15.220 | and a lot of them are like,
01:44:17.940 | I don't wanna just kind of take it and go,
01:44:20.080 | yeah, it's just me.
01:44:21.660 | There's so many great journalists.
01:44:25.420 | Matt Levine, Kelsey Piper,
01:44:27.440 | you've got anonymous journalists like Dirty Bubble,
01:44:30.940 | you've got citizen journalists like Tiffany Fong,
01:44:33.780 | but I think if you're gonna be in this space
01:44:38.780 | in the long term, you do need to accept certain risks
01:44:43.660 | and I think in the long term,
01:44:45.620 | it's like I don't know how easy it is to play that game
01:44:49.060 | for a long period of time
01:44:50.780 | because you make,
01:44:52.540 | to do great journalism, you don't get paid a lot
01:44:56.540 | compared to what you could get paid
01:44:57.700 | if you did press pieces or anything like that.
01:45:00.720 | You take a lot of risks legally,
01:45:02.380 | you take physical risks, you take,
01:45:04.980 | it's just like if you care about money,
01:45:08.620 | it's not the profession
01:45:09.820 | and I feel like a lot of people,
01:45:13.680 | when they get notoriety, they move to like,
01:45:15.620 | well, I can just maximize the money security side of things
01:45:19.460 | and I think it takes out a lot of would-be great journalists
01:45:23.700 | - And also, so first of all, comfort
01:45:26.060 | of physical and mental wellbeing.
01:45:28.660 | - Yes.
01:45:29.500 | - And also being invited to parties with powerful people.
01:45:32.340 | - Absolutely.
01:45:33.180 | - You make enemies, rich and powerful enemies doing this.
01:45:38.180 | - Yes.
01:45:41.740 | - But that's why it makes it, that's why it's admirable.
01:45:46.260 | I mean, it's an interesting case study
01:45:49.740 | that you've been doing it as long as you have
01:45:52.340 | and I hope you keep doing it,
01:45:53.580 | but it's just interesting that it's rare.
01:45:55.800 | - I'll say, I wanna make a call,
01:46:00.140 | like I think societies can create better journalists
01:46:05.140 | and worse journalists insofar as they support the journalists
01:46:10.820 | who are doing great work
01:46:12.020 | and I wanna call out Edward Snowden specifically
01:46:15.500 | because what we have done to him is such a travesty
01:46:18.580 | and the only lesson you can learn
01:46:20.300 | if you're a logical human being
01:46:21.940 | is that you should never whistleblow
01:46:23.580 | on the United States government
01:46:24.720 | after looking at what they did to Snowden.
01:46:26.860 | So as a society, we can put pressure on lawmakers
01:46:31.860 | to make it easier for people to do the great work
01:46:35.460 | by not punishing the people who do great work,
01:46:38.340 | if that makes sense, and de-risking it for them
01:46:40.700 | because we shouldn't expect journalists to be martyrs
01:46:43.460 | to do great work, right, to do important work.
01:46:46.260 | And part of that comes from protecting whistleblowers.
01:46:49.780 | There's like very common sense things.
01:46:51.860 | I love like, it's great to heroicize,
01:46:55.820 | people like Edward Snowden and stuff like that,
01:46:57.500 | but we shouldn't expect them to be heroes to do that work.
01:47:01.140 | - Do you ever think about going,
01:47:03.220 | you've been focused on financial fraud.
01:47:05.780 | Do you ever think about going after other centers of power?
01:47:09.360 | Like--
01:47:12.620 | - Government. - Government, politics.
01:47:15.820 | - Politics, it seems to me, you can't do good work.
01:47:18.700 | Like everybody doing good work in politics is to some extent
01:47:23.620 | from my limited perspective, as I said,
01:47:25.800 | I'm not that into it.
01:47:27.160 | It seems like everyone has to take a side
01:47:30.260 | because even if you do great work,
01:47:32.260 | whoever you're exposing, half the other people,
01:47:35.700 | no matter how good your work is,
01:47:36.940 | are going to claim it's just for partisan hackery
01:47:39.860 | and they're going to malign you.
01:47:41.580 | So it seems like a lot of journalists have to take a slant,
01:47:46.580 | even if it's not explicit like bias,
01:47:48.540 | they have to take a slant on who they expose.
01:47:50.660 | I hate that.
01:47:51.500 | I would really like a world where you could freely expose
01:47:55.340 | both sides without having a constant malignment of like,
01:48:00.140 | you know, who are you working for?
01:48:03.980 | Or you did this for X, Y, Z or whatever.
01:48:06.020 | Like, I really find that deeply problematic
01:48:08.660 | about our current journalism in the political sphere.
01:48:13.620 | As far as government stuff, I think it's easy to do,
01:48:16.700 | not easy, but like, it's much more enticing
01:48:21.420 | to do foreign journalism than to do local journalism
01:48:25.380 | on positions of power.
01:48:26.780 | 'Cause if you question, it's so easy to just get,
01:48:30.720 | the bigger cases you expose locally, you get in danger.
01:48:35.720 | Like it's just like very clear cut.
01:48:38.420 | The bigger the case, the more your financial wellbeing,
01:48:42.540 | your access, your entire life is like sort of in jeopardy.
01:48:45.760 | Whereas if you do foreign journalism,
01:48:47.040 | you can do great work and largely you're protected
01:48:49.800 | by your own government.
01:48:51.260 | So it's kind of this weird thing where if you want
01:48:52.820 | great journalism on America, sometimes going abroad
01:48:55.700 | might be the way to go.
01:48:58.020 | - But the politician thing, that's interesting
01:49:00.060 | you mentioned that and going abroad.
01:49:01.860 | I think the way you think about your current work,
01:49:05.740 | I think applies in great journalism in politics as well.
01:49:09.420 | So what happens, I have that sense,
01:49:12.100 | 'cause I aspire to be like you in the conversation space
01:49:16.980 | of like with politicians.
01:49:18.860 | I tried to talk to people on the left and the right
01:49:21.540 | and do so in non-partisan way and criticize,
01:49:23.780 | but also steel man their cases.
01:49:25.540 | What happens, I've learned, is when you talk to somebody
01:49:28.340 | on the right, the right kind of brings you in,
01:49:31.820 | it's like, yes, we'll keep you comfortable, come with us.
01:49:34.860 | And then the left attacks you.
01:49:36.780 | And so, and the same happens on the left.
01:49:40.060 | You talk on the left, the right attacks you
01:49:41.820 | and the left is like, come with us.
01:49:43.700 | So like there's a temptation, a momentum to staying
01:49:48.340 | to that one side, whatever that side is.
01:49:51.320 | The same with foreign journalism.
01:49:52.660 | You can cover Putin critically.
01:49:55.260 | There's a strong pull to being pro-Ukraine,
01:49:58.900 | pro-Zelensky, pro, basically really covering
01:50:03.340 | in a favorable way to the point of propaganda,
01:50:06.020 | to the point of PR, the Zelensky regime.
01:50:08.940 | If you criticize the Zelensky regime,
01:50:10.500 | there's a strong pull towards then being supportive
01:50:13.900 | of not necessarily the Putin regime,
01:50:15.700 | but a very different perspective on it,
01:50:17.900 | which is like NATO is the one that created that war.
01:50:21.140 | There's narratives that pull you.
01:50:23.780 | And what I think a great journalist does
01:50:26.340 | is make enemies empathize and walk through that fire
01:50:30.860 | and not get pulled in to the protection of any one side
01:50:35.220 | because they get so harshly attacked
01:50:37.380 | anytime they deviate from the center.
01:50:39.060 | - Well, and I think also like there's a criticism
01:50:42.700 | of all centrists, which I think in some way is fair.
01:50:45.660 | And I say that as someone largely who's a centrist,
01:50:48.220 | which is that this, what about is, or like this,
01:50:50.420 | like what about the left or what about the right
01:50:52.860 | can skew when it's not a both sides issue?
01:50:57.780 | So in the case of like Russia, Ukraine,
01:51:01.300 | I think like I'm strongly in favor of Ukraine,
01:51:04.380 | even though I tend to go like on both sides.
01:51:05.940 | And that might be partly because
01:51:07.260 | one of my employees is Ukrainian.
01:51:09.140 | And I think what a great journalist does,
01:51:13.420 | especially like in politics,
01:51:15.340 | is I think they criticize the regime that's most in power,
01:51:20.220 | most controls the keys and is the most corrupt at that time.
01:51:24.580 | And they might appear to be like,
01:51:27.140 | let's say during the realm of Trump,
01:51:29.700 | a great journalist would criticize Trump,
01:51:31.820 | but that same journalist who held Trump's feet to the fire
01:51:34.900 | should be capable of holding Biden's feet to the fire
01:51:37.660 | four years later, if that kind of makes sense.
01:51:39.860 | - That's exactly right, yeah.
01:51:41.380 | So any revealing any, so attacking any power center
01:51:46.180 | for the corruption, for the flaws they have.
01:51:48.900 | - Irrespective of like your political agenda
01:51:51.580 | or your political ideas.
01:51:52.960 | - So that, and that's what I mean about
01:51:55.180 | sort of the war in Ukraine.
01:51:56.220 | There's several key players,
01:51:58.620 | NATO, Russia, Ukraine, China, India.
01:52:03.620 | I mean, there's several less important players,
01:52:10.380 | maybe some of the like Iran and like Israel and maybe Africa.
01:52:15.380 | And what great journalism requires
01:52:19.380 | is basically revealing the flaws
01:52:21.620 | of each one of those players,
01:52:23.500 | irrespective of the attacks you get.
01:52:25.060 | And you're right that throughout any particular situation,
01:52:29.100 | there is some parties that are worse than others
01:52:32.100 | and you have to weigh your perspective accordingly.
01:52:36.140 | But also it requires you to be fearless in certain things.
01:52:38.860 | Like for example, I don't even know
01:52:41.620 | what it's like to be a journalist covering China now.
01:52:44.940 | - Oh, that's an exact case of like,
01:52:47.940 | China has made it so difficult to be a good journalist
01:52:51.460 | that they've effectively squashed criticism
01:52:53.740 | because to be a journalist in China
01:52:56.060 | means constantly risking your life every single day
01:53:00.620 | to criticize that government.
01:53:02.140 | And so the best journalists
01:53:03.820 | are a lot of times outside the country
01:53:06.220 | or they have sources inside the country
01:53:08.020 | who are like, there's like this,
01:53:09.620 | different, there's layers to the journalism
01:53:12.740 | where there's insiders who are leaking information,
01:53:15.580 | but they themselves cannot publish
01:53:16.780 | because it's like, it's extremely risky.
01:53:19.840 | So yeah, I think as a society,
01:53:23.420 | one measure of how healthy the political structure is
01:53:28.420 | is how well you can criticize it
01:53:31.580 | without fearing for your safety.
01:53:33.280 | - In that sense, the chaos and the bickering
01:53:37.260 | going on in the United States politics is a good thing.
01:53:40.540 | That people can criticize very harshly.
01:53:43.660 | - Very harsh.
01:53:44.500 | - And be in terms of safety are pretty safe.
01:53:46.900 | - Yes, absolutely.
01:53:48.300 | I think our only challenge is like,
01:53:50.860 | where it gets dangerous is around top secret information.
01:53:54.760 | The government comes down so hard
01:53:57.180 | that the danger in covering politics here
01:54:01.380 | is you can expose something that's top secret
01:54:03.700 | that should be exposed and they'll ruin you.
01:54:07.060 | - So that's where you again give props to Snowden
01:54:10.500 | for stepping up.
01:54:11.340 | - 100%, 100%.
01:54:13.540 | - What's the origin of the suspender wearing Batman?
01:54:19.140 | How did you come to do what you do?
01:54:20.980 | Like we talked about where you are
01:54:22.900 | and how your mind works, but how did it start?
01:54:26.140 | - I've kind of always been interested in fraud
01:54:28.060 | or at least I saw fraud early on
01:54:30.620 | and I was just like curious about what is this?
01:54:32.420 | I didn't know what I was really looking at.
01:54:34.060 | So basically my mom got cancer when I was in high school
01:54:38.300 | and it was pretty traumatic.
01:54:40.220 | I mean, she's fine.
01:54:41.040 | She had thyroid cancer, which is,
01:54:42.340 | we didn't know it at the time.
01:54:43.180 | It's like cancer is cancer,
01:54:44.420 | but it's fairly easily treatable with surgery.
01:54:47.220 | It's one of the better survivable ones.
01:54:50.460 | And I just watched her get like bombarded
01:54:52.660 | with all these like phony health scams
01:54:54.460 | of just like colloidal silver,
01:54:56.700 | all these different like remedies.
01:54:58.520 | And she was very into all the different ways
01:55:03.300 | that she might treat her cancer.
01:55:05.100 | And obviously surgery is very daunting.
01:55:07.460 | And I was just confused.
01:55:09.460 | I was like, why are we doing so many different remedies
01:55:12.400 | that all seem a very dubious health value?
01:55:15.720 | Later I'd find out that these are like all grifters.
01:55:17.780 | I mean, they take advantage of free speech in America
01:55:21.140 | to like advertise their products as life-saving miracles,
01:55:24.820 | whatever, when they're of course not.
01:55:27.300 | Eventually she got the surgery, thank God.
01:55:28.800 | But I know people in my life who their parents passed away
01:55:33.800 | because they didn't have the surgery.
01:55:36.720 | They instead took the alternative option.
01:55:40.420 | I know like, I don't wanna go into specifics
01:55:42.660 | 'cause I don't wanna mention their specific like case,
01:55:44.780 | but their family member went to Mexico
01:55:47.980 | for some alternate treatment, health treatment,
01:55:49.980 | instead of getting an easy surgery and they died.
01:55:52.620 | And so it's like, I realized,
01:55:55.460 | where is the outrage about this?
01:55:56.660 | Where's the, who covers this stuff?
01:55:58.940 | And I realized, well, not many people do.
01:56:00.540 | Then I went to college,
01:56:01.660 | I was getting a chemical engineering degree
01:56:03.480 | and all my friends were like telling me,
01:56:05.280 | hey, you should come to this meeting.
01:56:06.980 | We don't need this.
01:56:07.900 | Like you're doing this engineering stuff, that's great.
01:56:09.900 | You're gonna make like 70K a year,
01:56:11.940 | but don't you wanna get like rich now?
01:56:13.180 | Like why wait till you're 60 years old to retire?
01:56:15.460 | Like you can be rich now, Lex.
01:56:17.100 | So I'd go show up to a hotel and there'd be an MLM,
01:56:21.780 | like multi-level marketing pitch for Amway
01:56:25.940 | or whatever it was that day.
01:56:27.620 | And I was once again fascinated.
01:56:30.700 | I didn't know what I was looking at, but I was like,
01:56:32.380 | what is this weird game we're all playing
01:56:34.420 | where we sit in this room,
01:56:35.920 | we're looking at the speaker who says he's so successful,
01:56:38.580 | right, but why is he taking a Friday or a Saturday
01:56:41.780 | to do this pitch at night?
01:56:43.620 | And they're gonna telling me I'm gonna be financially free,
01:56:45.740 | but they're working on their Saturday and Sunday.
01:56:47.660 | And so it's like, how financially free are they?
01:56:49.660 | So I was just like confused.
01:56:50.620 | I was like, none of my friends were rich.
01:56:52.300 | They all said they were gonna be rich.
01:56:53.460 | No one ever seemed to get rich.
01:56:55.180 | And so I was sort of baffled by what I was looking at.
01:56:58.340 | Later, I graduate.
01:56:59.900 | I had no interest in doing engineering,
01:57:02.260 | which we can kind of get into,
01:57:03.400 | but I wanna do something in media.
01:57:05.200 | And I started covering a variety of topics,
01:57:08.340 | but eventually I sort of revisited this interest in fraud.
01:57:12.140 | And I started talking about these kind of
01:57:14.340 | get rich quick grifters that were online,
01:57:16.740 | sort of the Tai Lopez variety, 67 steps or whatever,
01:57:20.460 | like five steps to get rich, five coins to 5 million,
01:57:23.500 | these get rich quick schemes
01:57:25.100 | that a lot of people were interested in.
01:57:27.480 | No one seemed to get rich once again,
01:57:28.980 | except for the people at the tippity tippity top
01:57:30.860 | selling the get rich quick thing.
01:57:32.900 | And I was fascinated by the structure of it.
01:57:34.820 | I was like, does nobody see what this is?
01:57:37.140 | Does nobody get it?
01:57:38.300 | So I started making a series of videos on that.
01:57:40.860 | And the response was like palpable.
01:57:42.340 | I mean, it was like, I've made a lot of stuff before that.
01:57:44.780 | I'd made stuff that got a million views.
01:57:47.140 | I'd had like some marginal success,
01:57:49.380 | but the response of like the emails that came in,
01:57:53.820 | I could tell this work,
01:57:55.520 | even though it had far less views at the time,
01:57:58.880 | was having a different level of impact.
01:58:00.940 | And that's what I was really interested in.
01:58:02.740 | One of my problems with engineering was from my standpoint,
01:58:06.980 | I did chemical engineering at Texas A&M.
01:58:09.700 | And I was like, is my future
01:58:11.100 | just gonna be in a chemical plant,
01:58:12.520 | improving some process by 2%?
01:58:14.580 | And that's like my gift to the world.
01:58:16.540 | I didn't see the hard impact.
01:58:19.420 | And that maybe that's a lack of imagination
01:58:21.500 | 'cause chemicals matter,
01:58:22.540 | but I wanted to see an impact in the world.
01:58:26.580 | And so when I did started doing this fraud stuff,
01:58:29.260 | exposing fraud, I clicked in my brain.
01:58:30.780 | I was like, whoa, this is kind of doing what I want to do.
01:58:34.700 | And so I started posting videos.
01:58:37.580 | At first it really focused on like get rich quick scheme,
01:58:40.940 | grifty advertising, which I think is super predatory
01:58:44.340 | and we can go into why,
01:58:46.020 | but it eventually graduated to crypto
01:58:48.540 | and it snowballed I guess,
01:58:50.820 | 'cause now we're talking to Sam Bagnett-Fried.
01:58:53.740 | - Okay, grifty advertising.
01:58:56.180 | So actually there's a step back.
01:58:57.780 | What is a multi-level marketing scheme?
01:59:02.740 | 'Cause I've experienced a similar thing.
01:59:05.020 | I remember I worked at,
01:59:07.140 | I sold women's shoes at Sears in a bank.
01:59:11.060 | And I just remember like some kind of,
01:59:13.420 | I forgot the name of the company,
01:59:14.620 | but you'd like, you can sell like knives or whatever.
01:59:16.620 | Like that's a common.
01:59:17.700 | - Oh, I know what you're talking about.
01:59:21.220 | - I'm sure there's a lot of things like this, right?
01:59:23.100 | And I remember feeling a similar thing, like why?
01:59:25.680 | To me, what was fascinating about it is like, wow,
01:59:30.500 | like human civilization is a interesting,
01:59:33.740 | like a pyramid scheme.
01:59:35.020 | Like I didn't, maybe didn't know the words for that,
01:59:37.680 | but it's like, it's cool.
01:59:38.740 | Like you can like get in a room
01:59:40.540 | and you convince each other of ideas
01:59:41.980 | and you have these ambitions.
01:59:43.220 | There's a general desire, especially when you're young
01:59:46.780 | to like, like life sucks right now.
01:59:49.620 | Nobody respects you.
01:59:50.620 | You have no money and you wanna do good.
01:59:53.060 | And you wanna be sold this dream of like,
01:59:55.900 | if I work hard enough at this weird thing,
01:59:58.540 | I can shortcut and get to the very top.
02:00:01.220 | I don't know what that is.
02:00:02.060 | And I also in me felt that like life really sucks
02:00:05.900 | and I could do good.
02:00:07.220 | And I'm lucky I found a way to do good.
02:00:10.360 | I'm like, and I don't know, you connect with that somehow.
02:00:13.480 | I think there's like this weird fire inside people
02:00:16.080 | to like, to make better of themselves, right?
02:00:18.560 | I don't know if it's just an American thing or if it,
02:00:21.000 | but anyway, that was fascinating to me
02:00:22.300 | from a human nature perspective.
02:00:23.700 | - Grifters play on this though, right?
02:00:25.180 | Like this is, so the best salesmen play on true narratives
02:00:30.180 | that you already believe.
02:00:32.420 | So the true narrative is, you know, life is unfair.
02:00:36.040 | It is tough.
02:00:37.080 | The American dream as described by go to a job,
02:00:42.180 | work at the same company for 40 years and then retire
02:00:45.460 | to a safety net that you're positive is gonna be there.
02:00:47.980 | That is largely dead.
02:00:49.380 | And so they like play on those fears and those problems
02:00:52.780 | to then sell you a pill, a solution, a thing.
02:00:55.860 | And the problem is that the solution is usually worse
02:01:00.080 | than even the problem they sketched out.
02:01:01.900 | Like you will do better most of the time
02:01:03.980 | by going with the regular company
02:01:05.660 | than you will by going with these goofy
02:01:07.060 | multi-level marketing.
02:01:07.900 | But let me answer your question.
02:01:09.060 | What is a multi-level marketing?
02:01:10.820 | So there's a criticism, first of all, that,
02:01:13.980 | well, let me get to what it is in theory.
02:01:15.900 | Like at its most ideal, multi-level marketing
02:01:19.100 | is where you have a product that you're selling.
02:01:21.360 | And one of the ways you help sell it
02:01:23.320 | is by rather than going through traditional marketing,
02:01:25.900 | like where you go and you put out print advertising,
02:01:28.980 | it's like sort of a social network of marketing.
02:01:31.800 | I sell to you and then actually Lex,
02:01:34.320 | not only can I sell to you,
02:01:35.380 | you can then go sell this product
02:01:36.520 | and you'll make money selling it.
02:01:37.880 | And you know what, to incentivize me to get other salesmen,
02:01:40.400 | 'cause when I get another salesman,
02:01:41.400 | I'm actually giving myself competition.
02:01:43.140 | So that's bad.
02:01:44.320 | So to incentivize me to do that,
02:01:45.940 | they'll pay me part of what you make, right?
02:01:48.600 | And then you go out and you go,
02:01:50.000 | okay, well, I can sell this product.
02:01:51.360 | I also can get new salesmen to like sell for me
02:01:53.960 | and I'm gonna make money from you, whatever.
02:01:55.720 | So it goes down the line of you create multiple levels
02:01:58.760 | where you can profit from their marketing, right?
02:02:01.820 | The problem with this system
02:02:03.880 | is that however well-intentioned it is,
02:02:06.680 | is that usually the emphasis of that selling
02:02:09.980 | and making money ends up not being about the product at all
02:02:14.040 | and ends up entirely being about recruiting new people
02:02:16.800 | to recruit new people to recruit new people.
02:02:18.720 | That's the real way to make money in multi-level marketing.
02:02:21.200 | This is where the very true criticism
02:02:23.840 | of most multi-level marketing, if not all,
02:02:26.480 | are pyramid schemes in structure.
02:02:28.520 | Because what a pyramid scheme is,
02:02:30.160 | is it's all about I put in $500
02:02:33.200 | and I recruit two people to put in $500
02:02:34.880 | and that comes up to me.
02:02:36.000 | And they get two people to put in $500 and it goes to them.
02:02:39.000 | And the reason it's a flawed business model
02:02:41.160 | is in order for it to work and everyone to make money,
02:02:43.360 | you'd have to assume an infinite human race.
02:02:45.920 | And so that's not the case.
02:02:47.680 | Most people end up getting screwed in multi-level marketing
02:02:50.000 | and in pyramid schemes, that's what that is.
02:02:53.040 | That's that thing.
02:02:54.160 | - And the quality of the product,
02:02:55.520 | it doesn't necessarily matter what you're selling.
02:02:57.760 | - To people who are financially incentivized
02:02:59.440 | to buy this thing.
02:03:01.400 | - Yeah, and so you're selling the dream of becoming rich
02:03:06.400 | to the people down in the pyramid.
02:03:08.080 | - That's the real product of multi-level marketing,
02:03:10.160 | unfortunately.
02:03:11.040 | And so you look at the statistics of these companies
02:03:15.080 | and although they'll make it seem like it's so easy
02:03:17.520 | to be the top 0.1% who's making all this money,
02:03:21.400 | the statistics are that 97% make less than a minimum wage
02:03:26.280 | doing this.
02:03:27.120 | They spend an enormous amount of time.
02:03:28.680 | And just what's so cruel about it is
02:03:30.460 | that's not advertised up front.
02:03:32.520 | I mean, it's like, if I go work at McDonald's,
02:03:34.200 | I know what I'm getting.
02:03:35.400 | If I go work at Amway, I have visions of,
02:03:37.920 | they've sold me visions of beaches and whatever
02:03:40.520 | and more than likely I'm losing money.
02:03:42.680 | So better than 50% of people lose money,
02:03:44.920 | but 97% of people make less than minimum wage.
02:03:47.080 | It's like, it's such a bad business
02:03:50.720 | for the vast majority of people who join it
02:03:53.200 | and the people at the very top
02:03:54.920 | who are lying to the people at the bottom
02:03:56.600 | saying they all can do it when they can't
02:03:58.800 | are making all the money.
02:03:59.640 | So it's, yeah, it's really messed up.
02:04:01.920 | - The interesting thing I've noticed,
02:04:03.920 | maybe myself too, 'cause I've participated
02:04:05.840 | in the knife selling for like a short amount of time.
02:04:08.160 | That's probably the experience that most people have.
02:04:09.640 | - That's Cutco, is it?
02:04:11.360 | - I don't know.
02:04:12.180 | I don't think it was Cutco.
02:04:13.560 | - I know what you're talking about.
02:04:14.520 | It's killing me.
02:04:15.360 | - There's several variations of it.
02:04:17.160 | I think I was part of a less popular one.
02:04:22.160 | It doesn't, I keep wanting to say it was called Vector.
02:04:25.680 | - Yeah, yes.
02:04:27.920 | Something with a V was what I was gonna say.
02:04:30.080 | It might be Vector.
02:04:31.680 | Yeah, I get what you're saying though.
02:04:32.720 | It's a multi-level marketing knife selling.
02:04:34.360 | - But the thing is, I just remember
02:04:37.160 | my own small experience with it
02:04:39.360 | is I was too, I was embarrassed at myself
02:04:41.480 | for having like participated.
02:04:44.520 | I think there's an embarrassment.
02:04:45.960 | That's why people down in the pyramid
02:04:48.560 | don't like speak about it, right?
02:04:51.120 | I'm trying to understand the aspects
02:04:54.960 | of human nature that facilitate this.
02:04:57.080 | - Well, this is one of the problems with fraud
02:04:58.680 | is there's a tremendous embarrassment to being had.
02:05:01.920 | Also, if you buy, so slightly different human nature
02:05:05.560 | is that if you buy into a get-rich-quick scheme
02:05:08.320 | and then it doesn't deliver,
02:05:09.840 | you're more likely to blame yourself
02:05:11.960 | than blame the product for not actually working.
02:05:14.000 | You go, well, there must be something flawed with me.
02:05:15.680 | - That's true.
02:05:16.520 | - And they constantly reinforce this.
02:05:17.600 | They go, well, it's all about your hard work.
02:05:19.480 | The system works.
02:05:20.320 | Look at me, I did it.
02:05:21.520 | So if you're failing,
02:05:22.600 | it must be some indictment of your character
02:05:25.160 | and you have to always double down.
02:05:26.720 | You have to double, the system can't be flawed.
02:05:28.720 | You must be flawed.
02:05:30.120 | And so, yeah, it's a really messed up system.
02:05:32.600 | It really preys on people's psychology
02:05:35.120 | to keep them in this loop.
02:05:37.240 | And that's why in some ways these things are so viral,
02:05:39.240 | even though they don't actually get most people
02:05:41.840 | a significant amount of wealth
02:05:43.200 | and they cost most people money.
02:05:44.560 | So it's very unfortunate.
02:05:46.200 | - Most people do have the dream of becoming rich.
02:05:49.040 | Most young people.
02:05:50.600 | - Right, and the thing is,
02:05:51.960 | is that everyone knows in business, what do you sell?
02:05:55.320 | You sell solutions to problems.
02:05:57.200 | So if so many young people wanna get rich,
02:05:59.720 | the product is that pitch.
02:06:01.760 | It's you sell them the dream.
02:06:04.080 | Why this gets so grifty and so cruel and predatory
02:06:08.520 | is because there is no easy solution to this.
02:06:11.080 | There is no solution that people are gonna buy
02:06:14.120 | because the real solution people want is no work,
02:06:16.600 | no education, no skills required, no money upfront.
02:06:21.240 | And people will pay any price for that magic pill
02:06:24.800 | and people are happy to sell that magic pill.
02:06:27.520 | And I think those people are very cruel
02:06:30.120 | and I think deserve to get exposed for it.
02:06:33.200 | - So somebody that's been criticized
02:06:36.120 | for MLM type schemes is Andrew Tate.
02:06:40.680 | Somebody that I'm very likely to talk with.
02:06:43.760 | So for people who have been telling me
02:06:46.800 | that I'm too afraid to talk to Andrew Tate,
02:06:48.600 | first of all, let me just say,
02:06:50.480 | I'm not afraid to talk to anyone.
02:06:54.560 | It's just that certain people require preparation
02:06:57.320 | and you have to allocate your life in such a way
02:07:01.860 | that you wanna prepare properly for them.
02:07:04.040 | And so you have to kinda think who you want to prepare for.
02:07:08.240 | 'Cause I have other folks that have more power
02:07:11.560 | than this particular figure that I'm preparing for.
02:07:14.280 | So you have to make sure you allocate your time wisely.
02:07:17.840 | But I do think he's a very influential person
02:07:20.440 | that raises questions of what it means
02:07:23.320 | to be a man in the 21st century.
02:07:25.360 | And that's a very important and interesting question
02:07:30.120 | because young people look up to philosophers,
02:07:33.320 | to influencers about what it means to be a man.
02:07:36.160 | They look up to Jordan Peterson,
02:07:37.480 | they look up to Andrew Tate,
02:07:38.600 | they look up to others, to other figures.
02:07:40.880 | And I think it's important to talk about that,
02:07:43.880 | to think what does it mean to be a good man in this society?
02:07:47.920 | Of course, in the other gender, there's the same question,
02:07:50.400 | what does it mean to be a good woman?
02:07:52.080 | I think obviously the bigger question
02:07:53.680 | is what does it mean to be a good person?
02:07:56.560 | But-- - It's for men podcast.
02:07:58.000 | - I swear to God.
02:07:59.860 | (laughs)
02:08:03.600 | Now we're into, okay.
02:08:06.640 | So that said, one aspect of the criticism
02:08:10.400 | that Andrew's received is not just on the misogyny,
02:08:14.720 | is on the MLM aspect of the multi-level marketing schemes.
02:08:19.720 | So is there some truth to that?
02:08:22.080 | Is there some fraudulent aspect to that?
02:08:24.200 | - So yeah, I definitely think so.
02:08:26.280 | I mean, that's the main reason I criticized him.
02:08:28.680 | So let's back up.
02:08:32.680 | There's a few clarifications I need to make.
02:08:34.600 | What Andrew Tate is selling is not multi-level marketing,
02:08:37.760 | although he is selling the dream.
02:08:39.280 | He's selling an affiliate marketing thing,
02:08:40.880 | which is slightly different.
02:08:42.540 | So in multi-level marketing, if I sell to you,
02:08:45.720 | and then you go sell to two other people,
02:08:47.220 | I make money from those two other people down the chain,
02:08:49.280 | multi-level.
02:08:50.400 | Affiliate marketing is sort of like one level.
02:08:52.280 | I only make money.
02:08:54.040 | So Andrew Tate had this affiliate program
02:08:56.000 | where if you sold Hustler's University to somebody else,
02:09:00.180 | which sounds like something people would,
02:09:01.480 | boomers would put on Facebook in like 2010,
02:09:03.600 | like I went to Hustler's University.
02:09:05.800 | School of Hardware.
02:09:06.620 | - By the way, you were a member of Hustler's University.
02:09:09.800 | - Yeah, I joined.
02:09:10.640 | I joined, I became a Hustler.
02:09:11.560 | That's in large part due to why I'm so successful
02:09:14.840 | is because of my Hustler University membership.
02:09:17.360 | I'm just kidding.
02:09:18.560 | But so it's an affiliate program.
02:09:20.600 | So you'd sell, like I sell to you this $50 course,
02:09:24.080 | and I make like $5,
02:09:25.680 | and Andrew Tate in perpetuity makes $50 a month off of you.
02:09:29.160 | Okay.
02:09:30.320 | What does this course actually sell, right?
02:09:32.440 | 'Cause ultimately he's selling the dream.
02:09:34.440 | He's selling, hey, the matrix has enslaved you.
02:09:37.320 | He's really gone down this like neo rabbit hole.
02:09:40.600 | So the matrix has enslaved you.
02:09:42.260 | Your life is controlled by these people
02:09:44.860 | who wanna keep you like kind of weak, you know, lazy,
02:09:47.960 | whatever, you need to break out,
02:09:49.480 | and you need to achieve the new dream,
02:09:52.400 | which is sort of like hustling your way to the top.
02:09:54.640 | You don't need the antiquated systems of school.
02:09:57.520 | You can just pay me $50 a month,
02:09:58.920 | and I will teach you everything.
02:10:00.000 | Okay.
02:10:00.840 | So what do you actually get?
02:10:01.760 | Well, and why is it a scam?
02:10:03.120 | So you actually,
02:10:04.240 | I think it's just a scam in terms of like value
02:10:06.320 | and like you're selling based on
02:10:07.880 | these completely unrealistic things.
02:10:09.280 | He's like, let's get rich.
02:10:10.480 | Okay.
02:10:11.440 | You get a series of Discord rooms.
02:10:13.680 | So you, do you know what Discord,
02:10:14.920 | most people know what Discord is.
02:10:16.000 | It's like a bunch of, you know, chat rooms basically, right?
02:10:18.680 | - So it's like AOL or is it like-
02:10:20.200 | - Yeah, yeah, yeah, right, right.
02:10:21.160 | Well, I'm talking to the guy who quit Emacs,
02:10:23.760 | so I don't know.
02:10:24.920 | There's Discord servers,
02:10:25.760 | and in these like, there's like seven different rooms
02:10:30.000 | you can go in, or there's several rooms,
02:10:31.620 | and each one is like a different field of making money.
02:10:34.320 | - Yes.
02:10:35.160 | - E-commerce, trading, cryptocurrency,
02:10:39.320 | I think fulfillment by Amazon,
02:10:45.080 | like copywriting.
02:10:46.520 | Okay.
02:10:48.320 | So I went to all of these, I checked them out.
02:10:50.000 | I checked all their money-making tools.
02:10:52.140 | The first funny thing is that Andrew Tate
02:10:53.760 | is nowhere to be found.
02:10:54.580 | The supposed successful guy that you like bought into
02:10:56.880 | is nowhere to be found.
02:10:57.880 | It's these professors that you have now,
02:11:01.200 | he is hired and said, "These guys are super qualified."
02:11:04.440 | So like looked up some of what some of these guys have done,
02:11:07.920 | and some of them have launched like scammy crypto coins.
02:11:11.960 | The cryptocurrency professor was like shilling
02:11:14.720 | a bunch of coins that did bad,
02:11:15.920 | and then like deleted the tweets.
02:11:17.160 | I mean, just completely exactly what you'd expect.
02:11:20.760 | Behind the paywall, it's nothing of substance.
02:11:23.520 | You're not gonna learn to get rich by escaping the matrix
02:11:26.560 | and going to work for Jeff Bezos.
02:11:28.040 | Fulfillment by Amazon is not escaping the matrix, right?
02:11:31.240 | Like that's not the way to hustle to the top.
02:11:33.240 | It's literally a field of making money
02:11:35.660 | that everyone in the world has access to.
02:11:38.620 | If you wanna differentiate yourself and make money,
02:11:40.600 | the first thing you realize is going into skill sets
02:11:43.240 | that literally anyone with an internet account can do
02:11:45.720 | is a bad way to do that,
02:11:47.200 | because you have to have some differentiating factor
02:11:49.820 | to add value.
02:11:51.000 | So it's just such a obvious and complete scam,
02:11:56.960 | because there is no value to this like so-called education.
02:12:00.840 | The professors are crap.
02:12:03.320 | The advice, they're like hiding
02:12:04.920 | some of the bad things they've done,
02:12:06.120 | and Andrew Tate's nowhere to be found.
02:12:07.400 | Ultimately, that's why everyone joins.
02:12:09.320 | What he's done is very interesting,
02:12:10.760 | because, and I'll give him credit in his marketing,
02:12:14.080 | he's been very savvy to like make the reasons
02:12:16.400 | you admire him, not the thing he's sort of selling,
02:12:20.120 | which is weird.
02:12:21.160 | Like he's selling get rich quick,
02:12:23.660 | which seems like it relates to his persona,
02:12:27.260 | but is actually very orthogonal to it.
02:12:29.460 | His persona is like the tell it how it is,
02:12:32.500 | like tall, buff, rich guy.
02:12:36.140 | It's like actually his persona that you're buying into,
02:12:39.420 | and then he's selling you this thing to the side,
02:12:41.700 | which when people get in there
02:12:42.880 | and they're not delivered on the product,
02:12:44.460 | he still is those things that you first thought he was.
02:12:46.700 | So it's like, I think to some extent,
02:12:48.460 | he's made a lot of money by making the thing
02:12:52.420 | that he makes money on,
02:12:53.780 | not the thing he gets so much pushback online for,
02:12:56.940 | and what he's also loved for.
02:12:58.400 | So people will push back for his misogyny,
02:13:00.340 | but the real way he's making money is just like basic
02:13:02.740 | get rich quick schemes that are super obvious to spot,
02:13:05.540 | but everyone's distracted by like,
02:13:07.340 | oh, he said some crazy stuff about women,
02:13:09.380 | or all these various other scandals he's gotten himself in.
02:13:12.980 | - To get more and more and more attention.
02:13:14.860 | So with the persona,
02:13:16.980 | is the Hustler University still operating?
02:13:20.380 | - I think they've rebranded it.
02:13:21.540 | I'm part of their pitch now.
02:13:23.420 | I'm like, they put me on as like,
02:13:25.080 | I mean, as like, the Matrix is trying to take us out, Lex.
02:13:28.340 | And then it's me saying like,
02:13:29.860 | you know, they put me in like saying,
02:13:31.060 | I'm part of the Matrix.
02:13:32.200 | They put me in saying, oh, this guy sucks.
02:13:35.180 | You know, I joined, it sucks.
02:13:36.500 | And so they'll play that and they do like a bass drop
02:13:38.720 | and it's like, you know,
02:13:40.740 | don't listen to people like this, da da da da.
02:13:42.540 | I mean, it's, I'm basically like a-
02:13:44.020 | - Oh, so you've been co-opted by the Matrix to attack.
02:13:47.020 | - Yes.
02:13:47.860 | - You're an insider threat that infiltrated
02:13:49.820 | and now is being used by the Matrix to attack him.
02:13:52.700 | Well, to-
02:13:53.540 | - Everyone who criticizes him as part of the Matrix,
02:13:56.260 | he won't say who the Matrix is.
02:13:57.820 | It's just the shadowy cabal of rich, powerful people.
02:14:02.300 | It's just like the easy narrative
02:14:03.660 | for people who are disaffected
02:14:05.300 | and who feel cheated by the system.
02:14:08.180 | You just collectivize that system
02:14:09.600 | and you make it the bad guy and you go, look, look,
02:14:11.380 | those guys, those guys who have been cheating you,
02:14:13.740 | they're the bad guys.
02:14:14.780 | They want me shut up.
02:14:18.120 | And then now the person that, the people who harmed you,
02:14:21.340 | they want this guy shut up, you're gonna listen to him.
02:14:22.860 | So that's like, it's like the most basic
02:14:25.660 | psychological manipulation
02:14:27.340 | that everyone seems to constantly fall for.
02:14:29.220 | It's really trivial and stupid, but-
02:14:32.940 | - Can you steel man the case for Husserl's University
02:14:37.380 | where it's actually giving young people confidence,
02:14:42.380 | teaching invaluable lessons about like,
02:14:45.700 | actually incentivizing them to do something
02:14:47.920 | like Fulfillment by Amazon, the basic thing, to try it,
02:14:51.260 | to learn about it, to fail, or maybe see like,
02:14:54.560 | to try, to give a catalyst and incentive to try these things.
02:14:59.040 | - As much as it pains me, I will try to give a succinct,
02:15:03.180 | maybe steel man of it as best as I can,
02:15:07.440 | thinking that it's such a grift.
02:15:09.100 | But I think what you would say is that some people,
02:15:14.600 | in order to make a change in their life,
02:15:17.660 | need someone who they can look up to.
02:15:21.680 | And men don't have a lot of like strong role models,
02:15:25.460 | like big male presences in their life
02:15:29.520 | who can serve as a proper example.
02:15:32.000 | So the most charitable interpretation would be,
02:15:34.840 | Andrew Tate would encourage you
02:15:38.080 | to go reach for the stars, I guess.
02:15:40.780 | My problem is I have a deep, like,
02:15:43.640 | I have a deep issue with the like,
02:15:45.800 | lust and greed that centers all these things.
02:15:50.200 | It's like this glorification of wealth equals status,
02:15:55.200 | wealth equals good person,
02:15:57.420 | wealth and Bugatti equals you are meaningful, you matter.
02:16:02.280 | And like the dark underlying thing
02:16:05.620 | is that none of that, none of that matters.
02:16:09.640 | Like, it matters that you make a decent living,
02:16:12.260 | but past just like that,
02:16:14.720 | I think the like lust for more stuff
02:16:17.280 | and the idolization of these people
02:16:19.520 | that is just like opulence is a net bad.
02:16:23.120 | So that's like, my steel man has to stop there
02:16:25.240 | because I really disagree with like the values
02:16:27.640 | that are pushed by people like that.
02:16:30.840 | - Yeah, no, I agree.
02:16:31.720 | That's the thing that should be criticized.
02:16:33.320 | I shouldn't say it doesn't matter.
02:16:35.160 | I think it's just like an amazing meal
02:16:38.480 | at a great restaurant, it matters.
02:16:40.700 | Money and Bugattis for many people make life beautiful.
02:16:45.700 | Like those are all components,
02:16:47.120 | but I think money isn't,
02:16:49.500 | you can also enjoy a beautiful life on a hike in nature.
02:16:52.860 | There's a lot of ways to enjoy life.
02:16:55.940 | And one of the deepest has been tested through time
02:16:58.660 | is the intimacy, close connection,
02:17:00.420 | friendship with other people or with a loved one.
02:17:03.260 | They don't talk about like love.
02:17:05.940 | - Yeah, I--
02:17:06.780 | - What it means to be deeply connected with another person.
02:17:09.100 | It's just like get women, get money,
02:17:12.000 | all those kinds of things.
02:17:13.300 | But that, I think, I don't wanna dismiss that
02:17:15.980 | 'cause there's like value in that, there's fun in that.
02:17:18.480 | I think the positive,
02:17:20.660 | I haven't investigated Hustlers University,
02:17:23.540 | but the positive I see in general is young people
02:17:28.140 | don't get much respect from society.
02:17:32.140 | I know it's easy to call it the matrix and so on,
02:17:34.060 | but there's a kind of dismissal of that.
02:17:39.060 | As human beings, as capable of contributing,
02:17:42.140 | of doing anything special.
02:17:43.980 | And then here's, you have young people
02:17:45.920 | who are sitting there broke, with big dreams.
02:17:50.120 | They need the mentors, they need somebody to inspire them.
02:17:52.480 | So like, I would criticize the flawed nature of the message.
02:17:57.480 | But also it's just like, you have to realize
02:17:59.640 | like there needs to be institutions or people
02:18:03.900 | or influencers that help like inspire, right?
02:18:07.300 | The problem is though, is the people who are pitching
02:18:11.100 | unrealistic versions of that
02:18:13.580 | are getting a lot of attention.
02:18:15.640 | Whereas like, there's so many great free courses
02:18:18.460 | where you can learn everything and more
02:18:20.500 | about fulfillment by Amazon or about copywriting
02:18:25.500 | or all these different things.
02:18:27.620 | That I think like so often the air is just,
02:18:29.700 | the oxygen is sucked up by all the grifters
02:18:32.500 | who promise everything.
02:18:33.340 | It's back to what we said about vaporware.
02:18:35.220 | This is one of the reasons that like educational products
02:18:40.220 | can so often be co-opted by grifters,
02:18:43.280 | is vaporware is very hard to distinguish
02:18:46.300 | because prompt, like the feedback loop on education
02:18:50.880 | is not clear.
02:18:52.080 | It's not obvious immediately.
02:18:53.900 | So I can sell you a book and I can say,
02:18:56.100 | this is gonna change your life if you apply it.
02:18:58.660 | If you don't, if your life doesn't change,
02:19:00.500 | I just say, well, you didn't apply it, right?
02:19:01.880 | Like it's, there's this weird relationship.
02:19:03.460 | It's not clear the value.
02:19:05.340 | It's not so easy to like quantify education.
02:19:07.940 | So that gets co-opted by people who make all the promises.
02:19:12.240 | They get a ton of attention, a ton of money.
02:19:14.340 | And then those people are often left confused
02:19:18.340 | and like kind of disillusioned, maybe thinking,
02:19:21.660 | well, this didn't work in one year.
02:19:23.460 | So it's not gonna work at all.
02:19:25.820 | And so I think, yeah, there are problems there.
02:19:28.700 | There's certainly a need for like male role models.
02:19:31.660 | There's certainly a need for somebody kind of
02:19:33.780 | to speak to a younger generation.
02:19:36.860 | I just think that person shouldn't be maybe Andrew Tate,
02:19:41.860 | like personally.
02:19:42.820 | - Yeah, so you have to criticize
02:19:44.100 | those particular individuals.
02:19:45.660 | I also--
02:19:47.420 | - And yeah, I think like the Bugatti aspiration
02:19:49.740 | is so stupid.
02:19:51.180 | Like it's like, it's so--
02:19:52.700 | - Let me steal Matt.
02:19:53.540 | So I'm a person who doesn't care about money,
02:19:55.700 | don't like money.
02:19:56.680 | The women maybe appreciate the sort of the beauty
02:20:01.140 | of the other sex, but like, yeah, cars in particular
02:20:06.140 | is like, really, is this really the manifestation
02:20:09.140 | of all the highest accomplishments
02:20:11.180 | a human being can have in life?
02:20:12.240 | Yes, I can criticize all that.
02:20:14.060 | But to steal man, that case is a young person,
02:20:19.060 | a dreamer has ambition.
02:20:23.580 | And I often find that education throughout my whole life,
02:20:27.480 | there's been people who love me,
02:20:29.060 | teachers who saw ambition in me and tried to reason with me
02:20:34.060 | that my ambition is not justified.
02:20:37.020 | Looking at the data, look kid, you're not that special.
02:20:41.060 | Look at the data.
02:20:42.460 | You're not, and they want you to like,
02:20:46.200 | not dream, essentially.
02:20:48.180 | And then again, I look at the data,
02:20:50.500 | which is all the people, I just talked to Hodger Gracie.
02:20:54.420 | This Hodger Gracie is just a person,
02:20:57.140 | widely acknowledged as probably the greatest of all time,
02:21:00.540 | dominated everybody.
02:21:02.060 | But for the longest time, he sucked.
02:21:04.440 | And he was surrounded by people that kind of,
02:21:07.900 | you know, don't necessarily believe in him.
02:21:10.780 | So he had to believe in himself.
02:21:13.140 | It's nice to have somebody, I just,
02:21:15.620 | as older I get and I've seen it,
02:21:17.780 | it's so powerful to have somebody who comes to you,
02:21:20.860 | an older person, whether it's real or not,
02:21:24.060 | that says, you got this kid, I believe in you.
02:21:26.860 | - Sure. - Yeah.
02:21:28.580 | - Yeah, it does.
02:21:29.420 | I mean, I think, so I think dreaming
02:21:34.340 | is actually really important.
02:21:36.680 | I'm more protective over,
02:21:40.540 | people co-opt those dreams for money.
02:21:42.540 | - Yes, yes, absolutely. - And like,
02:21:43.740 | I do think it matters so much
02:21:45.580 | that we encourage people to take risks.
02:21:48.260 | It's one of the great things about America,
02:21:50.620 | is it lionizes like sort of people who have taken risks
02:21:53.100 | and won, but I think it's just a weird vapid thing
02:21:58.100 | when like the reason you do all of it
02:22:02.420 | is for this thing you can get out at the end of the day.
02:22:04.980 | When we all know, and you, like,
02:22:06.620 | well, you've just heard a million interviews,
02:22:07.860 | like nobody ever is, gets through Bugatti and goes,
02:22:10.500 | this now completely fulfills me.
02:22:12.260 | Everyone knows the beauty and the like fulfillment
02:22:15.580 | actually comes from becoming obsessed
02:22:18.220 | about what you're doing for its own sake,
02:22:19.900 | sort of the journey, the beauty of that thing.
02:22:23.540 | And I think money's just this thing we have to deal with
02:22:28.020 | to be able to do cool stuff.
02:22:30.060 | Like I acknowledge that, you know,
02:22:32.860 | you need money to build a $10 million studio,
02:22:35.300 | like you gotta get the cameras, you gotta get the lights
02:22:37.100 | and I'm very blessed to be able to have gotten that.
02:22:39.700 | But past a certain point, like I think that is really
02:22:42.800 | the function of money is to just do cool stuff.
02:22:46.040 | But ultimately, if you can't fall in love with the process
02:22:50.040 | and like the craft itself, you will be left very unhappy
02:22:53.680 | at the end of it.
02:22:55.320 | And so to start people off on that journey
02:22:58.440 | by pointing to the shiny object and going like,
02:23:00.600 | that's what you should care about,
02:23:02.400 | seems to me so backwards.
02:23:03.960 | We should learn from the actual people who have done it
02:23:06.160 | and said, that shiny thing did nothing for me.
02:23:09.320 | Learn to love the journey and like,
02:23:10.840 | that's the thing we pitch people,
02:23:12.680 | as unsexy as that might be.
02:23:14.880 | - Yeah, absolutely, that's awesome.
02:23:16.400 | And the same applies to like the Red Bull community
02:23:18.600 | that talks about dating and so on.
02:23:20.400 | That there is, it's not just about the number of hot women
02:23:25.400 | you go to bed with, it's also about intimacy and love
02:23:29.120 | and all those kinds of things.
02:23:30.060 | And so like, there's components to a fulfilling life
02:23:35.060 | that is important to sort of educate young people about.
02:23:40.920 | - Totally.
02:23:42.120 | - But at the same time, feeding the dream is saying,
02:23:44.000 | take big risks.
02:23:45.000 | - Sure.
02:23:45.840 | - The little you that has no evidence of ever being great
02:23:50.760 | can be great because there's evidence time and time again
02:23:54.180 | of people that come from very humble beginnings
02:23:56.120 | and doing incredible things that change the world.
02:23:58.240 | - Yeah, and there's just a tremendous like funny thing
02:24:01.080 | where you can't become great without having
02:24:03.960 | a willful denial of the statistics.
02:24:06.480 | Like in some ways you have to take the chance,
02:24:08.320 | even if that chance is so improbable.
02:24:10.920 | And it's always those people who did take that chance
02:24:13.000 | who end up winning, so I agree.
02:24:15.280 | - Probably SBF and FTX is the biggest thing
02:24:19.440 | you've ever covered, but previously you've called
02:24:22.640 | the Save the Kids scam the world's influencer scam,
02:24:26.000 | like the biggest in the world influencer scam
02:24:28.720 | you've ever seen.
02:24:30.240 | Can you describe it?
02:24:31.740 | - Sure, so Save the Kids was a charity coin
02:24:34.680 | that was launched by a number of extremely popular
02:24:39.180 | influencers, I think they had over 50 million followers
02:24:41.960 | together, huge names, and they basically said,
02:24:45.080 | "Hey guys, invest in this coin, we're gonna save the kids.
02:24:48.520 | "A portion of the proceeds go to charity."
02:24:51.240 | And this coin, it's unruggable.
02:24:54.280 | So rugging is the term for, remember earlier
02:24:57.100 | we talked about SafeMoon, you just grab the pool
02:24:58.880 | of funds in the middle and you take them out.
02:25:00.480 | Okay, it's unruggable because we have this smart code
02:25:04.560 | that is gonna prevent people who are quote whales,
02:25:08.600 | which is a crypto term for saying you have a large portion
02:25:11.960 | of the tokens, it'll prevent those people from selling
02:25:14.860 | a large amount of that at one time, right?
02:25:18.080 | And so basically you don't have to worry about trusting us,
02:25:21.160 | it just is what it is, join and we will change the world,
02:25:25.960 | save the kids, whatever.
02:25:27.160 | It was really skeezy from the beginning and sketchy
02:25:30.200 | because their logo matched the Save the Children logo,
02:25:33.760 | which is like an actual charity that, you know,
02:25:36.180 | so they basically copied it and said,
02:25:37.380 | "We're saving the kids," like a knockoff brand.
02:25:40.760 | And almost immediately the project rugged,
02:25:43.320 | they stole the money and tracing back through,
02:25:47.320 | the code was changed at the last second before launch.
02:25:51.500 | Like if you looked at their code that they launched
02:25:53.500 | as a test versus the code they launched in actuality,
02:25:56.800 | they changed only like two lines and it was the whale code
02:26:00.340 | to basically make the whale code non-existent.
02:26:02.000 | Like you can sell as much as you want, as fast as you want.
02:26:05.000 | And it turned out that some of the influencers
02:26:08.220 | had not only sold that and made money,
02:26:11.800 | but also had a pattern of pump and dumping tokens.
02:26:15.640 | So we can talk about what that is like.
02:26:16.980 | - Yeah, what's pump and dump?
02:26:18.400 | - A pump and dump is just where, you know,
02:26:20.680 | you have a huge following,
02:26:22.360 | you promote your little Lex coin to everybody
02:26:25.280 | while holding a big portion of it.
02:26:27.000 | And as everyone rushes in to buy it,
02:26:28.620 | the price is gonna pump and you dump at the same time.
02:26:32.440 | So that's where the name comes from, pump and dump.
02:26:34.740 | You pump the price, sell all your tokens,
02:26:37.000 | make a lot of money.
02:26:38.080 | So I traced basically their wallets on the blockchain
02:26:41.200 | and found that two of the actors specifically
02:26:44.360 | had had a long history of doing this,
02:26:46.420 | which really proved malicious intent.
02:26:49.920 | And why I called it the worst is not,
02:26:51.520 | it certainly wasn't the worst in terms of like
02:26:53.200 | the amount of people affected.
02:26:54.920 | It relatively was like a small pump and dump
02:26:58.240 | 'cause it rugged almost immediately.
02:27:00.840 | But in terms of the amount of people
02:27:03.160 | that were involved in it,
02:27:04.560 | in terms of the amount of malicious behavior before it
02:27:07.760 | that like sort of proved that this wasn't an accident,
02:27:10.760 | the fact that there was like this whale code,
02:27:12.720 | it was one of the most cynical attempts
02:27:14.600 | to just take the money of the followers you had
02:27:17.720 | and just like, that's mine now.
02:27:19.840 | So that's why I called it that, but that's to save the kids.
02:27:23.200 | - So that was-
02:27:24.980 | - That was a lot of the FaZe members.
02:27:26.800 | And it was, I think Addison Rae.
02:27:29.860 | There were a lot of people who seemed like
02:27:31.320 | they were kind of taking shrapnel on it.
02:27:33.600 | There was like this guy, Tico,
02:27:34.920 | who he didn't even sell the tokens.
02:27:36.240 | He just like held onto it the entire time
02:27:37.840 | and lost like a few thousand dollars or maybe even,
02:27:40.840 | I forget the exact amount.
02:27:41.920 | He lost a lot of money, a decent amount.
02:27:43.980 | And so like he took a lot of shrapnel with that,
02:27:47.120 | but there were also people who were maliciously doing this.
02:27:49.720 | So in that investigation,
02:27:51.920 | like several of the members of FaZe got kicked out.
02:27:55.560 | One of them got like permanently banned.
02:27:57.600 | And then this other guy that I talked about fled the country
02:28:00.880 | like he sold all his belongings and like fled the country
02:28:03.720 | and hid out in London or wherever he is now.
02:28:07.320 | I don't really know where he is.
02:28:08.200 | Somewhere in the UK area.
02:28:10.240 | - So the basic idea there is to try to convert
02:28:12.920 | your influence into money.
02:28:14.980 | - Correct.
02:28:16.300 | - Okay.
02:28:17.140 | - That's the basic idea behind
02:28:18.140 | a lot of influencer crypto promotions.
02:28:21.160 | - Well, but, right, but that little word influencer
02:28:25.440 | means something 'cause there are most crypto scams,
02:28:29.160 | influencer scams, they're not, right?
02:28:30.600 | Most kind of-
02:28:32.280 | - Right, the most high profile ones,
02:28:34.640 | like just by nature,
02:28:36.600 | they tend to be made high profile by the influencer.
02:28:39.240 | So sometimes they are, but you're right.
02:28:41.760 | A lot of money has been lost and like nobody finds out
02:28:44.360 | because there's no one big sort of attached to it.
02:28:46.380 | They just steal a lot of money.
02:28:47.960 | But influencers are great salespeople
02:28:51.500 | because like in order to overcome the resistance
02:28:55.340 | of getting you to buy some random coin,
02:28:58.540 | there has to be a reason.
02:28:59.800 | And so much of the 21st century content creator generation
02:29:03.200 | is defined by these strange parasocial relationships
02:29:06.480 | where people feel like they know you,
02:29:08.980 | not the character you play, but you,
02:29:11.440 | and you have some friendship with them.
02:29:13.900 | When in actuality, you don't know the viewers.
02:29:16.300 | You know, you have a sense,
02:29:17.240 | but you don't actually know all of these people.
02:29:20.480 | And so that relationship is extremely powerful
02:29:25.160 | in terms of persuasion.
02:29:26.760 | So you can say, I believe in this,
02:29:29.400 | and I've watched you for years,
02:29:31.800 | and all of a sudden I say, if Lex believes in it,
02:29:33.640 | I believe in it.
02:29:34.480 | I trust him as a human.
02:29:35.960 | And so that differentiates these coins.
02:29:38.440 | And all of a sudden the coin blows up, gets really popular.
02:29:41.880 | You made this side deal and you make a ton of money.
02:29:44.280 | - I have to say podcasting in particular is an intimacy.
02:29:47.080 | Like I'm a huge fan of podcasts
02:29:48.400 | and I feel like I'm friends with the people I listen to.
02:29:51.480 | - Right.
02:29:52.320 | - And boy, is that a responsibility.
02:29:54.520 | - Yes.
02:29:57.120 | And that's why it really hurts me to announce
02:30:01.960 | that I am launching LexCoin.
02:30:05.080 | No, man, I hate money.
02:30:11.560 | I hate this kind of, the schemingness of all of this,
02:30:14.320 | the use of any kind of degree of fame
02:30:18.440 | that you have for that kind of stuff.
02:30:19.560 | There's something just--
02:30:20.400 | - What makes it so like frustrating is these people--
02:30:23.000 | - Disgusting about it.
02:30:23.840 | - I have a general sense of what they were like,
02:30:26.440 | sort of what I'm in the,
02:30:29.160 | even though I wouldn't describe myself as an influencer,
02:30:31.000 | I make content on YouTube.
02:30:32.440 | I know that especially since they were taking
02:30:34.920 | these huge corporate sponsorships,
02:30:36.800 | they were making tons of money.
02:30:38.320 | They didn't have need for these scams.
02:30:41.040 | I mean, I think it's one thing to scam
02:30:43.140 | if you're like broke on the street,
02:30:45.680 | you know, and you're playing three card Monte to like live.
02:30:48.460 | And I think it's a whole other ethically cruel thing to do
02:30:51.880 | if you're basically trying to upgrade your penthouse
02:30:54.980 | to the building next door.
02:30:56.180 | And like you're already well off
02:30:57.600 | and you just kind of wanna get even further ahead.
02:31:00.040 | I think that's where it--
02:31:01.040 | - Well, this is the fascinating thing.
02:31:02.360 | So I've been very fortunate recently to sort of get,
02:31:06.960 | you know, whatever, a larger platform.
02:31:11.240 | And when you find out it's like, life is amazing.
02:31:13.520 | I always thought life is amazing,
02:31:14.960 | but it becomes more amazing.
02:31:16.480 | Like, you meet so many cool people and so on.
02:31:19.060 | But what you start getting is you have more opportunities
02:31:22.760 | to like, yeah, like scammers will come to you
02:31:26.840 | and then try to use you, right?
02:31:29.320 | And I could see for somebody could be tempting to be like,
02:31:32.540 | ooh, it'd be nice to make some money.
02:31:34.200 | - I wanted to say like on this kind of,
02:31:36.200 | we're on this topic of opportunities you get,
02:31:38.120 | you know, kind of when you get a platform.
02:31:40.480 | So one of the reasons kind of I railed a little bit earlier
02:31:43.160 | against materialism or whatever,
02:31:46.120 | I think to the extent to which
02:31:47.600 | you can moderate your own greed,
02:31:50.520 | you can play longer term games.
02:31:52.880 | And I think so many people end up cutting
02:31:56.060 | an otherwise promising career short
02:31:58.480 | by just wanting it too fast.
02:32:01.260 | So I think it's like a huge edge,
02:32:02.960 | just like discipline is in terms of like,
02:32:05.400 | achieving what you want.
02:32:06.600 | I think I'm like a very moderate,
02:32:08.360 | like being comfortable with a moderate existence
02:32:10.320 | and finding happiness in that is a huge edge
02:32:13.840 | because really your overhead is so much cheaper
02:32:16.000 | than the people who need a Ferrari
02:32:17.720 | or a super nice house to feel fulfilled.
02:32:20.360 | And when your overhead is less,
02:32:21.640 | you have the luxury to say no to like sketchy offers,
02:32:24.960 | to the different, you have freedom
02:32:26.140 | that other people don't have
02:32:27.120 | 'cause a lot of times people don't pitch it this way.
02:32:30.100 | They pitch a Ferrari as freedom
02:32:31.600 | or like a big house is like you've made it.
02:32:33.680 | In a lot of ways, those shackle you back to like,
02:32:36.440 | you gotta find the cashflow for those things.
02:32:38.160 | It's never a free ride.
02:32:39.800 | - Yeah, that's really beautifully put.
02:32:41.240 | I've always said that I had fuck you money
02:32:43.720 | at the very beginning.
02:32:44.880 | I was broke for most of my life.
02:32:47.000 | The way you have fuck you money
02:32:49.040 | is by not needing much to say fuck you.
02:32:51.160 | - That's right.
02:32:52.000 | - I mean, that's the overhead that you're talking about.
02:32:53.840 | If you can live simply and be truly happy
02:32:57.080 | and be truly free, I think that means
02:33:00.920 | you could be free in any kind of situation.
02:33:04.080 | You could make the wise kind of decisions.
02:33:06.160 | And in that case, money enables you in certain ways
02:33:10.960 | to do more cool stuff, but it doesn't shackle you,
02:33:13.120 | like you said.
02:33:14.200 | Too many people in this society, you would shackle
02:33:17.080 | 'cause material possession is kind of like
02:33:18.760 | draw you into this race of more and more and more and more.
02:33:23.760 | And then you feel the burden of that,
02:33:26.520 | bigger houses and all that kind of stuff.
02:33:28.080 | And now you have to keep working.
02:33:30.000 | Now you have to keep doing this thing.
02:33:31.840 | Now you have to make more money.
02:33:33.180 | And if it's a YouTube channel and so on,
02:33:35.520 | you have to get more.
02:33:36.640 | And the same, it's not even just about money.
02:33:38.920 | That's why I deliberately don't check views and likes
02:33:44.040 | and all that kind of stuff,
02:33:45.280 | is you don't want that dopamine of like,
02:33:48.500 | of pulling you in.
02:33:49.820 | I have to do the thing that gets more and more attention
02:33:52.220 | or more and more like, yeah, like money.
02:33:56.580 | And- - Yeah, that's so wise.
02:33:58.340 | Yeah, agreed.
02:33:59.540 | - It's a huge negative hit on your ability
02:34:04.540 | to do creative work.
02:34:06.300 | - Can I ask you about that?
02:34:07.380 | 'Cause I'm always interested in this.
02:34:09.220 | I completely agree.
02:34:12.420 | I think it's funny because when you abstract yourself out
02:34:15.040 | to the people you admire and respect
02:34:17.060 | who inspired you to do the creative work you do,
02:34:20.100 | you never think about like the views they were getting
02:34:23.540 | or the money they were making or the influence they had.
02:34:26.420 | All you ever think about is the work itself.
02:34:29.260 | And it's funny when a lot of people get in this position,
02:34:33.220 | your temptation is to focus on that which you can measure,
02:34:36.260 | which is like all the stuff you said,
02:34:38.820 | like the likes, the views.
02:34:40.460 | That's not actually the target or what you got into it for.
02:34:45.220 | If you get into it for like, 'cause you're inspired
02:34:47.180 | or whatever, your goal is inspiration, it's impact.
02:34:49.740 | And like that can't be quantified that same way.
02:34:52.300 | So it's interesting, you have to find a way
02:34:56.180 | as a creator of any of this stuff
02:34:59.060 | to like deliberately detach yourself from the measurable
02:35:03.580 | and focus on this thing, which is kind of abstract.
02:35:05.620 | And I was wondering if you have any like ideas for that.
02:35:08.300 | - So one, yeah, there's a bunch of ideas.
02:35:10.340 | So one is figure out ways where you don't see
02:35:15.340 | the number of views on things.
02:35:20.220 | So I wrote a Chrome extension for myself
02:35:23.180 | that hides the number of views.
02:35:24.740 | - That's really funny.
02:35:27.180 | No, what's funny is I have-- - For me.
02:35:28.860 | 'Cause it's useful for other people's content.
02:35:30.420 | - Right, oh my gosh, I'm gonna need to borrow this.
02:35:33.340 | That was my problem.
02:35:34.380 | I actually have some Chrome extensions for,
02:35:37.380 | like I don't like going down like recommendation rabbit holes
02:35:40.620 | when I'm at work.
02:35:41.500 | I just wanna like search for a video, find it.
02:35:43.420 | I don't wanna see like all the up next
02:35:45.540 | 'cause all the waste time.
02:35:46.620 | So I use Chrome extensions for that.
02:35:48.060 | But the views is a problem because it's relevant to me
02:35:50.860 | as a creator, like, is this a big video?
02:35:52.540 | Is it a-- - Yeah, which is why
02:35:53.900 | I really hurt when they remove like likes and dislikes
02:35:56.180 | 'cause I wanna know for tutorials and so on.
02:35:58.860 | I mean, that's probably really useful for you,
02:36:02.060 | the dislikes, yeah. - Yeah, do you have that?
02:36:03.980 | Do you ever consider making that Chrome extension public?
02:36:07.540 | - Sure, actually, yeah.
02:36:09.020 | And there would be a good philosophy behind it, right?
02:36:10.740 | Like don't, if you're a creator--
02:36:11.940 | - I really like it, I love the idea.
02:36:13.340 | I've wanted this thing before.
02:36:14.760 | I don't know if it necessarily exists.
02:36:16.980 | - 'Cause I don't think I've made a Chrome extension public.
02:36:19.780 | That'd be cool, I would love to see,
02:36:21.740 | yeah, I would go to that process of adding it to the,
02:36:25.220 | 'cause I love like open sourcing stuff.
02:36:26.700 | So yeah, I'll go add it to the Chrome extension,
02:36:29.620 | like the store. - Yeah, 'cause I totally have,
02:36:32.300 | I've hated this for like a long time.
02:36:34.940 | YouTube made a change and they just continue
02:36:37.180 | to make the analytics front and center,
02:36:38.960 | which makes sense from their perspective.
02:36:40.860 | They're trying to give people better data
02:36:42.580 | on what is successful and what makes something successful.
02:36:45.380 | They're trying to train their creators,
02:36:47.020 | but in the process, it can lead to some unhealthy habits
02:36:51.700 | of thinking views define a video.
02:36:53.780 | And so I've long thought, okay, I've learned analytics,
02:36:56.300 | I understand retention.
02:36:57.640 | Now I sort of wanna do like the zen, like forget it all.
02:37:01.020 | And you can only do that if it's out of your sight.
02:37:03.100 | - Depends how many friends you have who are creatives.
02:37:05.500 | The other really important thing,
02:37:07.540 | and I found this, this has nothing to do with creatives,
02:37:10.100 | but people I respect very much in my life,
02:37:13.260 | some of them people would know that they could be famous.
02:37:17.980 | They will come to me and say,
02:37:20.100 | they will comment on how popular a video was on YouTube.
02:37:23.940 | They will sort of compliment--
02:37:25.940 | - The success.
02:37:27.220 | - The success defined by the popularity.
02:37:29.220 | Even for a podcast where most of the listenership
02:37:32.140 | is not on YouTube, or Spotify now is getting crazy,
02:37:36.120 | they will still compliment the YouTube number.
02:37:39.740 | So one of the deliberate things I do is I either,
02:37:42.740 | depending if it's a close friend,
02:37:43.860 | I'll get offended and made fun of them for that,
02:37:46.780 | and sort of signal to them this is not the right thing.
02:37:50.020 | I don't want that.
02:37:51.340 | And for people more like strangers
02:37:53.900 | that compliment that kind of stuff,
02:37:55.540 | I show zero interest.
02:37:57.900 | I don't receive the compliment well,
02:37:59.940 | and I focus on the aspects of the compliment
02:38:02.020 | that have to do with what do they find interesting.
02:38:05.420 | I kind of make them, reveal to them
02:38:09.180 | that you shouldn't care about the number of views.
02:38:11.900 | - It is strange.
02:38:12.740 | There's like this weird hypnotism that happens
02:38:14.960 | once you get past a certain number,
02:38:17.380 | and that number is some approximation.
02:38:19.260 | It's always like hard numbers.
02:38:20.380 | It's like 100,000, a million, 10 million.
02:38:23.300 | People just see a number and they just go like,
02:38:25.080 | wow, that is, and they assign a quality to it
02:38:27.500 | that may not, it usually means nothing at all.
02:38:30.700 | So I agree, I've never been good at handling that
02:38:34.500 | 'cause you're like, thank you?
02:38:36.300 | - Yeah. - And it's like, okay.
02:38:37.740 | - That said, I do admire, very different from me,
02:38:40.140 | but I admire Mr. Beast, who unapologetically
02:38:44.220 | says the number is all that matters.
02:38:46.700 | Like basically, the number shows,
02:38:50.460 | like the number of views you get
02:38:52.640 | shows how much, I don't know,
02:38:55.820 | joy you brought to people's lives
02:38:57.540 | because if they watched the thing,
02:38:59.400 | they kept watching the thing,
02:39:01.380 | they didn't turn to now, that means they loved it.
02:39:04.300 | You brought value to their life, you brought enjoyment,
02:39:06.860 | and I'm going to bring the maximum amount of enjoyment
02:39:09.900 | to the maximum number of people,
02:39:12.060 | and I'm gonna do the most epic videos
02:39:14.220 | and all that kind of stuff.
02:39:15.040 | So I admire that when you're so unapologetically
02:39:18.500 | into the numbers.
02:39:20.020 | - Yeah, he's sort of, it's interesting.
02:39:21.860 | He's like, gosh, we're getting way too in the way.
02:39:24.940 | Is this a, I don't know, I'm like constantly
02:39:27.260 | self-monitoring about like what topics we're on,
02:39:29.220 | but if we can, Mr. Beast is so interesting
02:39:31.700 | 'cause he's almost done what,
02:39:32.900 | have you ever seen Moneyball?
02:39:34.340 | - Yes.
02:39:36.340 | - It's the story of how someone brought statistics
02:39:37.700 | to baseball and it revolutionized everything.
02:39:39.980 | He's Moneyballed YouTube.
02:39:42.420 | He took statistics to YouTube and it changed everything.
02:39:45.060 | And everybody now, so many people are playing catch up.
02:39:49.380 | I think it would be interesting in a few years
02:39:52.560 | to see how he develops.
02:39:54.740 | And now that he's like kind of revolutionized
02:39:57.500 | like the data side of things,
02:39:59.460 | how he then approaches future videos.
02:40:03.180 | Because there's a point at which you've optimized,
02:40:06.500 | you've optimized, you've optimized,
02:40:08.140 | but optimizing for short-term video performance
02:40:10.740 | is not the same as optimizing
02:40:11.940 | for long-term viewer happiness.
02:40:13.860 | And how do you do that?
02:40:15.820 | Assuming the YouTube algorithm does not perfectly
02:40:18.780 | already do it for you, which it doesn't,
02:40:20.780 | but they're trying to obviously do that,
02:40:22.460 | optimize for long-term happiness, but.
02:40:24.060 | - And also growing, optimize for long-term creative growth.
02:40:28.220 | - Sure.
02:40:29.060 | - I think the thing that people don't,
02:40:30.860 | I mean, maybe I don't know,
02:40:32.180 | I actually don't know enough about Jimmy.
02:40:34.100 | But like, to me, the thing that seems to be special
02:40:38.540 | about him isn't the Moneyball aspect.
02:40:42.020 | That's really important.
02:40:43.640 | It's like taking the data seriously.
02:40:46.180 | But to me, it's the part of the idea generation.
02:40:49.400 | The constant brainstorming and coming up with videos.
02:40:52.140 | So it's nice to connect the idea generation with the data,
02:40:55.900 | but like how many people, when they create on YouTube
02:41:00.300 | and other platforms, really generate a huge amount of ideas?
02:41:04.820 | Like constantly brainstorm,
02:41:06.660 | constantly, constantly brainstorm.
02:41:08.740 | At least for me, I don't,
02:41:10.120 | I don't, like, I don't think I go
02:41:13.380 | so many steps ahead in my thinking.
02:41:17.660 | I don't like try to come up
02:41:18.820 | with all possible conversations.
02:41:20.200 | I don't come up with all possible videos I can make.
02:41:22.980 | - But you can't, so the one mistake to make
02:41:25.500 | is to map Jimmy's philosophy onto every genre,
02:41:29.100 | 'cause not every genre fits that model.
02:41:31.600 | Your model is not an idea-centric model.
02:41:34.740 | It's a people-centric model.
02:41:36.380 | And so you, like, if you were in the business
02:41:40.320 | of creating just mass entertainment
02:41:42.500 | for the sake of mass entertainment,
02:41:44.580 | you might focus on, okay, the reason going idea-focused
02:41:49.420 | instead of person-focused is such a revolutionary idea
02:41:52.740 | in some senses is because ideas can be broader,
02:41:55.880 | more broadly appealing than any single human can be.
02:41:58.500 | But you're not going for that.
02:42:01.500 | You're going for a podcast interview.
02:42:03.700 | And I think for you, the goal should always be
02:42:06.940 | how deep can you get with interesting guests
02:42:09.340 | and, like, finding the most interesting guests,
02:42:11.260 | which is a different, probably, set of skills.
02:42:13.940 | - Well put, really well put.
02:42:15.220 | But I think the right mapping there
02:42:17.380 | is finding the most interesting guests.
02:42:19.060 | - Yeah, right.
02:42:19.980 | - I think I don't do enough work on that.
02:42:23.220 | So for example, I try to be,
02:42:26.620 | something I do prioritize is talking to people
02:42:29.460 | that nobody's talked to before.
02:42:31.020 | 'Cause it's like, I kind of see myself as not a good,
02:42:35.580 | like, I know a lot of people that are much better than me.
02:42:37.200 | I really admire, I think Joe Rogan is still the GOAT.
02:42:41.580 | He's just an incredible conversationalist.
02:42:43.440 | So it's like, all right,
02:42:45.340 | who is somebody Joe's not gonna talk to?
02:42:48.420 | Either he's not interested or it's not gonna happen.
02:42:53.580 | Like, I wanna talk to that person.
02:42:54.780 | I wanna reveal the interesting aspects of that person.
02:42:57.580 | And I think I should do a Mr. Beast-style rigor
02:43:02.580 | in searching for interesting people.
02:43:07.220 | - And you should probably find people to help you search.
02:43:11.660 | - Sure, he does that.
02:43:12.700 | But if we're being honest, he does that, of course,
02:43:15.260 | with other folks, but he's the main engine.
02:43:18.740 | - Yeah, you need like sort of like a pre-filter,
02:43:22.860 | you're the final filter.
02:43:24.300 | 'Cause your problem is you're only able to think of humans
02:43:29.300 | that you've thought of before or been exposed to.
02:43:32.660 | And most of the world you've never been exposed to.
02:43:35.220 | So you need people to like pre-filter and go,
02:43:37.900 | okay, these guys are just interesting humans.
02:43:40.380 | Lex has never heard of them.
02:43:41.620 | And then you sort of take a batch of like 100 people
02:43:45.220 | and you go, who seems the most interesting for me?
02:43:48.020 | - But by the way, on that topic, we're wheezing into wheeze,
02:43:51.020 | I've almost done, I'm building up,
02:43:53.580 | I programmed this guest recommendation thing
02:43:58.420 | where I wanna get suggestions from other people.
02:44:00.580 | 'Cause I really wanna find people that nobody knows.
02:44:03.900 | This is the tricky thing.
02:44:05.380 | Not, you're not famous, but the idea is
02:44:09.180 | there's probably fascinating humans out there
02:44:11.620 | that nobody knows.
02:44:12.860 | - Correct.
02:44:13.780 | - That, I wanna find those.
02:44:14.940 | And I believe in the crowdsourcing aspect will raise them.
02:44:17.820 | And now of course the top 100 will be crypto scams.
02:44:20.940 | No, but yes.
02:44:22.460 | So like I have to make sure that these kinds of swarms
02:44:25.860 | of humans that recommend, I can filter through
02:44:28.620 | and there's whole kinds of systems for that.
02:44:30.660 | But I wanna find the fascinating people out there
02:44:33.660 | that nobody's ever heard from.
02:44:34.980 | And from a programmer perspective,
02:44:38.100 | I thought surely I could do that
02:44:39.300 | by just building the system.
02:44:41.460 | - Yeah, that's how programmers always think.
02:44:42.820 | They'll just automate a system to do it.
02:44:44.460 | - Well, that's the Jimmy Moneyball, right?
02:44:46.140 | Like looking at the data.
02:44:47.540 | - Yep.
02:44:48.380 | (laughing)
02:44:50.300 | Weeds on weeds.
02:44:51.220 | - How do we get to Mr. Beast exactly?
02:44:53.220 | - I'm not sure.
02:44:54.060 | - Okay, save the kids.
02:44:56.220 | Influencer.
02:44:57.060 | - Influencer, that's probably how.
02:44:58.400 | - Let me ask you more on the guru front.
02:45:00.580 | You've, okay, let's start with somebody
02:45:03.060 | that you've covered that I think you've covered a lot
02:45:05.760 | and I'm really embarrassed to not know much about him.
02:45:08.660 | I think this is like old school coffee.
02:45:10.060 | So you've been through stages, okay?
02:45:11.660 | - I've been through stages and phases, true.
02:45:13.660 | - So a character named Dan Lok.
02:45:15.700 | - Yeah.
02:45:16.540 | - Who is he?
02:45:18.780 | You've exposed him for a cult-like human
02:45:23.140 | and his cult-like practices.
02:45:24.860 | Who is he?
02:45:25.700 | What has he done?
02:45:26.580 | - So Dan Lok is sort of,
02:45:29.340 | he's gone through a number of iterations,
02:45:31.100 | but he was kind of this like sales trainer guy
02:45:34.620 | who really made a hard push
02:45:36.800 | into what he called high ticket sales.
02:45:39.140 | And he was telling people that they could kind of escape
02:45:41.780 | the nine to five rat race
02:45:43.460 | if they just learn high ticket sales
02:45:45.340 | and they can have the life of their dreams.
02:45:47.680 | Basically, it's like, I'll teach you to sell,
02:45:50.040 | but I'll teach you to ask,
02:45:51.940 | like not only will I teach you to sell you that pen,
02:45:54.480 | but I'll teach you to sell it for $50,000
02:45:57.140 | instead of a dollar, right?
02:45:58.800 | So I talked to a lot of the people
02:46:02.040 | who had taken this course 'cause it was pretty expensive.
02:46:04.260 | I think it was like $2,500 or $1,000.
02:46:08.060 | And mind you, the people who are taking it are like teachers
02:46:11.000 | and like people who don't have a lot of money.
02:46:14.140 | And then you take the course and immediately you find out,
02:46:17.380 | okay, well, there's an upsell.
02:46:18.620 | At the end of the course, you're not ready.
02:46:21.020 | You need to go from like high ticket closer,
02:46:23.320 | which is one of the products to inner circle
02:46:26.460 | or like the level up, right?
02:46:28.740 | And all of these courses are structured like this.
02:46:31.340 | So they spend a tremendous amount on Google ads
02:46:34.620 | to get people in the door, promising the dream.
02:46:37.300 | And then once you're in,
02:46:38.820 | you're actually not done being like the product.
02:46:41.640 | You're actually in the system that tries to upsell you
02:46:45.040 | again and again and again and again.
02:46:46.880 | And eventually you're paying monthly
02:46:48.520 | and you're getting more and more.
02:46:49.780 | You're constantly paying for access to Dan Lok's wisdom
02:46:52.360 | and like ideas.
02:46:54.040 | And fundamentally,
02:46:55.840 | the sales system wasn't working for people.
02:46:59.480 | I mean, I talked to like, for example,
02:47:00.600 | a teacher who put in like $25,000,
02:47:04.400 | was in debt at one point and has nothing to show for it.
02:47:07.080 | I know, and it was sort of these tactics
02:47:10.000 | of pressuring, pressuring, pressuring.
02:47:12.000 | And then anytime anyone would complain,
02:47:13.740 | he would try to silence them.
02:47:14.860 | So I heard from like, funny enough,
02:47:18.500 | this lady was a teacher as well.
02:47:20.100 | She put together a Facebook group,
02:47:23.040 | basically saying, I think this guy's a scam.
02:47:24.720 | His course didn't work.
02:47:25.820 | It's not working for a lot of people
02:47:28.140 | because fundamentally the promise of turning someone
02:47:30.300 | from a non-salesman into a person
02:47:33.640 | who's making six figures selling
02:47:35.660 | is not an easy thing to do.
02:47:37.040 | It's not just a matter of just like take my course.
02:47:39.080 | But anyways, it wasn't working.
02:47:40.700 | She created a Facebook group about it
02:47:42.520 | and he like sues her
02:47:43.820 | or, and it was like legally pressuring her
02:47:46.300 | to stop doing that.
02:47:47.440 | And I realized like somebody has to speak out about this
02:47:50.960 | and everyone who is, is getting silenced.
02:47:53.720 | So I was like, I'm gonna use my platform
02:47:55.060 | to raise awareness to this.
02:47:56.920 | And people came out of the woodwork.
02:47:58.340 | I mean, saying that this guy defrauded me or he scammed me.
02:48:01.560 | And I wanna just really quickly take a second,
02:48:05.080 | take a beat to explain why get rich quick schemes
02:48:10.060 | are different than let's say selling a water bottle
02:48:13.980 | and saying it's the greatest water bottle ever.
02:48:16.320 | Right, 'cause sometimes people wonder,
02:48:17.440 | they go like, well, doesn't like Nissan
02:48:20.240 | say their car is gonna make you happy
02:48:21.560 | and then it doesn't make you happy.
02:48:22.460 | Like, why is that different
02:48:24.200 | from the kind of advertising of a get rich quick course?
02:48:27.720 | I mean, both of them are sort of promising things
02:48:29.240 | that aren't true, but you get something,
02:48:31.440 | you take some kind of a training.
02:48:32.880 | You know, isn't it the same thing?
02:48:34.820 | No, here's why.
02:48:36.140 | There's this concept in economics called elastic demand
02:48:40.400 | and inelastic demand.
02:48:41.940 | What it essentially means is that if I raise the value
02:48:45.780 | of this water bottle, there's a point at which
02:48:47.820 | you're just gonna be like, no,
02:48:49.080 | it doesn't make any sense, right?
02:48:51.040 | But there are areas in our lives
02:48:53.000 | where we have desperation around them
02:48:57.020 | that can get deeply predatory very quickly
02:49:00.240 | because they have no,
02:49:02.760 | there's no elasticity around their demand.
02:49:05.180 | For example, your health.
02:49:07.460 | If you get cancer and I have the pill that will solve it,
02:49:11.700 | or at least let's say I don't, I have a sugar pill here,
02:49:14.540 | but if I can convince you that this pill
02:49:16.100 | will solve your cancer or treat your cancer,
02:49:19.220 | you will pay any amount of money you have on this earth
02:49:22.500 | to get this pill.
02:49:24.780 | But obviously that gets really predatory really quick
02:49:26.780 | because selling something that isn't real
02:49:30.340 | is almost as compelling as selling something
02:49:31.820 | that is real, right?
02:49:33.100 | So this happens in the get rich quick space too.
02:49:35.800 | There's any amount of money you would pay
02:49:37.560 | to make a lot more money, right?
02:49:39.800 | So these products have inelastic demand.
02:49:42.480 | That's why you see what is essentially a few webinars
02:49:46.220 | getting sold for $2,500.
02:49:48.680 | Courses that literally have identical videos on YouTube,
02:49:52.680 | like very similar course curriculums
02:49:54.540 | that are selling for such extravagant amounts of money.
02:49:57.440 | And I think there can be comparisons made to college
02:50:01.680 | 'cause obviously there's similar questions about benefits,
02:50:06.680 | but in this case, there's not even statistics available
02:50:09.060 | that even shows the average person
02:50:10.400 | gets something out of it.
02:50:11.720 | That's true of like, if you go to college,
02:50:13.520 | your average income will improve, right?
02:50:15.740 | That's the justification there.
02:50:16.840 | There's none of that.
02:50:17.720 | There's no case studies.
02:50:19.020 | There's nothing backing their extravagant claims
02:50:21.680 | of you're gonna make all this money,
02:50:22.920 | you're gonna make all this wealth.
02:50:23.920 | Instead, they're just, as we said before,
02:50:25.840 | they're selling you a dream.
02:50:27.160 | So that's why I find all those types
02:50:30.000 | of get-rich-quick schemes so problematic,
02:50:32.000 | and it's why I've railed against them
02:50:33.920 | for a significant amount of time.
02:50:35.440 | - What have you learned from attacking,
02:50:38.280 | exposing some of the things that Dan Lok is doing?
02:50:41.080 | What have you learned about human nature
02:50:43.040 | and frosters and gurus and so on?
02:50:45.640 | - That's a great question.
02:50:46.480 | So I think one of them is that
02:50:49.600 | there's this systemic problem that the phrase,
02:50:54.840 | there's a sucker born every minute, is very true.
02:50:57.940 | There is no end to the people
02:51:00.480 | who will fall for something like this.
02:51:03.560 | And the problem is, is because there's just no end
02:51:05.880 | to need and want and just lack.
02:51:09.440 | I mean, it's easy to, on the one hand,
02:51:11.640 | criticize people's greed, but a lot of times,
02:51:13.640 | you have to put yourself in their shoes.
02:51:15.440 | If you're at a dead-end job, you have nothing going for you,
02:51:19.280 | you don't have the money to go to college,
02:51:20.480 | you don't wanna get in debt, fair play,
02:51:22.740 | where do you go, right?
02:51:24.720 | As you said, there's somebody who's there saying
02:51:26.600 | they believe in you, they believe you can make six figures.
02:51:29.840 | You're gonna believe in that.
02:51:32.000 | And so I really felt like it made a lot more sense
02:51:37.000 | to tackle it from the other side,
02:51:39.920 | from the side of people that can stop,
02:51:42.240 | that can basically be exposed and basically be,
02:51:46.460 | have sort of like a negative put on their work.
02:51:50.120 | I mean, they're largely going under the radar.
02:51:52.280 | So I kind of felt like, do you wanna educate,
02:51:56.200 | do you wanna blame it on the victims and say,
02:51:58.880 | you should have known better, you should have done this,
02:52:00.400 | you should stop, but there's no end to that.
02:52:04.320 | Or do you go after the grifters themselves?
02:52:06.720 | And so that's what I realized.
02:52:07.960 | I realized that's the tactic that I went with.
02:52:12.400 | And it's tough 'cause it's a little
02:52:13.600 | legally risky to do that, but yeah,
02:52:17.480 | you just kinda gotta be smart about it, I guess.
02:52:18.720 | - So your platform has gotten really big,
02:52:21.360 | so there's some responsibility to that.
02:52:23.000 | - Weirdly big, yeah.
02:52:24.120 | - Yeah.
02:52:24.960 | Let's say--
02:52:27.520 | - 'Cause like only a year ago, it was like a lot,
02:52:29.400 | a lot smaller, and then it's hard to make that adjustment.
02:52:34.200 | 'Cause like to me, it's just the same,
02:52:36.320 | it's the same show I've been doing.
02:52:38.300 | - So how do you avoid becoming a guru yourself
02:52:41.720 | or your ego growing?
02:52:44.440 | There's different trajectories it could take,
02:52:48.360 | one of which is you can start seeing everybody as a scammer
02:52:53.080 | and only you can reveal it.
02:52:55.080 | And like you have a audience of people
02:52:58.320 | who love seeing the epic Coffeezilla grilling,
02:53:02.600 | and you can destroy everyone,
02:53:04.320 | and that power now is getting to your head.
02:53:06.680 | How do you avoid that?
02:53:08.040 | - Well, I mean, this is like less optically obvious.
02:53:11.600 | I think the main way is like,
02:53:13.280 | my circle of friends doesn't care about any of that.
02:53:16.120 | Like my wife doesn't care.
02:53:17.920 | The people whose opinion I value has no relation
02:53:21.240 | to like a subscriber metric or anything like that.
02:53:24.520 | I think that's like tangibly the most important thing
02:53:26.960 | to just staying grounded.
02:53:28.200 | As far as like becoming a guru,
02:53:30.120 | I just don't have anything to like sell.
02:53:32.520 | I mean, I'm not interested in teaching people finance,
02:53:35.320 | I'm not interested in teaching people,
02:53:36.960 | not interested in selling a course.
02:53:39.240 | And I've kind of given myself a hard line on that,
02:53:42.100 | which I think has helped me a bit,
02:53:43.940 | is there's a temptation to go,
02:53:46.560 | well, I can tell what's a scam,
02:53:48.280 | so let me tell you what's not a scam.
02:53:50.440 | And a lot of people have offered a lot of money to do that,
02:53:52.960 | and basically be like,
02:53:53.920 | hey, I have such and such legitimate product,
02:53:57.080 | come be like an endorser.
02:53:59.120 | And I just don't do that,
02:54:01.120 | because I think it undermines a lot of what I do,
02:54:04.080 | is if you get like,
02:54:05.620 | if you're taking money in on the side to say this is legit,
02:54:09.480 | and you're saying this isn't legit,
02:54:10.640 | that's a huge conflict of interest.
02:54:12.160 | So I think it's about managing conflicts of interest
02:54:14.320 | and keeping people around me
02:54:16.160 | that are grounded.
02:54:17.800 | And also I think,
02:54:19.040 | yeah, my only interest really is just like,
02:54:23.180 | make cool stuff,
02:54:24.480 | and I guess I'll do that until people stop watching.
02:54:27.260 | - A question on that topic
02:54:30.720 | from the Coffeezilla subreddit, shout out.
02:54:33.040 | - Shout out.
02:54:33.880 | - How does Coffee find the strength
02:54:36.640 | to maintain his integrity
02:54:38.080 | and resist temptation of being paid a great amount of money
02:54:41.160 | to advertise or promote a potential scam?
02:54:45.280 | - I think that goes back
02:54:46.680 | to what we've been talking about a lot,
02:54:47.880 | which is just on what you prioritize, what you value.
02:54:50.480 | I've just never,
02:54:51.680 | I guess I grew up kind of lower middle class,
02:54:54.560 | and I had a great time.
02:54:57.600 | Like I had a great childhood,
02:54:58.760 | I had very loving parents.
02:55:00.640 | And because of that,
02:55:02.020 | I guess intuited at an early age
02:55:05.400 | that money doesn't do a whole lot.
02:55:07.880 | And I knew a lot of people who were way better off,
02:55:10.560 | who had miserable childhoods,
02:55:13.040 | because whether their dad was always gone at work,
02:55:15.520 | or like they just had other family issues
02:55:18.240 | that just money can't buy.
02:55:20.320 | And I realized,
02:55:22.560 | I guess quickly,
02:55:24.060 | that money's a very like,
02:55:28.520 | it's a glittery object that isn't what it appears to be.
02:55:31.920 | And so to me, I'm like,
02:55:34.440 | I'm having the time of my life making my show.
02:55:37.360 | I'm not gonna have the time,
02:55:38.240 | like I could,
02:55:39.240 | you could ruin all that
02:55:40.160 | just trying to go for this quick check
02:55:42.080 | when it's like, no, I'm having a great time.
02:55:44.240 | - Yeah, it's actually,
02:55:45.340 | maybe you're probably the same way,
02:55:48.100 | but for me, there's a lot of happiness in having integrity,
02:55:52.080 | in looking in the mirror and knowing
02:55:53.840 | that you're the kind of person that has that.
02:55:56.040 | In fact, walking away from money is also fun.
02:55:59.640 | 'Cause it's like promising yourself,
02:56:01.640 | like it's showing,
02:56:02.900 | it's easy to like just say you have integrity.
02:56:04.760 | It's nice to like,
02:56:06.000 | ah, I actually,
02:56:07.160 | I've discovered several times in my life
02:56:08.800 | that I have integrity.
02:56:11.800 | - So you get put,
02:56:12.640 | yeah, you get put like basically to the test.
02:56:14.720 | - Yeah.
02:56:15.560 | I've said,
02:56:16.960 | like, I don't know if I publicly said,
02:56:18.760 | but to myself, I say like,
02:56:20.040 | you can't buy,
02:56:21.460 | there's a lot of things you can't buy with me,
02:56:23.020 | like for a billion dollars,
02:56:24.680 | like a trillion dollars.
02:56:26.820 | But it'd be nice to get tested that way.
02:56:28.760 | It'd be cool to see,
02:56:30.000 | 'cause you never know until you're in that room.
02:56:32.000 | - It's true.
02:56:33.040 | - The same with power, given power.
02:56:34.560 | I'd like to believe I'm the kind of person
02:56:36.880 | that wouldn't abuse power,
02:56:38.080 | but you don't know until you're tested.
02:56:39.940 | So anyway,
02:56:41.640 | you're in a really tricky position
02:56:42.960 | because you're doing incredible,
02:56:44.840 | I mean, you are a world-class journalist,
02:56:46.760 | straight up.
02:56:47.680 | And so there is pressure on that
02:56:49.920 | of like not having,
02:56:53.160 | like erring on the side of caution
02:56:55.720 | with like having conflicts of interest
02:56:57.480 | and stuff like that.
02:56:58.320 | It's tough.
02:56:59.160 | It's a really tough seat to sit in.
02:57:00.800 | It's really tough.
02:57:03.560 | It's really tough,
02:57:04.400 | but it's unfairly tough, I feel like.
02:57:06.240 | But it's good that you're sort of weighing all of those.
02:57:10.480 | That said, go donate to Coffeezilla.
02:57:14.200 | Donate everything.
02:57:16.320 | Support him.
02:57:17.280 | He's a really, really important human being.
02:57:19.880 | The other guy I did,
02:57:22.980 | I think is the first person I discovered
02:57:24.680 | that you investigated is Brian Rose of London Real.
02:57:28.840 | Can you talk about his story?
02:57:31.400 | - Brian Rose, he was sort of this interesting figure
02:57:36.720 | 'cause he was like trying to be
02:57:39.480 | to one level or another,
02:57:40.720 | the Joe Rogan of London,
02:57:43.640 | which I don't think he did a terribly bad job of,
02:57:45.920 | especially initially.
02:57:46.840 | He had some really interesting podcasts
02:57:48.920 | with some really interesting people.
02:57:50.960 | And it's funny enough,
02:57:51.980 | I started out as like I would watch him.
02:57:53.880 | I mean, I don't know if I was like a huge fan,
02:57:55.300 | but I was like, I like some of his interviews.
02:57:57.000 | He had some really good, like big gets
02:57:59.360 | in terms of great guests.
02:58:02.520 | However, when kind of COVID started,
02:58:06.960 | he went down this really weird grifting rabbit hole
02:58:11.320 | where he did like this interview with David Icke,
02:58:15.680 | who's, as you know,
02:58:16.920 | like a pretty big COVID conspiracy theorist.
02:58:19.340 | And I mean like actual,
02:58:22.080 | like he believes some of the royals are literally lizards.
02:58:25.720 | So he got shut down for that.
02:58:29.440 | And he kind of made a big stink,
02:58:31.840 | which I think it's fine.
02:58:33.000 | Nobody likes to be censored.
02:58:34.520 | And I'm not even saying
02:58:35.520 | that he should have been censored.
02:58:37.880 | But his reaction to that was to like
02:58:39.880 | raise a ton of money from his audience,
02:58:41.400 | promising this digital freedom platform.
02:58:44.980 | And at first it was like,
02:58:46.200 | oh, we want to raise $100,000.
02:58:47.840 | And then they raised it like within a day.
02:58:49.520 | So he's like, well, we got to raise a lot more money.
02:58:52.080 | And so eventually they raised a million dollars
02:58:54.020 | and he's trying to raise $250,000 a month
02:58:57.000 | to kind of keep putting his viewers money into this stuff.
02:59:00.840 | So I started digging into the platform they were building
02:59:02.520 | and there was nothing free about it.
02:59:04.520 | They had censorship guidelines
02:59:06.280 | and there was nothing about a platform at all.
02:59:07.960 | There was no underlying infrastructure.
02:59:09.960 | He just got some white label live streaming thing.
02:59:14.360 | So I criticized him for that.
02:59:16.600 | It was just this ridiculous thing.
02:59:17.920 | All the donators expected one thing.
02:59:19.920 | They thought Brian Rose was going to take on Google
02:59:22.440 | and Facebook and like bring free speech back for everybody.
02:59:25.560 | And of course he didn't.
02:59:26.760 | And then it kind of got worse
02:59:29.860 | because he started taking a lot of heat for that.
02:59:32.980 | And he really pivoted hard into like the DeFi grift.
02:59:37.200 | So he started selling this course about DeFi mastery.
02:59:40.220 | And this is a guy who knows nothing about crypto
02:59:43.480 | or very little at the least.
02:59:45.160 | So it just got really kind of,
02:59:47.660 | he just kind of doubled down on this course model
02:59:51.080 | of you're going to be rich if you just follow me.
02:59:52.720 | And it was ultimately, you just type in Brian Rose on YouTube
02:59:57.120 | you can see what his audience thought of that.
02:59:59.000 | 'Cause almost all of them have left him at this point.
03:00:01.800 | He's getting like a thousand views a video.
03:00:04.400 | And it wasn't because of me.
03:00:05.760 | I mean, it was like people lost taste
03:00:09.040 | in just the constant ask for more money,
03:00:12.280 | more money, more money.
03:00:13.560 | At some point people get sick of it.
03:00:15.320 | And it's like, everyone has an understanding
03:00:18.040 | that like no one works for free,
03:00:19.480 | but when it starts to be ego driven
03:00:22.600 | and driven around money, everything's about money,
03:00:25.560 | it drives people away.
03:00:28.160 | - Well, you're a part of that sort of helping.
03:00:30.560 | It's nice to have a voice.
03:00:32.640 | - Yeah, I certainly spoke out.
03:00:33.760 | I mean, it wasn't like I was quiet.
03:00:35.120 | I was very loud about it at the time.
03:00:36.980 | But I mean, in the sense that there,
03:00:42.500 | if you look at someone like Andrew Tate,
03:00:45.920 | I've made a video about him,
03:00:47.440 | even though he's been banned off all the platforms,
03:00:49.040 | he gets more views than Brian Rose.
03:00:51.400 | And I think it's just like,
03:00:53.600 | it was a Testament to how much Brian Rose
03:00:55.360 | was like doing like the grift that people could,
03:00:58.120 | even people who were fans
03:00:59.480 | and didn't care about what I said,
03:01:01.320 | like couldn't look past,
03:01:02.920 | just the constant ask for more and more money.
03:01:05.720 | People just get burned out.
03:01:07.080 | - Is there some aspect that you worry about
03:01:10.040 | where with a large audience,
03:01:12.960 | there seems to be a certain aspect of human nature
03:01:16.600 | where people like to see others destroyed?
03:01:19.760 | - Sure.
03:01:21.240 | - Do you worry about hurting people that don't deserve it?
03:01:25.320 | Or rather sort of attacking people that are grifter light,
03:01:30.320 | but they get like a giant storm of negativity towards them
03:01:36.960 | and therefore sort of overpoweringly cancel them
03:01:41.240 | or like hurt them disproportionately?
03:01:44.240 | - Sure.
03:01:45.360 | I mean, I try to be sensitive to my platform.
03:01:50.360 | And as I've grown,
03:01:52.240 | I've tried to make sure my video topics have grown with me.
03:01:56.240 | And like, it does reach this tricky point
03:02:00.240 | where if you're exposing a grifter with like 50,000 subs,
03:02:04.680 | who's doing some harm, are you punching down?
03:02:08.480 | - Right.
03:02:09.320 | - And so far there's been enough high profile things
03:02:14.320 | that I can distract myself with
03:02:16.540 | to where this has never been a problem.
03:02:18.240 | You don't ever wanna be,
03:02:20.200 | sort of like Sir Lancelot in retirement.
03:02:25.200 | Where have you heard this analogy?
03:02:28.520 | Okay, so there's this great analogy where it's like,
03:02:31.000 | Sir Lancelot's the guy who slays the dragon, right?
03:02:34.640 | He gets a lot of fame
03:02:35.800 | and he gets a lot of fortune for saving the dragon,
03:02:37.880 | or at least a lot of people love him.
03:02:39.720 | But what happens after he slays the first dragon?
03:02:43.920 | He's gotta go find a bigger dragon.
03:02:45.120 | So he goes find a bigger dragon.
03:02:46.920 | And eventually, depending on how many dragons
03:02:49.560 | you think are there in the world,
03:02:51.200 | maybe he kills all the dragons.
03:02:53.280 | And one day people go see Sir Lancelot
03:02:55.360 | and he's in a field with cows and he's chopping their heads.
03:02:59.160 | And he's sort of put himself in retirement,
03:03:03.360 | but he can't even enjoy the fruits
03:03:04.740 | because his whole thing is like, I'm killing the dragon.
03:03:08.360 | So I try to be cognizant
03:03:10.160 | and I try to always make myself willing
03:03:11.960 | to hang up my suspenders, I guess, hang up my hat.
03:03:18.000 | I try to be aware, if I significantly improve the problem,
03:03:21.840 | I put myself out of business,
03:03:23.700 | I want to be okay with that, basically.
03:03:27.060 | And just be fine with it.
03:03:30.160 | The funny thing is I was more worried about this
03:03:32.240 | as an issue earlier,
03:03:35.000 | because I thought there was a finite,
03:03:37.400 | I was like, I'm gonna solve this faster,
03:03:40.000 | especially as it started gaining traction.
03:03:41.720 | I'm gonna solve this fast, I got this.
03:03:43.640 | Classic naive, we all think we're so influential.
03:03:47.960 | - RTX comes along.
03:03:49.480 | - Well, yeah, and you just get,
03:03:52.960 | with time you get humbled 'cause you talk to people.
03:03:55.320 | I've talked to versions of Coffeezilla that are older
03:03:59.680 | and it's like, oh yeah, they didn't solve it
03:04:02.560 | and they probably were better.
03:04:03.920 | - I just imagine a smoke-filled room
03:04:06.320 | of just retired Batman
03:04:11.440 | and you're this young, bright-eyed,
03:04:17.080 | fiery-spirited investigator.
03:04:19.080 | - Yeah, exactly.
03:04:19.920 | - What's the process of investigation that you can speak to?
03:04:24.200 | What are some interesting things you've learned
03:04:27.520 | about what it takes to do great investigations?
03:04:30.800 | - Sure, great investigations reveal something new
03:04:34.600 | or bring something to light.
03:04:36.440 | So I think what everyone thinks in terms of investigations
03:04:40.280 | is a lot of Googling or searching through articles.
03:04:44.400 | I think that's the first thing you wanna get away from
03:04:47.360 | and you wanna try to talk to people,
03:04:51.080 | doing the non-obvious things
03:04:52.800 | and just trying to get perspectives
03:04:55.040 | that are beyond just what is available.
03:04:57.200 | So a lot of it's just having conversations
03:05:00.040 | is so enlightening, both to victims
03:05:03.160 | and also obviously trying to get,
03:05:04.400 | talk to the people themselves.
03:05:05.920 | Secondly, there's sometimes some analysis you can generate
03:05:09.800 | that's meaningful, like blockchain evidence.
03:05:12.040 | So in the case of SafeMoon, for example, going back to that,
03:05:15.880 | I found someone's secret account
03:05:18.660 | where they were pumping, dumping coins.
03:05:20.100 | They were saying things like, "Who sold?
03:05:22.100 | "I'm so mad at the guy who sold, F the guy who sold."
03:05:26.160 | And you look at his account and he was the guy selling.
03:05:28.440 | And it's like, that is just, that's great stuff.
03:05:30.920 | So digging through the blockchain,
03:05:34.000 | I've gained some skills there and that's kind of this fun,
03:05:38.400 | I guess I would say it's this weird edge I have right now
03:05:41.080 | because a lot of people don't know too much about that.
03:05:43.880 | And so I have this weird expertise that works now.
03:05:46.520 | I don't think that'll work forever
03:05:47.480 | 'cause I think people kind of figure out
03:05:48.920 | how to do very similar analyses.
03:05:50.680 | But so it's like kind of an interesting edge right now
03:05:54.320 | that I have.
03:05:55.160 | - So that's like a data-driven investigation,
03:05:56.680 | but you also do interviews, right?
03:05:58.120 | - Yeah, definitely.
03:05:58.960 | And then also recently I've tried to get more response,
03:06:01.840 | speaking to your point about like,
03:06:03.240 | as your platform gets bigger,
03:06:04.400 | you need more responsibility.
03:06:05.800 | I've tried to get much more responsible
03:06:09.560 | about like reaching out or somehow giving the subject
03:06:13.320 | some way to talk.
03:06:15.040 | Because I think in early on, I was such a small channel
03:06:18.360 | that A, if I asked them, they wouldn't answer.
03:06:21.480 | But B, I kind of felt like I was launching these videos
03:06:24.360 | into the abyss.
03:06:26.080 | And when some of my videos had real traction,
03:06:28.720 | I was like, okay, hang on a second.
03:06:31.200 | Let's double check this, let's triple check it.
03:06:33.480 | Let's try to make sure all this stuff is correct.
03:06:36.680 | And there's no other side of the story.
03:06:38.920 | I'll say this has interesting implications
03:06:41.080 | because for example,
03:06:42.480 | I investigated this thing called Genesis,
03:06:44.280 | they're a billion dollar crypto lender.
03:06:46.160 | And my conclusion was that they were insolvent.
03:06:49.200 | That's a huge accusation.
03:06:50.240 | So what do you do?
03:06:51.080 | Well, I emailed their press team, everybody.
03:06:52.840 | I said, hey, I think you're insolvent.
03:06:55.240 | I think you're this.
03:06:56.080 | I think I laid out all my accusations.
03:06:57.920 | And I said, you have till, I think 2 p.m.
03:07:00.920 | the next day to respond.
03:07:02.200 | At 8 a.m. before I made my video,
03:07:06.520 | they announced to all their investors
03:07:07.760 | that they're freezing withdrawals.
03:07:09.000 | They don't have the money.
03:07:10.360 | So they front, I don't know if they saw the,
03:07:13.040 | like I don't know if they actually saw that email.
03:07:14.600 | I don't wanna take credit for collapsing them or whatever.
03:07:17.320 | But my point is, had I not taken that level of kind of care
03:07:21.520 | and just said, hey, you're a scammer, you're frauds.
03:07:25.000 | Ironically, could I have done more good
03:07:27.740 | by allowing people to withdraw their money early?
03:07:30.560 | I made some tweets that people did see
03:07:32.640 | that like some people got their money out,
03:07:34.800 | but my YouTube audience is much larger.
03:07:36.680 | And could I have helped more people
03:07:38.280 | had I not given them basically the ability
03:07:41.040 | to know what I was gonna produce when I produced it?
03:07:43.360 | - Boy, your life is difficult.
03:07:45.400 | 'Cause you can potentially hurt the company
03:07:48.320 | that doesn't deserve it if you're wrong.
03:07:51.160 | Or if you're right and you warn the company,
03:07:54.520 | you might hurt the, whew.
03:07:56.720 | Well, I'm glad your wife is a supporter and keeps you strong.
03:08:02.800 | That's a tough, tough decision.
03:08:06.200 | Ultimately, I guess you wanna err on the side
03:08:08.280 | of the individual people, of the investors and so on.
03:08:13.280 | But it's tough.
03:08:15.680 | It's always a really, really tricky decision to make.
03:08:18.320 | - Very tricky.
03:08:19.640 | - Oh, boy.
03:08:20.600 | That's so interesting.
03:08:23.720 | And then the thing I've seen in your interviews
03:08:28.160 | that I don't remember, 'cause I think when you,
03:08:32.000 | I watched you earlier in your career,
03:08:34.940 | you were a little bit harsher.
03:08:36.160 | You were like trollier.
03:08:38.600 | You're having a little more fun.
03:08:40.520 | - Sure.
03:08:41.360 | - And when I've seen you recently, you do have the fun,
03:08:44.880 | but whenever you interview, you seem respectful.
03:08:47.920 | Like you attack in good faith,
03:08:50.840 | which is really important for an interviewer.
03:08:52.800 | So then people can go on
03:08:54.760 | and actually try to defend themselves.
03:08:56.780 | That's really important signal to send to people,
03:08:59.140 | 'cause then you're not just about tearing down.
03:09:01.480 | You're after like the, it's cliche to say, but the truth.
03:09:05.120 | Like you're really trying to actually investigate
03:09:07.800 | in good faith, which is great.
03:09:09.440 | So that signal is out there.
03:09:10.880 | So like people like SBF could,
03:09:13.160 | like he should go on your platform, I think.
03:09:16.400 | I mean, now it's like in full,
03:09:19.920 | not just like a half-assed conversation
03:09:21.800 | on Twitter space, but in full.
03:09:23.160 | So that's great that that signal is out there.
03:09:25.820 | But of course the downside of sort of,
03:09:27.680 | as you become more famous,
03:09:29.840 | people might be scared to sort of go on.
03:09:32.760 | But you do put that signal of being respectful out there,
03:09:35.360 | which is really, really important.
03:09:36.560 | - You know, it's interesting.
03:09:38.140 | It surprises me.
03:09:39.920 | I know it surprises other people
03:09:41.000 | 'cause other people have commented,
03:09:42.500 | but it consistently surprises me
03:09:44.100 | how many people still talk to me.
03:09:45.780 | And maybe it's because they,
03:09:48.960 | and I really do give a good attempt
03:09:50.780 | to try to argue in good faith.
03:09:52.480 | I try not to just like load up ad hominems
03:09:54.960 | or anything like that.
03:09:55.800 | I just try to present the evidence
03:09:56.920 | and let the audience make up their mind.
03:10:00.560 | But it surprises me sometimes
03:10:02.260 | that people will just be like,
03:10:04.160 | "Yeah, they wanna talk, they wanna talk, they wanna talk."
03:10:06.300 | I think it's very human in a way.
03:10:11.160 | And I think it's like almost,
03:10:12.760 | it's almost like good.
03:10:14.900 | Like one of the things that is always told
03:10:17.040 | to everyone who's gonna talk to the cops
03:10:18.360 | is like, "You should never talk to the cops, whatever."
03:10:21.400 | Which is true.
03:10:22.240 | You shouldn't talk to the cops.
03:10:23.400 | 'Cause even if you're innocent,
03:10:25.000 | they can use your words,
03:10:25.840 | they can twist your words, da-da-da.
03:10:27.360 | But there's something that gets lost
03:10:29.800 | in that like almost robotic, like, you know, self-interest
03:10:34.360 | that I think having open conversations,
03:10:38.040 | even if you've done something wrong,
03:10:39.920 | I think there's something really compelling about that
03:10:42.040 | that continues to make people talk in interrogation rooms,
03:10:45.920 | in Twitter spaces, wherever you are,
03:10:48.200 | regardless of whether you totally shouldn't be talking.
03:10:51.840 | And I don't wanna downplay that.
03:10:53.940 | That's actually really important.
03:10:55.140 | I mean, it's like a lot of cases get solved.
03:10:58.600 | A lot of investigations go farther
03:11:00.600 | because people sort of make the miscalculation to talk.
03:11:04.480 | But I think it's like almost important in a way
03:11:06.560 | that we have that human bias to like connect
03:11:09.720 | in spite of self-interest.
03:11:11.380 | - Yeah, but also they're judging the integrity
03:11:14.800 | and the good faith of the other person.
03:11:16.720 | So I think when people consume your content,
03:11:18.760 | especially your latest content,
03:11:20.800 | they know that you're a good person.
03:11:23.760 | I found myself,
03:11:27.440 | like there's a lot of journalists that reach out to me
03:11:30.440 | and I find myself like not wanting to talk to them
03:11:33.440 | because I don't know if the other person on the side
03:11:35.640 | is coming in good faith.
03:11:37.320 | Even on silly stuff, I'm not a-
03:11:38.960 | - Same way.
03:11:39.920 | - Like I'm not a, I don't have anything to hide.
03:11:42.680 | Like you don't really have anything to hide,
03:11:44.440 | but you don't know what their like spin is.
03:11:49.200 | - Can I tell you an example?
03:11:50.800 | I'm dying 'cause I believe so strongly
03:11:53.020 | that journalists have done themselves such a disservice.
03:11:55.960 | Okay, one of the truest things is that like
03:11:57.840 | everyone loves journalism in theory
03:12:00.540 | and almost everyone dislikes journalists as a whole.
03:12:04.360 | Like there's a deep distrust of journalists
03:12:05.960 | and there's a deep love for journalism.
03:12:08.280 | It's this weird disconnect.
03:12:09.640 | I think a lot of it can be summarized in,
03:12:11.760 | there's this book called the,
03:12:13.160 | God, what it's called?
03:12:15.760 | I think it's called "The Journalist and the Murderer."
03:12:17.900 | It's written by Janet Malcolm.
03:12:20.400 | The first line of this book is that like
03:12:23.280 | every journalist who knows what they're doing,
03:12:26.920 | who isn't too like, is smart enough to know
03:12:29.520 | what they're doing, knows what they're doing
03:12:31.280 | is deeply unethical or something like that.
03:12:33.400 | And what they're talking about
03:12:35.120 | is that there's a tradition in journalism
03:12:37.020 | to betray the subject, to lie to them
03:12:41.440 | in the hopes of getting a story
03:12:43.200 | and play to their ego and to their sense of self
03:12:47.320 | to make it seem like you're gonna write one article
03:12:49.640 | and you stab them in the back at the end
03:12:51.280 | when you press publish
03:12:52.200 | and you write the totally different article.
03:12:54.000 | This is what actually everyone hates about journalists.
03:12:56.800 | And it's happened to me before.
03:12:57.840 | So I did a story like way back in the day,
03:13:00.480 | I got interviewed about something
03:13:01.840 | that was like data with YouTube.
03:13:03.680 | I made a few comments about data and YouTube.
03:13:06.840 | And somehow by the time the article got published,
03:13:09.240 | it's about me endorsing their opinion
03:13:11.800 | that PewDiePie is an anti-Semite.
03:13:13.840 | And I'm like, I reached out to this person,
03:13:15.320 | I said, I never said that.
03:13:16.640 | Like, what are you talking?
03:13:17.480 | How did you even twist my words to say that?
03:13:19.440 | And I felt so disgusted and betrayed
03:13:21.800 | to have like, I'm like this mouthpiece for an ideology
03:13:25.800 | or like a thought that I do not actually agree with.
03:13:29.000 | So, and when journalists do this, they think,
03:13:32.600 | well, I'm never gonna interview this person again,
03:13:34.160 | so it's okay.
03:13:35.000 | So it's like, it's almost like the ends justify the means,
03:13:38.080 | I get the story.
03:13:39.600 | But the ends don't justify the means
03:13:41.040 | because you've now undermined
03:13:43.120 | the entire field's credibility with that person.
03:13:46.920 | And when that happens enough times,
03:13:49.840 | you end up sitting across from Lex Friedman
03:13:52.160 | and it's like, well, I don't know
03:13:53.280 | if they're gonna represent me fairly.
03:13:54.720 | Because the base assumption is that
03:13:57.960 | regardless of what the journalist says,
03:13:59.800 | they could betray you and they might betray you
03:14:01.640 | at the end of the day and be saying you're great
03:14:03.840 | while they're secretly writing like a hit piece
03:14:05.860 | about like, you know, how much, you know,
03:14:08.160 | you're a bad force for the world.
03:14:10.920 | Where, whereas there's an alternate universe
03:14:14.380 | where if the journalist was somewhat upfront
03:14:17.000 | about their approach, or at least didn't mislead
03:14:20.080 | and didn't say like, I love you, I think you're great,
03:14:22.780 | you would end up with less access,
03:14:27.920 | but you would end up with more trusted journalism,
03:14:30.560 | which I think in the long run would be better.
03:14:33.480 | - I think you get more access.
03:14:35.640 | - I think in the long term, yeah, but all of these,
03:14:37.320 | like everything we're talking about is long-term games
03:14:39.320 | versus short-term games.
03:14:40.560 | - Yeah.
03:14:41.400 | - In the short term, you get more access
03:14:42.280 | if you suck up to the person,
03:14:43.780 | if you say this, say this, say this,
03:14:45.520 | and you stab them in the back later.
03:14:47.160 | Long-term, you build a long-term reputation,
03:14:49.600 | people trust you, it actually matters more.
03:14:51.680 | - And it's nice when that reputation
03:14:53.240 | is your own individual.
03:14:54.440 | So like you have a YouTube channel, you're one individual.
03:14:58.480 | So people trust that because you have a huge disincentive
03:15:02.520 | to screw people over.
03:15:04.320 | - True.
03:15:05.240 | - I feel like if you're in the New York Times,
03:15:07.880 | if you screw somebody over,
03:15:09.040 | the New York Times gets the hit, not you individually.
03:15:11.760 | So you can like, you're safer,
03:15:15.800 | but like the reason I don't screw people over
03:15:18.320 | is I know that, well, there's my own ethics and integrity.
03:15:23.240 | - Sure, but yeah.
03:15:24.080 | - Also there's a strong disincentive to like,
03:15:25.920 | 'cause you're now, I'm going,
03:15:27.240 | that person is gonna go public with me screwing them over,
03:15:30.640 | completely lying about everything,
03:15:32.720 | how I presented the person, for example.
03:15:35.020 | And that's just gonna, you know,
03:15:36.520 | that's gonna percolate throughout the populace,
03:15:39.640 | and they're gonna be like,
03:15:40.480 | this is the person that's a lying sack of shit.
03:15:43.840 | And so there's a huge disincentive to do that.
03:15:46.040 | Yeah, journalists don't have that.
03:15:47.760 | - That is what's interesting about, yeah,
03:15:50.120 | the move towards independent journalism.
03:15:52.760 | I think we'll probably end up at a space where,
03:15:56.500 | it's so interesting.
03:15:59.080 | Mainstream journalism has so much work to do
03:16:01.440 | to repair the trust with the average individual.
03:16:04.840 | And it's going to take a lot of like self-reflection.
03:16:09.680 | I've talked to a few mainstream journalists about this,
03:16:12.600 | and a lot of them will admit it behind closed doors,
03:16:16.120 | but like there's this general sense that,
03:16:18.320 | oh, the public's not being fair to us.
03:16:19.960 | Like they're very self, they're defensive, I guess,
03:16:23.360 | in a way.
03:16:24.200 | And I understand why, because sometimes
03:16:27.720 | it's just a few bad apples that ruin it for everybody,
03:16:30.720 | but without the acknowledgement of the deep distrust
03:16:34.280 | that they have with a good portion of our society,
03:16:39.320 | there's no way to rebuild that.
03:16:41.480 | Just like when there's no acknowledgement
03:16:43.120 | of the corruption of the 2008 financial crash,
03:16:46.620 | there's no way to rebuild that.
03:16:48.080 | Even if most bankers, most traders are not unethical
03:16:52.880 | or duplicitous or they're totally normal people
03:16:56.760 | who maybe aren't deserving of the bad reputation,
03:16:58.840 | but you have to acknowledge the damage that's been done
03:17:01.680 | by bad actors before you can like heal that system.
03:17:05.640 | - Well, what do you think about Elon just opening the door
03:17:08.160 | to a journalist to see all the emails that were sent,
03:17:11.560 | the quote unquote Twitter files?
03:17:14.680 | - Yeah, that's really interesting.
03:17:15.960 | I mean, I saw a lot of, I'm like in this weird thing
03:17:19.960 | where I see, I follow a lot of independent people
03:17:23.840 | and I follow a lot of mainstream journalists
03:17:25.640 | and that there are very polar opposite takes on that.
03:17:29.480 | - People really quickly politicize it,
03:17:31.120 | but to me, the thing that was fascinating
03:17:32.640 | is just the transparency that I've never seen from,
03:17:35.680 | one of the really frustrating things to me,
03:17:37.360 | is a lot of this podcast has been about interviewing
03:17:39.960 | tech people, CEOs and so on,
03:17:41.960 | and they're just so guarded with everything.
03:17:44.640 | It's hard to get to, and so it's nice to get,
03:17:47.300 | hopefully this is a signal, look, you can be transparent.
03:17:52.160 | Like this is a signal to increase transparency.
03:17:55.040 | - Hopefully so.
03:17:56.880 | I don't, yeah, it's been tribalized so quickly,
03:17:59.720 | it's like I've lost a lot of faith in that.
03:18:01.840 | And unfortunately it's been this like bludgeon match
03:18:04.320 | of like, if you're on the right,
03:18:08.280 | you think it's uncovering the greatest story ever
03:18:10.680 | about Hunter Biden.
03:18:11.520 | If you're on the left, you think they were just sharing,
03:18:13.920 | they were just silencing revenge porn pics of Hunter Biden.
03:18:16.920 | So therefore it was justified.
03:18:18.000 | And by the way, Trump also sent messages to Twitter.
03:18:20.600 | So doesn't that mean that like we should be criticizing?
03:18:22.600 | It just like, this is goes back to why I don't touch
03:18:26.200 | politics is 'cause I think as many problems as I have,
03:18:29.920 | I think when you become a journalist that,
03:18:32.560 | not even a political journalist,
03:18:33.880 | when you become a journalist in politics,
03:18:36.480 | you have like twice the problem.
03:18:38.240 | So I'm like, I'm happy to be well outside
03:18:41.040 | of that kind of sphere.
03:18:43.320 | - But it's an interesting--
03:18:44.960 | - It is interesting.
03:18:45.800 | - It, you know, forget Twitter files,
03:18:47.400 | but Twitter itself is really, really interesting
03:18:50.560 | from the virality of information transfer.
03:18:54.800 | - Yeah.
03:18:55.640 | - And from a journalistic perspective,
03:18:57.240 | it's like how information travels,
03:18:59.580 | how it becomes distributed, it's interesting.
03:19:03.000 | - What do you think about Twitter?
03:19:04.960 | I'm always conflicted on Twitter
03:19:07.080 | 'cause I almost hate how much I enjoy using it.
03:19:12.080 | 'Cause I'm like, this is like this mindless bird app
03:19:15.680 | is consuming my time.
03:19:16.880 | It's this incredible networking tool.
03:19:20.460 | But what's weird is when I think about my own presence
03:19:23.040 | on Twitter, they've almost made it too easy
03:19:25.720 | to like say something that you've half thought.
03:19:29.040 | Like the friction to send a tweet is so much less
03:19:31.240 | than like if I'm gonna make a YouTube video,
03:19:33.320 | there's several points at which I'm like,
03:19:34.720 | well, what's the other side?
03:19:36.000 | What's this, what's that?
03:19:37.200 | There's no friction there.
03:19:38.480 | And so one thing I've noticed
03:19:39.480 | is everyone I follow on Twitter,
03:19:41.080 | a lot of them after reading all their tweets,
03:19:44.500 | I think nothing more of them, nothing less of them.
03:19:47.440 | But there's a lot of them that I think less of.
03:19:49.400 | And I don't think I've ever had an experience
03:19:51.600 | where I've read someone tweets
03:19:53.500 | and I think more of them in a way.
03:19:55.320 | And I'm like, what does that say that?
03:19:57.480 | - Yeah, what is that?
03:19:59.560 | Like there's so many people I admire
03:20:03.560 | that the worst of them is represented on Twitter.
03:20:07.600 | Like there's a lot of people.
03:20:10.400 | - There's a million examples.
03:20:11.280 | - They become like snarky and sometimes mocking
03:20:15.440 | and derisive and negative and like emotional messes.
03:20:19.400 | I don't know, yeah, what is that?
03:20:23.480 | Maybe we shouldn't criticize it
03:20:26.280 | and accept that as like a beautiful,
03:20:28.500 | raw aspect of the human being,
03:20:30.420 | but not encompassing,
03:20:32.040 | not representing the full entirety of the human being.
03:20:34.120 | - But it does reflect like,
03:20:35.260 | it's impossible to not reflect it to some extent.
03:20:38.400 | Or you'd have to counter that bias really carefully
03:20:40.360 | 'cause that is them.
03:20:41.680 | It is a thought they had.
03:20:43.180 | It's just probably something
03:20:44.200 | that should have been an unexpressed thought perhaps.
03:20:46.940 | So yeah, I kind of wonder like my,
03:20:51.120 | I'm like, should I be on Twitter?
03:20:52.560 | But the problem is, is it's such a great place
03:20:56.000 | where so many, like so much of the news happens on Twitter,
03:20:59.940 | so much of the journalism breaks on Twitter.
03:21:03.100 | Even people in the New York Times,
03:21:04.460 | they'll tweet their scoop.
03:21:05.860 | And they'll like, they'll put that out on Twitter first.
03:21:07.980 | So it's this really weird thing where I'd love to be off it
03:21:11.980 | and it's like too useful for my job,
03:21:13.700 | but I kind of hate it.
03:21:15.620 | - No, no, you need to, well, it depends.
03:21:17.820 | But from my perspective, you should be on it.
03:21:20.580 | - Oh, I definitely am, yeah.
03:21:22.300 | - So like Coffeezilla should definitely be on Twitter.
03:21:25.940 | But have developed the galluses
03:21:30.420 | and the strength to not give into the lesser aspects.
03:21:35.420 | 'Cause you're silly, you're funny,
03:21:39.540 | you could be cutting with your humor.
03:21:42.380 | I wouldn't give into the darker aspects of that,
03:21:47.380 | like low effort negativity.
03:21:50.180 | If you're, the way you are in your videos,
03:21:52.260 | I would say if you're ever negative or making fun of stuff,
03:21:55.420 | I think that's high effort.
03:21:57.100 | So I would still put a lot of effort into it,
03:21:59.340 | like calmly thinking through that.
03:22:01.660 | And also not giving into the dopamine desire
03:22:06.140 | to say something that's gonna get a lot of likes.
03:22:08.940 | I have that all the time.
03:22:11.500 | You use Twitter enough, you realize certain messages
03:22:14.740 | that are going to get more likes than others.
03:22:16.980 | - And are usually the ones that are extreme, more extreme.
03:22:20.860 | And like emotional, like Lex is an idiot.
03:22:23.940 | Or like Lex is the greatest human being ever.
03:22:26.580 | It's much better than,
03:22:27.500 | oh wow, what a polite nuanced conversation.
03:22:30.100 | I can tell you right now which of those three tweets
03:22:31.900 | isn't gonna perform well.
03:22:33.740 | - Yeah, yeah.
03:22:35.140 | So I think the extremes are okay if you believe it.
03:22:38.500 | Like I will sometimes say positive things.
03:22:42.820 | I said that the Twitter files release was historic.
03:22:46.780 | Of course, this before I realized,
03:22:49.500 | I mean, the reason I said it is not,
03:22:53.420 | is because the transparency.
03:22:55.540 | It's so refreshing to see some, any level of transparency.
03:23:00.540 | And then of course, those kinds of comments,
03:23:04.260 | the way Twitter does, is every side will interpret it
03:23:08.140 | in the worst possible way for them,
03:23:10.140 | and they will run with it.
03:23:12.500 | Or some side, when it's political,
03:23:14.300 | yeah, one side will interpret, yes, I agree with Lex.
03:23:18.860 | - It's historic.
03:23:20.140 | They might not have even read the article.
03:23:21.620 | They just like, they literally, or the tweet thread,
03:23:23.980 | and they're just like, it is historic.
03:23:25.940 | - Historic because Hunter Biden was finally the collusion,
03:23:29.180 | or whatever it is.
03:23:30.180 | And then the other side is like, no, it's a nothing burger.
03:23:33.520 | But yeah, that aspect of nuance, and that it's frozen
03:23:37.580 | in time, even with editing, there's a--
03:23:39.740 | - So tough.
03:23:40.780 | - It's tricky, but if you maintain a cool head
03:23:44.100 | through all of that, and hold to your ethics and your ideas,
03:23:49.180 | and use it to spread the ideas,
03:23:52.140 | which you do extremely well on YouTube,
03:23:54.860 | I think it'd be a really powerful platform.
03:23:57.100 | There's no other platform that allows
03:24:00.220 | for the viral spread of ideas, good ideas, than this.
03:24:04.540 | And this is where, like, especially with Twitter spaces,
03:24:10.500 | I mean, where else would I see, I think,
03:24:15.500 | twice impromptu conversations with coffees
03:24:18.860 | with Yolo and SPF, like--
03:24:20.100 | - Never, yeah, nowhere else.
03:24:21.540 | 'Cause he wasn't gonna come on my show.
03:24:22.980 | He wasn't gonna come on some big prepared thing.
03:24:24.660 | It's like, hey, Yolo, let's go on a Twitter space.
03:24:28.100 | And I like pop up, and you know what's funny?
03:24:31.100 | And this, I hope this release is late enough,
03:24:32.860 | or well, SPF probably won't see this.
03:24:35.460 | Although I'm sure he's--
03:24:36.300 | - And unfortunately, unfortunately he will.
03:24:39.140 | - Oh, okay. - Yes.
03:24:39.980 | - Well, hopefully I'll have time to enact my little plan,
03:24:44.140 | but I'm hoping if he goes on any future spaces,
03:24:46.900 | I can haunt him from interview to interview,
03:24:49.860 | where just I keep showing up, and he's like,
03:24:51.740 | ah, I hate this guy.
03:24:52.980 | - But I think he's already kind of probably
03:24:55.580 | has PTSD of in the shadows lurks a coffeezilla.
03:25:02.740 | - That just would, that would just,
03:25:04.580 | it's just like, that really amuses me.
03:25:07.260 | - I mean, there is, I think he honestly
03:25:10.100 | would enjoy talking to you.
03:25:11.100 | There's an aspect of Twitter spaces
03:25:12.900 | that's a little uncertain of what are we doing here,
03:25:16.020 | because there's an urgency,
03:25:17.580 | because other voices might want to butt in.
03:25:19.300 | - Exactly, exactly.
03:25:20.180 | - If it's an intimate one-on-one,
03:25:21.940 | where you can like breathe, like hold on a second,
03:25:25.140 | I think it's much easier.
03:25:26.500 | So that sense, Twitter spaces is a little bit negative,
03:25:29.860 | that there's too many voices,
03:25:32.860 | if especially if it's a very controversial kind of thing.
03:25:35.620 | So it's tricky, but at the same time,
03:25:37.940 | the friction, it's so easy to just jump in.
03:25:40.720 | So I can just, I mean, I could,
03:25:45.460 | I mean, you just imagine like a Twitter space
03:25:47.500 | with like Zelensky and Putin.
03:25:49.100 | Like how else are these two going to talk, right?
03:25:52.900 | Like, can't you imagine?
03:25:54.460 | - If you try to set it up on Zoom,
03:25:56.060 | like it never happens.
03:25:56.900 | - It's never happening.
03:25:57.720 | - Too many delegates, like the only way it happens.
03:25:59.980 | - Just imagine Putin like sitting there like.
03:26:01.580 | - Zelensky's live.
03:26:02.860 | - Just live, right, just jumping in.
03:26:05.300 | - It's hilarious.
03:26:06.260 | - (speaking in foreign language)
03:26:09.340 | Okay, actually just on a small tangent.
03:26:12.500 | So how do you have a productive day?
03:26:14.380 | Do you have any insights on how to manage your time optimally?
03:26:17.820 | - Yeah, I mean, I've gone the gamut of,
03:26:20.540 | from obsessive time tracking in 15 minute buckets
03:26:23.740 | to kind of like the other extreme
03:26:27.100 | where it's more kind of like large scale,
03:26:32.100 | some deep work here, two hour bucket,
03:26:35.180 | account for an hour of lunch and some other thing.
03:26:37.940 | But now, I just roughly, because I manage a team
03:26:41.660 | and there's some things that kind of come up,
03:26:44.660 | it's only a team of two, it's not like big,
03:26:46.180 | but I just have things that are not necessarily
03:26:47.860 | controllable by me.
03:26:48.860 | I like have to take some meetings or whatever.
03:26:50.900 | It's not as easy to plan out my day ahead of time.
03:26:53.340 | So I do a lot of retrospective time management
03:26:56.540 | where I look at my day and that's what I mostly do now.
03:26:59.300 | And I account, did I spend this day productively?
03:27:02.340 | What could I do better?
03:27:03.460 | And then try to implement it in the future.
03:27:05.700 | So a lot of this I realized is very personal for me.
03:27:09.940 | I do very well in long streaks of working.
03:27:14.100 | And if I can't do a lot of work in 15 minutes,
03:27:16.700 | I can't do a lot of work in even an hour,
03:27:19.060 | but if you give me like three hours or five hours
03:27:21.540 | or six hours of uninterrupted work,
03:27:23.680 | that's where I get most of my stuff done.
03:27:25.580 | - So from the, it'd be fun to explore those,
03:27:28.500 | when you did 15 minute buckets.
03:27:30.500 | So you have a day in front of you
03:27:33.940 | and you have like a Google sheet or a spreadsheet
03:27:35.580 | or something.
03:27:36.420 | - Yeah, I did an Excel.
03:27:37.340 | - Excel.
03:27:39.420 | And you're, do you have a plan for the day
03:27:42.340 | or do you go like when you did it,
03:27:44.300 | or you just literally sort of focus on a particular task
03:27:50.260 | and then you're tracking as you're doing that task
03:27:52.540 | of every 15 minutes?
03:27:53.580 | - Yeah, I would kind of do it live.
03:27:55.340 | I'm not, so one of the reasons I'm so obsessive about it
03:27:58.620 | is 'cause I'm not organized by nature.
03:28:00.820 | And I lost, like in college I learned how much
03:28:05.340 | lack of organization can just hurt you in terms of output.
03:28:09.820 | And so I realized like I just had to build systems
03:28:12.580 | that would enable me to become more organized.
03:28:15.700 | So really, I think that doing that really taught me a lot
03:28:20.700 | about time in the same way that tracking calories
03:28:24.820 | can teach you about food.
03:28:26.260 | Like just learning accounting for these things
03:28:28.980 | will give you skills that eventually you might not need
03:28:31.540 | to track on such a granular level
03:28:33.180 | because you've kind of like figured out.
03:28:35.260 | So that's kind of how I feel about it.
03:28:37.340 | I think everyone should,
03:28:38.540 | if you care about productivity and stuff,
03:28:40.360 | should do a little bit of it.
03:28:41.780 | I don't think it's sustainable in the longterm.
03:28:43.860 | It just takes so much effort and time to like,
03:28:46.060 | and I think the marginal effect of it in the longterm
03:28:49.460 | is kind of minimal once you learn these basic skills.
03:28:52.680 | But yeah, I was basically tracking like live what I did.
03:28:56.560 | And what I saw is that a lot of my real work
03:29:01.500 | would be done in small sections of the day.
03:29:04.140 | And then it'd be like a lot of just nothing,
03:29:06.140 | like a lot of small things where I'm busy,
03:29:08.980 | but little is being achieved.
03:29:11.540 | And so I think that's a really interesting insight.
03:29:14.740 | I've never figured out how to un-busy myself
03:29:16.980 | and focus on the like core essentials.
03:29:19.340 | I'm still getting to that.
03:29:21.100 | But it is interesting realizing most of your day
03:29:23.840 | is like a lot of nothing.
03:29:25.260 | And then like some real deep work
03:29:26.860 | where most of your value comes from is like 20% of your day.
03:29:29.740 | - Yeah, I tried to start every day with that.
03:29:31.900 | So the hardest task of the day,
03:29:33.100 | you focus for long periods of time.
03:29:34.980 | And I also have the segment of two hours
03:29:37.620 | where it's a set of tasks that I do every single day.
03:29:40.100 | The idea is you do that for like your whole life.
03:29:43.180 | It's like longterm investment of Anki
03:29:45.780 | and it's just like learning and reminding yourself of facts.
03:29:50.780 | You know, they're useful in your everyday life.
03:29:54.660 | And then for me, also music.
03:29:56.300 | I'll play a little bit of music.
03:29:57.460 | - Play a piano.
03:29:58.460 | - Piano.
03:29:59.300 | And so like keeping that regular thing
03:30:01.020 | is part of your life.
03:30:01.860 | - And one thing that I've really taken from this
03:30:04.380 | is 'cause I've read all the like,
03:30:06.060 | I had a self-improvement phase in my early 20s.
03:30:08.820 | And one thing you learn is that everyone wants to give you
03:30:12.980 | a broad general solution.
03:30:15.100 | But really the real trick of figuring out like optimizing
03:30:20.100 | is figuring out the things that work for you specifically.
03:30:23.400 | So like one interesting thing you said is like,
03:30:25.740 | oh, I like to do my hard work at the beginning of the day.
03:30:29.940 | I know a lot of people recommend this.
03:30:32.300 | I've tried so many times
03:30:34.380 | and I just do better work late at night.
03:30:37.320 | And so usually my streak of work is like from like
03:30:40.060 | after dinner, 7.30 to like 2 a.m.
03:30:43.340 | That's my prime time.
03:30:45.120 | And so like a lot of my videos, which you'll see,
03:30:47.680 | which is like lit from this studio,
03:30:49.820 | which appears to be daytime,
03:30:50.740 | it's like shot at 3 a.m., you know,
03:30:52.780 | just like in a caffeine fueled rush.
03:30:54.960 | But that's kind of how it works for me.
03:30:58.100 | And then also like with the social outlets
03:31:00.980 | and stuff like that, which it's easy.
03:31:03.220 | And I know, I feel like we think similarly on this.
03:31:05.980 | So it's easy to discount these things as less relevant
03:31:09.700 | 'cause they don't have quantitative metrics
03:31:13.060 | associated with them.
03:31:14.900 | But in terms of longevity and like,
03:31:17.780 | I think to be able to do creative work,
03:31:21.460 | there's an amount of recharge and like re-inputting stuff
03:31:25.100 | that is frequently discounted by people like us
03:31:27.680 | who are like obsessed with quantitative metrics.
03:31:30.660 | And so I really found that some of my best work
03:31:33.380 | gets done after I take like a break
03:31:35.220 | or I'll go play like live sets of music.
03:31:39.380 | And I mean, like that's like for me really recharging,
03:31:41.860 | but nowhere on a spreadsheet is that gonna show up
03:31:44.460 | as productive or like meaningful.
03:31:46.580 | But for me, for whatever reason,
03:31:49.300 | it recharges me in a way that like
03:31:51.060 | I need to pay attention to.
03:31:52.140 | - Yeah, for sure.
03:31:52.980 | I usually have a spreadsheet, a 15 minute increments
03:31:55.240 | when I'm socially interacting with people.
03:31:57.180 | And I evaluate how--
03:31:58.580 | - I'm getting roasted right now.
03:31:59.740 | - No, I'm not.
03:32:00.580 | It's actually, I'm probably roasting myself.
03:32:04.060 | But I do find that when I do have social interactions,
03:32:08.200 | I like to do with people that are in,
03:32:10.440 | outside of that exceptionally busy themselves.
03:32:13.900 | 'Cause then you understand the value of time.
03:32:16.660 | And when you understand the value of time,
03:32:18.620 | your interaction becomes more intimate and intense.
03:32:22.700 | Like the cliche of work hard, party hard
03:32:26.500 | or whatever the cliche is.
03:32:27.700 | - Play hard.
03:32:28.540 | - Play hard, damn it.
03:32:29.380 | Whatever the--
03:32:30.220 | - In good social interaction.
03:32:32.340 | No, I'm just kidding.
03:32:33.380 | - I mean, that cliche, there's a truth to that.
03:32:39.500 | But the intensity of the social interaction,
03:32:43.060 | even like, you know, it's not even the intensity,
03:32:45.580 | it's not even the party hard.
03:32:46.700 | It's like, even if you're going hiking and relaxing
03:32:49.740 | and taking in nature, so it's very relaxed,
03:32:52.060 | but you understand the value of that.
03:32:54.380 | When you put a huge amount of value on those moments
03:32:57.460 | spent in nature, that recharges me much more.
03:33:01.260 | So you have to surround yourself with people
03:33:02.660 | that think of life that way,
03:33:04.180 | that think about the value of every single moment.
03:33:06.660 | That's one of the things you do
03:33:07.580 | when you break it up in 15 minute increments,
03:33:10.240 | is you realize how much time there is in the day,
03:33:13.380 | how much awesomeness there is in the day
03:33:15.180 | to experience, to get done and so on.
03:33:17.340 | And then so you can feel that when you're with somebody.
03:33:21.500 | And then for me personally, like when I interact with people
03:33:24.620 | I really like to be fully present for the interaction.
03:33:29.060 | - I can tell this is, for anyone who has, you know,
03:33:32.360 | I've been the audience forever.
03:33:33.740 | So I haven't been on this side of the table before.
03:33:36.020 | You're very intense.
03:33:36.860 | You look right in the eye.
03:33:38.660 | - Well, I don't know about right in the eye.
03:33:39.980 | Eye contact is an issue, but yes, I'm there.
03:33:41.740 | - Lex has a soulful gaze, guys,
03:33:43.520 | just in case you were wondering.
03:33:44.980 | It's very soulful, it's very comforting.
03:33:47.100 | It's like a warm hug.
03:33:48.900 | - Back to serious talk.
03:33:50.060 | Okay, you've studied a lot of people who lie,
03:33:53.740 | who defraud, cheat and scam on a basic human level.
03:33:58.740 | How do you have trouble trusting human beings
03:34:05.100 | in your own life?
03:34:06.020 | What's your relationship with trust of other humans?
03:34:10.660 | - It's a great question.
03:34:11.660 | So funny enough, before I did this,
03:34:14.280 | I was like an incorrigible optimist.
03:34:18.380 | Everything, the sun shined, every which place.
03:34:22.340 | I always saw like everybody as fundamentally good.
03:34:27.300 | Nobody was bad.
03:34:28.420 | It just was like sort of that wrong place,
03:34:29.940 | wrong time, bad incentives.
03:34:31.300 | That view has darkened significantly,
03:34:36.100 | but I just try to remember my sample set
03:34:40.420 | and just like, I'm just sampling sort of the worst.
03:34:43.260 | And I try not to let it bleed into my day-to-day life.
03:34:48.220 | And I think it's probably because
03:34:50.940 | I was such an optimist early on
03:34:52.580 | that I've been able to kind of retain some of it.
03:34:55.900 | I call it enlightened optimism.
03:34:57.660 | Like choosing to be optimistic in the face of
03:35:00.300 | a realistic sense of the problems in the world
03:35:04.100 | and with a realistic sense of like the scale
03:35:06.580 | and the challenge ahead.
03:35:08.180 | I actually think it's much braver to be an optimist
03:35:14.100 | when you're aware of what's going on in the world
03:35:16.380 | than to be a cynic.
03:35:17.220 | I think being a complete cynic is maybe,
03:35:21.100 | I'm not saying it's wrong,
03:35:22.300 | but I'm saying it's maybe the easier way
03:35:24.980 | just mentally to cope with so much negativity.
03:35:29.340 | It's like just saying, well, it's all bad.
03:35:31.540 | It's all doomed to fail.
03:35:32.380 | It's all gonna go bust is easier.
03:35:36.940 | - Yeah, that leap into a believing that's a good world
03:35:42.260 | is a little baby act of courage.
03:35:47.540 | - At least I think so.
03:35:48.700 | I don't think it's naive.
03:35:51.060 | - No, it can be.
03:35:53.540 | Some people are naive that are optimistic,
03:35:55.140 | but oftentimes just because someone's optimistic
03:35:59.060 | does not mean they're naive.
03:35:59.980 | They could be full well aware of how troubling the world is.
03:36:03.580 | - And I also believe some of the people you study,
03:36:07.340 | you know, I'm a big believer that all of us
03:36:09.020 | are capable of good and evil.
03:36:10.940 | So in some sense, the people you study
03:36:14.980 | are just the successful ones.
03:36:17.860 | The ones who chose sort of the dark path
03:36:22.740 | and were successful at it.
03:36:23.780 | And I think all of us can choose the dark path in life.
03:36:26.540 | That's the responsibility we all carry
03:36:34.380 | is we get to choose that at every moment.
03:36:37.340 | And it's like a big responsibility.
03:36:40.180 | And it's a chance to really have integrity.
03:36:42.300 | It is a chance to stand for something good in this world.
03:36:45.360 | All of us have that because I think
03:36:47.280 | all of us are capable of evil.
03:36:49.320 | All of us could be good Germans.
03:36:50.600 | All of us could in atrocities be part of the atrocities.
03:36:54.860 | - Yeah, I think it's, I really have,
03:36:59.440 | especially in recent years,
03:37:01.160 | tried to somewhat depersonalize my work
03:37:05.400 | and see it almost like as like a, I don't know,
03:37:10.400 | like a force of nature that I'm fighting
03:37:13.640 | more than like individuals.
03:37:15.960 | Because of this exact thing, I think like sort of,
03:37:18.720 | therefore, but the grace of God, there go I,
03:37:20.680 | is kind of a really profound way to understand yourself
03:37:24.960 | rather than it's just like fundamentally good
03:37:27.900 | and like full of integrity.
03:37:30.520 | Acknowledging that so much of that
03:37:32.140 | is a product of your environment
03:37:33.720 | and your family and your upbringing.
03:37:36.160 | And so much of the people who don't have that
03:37:38.480 | is a product of their environment.
03:37:40.880 | It doesn't absolve them,
03:37:42.720 | but it gives you more perspective
03:37:44.760 | to like to sort of deal fairly, if that makes sense,
03:37:50.080 | and not approach it from a place of anger
03:37:52.280 | or a place of outrage.
03:37:54.520 | There is a sense of like sadness for the victims.
03:37:56.960 | There's a sense of outrage for the victims,
03:37:59.120 | but approach the individual who's done the thing
03:38:02.440 | from that place of understanding of,
03:38:07.040 | this isn't just this person.
03:38:08.560 | There's like a whole broader thing going on here.
03:38:11.600 | Do you have advice for young people
03:38:13.560 | of how to live this life?
03:38:16.500 | How to have a career they can be proud of?
03:38:19.000 | So high school students, college students,
03:38:20.600 | or maybe a life they can be proud of?
03:38:22.440 | - That's a great, well, let me think about this
03:38:24.960 | for a second.
03:38:25.800 | I think, don't be afraid to go against the grain
03:38:36.280 | and sort of challenge the expectations on you.
03:38:42.520 | Like you sort of have to do this weird thing
03:38:46.240 | where you acknowledge how difficult it will be
03:38:48.800 | to achieve something great
03:38:49.980 | while also having the courage to go for it anyways.
03:38:53.640 | And understanding that other people
03:38:56.040 | don't have it figured out, I think is a big theme of my work,
03:38:59.300 | which is that everyone wants like the guru
03:39:02.400 | to show them the way, to show them the secrets.
03:39:05.120 | So much of life and achieving anything
03:39:07.560 | is learning to figure it out yourself.
03:39:09.880 | And like the meta skill of being an autodidactic
03:39:14.160 | where you can, I don't know if I said that word right.
03:39:15.960 | Basically you self-teach.
03:39:17.280 | You learn the meta skill of like learning to learn.
03:39:20.620 | I think that's such an underrated aspect of education.
03:39:23.540 | People leave education, they go,
03:39:24.560 | when am I gonna use two plus two?
03:39:26.200 | When am I gonna learn, use calculus?
03:39:28.580 | But so much of it is learning this higher level
03:39:30.720 | abstract thinking that can apply to anything.
03:39:33.980 | And getting that early on is incredibly important
03:39:39.580 | and incredibly powerful.
03:39:41.340 | So yeah, I would say like a lot of it is,
03:39:44.840 | is I guess to some extent,
03:39:47.140 | like you kind of have to do that Steve Jobs thing
03:39:48.960 | where you realize that nobody else in the world
03:39:51.340 | is smarter than you.
03:39:52.980 | And that both means that like
03:39:55.140 | they can't show you the perfect way,
03:39:56.420 | but it also means you could do great things
03:39:58.300 | and kind of chart your own path, I don't know.
03:40:00.260 | That's so cheesy.
03:40:01.340 | This is why I hate giving advice.
03:40:03.140 | (Zubin laughs)
03:40:03.980 | I feel like it's cheesy.
03:40:04.800 | I mean, and I don't think it is.
03:40:05.640 | Like I think my journey is so full of luck
03:40:08.860 | and like specific experience.
03:40:10.900 | I wonder how generalizable it is.
03:40:12.620 | But if I've learned anything
03:40:14.940 | and if I could talk specifically to myself,
03:40:17.240 | I guess that's what I would say.
03:40:18.300 | - I mean, you've taken a very difficult path.
03:40:21.660 | And I think part of taking that path,
03:40:23.520 | like of a great journalist, frankly,
03:40:26.020 | is like I can be that person.
03:40:30.120 | Like just believing in yourself that you could take that.
03:40:33.680 | 'Cause like if you see a problem in the world,
03:40:38.780 | you could be the solution to that problem.
03:40:41.020 | Like you can solve that problem.
03:40:42.620 | I think that's like, it's really important to believe that.
03:40:46.060 | It depends.
03:40:47.420 | Maybe you're lucky to have the belief inside yourself.
03:40:50.920 | Maybe the thing that you're saying is like,
03:40:54.140 | don't look to others for that strength.
03:40:56.780 | - And also like be really comfortable failing.
03:40:59.580 | I think one of the best things
03:41:02.580 | that like you would never know about me,
03:41:05.120 | just looking at my background,
03:41:06.560 | that helped me was playing music live.
03:41:11.360 | I had incredible amounts of stage fright growing up,
03:41:17.340 | mostly because I was terrible at piano.
03:41:19.260 | I was like, sucked.
03:41:20.620 | And I specifically, I taught myself how to play.
03:41:23.180 | And I joined jazz band in like high school,
03:41:25.980 | did it through college.
03:41:27.260 | I remember all my recitals,
03:41:29.780 | I messed up every single solo I ever did.
03:41:32.340 | I never like actually nailed it.
03:41:34.660 | And every time I'd go up there,
03:41:35.660 | I'd like have so much dread around this.
03:41:38.420 | And it was easier to get up there
03:41:41.320 | 'cause there were sort of some people up there.
03:41:43.140 | But eventually I started like playing live too.
03:41:45.660 | And I sucked at that.
03:41:47.360 | And I've just gone through the trenches
03:41:48.780 | of like just like being publicly sort of,
03:41:52.160 | in my mind humiliated,
03:41:54.300 | like that prepared me so much for what I do now
03:41:58.380 | of trying to basically being fearless of failure
03:42:02.460 | in the face of like a wide audience.
03:42:05.140 | I don't have that anymore
03:42:06.480 | 'cause kind of I've experienced so many iterations of it
03:42:09.900 | at a smaller scale of just like abject public humiliation
03:42:12.980 | to where it's like not something that bothers me.
03:42:14.780 | I have no stage fright that doesn't bother me anymore.
03:42:17.660 | But you'd think like,
03:42:19.460 | oh, maybe he just was always good at this.
03:42:21.000 | I was terrible at it.
03:42:21.900 | I had a complete phobia about public anything.
03:42:25.400 | So it was that rapid iteration of just failure.
03:42:30.340 | And eventually I just like came to the conclusion
03:42:31.780 | of like, I wanna love it.
03:42:33.060 | I wanna like love like getting up on a stage and bombing.
03:42:36.500 | If you can learn to like love that and be fearless there,
03:42:40.360 | there's almost nothing you can't do.
03:42:42.900 | - Yeah, that's brilliant advice.
03:42:44.420 | I'm with you, still terrifying to me, like live performance.
03:42:48.060 | But yeah, that's exactly the feeling
03:42:49.960 | is loving the fact that you tried.
03:42:54.140 | And somehow failure is like a deeper celebration
03:42:58.860 | of the fact that you tried.
03:43:00.620 | 'Cause success is easy.
03:43:01.900 | But like failure is like bombing.
03:43:04.860 | I mean, music, yeah, on small scale,
03:43:08.060 | on the smallest and the largest of stages,
03:43:10.740 | I'm not gonna say who, but there's a huge band,
03:43:14.620 | huge band that wanted to be on stage.
03:43:18.080 | And it probably will happen.
03:43:21.540 | But like, but I turned it down because I was like, no.
03:43:26.540 | 'Cause I'm gonna suck for sure.
03:43:28.580 | So the question is,
03:43:30.340 | do I wanna suck in front of a very large live audience?
03:43:35.340 | And then I turned it down 'cause I was like, no, no, no.
03:43:40.920 | But now I realize, yes.
03:43:43.860 | - Embrace it.
03:43:44.700 | - It's gonna be good.
03:43:45.660 | It's gonna be good for you.
03:43:46.900 | It's gonna crush your ego to the degree it's remaining.
03:43:50.260 | And it's just good for you.
03:43:51.220 | It's good not to take yourself seriously
03:43:53.020 | and do those kinds of things.
03:43:54.260 | But honestly, I feel that way in an audience,
03:43:58.060 | like in an open mic.
03:44:00.140 | It hurts.
03:44:01.540 | That's why I really admire comedians.
03:44:02.980 | And like, I go to open mics all the time
03:44:06.860 | with comedians and musicians,
03:44:08.780 | and I just see them bomb and play in front of
03:44:12.060 | like just a few folks,
03:44:13.580 | and they're putting their heart out.
03:44:15.340 | And especially the ones that kind of suck,
03:44:17.460 | but are going all out anyway.
03:44:19.740 | - I think open mics are the best place to learn though,
03:44:22.760 | because it's the lowest stakes you can get
03:44:25.020 | while still being public.
03:44:26.780 | If you're gonna face like fears around this,
03:44:29.140 | 'cause we're talking very specifically like public speaking
03:44:31.660 | or any kind of like, you know, being in front of a camera.
03:44:34.340 | If you're gonna face your fear, you have to do it.
03:44:37.140 | And the easiest way to do it is to lower the stakes.
03:44:39.300 | You're not gonna start being Lex Friedman on stage
03:44:41.500 | with a huge band.
03:44:42.340 | You don't wanna be.
03:44:43.160 | Like it's like in that way, it is so impossible.
03:44:46.120 | But the more you lower the stakes
03:44:48.980 | and just like open it up to like two strangers,
03:44:51.140 | five strangers, like the most dive bar open mic
03:44:54.260 | you can go to and like start performing.
03:44:57.300 | That's really what I did is like,
03:44:59.220 | like I love open mics now,
03:45:00.740 | 'cause it's like low stakes on the one hand,
03:45:03.200 | but you really get the feeling of like going for it.
03:45:06.820 | - And you get better and better and better at that.
03:45:08.300 | Yeah, for sure.
03:45:09.140 | And then you'll get the strength to take a bigger
03:45:13.380 | and bigger and bigger risks.
03:45:14.820 | Listen, Koff, I'm a huge fan of yours,
03:45:17.500 | not just for who you are, but for what you stand for.
03:45:21.540 | People like you are rare and they're a huge inspiration.
03:45:25.900 | I just, I'm inspired by your fearlessness,
03:45:29.140 | that you're taking on some of the most powerful
03:45:32.980 | and richest people in this world and doing so
03:45:35.340 | with respect, I think, with good faith,
03:45:37.780 | but also with the boldness and fearlessness.
03:45:40.460 | Listen, man, I'm a huge fan.
03:45:42.060 | Keep doing what you're doing
03:45:43.500 | as long as you got the strength for it,
03:45:45.140 | 'cause I think you inspire all of us.
03:45:46.540 | You're doing important work, brother.
03:45:48.020 | - Thanks for having me.
03:45:49.740 | - Thanks for listening to this conversation
03:45:51.260 | with Koffee Zilla.
03:45:52.460 | To support this podcast,
03:45:53.580 | please check out our sponsors in the description.
03:45:55.980 | And now let me leave you with some words
03:45:57.980 | from Walter Lippmann.
03:45:59.900 | "There can be no higher law in journalism
03:46:02.620 | than to tell the truth and to shame the devil."
03:46:06.660 | Thank you for listening and hope to see you next time.
03:46:09.620 | (upbeat music)
03:46:12.220 | (upbeat music)
03:46:14.820 | [BLANK_AUDIO]