back to index

Unlocking the Power of Razors The Key to Simplifying Your Life


Whisper Transcript | Transcript Only Page

00:00:00.000 | (upbeat music)
00:00:02.580 | - So I thought back to kind of the first content of yours
00:00:08.200 | that I'd read and it was all about razors,
00:00:09.720 | which at the time I was like,
00:00:10.960 | I don't even know what that is.
00:00:12.760 | And I felt like that would be a great place to start
00:00:15.420 | for this conversation.
00:00:16.760 | But like everyone else listening,
00:00:18.880 | I'm sure people don't know what razors are.
00:00:21.440 | So can you talk a little bit about what they even are
00:00:23.600 | and we can jump into why they're important?
00:00:25.080 | - Yeah, absolutely.
00:00:25.960 | So it's not for shaving,
00:00:28.760 | would be the first thing I would start with, I suppose.
00:00:31.180 | No, I mean, a philosophical razor,
00:00:32.880 | the term is from philosophy.
00:00:35.720 | And basically the idea is that it's a rule of thumb
00:00:37.920 | or some sort of decision-making heuristic
00:00:40.000 | that allows you to cut through available options
00:00:42.600 | and make a decision.
00:00:43.600 | So the whole idea was like in philosophy,
00:00:46.280 | you can take a razor and it allows you to like strip away
00:00:49.240 | any of the unnecessary things
00:00:50.680 | in order to cut through the noise and just make a decision.
00:00:52.880 | So Occam's razor is the one that
00:00:54.880 | if people know a single razor,
00:00:56.520 | Occam's razor is the thing they've heard of.
00:00:58.320 | And that's like the whole idea
00:00:59.440 | that basically simple is beautiful.
00:01:01.040 | It's like, if you have a whole bunch of hypotheses,
00:01:04.040 | the one that requires the fewest assumptions
00:01:06.120 | is generally the one that is right.
00:01:08.840 | If you're kind of like looking to decide which path
00:01:13.680 | you believe is the driver of some situation.
00:01:16.660 | So I've written a bunch about razors
00:01:18.560 | and basically like extending the concept
00:01:20.840 | into a whole bunch of different realms.
00:01:22.280 | I think a lot about ways to simplify decision-making.
00:01:24.940 | It's always been something that I've liked.
00:01:26.400 | You're sort of similar to that,
00:01:27.520 | like life hacks are all sort of razors
00:01:30.000 | in their own way, shape, or form,
00:01:31.120 | like ways to make a decision, cut through things,
00:01:34.920 | you know, that'll just simplify your life hopefully,
00:01:36.560 | like tools in your toolkit.
00:01:37.800 | So that's been the kind of genesis
00:01:40.120 | of me writing about them in the past.
00:01:41.720 | - Yeah, I think when it comes to my whole scope in life
00:01:44.480 | is like, how do I optimize everything?
00:01:46.000 | I think one of the challenges you get into
00:01:47.720 | is there's just too much and you get overwhelmed.
00:01:50.560 | So for me, I love these idea of razors, rules of thumb,
00:01:54.140 | to just make some of it a little easier.
00:01:55.800 | - Yeah.
00:01:56.640 | - The risk is that you then have like a million razors,
00:01:58.840 | which would be the pushback against this,
00:02:00.520 | which is like, oh, I have a million razors,
00:02:02.120 | I can't remember a single one of them.
00:02:03.560 | So the way I always think about these things
00:02:05.600 | is like tools in your toolkit.
00:02:07.240 | You don't need to like know that you have
00:02:09.360 | all of those different wrenches
00:02:10.760 | or all those different things at every single point in time.
00:02:12.840 | You just need to like have your toolkit
00:02:15.000 | and be able to open it up and think about,
00:02:17.280 | okay, what applies to a situation at any point in time?
00:02:20.220 | - Yeah, and I don't expect anyone listening to go home
00:02:21.920 | and be like, oh, I got 17 razors to use in my life.
00:02:25.200 | I did try to bulk them into a couple categories
00:02:28.520 | 'cause I thought that might be helpful
00:02:29.600 | for people to think, okay,
00:02:30.560 | I need to make a decision on this thing.
00:02:33.080 | Is there something that might make this easier?
00:02:34.840 | So the first one I thought about was related to people.
00:02:37.880 | There's people in your life,
00:02:39.200 | first one I had was the optimist razor.
00:02:41.280 | I consider myself an optimist, so I like this one.
00:02:43.920 | Let's start there and kind of go through a few people one
00:02:46.200 | and see where we get.
00:02:47.040 | - Yeah, so the optimist razor is basically the idea
00:02:49.680 | that you should always default
00:02:52.360 | to trying to spend time with optimists.
00:02:54.200 | And if you have a choice between spending time
00:02:55.880 | with optimists and pessimists,
00:02:57.040 | you're always better off spending time with optimists.
00:02:59.240 | And the grounding of this,
00:03:00.880 | I mean, I originally came up with this
00:03:02.300 | during like the period of COVID
00:03:04.560 | when there was a whole ton of pessimism out there
00:03:07.080 | around like, oh, the markets are going to hell,
00:03:09.240 | the economy is going to hell.
00:03:10.320 | And then there was like little shades of optimism
00:03:12.880 | and like kind of mid 2020 about,
00:03:15.560 | okay, but a lot of these things are happening X, Y, and Z
00:03:18.780 | that might be positive actually for the future.
00:03:20.740 | Like the, I mean, the Fed was printing
00:03:22.520 | trillions and trillions of dollars.
00:03:24.000 | And what I found was my initial skew was to pessimism.
00:03:27.680 | And I was like, oh, short the market,
00:03:29.120 | everything's going bad, everything's going bad.
00:03:31.080 | And really regretted that after a few months
00:03:33.520 | when I realized that the optimists
00:03:34.960 | were actually the ones that were getting rich
00:03:36.520 | by betting on things going well in that time period.
00:03:39.920 | And so I started just thinking about,
00:03:42.200 | well, there's this little heuristic
00:03:44.600 | that pessimists sound really smart,
00:03:48.280 | but optimists seem to be getting really rich.
00:03:51.360 | And it all of a sudden cemented in my mind,
00:03:53.780 | like, okay, there's something to this.
00:03:55.320 | And really, I want to be spending more time
00:03:57.360 | around optimists in general.
00:03:59.200 | - But it's not just about investing and making money.
00:04:01.720 | It can be just who you want to hang out with on the weekend.
00:04:03.880 | - Totally.
00:04:04.720 | I mean, that's why it applies so broadly to life.
00:04:07.120 | And when I took it beyond that, I just started thinking,
00:04:10.600 | you know, who do you want?
00:04:11.540 | Like, who do you feel good when you're around
00:04:13.400 | just as a general rule of thumb?
00:04:14.840 | You're like, who do I get energy from?
00:04:17.660 | Who makes me happy and feel good to be around?
00:04:20.120 | And that tends to be optimists.
00:04:21.460 | Again, pessimists, they sound smart,
00:04:24.200 | and they have a whole lot of things,
00:04:25.540 | and negativity, et cetera.
00:04:26.960 | But being around people that are positive and optimistic
00:04:29.080 | about the future, about what it looks like in any arena,
00:04:31.880 | just tends to feel better.
00:04:33.260 | And what I find is that good things happen
00:04:35.400 | when you spend time with optimists.
00:04:37.080 | They just have a better outlook on the future.
00:04:38.780 | And if you believe that energy attracts energy,
00:04:42.340 | optimists tend to attract good outcomes.
00:04:44.660 | - What about finding people to work with,
00:04:47.560 | more than just hanging out?
00:04:49.020 | - Yeah, I mean, it applies to another kind of area
00:04:51.720 | that I've thought about,
00:04:52.560 | which is originally from Nassim Taleb,
00:04:55.960 | the author of "Black Swan,"
00:04:58.760 | who, I don't know, I forget what book it's in.
00:05:01.520 | It's either in, I don't think it's in "Black Swan."
00:05:03.520 | It might be in "Skin in the Game" or "Anti-Fragile,"
00:05:06.680 | where he talks about this whole thing of the surgeon
00:05:09.420 | and how you pick a surgeon.
00:05:10.600 | And so he tells this story of, like,
00:05:12.400 | you're choosing between two surgeons
00:05:14.320 | who are of equal merit.
00:05:16.540 | Assume they both have the same track record
00:05:18.340 | of successful surgery, say.
00:05:19.880 | And one of them looks like this beautiful, polished,
00:05:22.520 | like, Harvard Medical School credentialed surgeon,
00:05:25.560 | like, perfect, clean cut, everything's good.
00:05:27.520 | And then the other one looks like a butcher.
00:05:29.320 | Like, he's got blood all over him.
00:05:30.620 | Like, he just doesn't look the part.
00:05:31.840 | He's like big, you know, big hands, whatever,
00:05:34.320 | like, big beard, scraggly.
00:05:36.040 | And his whole thing is that most people
00:05:38.480 | pick the, like, nice, clean cut surgeon
00:05:41.120 | when what you should actually do
00:05:42.960 | is pick the one who doesn't look the part.
00:05:44.960 | And his logic is that the one who doesn't look the part
00:05:47.760 | has had to overcome not looking the part
00:05:50.040 | in order to get to where they are.
00:05:51.560 | And so if they have equal merit,
00:05:53.520 | that one is actually the better one
00:05:54.880 | because they've had to overcome not looking like they should
00:05:58.120 | all along the way in order to get to that level.
00:06:00.640 | And I've always thought that that was, like,
00:06:01.760 | another interesting way of thinking about who to work with
00:06:04.000 | because we do have all of these little, like,
00:06:06.280 | biases and prejudices that are in our head.
00:06:08.440 | And people that have managed to overcome all of those
00:06:11.000 | all along the way to get to where they are
00:06:13.040 | tend to be great, great people to work with.
00:06:15.400 | - Yeah, I don't know if you've ever done
00:06:16.440 | unconscious bias training.
00:06:18.960 | But it just kind of, like, opens your eyes up to,
00:06:21.200 | oh, okay, like, even someone who might not think
00:06:23.480 | you have bias, you have bias for sure.
00:06:25.240 | - Yeah, I mean, the statistics on those things are insane.
00:06:27.520 | Like, you can go on Google and just, like,
00:06:29.480 | look one up and do one of the tests quickly
00:06:31.480 | and you can get, like, a score on it.
00:06:32.880 | It's usually pretty shocking.
00:06:34.120 | - Yeah.
00:06:35.040 | What about smart friends?
00:06:37.160 | - Again, like with people, man.
00:06:39.360 | The smart friends one is basically
00:06:42.320 | that if you have enough smart friends
00:06:44.600 | talking about something, you should take it seriously.
00:06:46.960 | And again, like, a lot of these were, for me,
00:06:51.040 | derived from bad experiences.
00:06:53.040 | Like, where I didn't listen to this
00:06:54.560 | and then I came to regret it.
00:06:55.760 | And so, like, smart friends for me,
00:06:57.840 | I mean, again, going back to early COVID,
00:06:59.520 | like, all of these smart friends of mine
00:07:01.120 | were talking about board API club as, like, a silly example.
00:07:04.920 | Which maybe now it's, like, not as much of a thing.
00:07:06.760 | But at the time, it was, like, this, you know,
00:07:09.320 | whatever, you know, NFT project.
00:07:11.040 | I didn't understand what NFTs were,
00:07:12.160 | but I had all these friends that were talking about it.
00:07:14.480 | And I was, like, dude, that sounds stupid.
00:07:15.960 | I'm just gonna ignore this.
00:07:16.800 | Like, that sounds super dumb.
00:07:17.800 | But I had multiple friends saying it over and over again.
00:07:20.080 | And then, like, you know, if I had listened to them
00:07:23.080 | and done something about it,
00:07:24.920 | 12 months after that, I probably would have made,
00:07:26.560 | like, a million dollars off of just
00:07:28.560 | putting in a little bit of money into this thing.
00:07:30.600 | And so, I developed a rule, basically,
00:07:32.160 | that was if three smart friends,
00:07:33.760 | like, three friends that I consider intelligent
00:07:35.480 | tell me about the same thing,
00:07:37.160 | I'm gonna put, like, a little bit of skin
00:07:38.680 | into the game no matter what.
00:07:39.880 | Even if I'm not gonna go deeper on it,
00:07:41.520 | it just, like, gives me kind of a hedge
00:07:43.520 | against looking like an idiot later on,
00:07:45.280 | like I did with that one.
00:07:46.720 | But again, it's just another way of thinking about,
00:07:49.800 | you know, drafting off of the intelligence
00:07:53.000 | of your peer group.
00:07:54.520 | Like, if you have smart people
00:07:55.720 | that are talking about things,
00:07:56.800 | and you're in the circles with intelligent people
00:07:58.520 | that have domain expertise that goes beyond yours,
00:08:01.320 | listening to them and taking it seriously
00:08:03.640 | when there's enough density in a single idea
00:08:05.960 | is usually a good bet.
00:08:07.280 | Presumably, they have to be excited about it
00:08:09.240 | or just talking about it.
00:08:10.600 | I think excited about it.
00:08:11.720 | You know, like, and again, and domain experts, right?
00:08:14.440 | Like, not just, like, you don't wanna have,
00:08:17.560 | you know, your, like, coder and engineer friends
00:08:20.080 | maybe, like, weighing in on, like, culture and fashion
00:08:22.960 | and being like, oh, I'm gonna listen to them.
00:08:24.560 | Like, you want it to be people,
00:08:26.120 | like, if they're in technology
00:08:27.360 | and they're weighing in on some new technology,
00:08:29.000 | like AI could be an example today.
00:08:31.040 | If you have a certain domain within AI
00:08:33.000 | that a bunch of your smart friends
00:08:34.480 | are really interested and excited about,
00:08:36.200 | it's worth taking seriously.
00:08:37.240 | It doesn't mean you have to invest in it or do something,
00:08:39.440 | but it's worth at least digging into it
00:08:40.880 | trying to understand more because it might be like an asymmetric bet on the future.