back to index

Bryan Johnson: Kernel Brain-Computer Interfaces | Lex Fridman Podcast #186


Chapters

0:0 Introduction
1:13 Kernel Flow demo
10:25 The future of brain-computer interfaces
43:54 Existential risk
49:33 Overcoming depression
64:52 Zeroth principles thinking
73:5 Engineering consciousness
79:19 Privacy
83:48 Neuralink
93:27 Braintree and Venmo
109:10 Eating one meal a day
115:22 Sleep
135:4 Climbing Mount Kilimanjaro
142:2 Advice for young people
146:38 Meaning of life

Whisper Transcript | Transcript Only Page

00:00:00.000 | The following is a conversation with Brian Johnson,
00:00:02.560 | founder of Kernel, a company that has developed devices
00:00:06.440 | that can monitor and record brain activity.
00:00:09.360 | And previously, he was the founder of Braintree,
00:00:12.080 | a mobile payment company that acquired Venmo
00:00:14.760 | and then was acquired by PayPal and eBay.
00:00:17.840 | Quick mention of our sponsors,
00:00:19.720 | Four Sigmatic, NetSuite, Grammarly, and ExpressVPN.
00:00:24.280 | Check them out in the description to support this podcast.
00:00:27.760 | As a side note, let me say that this was a fun
00:00:30.160 | and memorable experience,
00:00:31.720 | wearing the Kernel Flow Brain interface
00:00:34.040 | in the beginning of this conversation,
00:00:35.920 | as you can see if you watch the video version
00:00:38.080 | of this episode.
00:00:39.520 | And there was a Ubuntu Linux machine sitting next to me,
00:00:42.880 | collecting the data from my brain.
00:00:44.840 | The whole thing gave me hope that the mystery
00:00:47.840 | of the human mind will be unlocked in the coming decades,
00:00:51.480 | as we begin to measure signals from the brain
00:00:53.760 | in a high bandwidth way.
00:00:55.400 | To understand the mind, we either have to build it
00:00:58.720 | or to measure it.
00:01:00.160 | Both are worth a try.
00:01:02.200 | Thanks to Brian and the rest of the Kernel team
00:01:04.200 | for making this little demo happen.
00:01:06.320 | This is the Lex Friedman Podcast,
00:01:08.640 | and here is my conversation with Brian Johnson.
00:01:12.280 | - You ready, Lex?
00:01:14.080 | - Yes, I'm ready.
00:01:15.680 | - Do you guys wanna come in
00:01:16.560 | and put the interfaces on our heads?
00:01:18.840 | And then I will proceed to tell you a few jokes.
00:01:22.040 | So we have two incredible pieces of technology
00:01:25.760 | and a machine running Ubuntu 2004 in front of us.
00:01:30.480 | What are we doing?
00:01:31.520 | - All right.
00:01:32.360 | - Are these going on our head?
00:01:33.180 | - They're going on our heads, yeah.
00:01:34.080 | And they will place it on our heads for proper alignment.
00:01:39.080 | - Does this support giant heads?
00:01:43.160 | Because I kind of have a giant head.
00:01:45.120 | Is this just giant head fine?
00:01:46.800 | - Are you saying as like an ego,
00:01:48.040 | or are you saying physically both?
00:01:51.000 | - I'm gonna drop it on you.
00:01:53.240 | - It's a nice massage.
00:01:57.080 | - Yes.
00:01:57.920 | Okay, how does this feel?
00:02:01.240 | - It's okay to move around?
00:02:04.760 | - Yeah.
00:02:05.720 | - It feels, oh yeah.
00:02:07.480 | Hey, hey.
00:02:08.320 | (laughing)
00:02:10.960 | This feels awesome.
00:02:11.800 | - It's a pretty good fit.
00:02:12.880 | - Thank you.
00:02:13.720 | - That feels good.
00:02:14.540 | - So this is big head friendly.
00:02:17.680 | - It suits you well, Lex.
00:02:19.160 | - Thank you.
00:02:20.000 | (laughing)
00:02:22.080 | I feel like I need to,
00:02:23.600 | I feel like when I wear this,
00:02:25.800 | I need to sound like Sam Harris,
00:02:27.080 | calm, collected, eloquent.
00:02:31.800 | I feel smarter, actually.
00:02:34.120 | I don't think I've ever felt quite as much
00:02:36.880 | like I'm part of the future as now.
00:02:38.760 | - Have you ever worn a brain interface,
00:02:40.720 | or had your brain imaged?
00:02:42.560 | - Oh, never had my brain imaged.
00:02:44.720 | The only way I've analyzed my brain
00:02:47.280 | is by talking to myself and thinking.
00:02:52.280 | No direct data.
00:02:53.320 | - Yeah, that is definitely a brain interface.
00:02:56.440 | (laughing)
00:02:57.400 | That has a lot of blind spots.
00:02:59.200 | - It has some blind spots, yeah.
00:03:01.280 | Psychotherapy.
00:03:02.480 | - That's right.
00:03:03.320 | All right, are we recording?
00:03:07.000 | - Yep, we're good.
00:03:08.240 | - All right.
00:03:09.560 | So Lex, the objective of this,
00:03:12.640 | I'm going to tell you some jokes,
00:03:16.940 | and your objective is to not smile,
00:03:18.920 | which as a Russian, you should have an edge.
00:03:22.960 | - Make the motherland proud, I gotcha.
00:03:25.520 | Okay.
00:03:26.340 | Let's hear the jokes.
00:03:29.440 | - Lex, and this is from the Colonel Crew.
00:03:32.300 | We've been working on a device that can read your mind,
00:03:37.080 | and we would love to see your thoughts.
00:03:39.040 | - Is that the joke?
00:03:43.960 | - That's the opening.
00:03:44.840 | - Okay.
00:03:45.680 | - All right.
00:03:46.520 | - If I'm seeing the muscle activation correctly
00:03:52.460 | on your lips, you're not gonna do well on this.
00:03:54.760 | Let's see.
00:03:55.600 | All right, here comes the first.
00:03:56.420 | - I'm screwed.
00:03:57.260 | - Here comes the first one.
00:03:58.100 | - Is this gonna break the device?
00:03:59.800 | Is it resilient to laughter?
00:04:04.240 | - Lex, what goes through a potato's brain?
00:04:09.440 | (laughing)
00:04:14.320 | - I got already failed.
00:04:15.800 | That's the hilarious opener.
00:04:17.320 | Okay, what?
00:04:19.320 | - Tater thoughts.
00:04:20.240 | What kind of fish performs brain surgery?
00:04:26.920 | - I don't know.
00:04:29.040 | - A neural surgeon.
00:04:30.240 | - And so we're getting data of everything
00:04:37.720 | that's happening in my brain right now?
00:04:39.080 | - Lifetime, yeah.
00:04:39.920 | - We're getting activation patterns of your entire cortex.
00:04:44.920 | - I'm gonna try to do better.
00:04:46.280 | I'll edit out all the parts where I laughed.
00:04:48.400 | Photoshop a serious face over me.
00:04:50.600 | - You can recover.
00:04:51.960 | - Yeah, all right.
00:04:52.840 | - Lex, what do scholars eat when they're hungry?
00:04:57.000 | - I don't know, what?
00:04:58.440 | - Academia nuts.
00:04:59.860 | - That was a pretty good one.
00:05:05.880 | - So what we'll do is, so you're wearing
00:05:08.240 | a kernel flow, which is an interface built
00:05:13.240 | using technology called spectroscopy.
00:05:17.720 | So it's similar to what we wear wearables on the wrist
00:05:20.680 | using light, so using LIDAR as you know.
00:05:24.720 | And we're using that to image,
00:05:27.880 | it's a functional imaging of brain activity.
00:05:30.360 | And so as your neurons fire,
00:05:32.640 | electrically and chemically,
00:05:35.160 | it creates blood oxygenation levels.
00:05:37.440 | We're measuring that.
00:05:38.280 | And so you'll see in the reconstructions we do for you,
00:05:41.160 | you'll see your activation patterns in your brain
00:05:43.080 | as throughout this entire time we were wearing it.
00:05:45.840 | So in the reaction to the jokes
00:05:47.760 | and as we were sitting here talking.
00:05:50.320 | And so it's a, we're moving towards a real time feed
00:05:54.620 | of your cortical brain activity.
00:05:56.920 | - So there's a bunch of things that are in contact
00:05:59.480 | with my skull right now.
00:06:02.280 | How many of them are there?
00:06:03.680 | And so how many of them are, what are they?
00:06:05.880 | What are the actual sensors?
00:06:07.120 | - There's 52 modules and each module has one laser
00:06:11.440 | and six sensors.
00:06:13.700 | And the sensors fire in about 100 picoseconds.
00:06:18.480 | And then the photons scatter and absorb in your brain.
00:06:21.480 | And then a few go in, a bunch go in,
00:06:25.080 | then a few come back out and we sense those photons
00:06:27.920 | and then we do the reconstruction for the activity.
00:06:30.360 | Overall, there's about a thousand plus channels
00:06:33.520 | that are sampling your activity.
00:06:35.640 | - How difficult is it to make it as comfortable as it is?
00:06:38.760 | 'Cause it's surprisingly uncomfortable.
00:06:40.720 | I would not think it would be comfortable.
00:06:42.820 | Something that's measuring brain activity,
00:06:48.320 | I would not think it would be comfortable, but it is.
00:06:51.160 | - I agree.
00:06:52.000 | - In fact, I wanna take this home.
00:06:52.840 | - Yeah, yeah, that's right.
00:06:54.200 | So people are accustomed to being in big systems like fMRI
00:06:59.000 | where there's 120 decibel sounds
00:07:02.200 | and you're in a claustrophobic encasement
00:07:05.600 | or EEG, which is just painful or surgery.
00:07:09.240 | And so, yes, I agree that this is a convenient option
00:07:12.280 | to be able to just put on your head.
00:07:14.400 | It measures your brain activity
00:07:15.960 | in the contextual environment you choose.
00:07:18.000 | So if we want to have it during a podcast
00:07:20.200 | or if we wanna be at home in a business setting,
00:07:22.660 | it's freedom to be where to record your brain activity
00:07:27.480 | in the setting that you choose.
00:07:29.000 | - Yeah, but sort of from an engineering perspective,
00:07:31.000 | are these, what is it?
00:07:32.920 | There's a bunch of different modular parts
00:07:34.960 | and they're kinda, there's like a rubber band thing
00:07:38.080 | where they mold to the shape of your head.
00:07:40.880 | - That's right.
00:07:41.700 | So we built this version of the mechanical design
00:07:45.480 | to accommodate most adult heads.
00:07:48.760 | - But I have a giant head and it fits fine.
00:07:51.760 | It fits well, actually.
00:07:53.800 | So I don't think I have an average head.
00:07:55.840 | Okay, maybe I feel much better about my head now.
00:08:01.640 | Maybe I'm more average than I thought.
00:08:05.260 | Okay, so what else is there,
00:08:08.120 | interesting that you could say while it's on our heads.
00:08:10.820 | I can keep this on the whole time.
00:08:12.120 | This is kind of awesome.
00:08:13.480 | And it's amazing for me as a fan of Ubuntu.
00:08:16.080 | I use Ubuntu Mate.
00:08:17.160 | You guys should use that too.
00:08:18.880 | But it's amazing to have code running
00:08:22.520 | to the side measuring stuff and collecting data.
00:08:26.200 | I mean, I just, I feel like much more important now
00:08:30.360 | that my data is being recorded.
00:08:32.560 | Like somebody cares,
00:08:33.400 | like you know when you have a good friend
00:08:35.280 | that listens to you,
00:08:36.920 | that actually is listening to you?
00:08:40.240 | This is what I feel like.
00:08:41.880 | I'm like a much better friend
00:08:43.080 | because it's like accurately listening to me, Ubuntu.
00:08:47.120 | - What a cool perspective.
00:08:49.120 | I hadn't thought about that, of feeling understood.
00:08:53.400 | - Yeah.
00:08:54.240 | - Heard, yeah, heard deeply by the mechanical system
00:08:59.240 | that is recording your brain activity
00:09:01.960 | versus the human that you're engaging with,
00:09:04.480 | that your mind immediately goes to
00:09:06.280 | that there's this dimensionality
00:09:09.520 | and depth of understanding of this software system,
00:09:12.600 | which you're intimately familiar with.
00:09:14.600 | And now you're able to communicate with this system
00:09:17.360 | in ways that you couldn't before.
00:09:19.280 | - Yeah, I feel heard.
00:09:20.760 | - Yeah, I mean, I guess what's interesting about this is
00:09:26.880 | your intuitions are spot on.
00:09:28.760 | Most people have intuitions about brain interfaces
00:09:30.920 | that they've grown up with this idea
00:09:32.960 | of people moving cursors on the screen
00:09:34.800 | or typing or changing the channel or skipping a song.
00:09:38.600 | It's primarily been anchored on control.
00:09:41.840 | And I think the more relevant understanding
00:09:44.880 | of brain interfaces or neuroimaging
00:09:47.280 | is that it's a measurement system.
00:09:49.800 | And once you have numbers for a given thing,
00:09:53.320 | a seemingly endless number of possibilities
00:09:54.920 | emerge around that of what to do with those numbers.
00:09:57.760 | - So before you tell me about the possibilities,
00:09:59.880 | this was an incredible experience.
00:10:01.520 | I can keep this on for another two hours,
00:10:03.880 | but I'm being told that for a bunch of reasons,
00:10:08.880 | just because we probably wanna keep the data small
00:10:11.240 | and visualize it nicely for the final product,
00:10:13.840 | we wanna cut this off
00:10:15.000 | and take this amazing helmet away from me.
00:10:17.960 | So Brian, thank you so much for this experience
00:10:21.240 | and let's continue without helmet-less.
00:10:25.200 | - All right.
00:10:26.200 | - So that was an incredible experience.
00:10:28.920 | Can you maybe speak to what kind of opportunities
00:10:31.240 | that opens up that stream of data,
00:10:32.960 | that rich stream of data from the brain?
00:10:35.520 | - First, I'm curious, what is your reaction?
00:10:38.360 | What comes to mind when you put that on your head?
00:10:41.560 | What does it mean to you and what possibilities emerge
00:10:44.120 | and what significance might it have?
00:10:46.760 | I'm curious where your orientation is at.
00:10:49.480 | - Well, for me, I'm really excited by the possibility
00:10:54.840 | of various information about my body,
00:10:59.440 | about my mind being converted into data
00:11:03.320 | such that data can be used to create products
00:11:06.320 | that make my life better.
00:11:07.360 | So that to me is a really exciting possibility.
00:11:09.760 | Even just like a Fitbit that measures, I don't know,
00:11:13.080 | some very basic measurements about your body
00:11:15.680 | is really cool.
00:11:17.120 | But the bandwidth of information,
00:11:20.560 | the resolution of that information is very crude,
00:11:22.720 | so it's not very interesting.
00:11:23.960 | The possibility of recording,
00:11:25.520 | of just building a dataset coming in a clean way
00:11:31.320 | and a high bandwidth way from my brain
00:11:33.960 | opens up all kinds of, at the very,
00:11:39.120 | I was kind of joking when we were talking,
00:11:40.840 | but it's not really, is like I feel heard
00:11:44.680 | in the sense that it feels like the full richness
00:11:49.280 | of the information coming from my mind
00:11:51.680 | is actually being recorded by the machine.
00:11:56.080 | I mean, there's a, I can't quite put it into words,
00:12:00.160 | but there is a genuinely for me,
00:12:02.880 | this is not some kind of joke about me being a robot,
00:12:05.160 | it just genuinely feels like I'm being heard
00:12:09.080 | in a way that's going to improve my life.
00:12:13.880 | As long as the thing that's on the other end
00:12:16.320 | can do something useful with that data.
00:12:17.740 | But even the recording itself is like,
00:12:20.400 | once you record, it's like taking a picture.
00:12:24.180 | That moment is forever saved in time.
00:12:28.480 | Now, a picture cannot allow you to step back
00:12:32.720 | into that world, but perhaps recording your brain
00:12:37.720 | is a much higher resolution thing,
00:12:41.160 | much more personal recording of that information
00:12:44.720 | than a picture that would allow you to step back
00:12:48.040 | into that, where you were in that particular moment
00:12:51.800 | in history and then map out a certain trajectory
00:12:54.160 | to tell you certain things about yourself.
00:12:58.520 | That could open up all kinds of applications.
00:13:00.360 | Of course, there's health that I consider,
00:13:02.640 | but honestly, to me, the exciting thing is just being heard.
00:13:06.600 | My state of mind, the level of focus,
00:13:08.960 | all those kinds of things, being heard.
00:13:10.920 | - What I heard you say is you have an entirety
00:13:13.820 | of lived experience, some of which you can communicate
00:13:17.080 | in words and in body language,
00:13:20.020 | some of which you feel internally,
00:13:21.440 | which cannot be captured in those communication modalities.
00:13:24.440 | And that this measurement system captures
00:13:27.640 | both the things you can try to articulate in words,
00:13:29.920 | maybe in a lower dimensional space,
00:13:31.520 | using one word, for example, to communicate focus,
00:13:34.560 | when it really may be represented in a 20 dimensional space
00:13:37.160 | of this particular kind of focus,
00:13:39.740 | and that this information is being captured.
00:13:42.560 | So it's a closer representation to the entirety
00:13:46.200 | of your experience captured in a dynamic fashion
00:13:48.600 | that is not just a static image
00:13:50.960 | of your conscious experience.
00:13:53.080 | - Yeah, that's the promise.
00:13:56.440 | That was the feeling, and it felt like the future.
00:13:59.380 | So it was a pretty cool experience.
00:14:00.840 | And from the sort of mechanical perspective,
00:14:04.000 | it was cool to have an actual device
00:14:06.680 | that feels pretty good,
00:14:08.380 | that doesn't require me to go into the lab.
00:14:11.600 | And also the other thing I was feeling,
00:14:14.480 | there's a guy named Andrew Huberman,
00:14:16.160 | he's a friend of mine, amazing podcast,
00:14:17.940 | people should listen to it, Huberman Lab Podcast.
00:14:21.600 | We're working on a paper together
00:14:24.040 | about eye movement and so on.
00:14:25.500 | And we're kind of, he's a neuroscientist,
00:14:28.440 | and I'm a data person, machine learning person.
00:14:30.960 | We're both excited by how much the data,
00:14:35.960 | measurements of the human mind, the brain,
00:14:43.520 | and all the different metrics that come from that
00:14:45.960 | can be used to understand human beings
00:14:48.840 | and in a rigorous scientific way.
00:14:50.520 | So the other thing I was thinking about
00:14:52.000 | is how this could be turned into a tool for science.
00:14:55.960 | Sort of not just personal science,
00:14:57.700 | not just like Fitbit style,
00:14:59.760 | how am I doing my personal metrics of health,
00:15:04.040 | but doing larger scale studies of human behavior and so on.
00:15:07.760 | So like data not at the scale of an individual,
00:15:10.480 | but data at the scale of many individuals
00:15:12.440 | or a large number of individuals.
00:15:14.720 | So personal being heard was exciting
00:15:17.000 | and also just for science is exciting.
00:15:19.920 | 'Cause it's very easy.
00:15:21.160 | Like there's a very powerful thing to it
00:15:23.420 | being so easy to just put on
00:15:25.600 | that you could scale much easier.
00:15:28.320 | - If you think about that second thing you said
00:15:30.120 | about the science of the brain,
00:15:35.120 | most, we've done a pretty good job,
00:15:40.600 | like we, the human race has done a pretty good job
00:15:43.240 | figuring out how to quantify the things around us
00:15:46.120 | from distant stars to calories and steps and our genome.
00:15:51.120 | So we can measure and quantify pretty much everything
00:15:55.360 | in the known universe except for our minds.
00:15:59.200 | And we can do these one-offs
00:16:02.160 | if we're going to get an fMRI scan
00:16:03.640 | or do something with a low res EEG system,
00:16:08.880 | but we haven't done this at population scale.
00:16:11.760 | And so if you think about human thought
00:16:14.840 | or human cognition is probably the single law,
00:16:19.320 | largest raw input material into society
00:16:23.340 | at any given moment.
00:16:24.760 | It's our conversations with ourselves
00:16:26.680 | and with other people.
00:16:27.920 | And we have this raw input that we can't,
00:16:32.920 | haven't been able to measure yet.
00:16:35.180 | And if you, when I think about it through that frame,
00:16:38.740 | it's remarkable.
00:16:42.260 | It's almost like we live in this wild, wild west
00:16:46.140 | of unquantified communications within ourselves
00:16:50.140 | and between each other
00:16:51.980 | when everything else has been grounded.
00:16:53.380 | I mean, for example, I know if I buy an appliance
00:16:55.460 | at the store or on a website,
00:16:58.420 | I don't need to look at the measurements on the appliance
00:17:01.260 | and make sure it can fit through my door.
00:17:03.040 | That's an engineered system of appliance manufacturing
00:17:06.160 | and construction.
00:17:07.120 | Everyone's agreed upon engineering standards.
00:17:10.640 | And we don't have engineering standards around cognition.
00:17:15.400 | It's not a, it has not entered
00:17:16.900 | as a formal engineering discipline
00:17:19.280 | that enables us to scaffold in society
00:17:21.780 | with everything else we're doing,
00:17:23.040 | including consuming news, our relationships,
00:17:26.560 | politics, economics, education, all the above.
00:17:29.600 | And so to me, the most significant contribution
00:17:33.720 | that kernel technology has to offer
00:17:36.520 | would be the formal,
00:17:37.900 | the introduction of the formal engineering of cognition
00:17:42.640 | as it relates to everything else in society.
00:17:45.300 | - I love that idea that you kind of think
00:17:49.120 | that there is just this ocean of data
00:17:51.640 | that's coming from people's brains
00:17:53.120 | as being in a crude way, reduced down to like tweets
00:17:57.360 | and texts and so on.
00:17:58.920 | So it's a very hardcore, many scale compression
00:18:03.640 | of actual, the raw data.
00:18:06.040 | But maybe you can comment,
00:18:08.280 | 'cause you're using the word cognition.
00:18:10.240 | I think the first step is to get the brain data.
00:18:15.200 | But is there a leap to be taken
00:18:19.160 | to sort of interpreting that data in terms of cognition?
00:18:22.120 | So is your idea is basically you need
00:18:24.440 | to start collecting data at scale from the brain,
00:18:27.760 | and then we start to really be able to take little steps
00:18:32.640 | along the path to actually measuring
00:18:35.280 | some deep sense of cognition.
00:18:37.720 | Because as I'm sure you know,
00:18:40.960 | we understand a few things,
00:18:43.080 | but we don't understand most of what makes up cognition.
00:18:47.280 | - This has been one of the most significant challenges
00:18:50.360 | of building kernel.
00:18:51.240 | And kernel wouldn't exist if I wasn't able to fund it
00:18:53.720 | initially by myself.
00:18:55.720 | Because when I engage in conversations with investors,
00:18:59.240 | the immediate thought is what is the killer app?
00:19:02.760 | And of course, I understand that heuristic.
00:19:04.960 | That's what they're looking at is they're looking to de-risk.
00:19:07.880 | Is the product solved?
00:19:09.520 | Is there a customer base?
00:19:10.680 | Are people willing to pay for it?
00:19:11.880 | How does it compare to competing options?
00:19:14.280 | And in the case with Brain Interfaces,
00:19:15.560 | when I started the company,
00:19:17.000 | there was no known path to even build a technology
00:19:21.480 | that could potentially become mainstream.
00:19:24.360 | And then once we figured out the technology,
00:19:26.920 | we could commence having conversations with investors
00:19:29.400 | and it became what is the killer app?
00:19:31.440 | And so what has been...
00:19:33.240 | So I funded the first $53 million for the company
00:19:36.360 | and to raise the round of funding,
00:19:38.960 | the first one we did, I spoke to 228 investors.
00:19:42.520 | One said, yes.
00:19:43.920 | It was remarkable.
00:19:45.720 | And it was mostly around this concept
00:19:48.280 | around what is a killer app.
00:19:49.480 | And so internally, the way we think about it
00:19:51.600 | is we think of the go-to-market strategy
00:19:55.320 | much more like the Drake equation,
00:19:57.840 | where if we can build technology
00:20:00.680 | that has the characteristics of,
00:20:02.480 | it has the data quality is high enough,
00:20:05.120 | it meets some certain threshold,
00:20:07.480 | cost, accessibility, comfort,
00:20:11.200 | it can be worn in contextual environments.
00:20:12.680 | If it meets the criteria of being a mass market device,
00:20:17.320 | then the responsibility that we have
00:20:21.560 | is to figure out how to create the algorithm
00:20:24.240 | that enables the human...
00:20:27.840 | To enable humans to then find value with it.
00:20:32.400 | So the analogy is like brain interfaces
00:20:34.920 | are like early nineties of the internet.
00:20:37.080 | Is you wanna populate an ecosystem
00:20:39.480 | with a certain number of devices,
00:20:40.560 | you want a certain number of people
00:20:41.800 | who play around with them,
00:20:42.720 | who do experiments of certain data collection parameters,
00:20:44.480 | you want to encourage certain mistakes
00:20:46.440 | from experts and non-experts.
00:20:48.000 | These are all critical elements that ignite discovery.
00:20:51.920 | And so we believe we've accomplished
00:20:56.720 | the first objective of building technology
00:20:59.160 | that reaches those thresholds.
00:21:01.800 | And now it's the Drake equation component
00:21:03.480 | of how do we try to generate 20 years of value discovery
00:21:08.480 | in a two or three year time period?
00:21:12.840 | How do we compress that?
00:21:14.280 | - So just to clarify,
00:21:15.360 | so when you mean the Drake equation,
00:21:17.520 | which for people who don't know,
00:21:19.080 | I don't know why you...
00:21:19.920 | If you listen to this,
00:21:20.760 | I bring up aliens every single conversation.
00:21:22.600 | So I don't know how you wouldn't know
00:21:23.960 | what the Drake equation is,
00:21:25.000 | but you mean like the killer app,
00:21:28.600 | it would be one alien civilization in that equation.
00:21:31.240 | So meaning like this is in search of an application
00:21:34.800 | that's impactful and transformative.
00:21:36.520 | By the way, it should be...
00:21:37.360 | We need to come up with a better term than killer app.
00:21:40.400 | - It's also violent, right?
00:21:43.160 | - Very violent.
00:21:44.000 | You can go like viral app.
00:21:45.200 | That's horrible too, right?
00:21:46.680 | It's some very inspiringly impactful application.
00:21:51.520 | How about that?
00:21:52.360 | No. - Yeah.
00:21:53.200 | - Okay, so let's stick with killer app.
00:21:54.640 | That's fine.
00:21:55.480 | Nobody's--
00:21:56.320 | - But I concur with you.
00:21:57.140 | I dislike the chosen words in capturing the concept.
00:22:02.140 | - You know, it's one of those sticky things
00:22:03.840 | that is effective to use in the tech world,
00:22:08.440 | but when you now become a communicator
00:22:11.520 | outside of the tech world,
00:22:12.720 | especially when you're talking about software
00:22:14.600 | and hardware and artificial intelligence applications,
00:22:17.360 | it sounds horrible.
00:22:18.200 | - Yeah, no, it's interesting.
00:22:19.020 | I actually regret now having called attention
00:22:21.360 | to this, I regret having used that word
00:22:23.100 | in this conversation
00:22:24.200 | because it's something I would not normally do.
00:22:26.120 | I used it in order to create a bridge
00:22:29.080 | of shared understanding of how others would,
00:22:31.100 | what terminology others would use.
00:22:32.800 | - Yeah.
00:22:33.640 | - But yeah, I concur.
00:22:34.840 | - Let's go with impactful application.
00:22:38.200 | - Or just value creation.
00:22:40.040 | - Value creation.
00:22:41.360 | Something people love using.
00:22:43.600 | - There we go.
00:22:44.440 | - That's it.
00:22:45.280 | - Love app.
00:22:46.760 | Okay, so what, do you have any ideas?
00:22:49.400 | So you're basically creating a framework
00:22:51.040 | where there's the possibility of a discovery
00:22:54.200 | of an application that people love using.
00:22:57.380 | Is, do you have ideas?
00:22:59.640 | - We've began to play a fun game internally
00:23:01.560 | where when we have these discussions,
00:23:03.400 | we begin circling around this concept of,
00:23:06.820 | does anybody have an idea?
00:23:10.160 | Does anyone have intuitions?
00:23:11.400 | And if we see the conversation starting
00:23:13.840 | to veer in that direction,
00:23:15.840 | we flag it and say, human intuition alert, stop it.
00:23:20.480 | And so we really want to focus on the algorithm
00:23:23.120 | of there's a natural process of human discovery
00:23:26.280 | that when you populate a system with devices
00:23:30.480 | and you give people the opportunity to play around with it
00:23:33.080 | in expected and unexpected ways,
00:23:35.440 | we are thinking that is a much better system of discovery
00:23:40.200 | than us exercising intuitions.
00:23:41.480 | And it's interesting,
00:23:42.320 | we're also seeing a few neuroscientists
00:23:43.640 | who have been talking to us
00:23:45.920 | where I was speaking to this one young associate professor
00:23:49.440 | and I approached a conversation and said,
00:23:50.960 | hey, we have these five data streams that we're pulling off.
00:23:54.820 | When you hear that,
00:23:57.120 | what weighted value do you add to each data source?
00:23:59.360 | Like which one do you think is gonna be valuable
00:24:00.900 | for your objectives and which one's not?
00:24:03.160 | And he said, I don't care, just give me the data.
00:24:05.760 | All I care about is my machine learning model.
00:24:08.080 | But importantly, he did not have a theory of mind.
00:24:10.800 | He did not come to the table and say,
00:24:12.380 | I think the brain operates in this way
00:24:15.100 | and these reasons or have these functions.
00:24:16.880 | He didn't care.
00:24:17.880 | He just wanted the data.
00:24:19.040 | And we're seeing that more and more
00:24:20.800 | that certain people are devaluing human intuitions
00:24:25.800 | for good reasons,
00:24:27.420 | as we've seen in machine learning
00:24:28.360 | over the past couple of years.
00:24:30.580 | And we're doing the same
00:24:31.480 | in our value creation market strategy.
00:24:35.100 | - So collect more data, clean data,
00:24:40.380 | make the product such that the collection of data
00:24:42.960 | is easy and fun,
00:24:46.320 | and then the rest will just spring to life
00:24:50.400 | through humans playing around with it.
00:24:52.880 | - Our objective is to create the most valuable
00:24:56.400 | data collection system of the brain ever.
00:25:01.360 | And with that, then applying all the best tools
00:25:07.980 | of machine learning and other techniques
00:25:11.520 | to extract out, to try to find insight.
00:25:15.880 | But yes, our objective is really to systematize
00:25:18.400 | the discovery process,
00:25:19.240 | because we can't put definite timeframes on discovery.
00:25:24.040 | The brain is complicated
00:25:26.300 | and science is not a business strategy.
00:25:30.120 | And so we really need to figure out how to...
00:25:32.640 | This is the difficulty of bringing
00:25:35.800 | technology like this to market.
00:25:36.880 | That's why most of the time it just languishes
00:25:40.420 | in academia for quite some time.
00:25:42.980 | But we hope that we will cross over
00:25:46.060 | and make this mainstream in the coming years.
00:25:48.980 | - The thing was cool to wear,
00:25:50.740 | but are you chasing a good reason
00:25:54.020 | for millions of people to put this on their head
00:25:58.260 | and keep on their head regularly?
00:26:00.860 | Is there...
00:26:01.700 | Like who's going to discover that reason?
00:26:04.740 | Is it going to be people just kind of organically,
00:26:08.080 | or is there going to be a Angry Birds style application
00:26:12.520 | that's just too exciting to not use?
00:26:17.120 | - If I think through the things
00:26:19.320 | that have changed my life most significantly
00:26:22.480 | over the past few years,
00:26:23.480 | when I started wearing a wearable on my wrist
00:26:27.160 | that would give me data about my heart rate,
00:26:29.120 | heart rate variability, respiration rate,
00:26:33.680 | metabolic approximations, et cetera.
00:26:36.140 | For the first time in my life,
00:26:38.660 | I had access to information, sleep patterns,
00:26:41.780 | that were highly impactful.
00:26:43.180 | They told me, for example, if I eat close to bedtime,
00:26:48.180 | I'm not going to get deep sleep.
00:26:50.860 | And not getting deep sleep means
00:26:52.540 | you have all these follow-on consequences in life.
00:26:54.420 | And so it opened up this window of understanding of myself
00:26:59.420 | that I cannot self-introspect and deduce these things.
00:27:03.180 | This is information that was available to be acquired,
00:27:06.220 | but it just wasn't.
00:27:07.100 | I would have to get an expensive sleep study,
00:27:08.880 | then it's an end, like one night,
00:27:10.480 | and that's not good enough to run all my trials.
00:27:12.540 | And so if you look just at the information
00:27:15.380 | that one can acquire on their wrist,
00:27:17.300 | and now you're applying it to the entire cortex on the brain,
00:27:21.660 | and you say, what kind of information could we acquire?
00:27:25.060 | It opens up a whole new universe of possibilities.
00:27:28.400 | For example, we did this internal study at Kernel
00:27:30.180 | where I wore a prototype device,
00:27:32.580 | and we were measuring the cognitive effects of sleep.
00:27:36.220 | So I had a device measuring my sleep.
00:27:38.440 | I performed with 13 of my coworkers,
00:27:41.780 | we performed four cognitive tasks over 13 sessions.
00:27:45.780 | And we focused on reaction time, impulse control,
00:27:48.640 | short-term memory, and then a resting state task.
00:27:52.800 | And with mine, we found, for example,
00:27:55.180 | that my impulse control
00:27:57.540 | was independently correlated with my sleep
00:28:01.260 | outside of behavioral measures
00:28:02.340 | of my ability to play the game.
00:28:04.140 | The point of the study was I had,
00:28:07.580 | the brain study I did at Kernel
00:28:09.580 | confirmed my life experience.
00:28:12.100 | That if I, my deep sleep determined
00:28:17.100 | whether or not I would be able to resist temptation
00:28:20.020 | the following day.
00:28:21.660 | And my brain did show that as one example.
00:28:24.460 | And so if you start thinking,
00:28:25.740 | if you actually have data on yourself,
00:28:27.940 | on your entire cortex, you can control the settings,
00:28:30.880 | I think there's probably a large number of things
00:28:36.860 | that we could discover about ourselves.
00:28:37.980 | Very, very small and very, very big.
00:28:39.860 | I just, for example, like when you read news,
00:28:42.660 | what's going on?
00:28:44.820 | - Like when you use social media, when you use news,
00:28:47.620 | like all the ways we allocate attention.
00:28:51.340 | - That's right. - With a computer.
00:28:52.780 | I mean, that seems like a compelling place
00:28:54.520 | to where you would want to put on a Kernel.
00:28:58.260 | By the way, what is it called?
00:28:59.180 | Kernel flux, Kernel, like what?
00:29:00.860 | - Flow. - Flow.
00:29:02.260 | - Two technologies, you wore flow.
00:29:04.260 | - Flow, okay.
00:29:05.540 | So when you put on the Kernel Flow,
00:29:08.280 | it is, seems like to be a compelling time
00:29:15.540 | and place to do it is when you're behind a desk,
00:29:17.260 | behind a computer.
00:29:18.660 | 'Cause you could probably wear it
00:29:19.660 | for prolonged periods of time
00:29:21.660 | as you're taking in content.
00:29:23.700 | And there could, a lot of,
00:29:25.300 | because some of our, so much of our lives
00:29:27.500 | happens in the digital world now.
00:29:29.940 | That kind of coupling the information
00:29:32.860 | about the human mind with the consumption
00:29:36.580 | and the behaviors in the digital world
00:29:39.500 | might give us a lot of information
00:29:40.980 | about the effects of the way we behave
00:29:43.300 | and navigate the digital world
00:29:45.120 | to the actual physical meat, space, effects on our body.
00:29:50.120 | It's interesting to think,
00:29:51.420 | so in terms of both like for work,
00:29:53.620 | I'm a big fan of, so Cal Newport,
00:29:57.580 | his ideas of deep work that I spend,
00:30:01.720 | with few exceptions, I try to spend
00:30:05.700 | the first two hours of every day,
00:30:07.860 | usually if I'm like at home
00:30:09.940 | and have nothing on my schedule,
00:30:11.340 | is going to be up to eight hours of deep work,
00:30:15.020 | of focus, zero distraction.
00:30:16.900 | And for me to analyze the,
00:30:18.540 | I mean I'm very aware of the waning of that,
00:30:22.980 | the ups and downs of that.
00:30:24.780 | And it's almost like you're surfing
00:30:27.240 | the ups and downs of that as you're doing programming,
00:30:29.440 | as you're doing thinking about particular problems,
00:30:32.580 | you're trying to visualize things in your mind,
00:30:34.260 | you start trying to stitch them together.
00:30:37.300 | You're trying to, when there's a dead end
00:30:40.060 | about an idea, you have to kind of calmly
00:30:43.580 | like walk back and start again,
00:30:45.520 | all those kinds of processes.
00:30:47.260 | It'd be interesting to get data
00:30:49.020 | on what my mind is actually doing.
00:30:51.100 | And also recently started doing,
00:30:53.380 | I just talked to Sam Harris a few days ago
00:30:55.960 | and been building up to that.
00:30:58.100 | I started using, started meditating using his app,
00:31:01.900 | waking up, very much recommend it.
00:31:05.340 | And be interested to get data on that.
00:31:07.500 | 'Cause it's, you're very, it's like,
00:31:09.880 | you're removing all the noise from your head,
00:31:12.620 | and you're very much, it's an active process of,
00:31:15.940 | active noise removal, active noise canceling,
00:31:19.900 | like the headphones.
00:31:21.220 | And it'd be interesting to see what is going on in the mind
00:31:24.900 | before the meditation, during it, and after,
00:31:28.380 | all those kinds of things.
00:31:29.220 | - And in all of your examples, it's interesting that
00:31:32.300 | everyone who's designed an experience for you,
00:31:35.380 | so whether it be the meditation app or the deep work,
00:31:38.540 | or all the things you mentioned,
00:31:40.420 | they constructed this product
00:31:43.280 | with a certain number of knowns.
00:31:45.900 | - Yeah.
00:31:47.020 | - Now, what if we expanded the number of knowns by 10X,
00:31:50.540 | or 20X, or 30X?
00:31:51.960 | They would reconstruct their product,
00:31:54.220 | co-incorporate those knowns.
00:31:55.500 | So it'd be, and so this is the dimensionality
00:31:58.540 | that I think is the promising aspect,
00:32:00.300 | is that people will be able to use this quantification,
00:32:04.620 | use this information to build more effective products.
00:32:09.260 | And this is, I'm not talking about better products
00:32:11.340 | to advertise to you or manipulate you.
00:32:13.980 | I'm talking about our focus is helping people,
00:32:17.820 | individuals, have this contextual awareness
00:32:20.340 | and this quantification, and then to engage with others
00:32:23.020 | who are seeking to improve people's lives.
00:32:26.060 | That the objective is betterment across ourselves,
00:32:31.060 | individually and also with each other.
00:32:33.480 | - Yeah, so it's a nice data stream to have
00:32:35.020 | if you're building an app.
00:32:36.100 | Like if you're building a podcast listening app,
00:32:38.020 | it would be nice to know data about the listener
00:32:40.300 | so that if you're bored or you fell asleep,
00:32:42.820 | maybe pause the podcast.
00:32:44.540 | It's like really dumb, just very simple applications
00:32:48.380 | that could just improve the quality of the experience
00:32:50.620 | of using the app.
00:32:51.980 | - I'm imagining if you have your neuron, this is Lex,
00:32:56.660 | and there's a statistical representation of you,
00:32:59.900 | and you engage with the app, and it says,
00:33:02.540 | "Lex, your best to engage with this meditation exercise
00:33:07.300 | "in the following settings.
00:33:10.980 | "At this time of day, after eating this kind of food
00:33:13.940 | "or not eating, fasting, with this level of blood glucose
00:33:17.320 | "and this kind of night's sleep."
00:33:19.740 | But all these data combined to give you
00:33:23.480 | this contextually relevant experience,
00:33:26.660 | just like we do with our sleep.
00:33:27.860 | You've optimized your entire life
00:33:29.820 | based upon what information you can acquire
00:33:32.060 | and know about yourself.
00:33:33.260 | And so the question is, how much do we really know
00:33:37.260 | of the things going around us?
00:33:38.500 | And I would venture to guess in my life experience,
00:33:41.220 | my self-awareness captures an extremely small percent
00:33:46.440 | of the things that actually influence
00:33:47.760 | my conscious and unconscious experience.
00:33:50.500 | - Well, in some sense, the data would help encourage you
00:33:54.540 | to be more self-aware,
00:33:55.700 | not just because you trust everything the data is saying,
00:33:58.600 | but it'll give you a prod to start investigating.
00:34:04.620 | Like I would love to get a rating,
00:34:08.020 | like a ranking of all the things I do
00:34:11.140 | and what are the things,
00:34:12.300 | it's probably important to do without the data,
00:34:14.740 | but the data will certainly help.
00:34:16.260 | It's like rank all the things you do in life
00:34:19.800 | and which ones make you feel shitty,
00:34:21.740 | which ones make you feel good.
00:34:23.540 | Like you're talking about evening, Brian.
00:34:26.080 | It's a good example of somebody like,
00:34:30.300 | I do pig out at night as well.
00:34:32.660 | (Lex laughs)
00:34:33.500 | And it never makes me feel good.
00:34:35.700 | - Lex, you're in a safe space.
00:34:37.380 | - This is a safe space. - You're in a safe space.
00:34:39.460 | Let's hear it.
00:34:40.420 | - No, I definitely have much less self-control at night
00:34:43.140 | and it's interesting.
00:34:44.380 | And the same, people might criticize this,
00:34:47.740 | but I know my own body.
00:34:49.980 | I know when I eat carnivore, just eat meat,
00:34:52.800 | I feel much better
00:34:54.020 | than if I eat more carbs.
00:35:00.100 | The more carbs I eat, the worse I feel.
00:35:02.300 | I don't know why that is.
00:35:04.180 | There is science supporting it,
00:35:05.260 | but I'm not leaning on science.
00:35:06.620 | I'm leaning on personal experience.
00:35:07.700 | And that's really important.
00:35:09.340 | I don't need to read,
00:35:11.420 | I'm not gonna go on a whole rant about nutrition science,
00:35:13.620 | but many of those studies are very flawed.
00:35:17.140 | They're doing their best,
00:35:18.380 | but nutrition science is a very difficult field of study
00:35:21.660 | because humans are so different
00:35:24.220 | and the mind has so much impact
00:35:26.120 | on the way your body behaves.
00:35:28.460 | And it's so difficult from a scientific perspective
00:35:30.820 | to conduct really strong studies
00:35:32.780 | that you have to be almost like a scientist of one.
00:35:36.980 | You have to do these studies on yourself.
00:35:39.060 | That's the best way to understand what works for you or not.
00:35:41.740 | And I don't understand why, 'cause it sounds unhealthy,
00:35:44.580 | but eating only meat always makes me feel good.
00:35:47.480 | Just eat meat, that's it.
00:35:49.740 | And I don't have any allergies, any of that kind of stuff.
00:35:52.900 | I'm not full like Jordan Peterson,
00:35:54.560 | where if he deviates a little bit from the carnivore diet,
00:36:01.380 | he goes off like the cliff.
00:36:03.180 | No, I can have like chalk.
00:36:04.980 | I can go off the diet, I feel fine.
00:36:07.700 | It's a gradual worsening of how I feel.
00:36:12.700 | But when I eat only meat, I feel great.
00:36:17.340 | And it'd be nice to be reminded of that.
00:36:19.300 | Like it's a very simple fact
00:36:20.940 | that I feel good when I eat carnivore.
00:36:24.180 | And I think that repeats itself in all kinds of experiences.
00:36:27.580 | Like I feel really good when I exercise.
00:36:32.580 | Not I hate exercise, okay.
00:36:37.180 | But in the rest of the day,
00:36:39.780 | the impact it has on my mind and the clarity of mind
00:36:43.660 | and the experiences and the happiness
00:36:45.820 | and all those kinds of things, I feel really good.
00:36:48.140 | And to be able to concretely express that through data
00:36:52.460 | would be nice.
00:36:53.820 | It would be a nice reminder, almost like a statement.
00:36:55.840 | Like remember what feels good and what not.
00:36:58.300 | And there could be things like that I'm not,
00:37:02.460 | many things like you're suggesting
00:37:04.600 | that I could not be aware of.
00:37:07.060 | They might be sitting right in front of me
00:37:09.100 | that make me feel really good and make me feel not good.
00:37:12.180 | And the data would show that.
00:37:14.020 | - I agree with you.
00:37:14.840 | I've actually employed the same strategy.
00:37:17.340 | I fired my mind entirely
00:37:20.140 | from being responsible for constructing my diet.
00:37:23.260 | And so I started doing a program
00:37:24.640 | where I now track over 200 biomarkers every 90 days.
00:37:28.700 | And it captures, of course,
00:37:30.740 | the things you would expect like cholesterol,
00:37:32.940 | but also DNA methylation
00:37:34.360 | and all kinds of things about my body,
00:37:37.120 | all the processes that make up me.
00:37:39.380 | And then I let that data generate the shopping list.
00:37:42.080 | And so I never actually ask my mind what it wants.
00:37:45.580 | It's entirely what my body is reporting that it wants.
00:37:48.340 | And so I call this goal alignment with M. Brian.
00:37:51.780 | And there's 200 plus actors
00:37:53.460 | that I'm currently asking their opinion of.
00:37:55.220 | And so I'm asking my liver, how are you doing?
00:37:57.760 | And it's expressing via the biomarkers.
00:38:00.180 | And so then I construct that diet
00:38:02.220 | and I only eat those foods until my next testing round.
00:38:06.140 | And that has changed my life more than I think anything else
00:38:10.780 | because in the demotion of my conscious mind
00:38:15.780 | that I gave primacy to my entire life,
00:38:17.940 | it led me astray because like you were saying,
00:38:20.460 | the mind then goes out into the world
00:38:22.460 | and it navigates the dozens of different dietary regimens
00:38:27.460 | people put together in books.
00:38:29.060 | And it all has their supporting science
00:38:32.740 | in certain contextual settings, but it's not N of one.
00:38:35.680 | And like you're saying, this dietary really is an N of one.
00:38:39.380 | What people have published scientifically, of course,
00:38:41.880 | can be used for nice groundings,
00:38:46.420 | but it changes when you get to an N of one level.
00:38:48.540 | And so that's what gets me excited about brain interfaces
00:38:51.620 | is if I could do the same thing for my brain
00:38:54.600 | where I can stop asking my conscious mind for its advice
00:38:57.980 | or for its decision-making, which is flawed.
00:39:01.500 | And I'd rather just look at this data.
00:39:03.540 | And I've never had better health markers in my life
00:39:05.640 | than when I stopped actually asking myself
00:39:08.020 | to be in charge of it.
00:39:09.680 | - The idea of demotion of the conscious mind
00:39:13.900 | is such a sort of engineering way of phrasing
00:39:19.040 | like meditation with the-
00:39:20.240 | (laughs)
00:39:21.080 | - I mean, that's what we're doing, right?
00:39:22.520 | - Yeah, yeah, that's beautiful.
00:39:23.520 | That means really beautifully put.
00:39:26.120 | By the way, testing round, what does that look like?
00:39:28.560 | What's that?
00:39:29.900 | Well, you mentioned-
00:39:31.340 | - Yeah, the tests I do?
00:39:33.240 | - Yes.
00:39:34.080 | - So includes a complete blood panel.
00:39:37.560 | I do a microbiome test.
00:39:39.020 | I do a food inflammation, a diet-induced inflammation.
00:39:43.560 | So I look for exotokine expressions.
00:39:45.440 | So foods that produce inflammatory reactions.
00:39:48.400 | I look at my neuroendocrine systems.
00:39:50.160 | I look at all my neurotransmitters.
00:39:52.140 | I do, yeah, there's several micronutrient tests
00:39:57.480 | to see how I'm looking at the various nutrients.
00:39:59.440 | - What about like self-report of like how you feel?
00:40:04.080 | You know, almost like,
00:40:05.320 | you can't demote your,
00:40:09.200 | you still exist within your conscious mind, right?
00:40:12.560 | So that lived experience is of a lot of value.
00:40:16.320 | So how do you measure that?
00:40:17.960 | - I do a temporal sampling over some duration of time.
00:40:20.960 | So I'll think through how I feel over a week,
00:40:23.760 | over a month, over three months.
00:40:25.800 | I don't do a temporal sampling of
00:40:27.580 | if I'm at the grocery store in front of a cereal box
00:40:29.720 | and be like, you know what?
00:40:30.960 | Captain Crunch is probably the right thing for me today
00:40:33.320 | 'cause I'm feeling like I need a little fun in my life.
00:40:36.600 | And so it's a temporal sampling.
00:40:37.680 | If the data set's large enough,
00:40:38.880 | then I smooth out the function of my natural oscillations
00:40:42.240 | of how I feel about life,
00:40:43.200 | where some days I may feel upset
00:40:45.400 | or depressed or down or whatever.
00:40:47.440 | And I don't want those moments
00:40:48.960 | to then rule my decision-making.
00:40:50.520 | That's why the demotion happens.
00:40:52.520 | And it says, really,
00:40:53.360 | if you're looking at health over a 90-day period of time,
00:40:56.560 | all my 200 voices speak up on that interval.
00:41:00.080 | And they're all given voice to say,
00:41:02.400 | this is how I'm doing and this is what I want.
00:41:04.640 | And so it really is an accounting system for everybody.
00:41:07.160 | So that's why I think that
00:41:09.000 | if you think about the future of being human,
00:41:13.760 | there's two things I think that are really going on.
00:41:17.800 | One is the design, manufacturing,
00:41:20.000 | and distribution of intelligence
00:41:21.600 | is heading towards zero, kind of cost curve,
00:41:25.680 | over a certain design, over a certain timeframe.
00:41:30.200 | But our ability to, you know,
00:41:31.720 | evolution produced us an intelligent,
00:41:34.280 | a form of intelligence.
00:41:35.600 | We are now designing our own intelligence systems.
00:41:38.560 | And the design, manufacturing,
00:41:39.720 | and distribution of that intelligence
00:41:40.880 | over a certain timeframe
00:41:42.800 | is gonna go to a cost of zero.
00:41:45.080 | - Design, manufacturing, distribution of intelligent
00:41:48.040 | costs is going to zero.
00:41:49.680 | - For example-- - Again,
00:41:51.280 | just give me a second.
00:41:52.520 | That's brilliant, okay.
00:41:54.520 | And evolution is doing the design, manufacturing,
00:41:58.280 | distribution of intelligence.
00:41:59.840 | And now we are doing the design, manufacturing,
00:42:02.360 | distribution of intelligence.
00:42:04.360 | And the cost of that is going to zero.
00:42:06.560 | That's a very nice way of looking at life on Earth.
00:42:10.760 | - So if that, that's going on.
00:42:12.200 | And then now in parallel to that,
00:42:14.680 | then you say, okay, what then happens
00:42:19.000 | if when that cost curve is heading to zero,
00:42:23.360 | our existence becomes a goal alignment problem,
00:42:28.720 | a goal alignment function.
00:42:31.320 | And so the same thing I'm doing
00:42:33.000 | where I'm doing goal alignment within myself
00:42:34.840 | of these 200 biomarkers, where I'm saying,
00:42:37.520 | when Brian exists on a daily basis
00:42:41.680 | and this entity is deciding what to eat
00:42:43.680 | and what to do and et cetera,
00:42:45.280 | it's not just my conscious mind, which is opining,
00:42:47.760 | it's 200 biological processes.
00:42:50.160 | And there's a whole bunch of more voices involved.
00:42:52.080 | So in that equation,
00:42:55.400 | we're going to increasingly automate
00:43:00.000 | the things that we spend high energy on today
00:43:03.960 | 'cause it's easier.
00:43:05.320 | And now we're going to then negotiate
00:43:08.600 | the terms and conditions of intelligent life.
00:43:11.160 | Now we say conscious existence because we're biased
00:43:13.160 | because that's what we have,
00:43:15.040 | but it will be the largest computational exercise in history
00:43:18.640 | 'cause you're now doing goal alignment with planet Earth,
00:43:22.160 | within yourself, with each other,
00:43:24.040 | within all the intelligent agents we're building,
00:43:26.880 | bots and other voice assistants.
00:43:29.400 | You basically have a trillions and trillions of agents
00:43:32.000 | working on the negotiation of goal alignment.
00:43:35.840 | - Yeah, this is in fact true.
00:43:39.160 | And what was the second thing?
00:43:40.560 | - That was it.
00:43:41.400 | So the cost, the design, manufacturing,
00:43:43.840 | distribution of intelligence going to zero,
00:43:45.840 | which then means what's really going on?
00:43:48.640 | What are we really doing?
00:43:50.160 | We're negotiating the terms and conditions of existence.
00:43:55.040 | - Do you worry about the survival of this process?
00:43:59.280 | That life as we know it on Earth comes to an end
00:44:04.920 | or at least intelligent life,
00:44:06.640 | that as the cost goes to zero,
00:44:09.600 | something happens where all of that intelligence
00:44:13.400 | is thrown in the trash by something like nuclear war
00:44:17.600 | or development of AGI systems that are very dumb.
00:44:20.800 | Not AGI, I guess, but AI systems, the paperclip thing,
00:44:25.360 | en masse is dumb but has unintended consequences
00:44:28.960 | to where it destroys human civilization.
00:44:31.080 | Do you worry about those kinds of things?
00:44:32.320 | - I mean, it's unsurprising that a new thing comes into
00:44:38.640 | this sphere of human consciousness.
00:44:41.000 | Humans identify the foreign object,
00:44:43.040 | in this case, artificial intelligence.
00:44:45.480 | Our amygdala fires up and says, "Scary, foreign.
00:44:49.120 | "We should be apprehensive about this."
00:44:53.400 | And so it makes sense from a biological perspective
00:44:57.040 | that humans, the knee jerk reaction is fear.
00:45:00.940 | What I don't think has been properly weighted with that
00:45:08.240 | is that we are the first generation
00:45:12.480 | of intelligent beings on this Earth
00:45:15.160 | that has been able to look out over their expected lifetime
00:45:18.880 | and see there is a real possibility of evolving
00:45:23.400 | into entirely novel forms of consciousness.
00:45:25.880 | So different that it would be totally unrecognizable
00:45:32.600 | to us today.
00:45:33.440 | We don't have words for it.
00:45:34.260 | We can't hint at it.
00:45:35.100 | We can't point at it.
00:45:35.920 | We can't, you can't look in the sky and see that.
00:45:38.160 | Thing that is shining, we're gonna go up there.
00:45:40.920 | You cannot even create an aspirational statement about it.
00:45:45.920 | And instead, we've had this knee jerk reaction of fear
00:45:51.800 | about everything that could go wrong.
00:45:53.720 | But in my estimation, this should be the defining aspiration
00:45:58.720 | of all intelligent life on Earth that we would aspire.
00:46:04.800 | That basically every generation surveys the landscape
00:46:08.240 | of possibilities that are afforded,
00:46:09.600 | given the technological, cultural
00:46:11.200 | and other contextual situation that they're in.
00:46:14.720 | We're in this context.
00:46:16.560 | We haven't yet identified this and said,
00:46:18.840 | this is unbelievable.
00:46:21.920 | We should carefully think this thing through,
00:46:24.200 | not just of mitigating the things that'll wipe us out,
00:46:27.280 | but like we have this potential.
00:46:29.720 | And so we just haven't given a voice to it,
00:46:31.760 | even though it's within this realm of possibilities.
00:46:33.920 | - So you're excited about the possibility
00:46:35.520 | of superintelligent systems
00:46:37.200 | and what the opportunities that bring.
00:46:39.160 | I mean, there's parallels to this.
00:46:41.840 | You think about people before the internet,
00:46:44.120 | as the internet was coming to life,
00:46:46.120 | I mean, there's kind of a fog through which you can't see.
00:46:51.020 | What does the future look like?
00:46:52.620 | Predicting collective intelligence,
00:46:56.440 | which I don't think we're understanding
00:46:57.720 | that we're living through that now,
00:46:59.280 | is that there's now, we've in some sense,
00:47:03.080 | stopped being individual intelligences
00:47:05.560 | and become much more like collective intelligences,
00:47:09.400 | because ideas travel much, much faster now.
00:47:13.760 | And they can, in a viral way,
00:47:16.360 | like sweep across the population.
00:47:18.880 | And so it's almost, I mean, it almost feels like
00:47:23.320 | a thought is had by many people now,
00:47:26.480 | thousands or millions of people,
00:47:27.880 | as opposed to an individual person.
00:47:29.760 | And that's changed everything.
00:47:30.820 | But to me, I don't think we're realizing
00:47:33.040 | how much that actually changed people or societies.
00:47:36.200 | But to predict that before the internet
00:47:38.960 | would have been very difficult.
00:47:41.200 | And in that same way, we're sitting here
00:47:43.180 | with the fog before us, thinking,
00:47:45.280 | what is superintelligence systems,
00:47:49.600 | how is that going to change the world?
00:47:51.780 | What is increasing the bandwidth,
00:47:54.760 | like plugging our brains into this whole thing,
00:47:59.360 | how is that going to change the world?
00:48:01.320 | And it seems like it's a fog, you don't know.
00:48:05.760 | And it could be, it could,
00:48:09.000 | whatever comes to be could destroy the world.
00:48:12.440 | We could be the last generation.
00:48:14.040 | But it also could transform in ways that creates
00:48:18.720 | an incredibly fulfilling life experience
00:48:24.520 | that's unlike anything we've ever experienced.
00:48:27.360 | It might involve dissolution of ego and consciousness
00:48:30.400 | and so on, you're no longer one individual.
00:48:32.640 | It might be more, that might be a certain kind of death,
00:48:37.440 | an ego death.
00:48:38.920 | But the experience might be really exciting and enriching.
00:48:42.240 | Maybe we'll live in a virtual,
00:48:44.040 | like it's funny to think about
00:48:47.040 | a bunch of sort of hypothetical questions of,
00:48:50.480 | would it be more fulfilling to live in a virtual world?
00:48:55.480 | Like if you were able to plug your brain in
00:48:59.520 | in a very dense way into a video game,
00:49:02.700 | like which world would you want to live in?
00:49:05.680 | In the video game or in the physical world?
00:49:08.840 | For most of us, we kind of,
00:49:11.460 | toying with the idea of the video game,
00:49:14.080 | but we still want to live in the physical world,
00:49:16.040 | have friendships and relationships in the physical world.
00:49:20.080 | But we don't know that.
00:49:21.760 | Again, it's a fog.
00:49:23.360 | And maybe in 100 years,
00:49:25.460 | we're all living inside a video game,
00:49:27.160 | hopefully not "Call of Duty."
00:49:29.040 | Hopefully more like "Sims 5."
00:49:31.680 | Which version is it on?
00:49:33.040 | For you individually though,
00:49:36.040 | does it make you sad that your brain ends?
00:49:39.900 | That you die one day very soon?
00:49:44.320 | That the whole thing,
00:49:46.800 | that data source just goes offline
00:49:50.480 | sooner than you would like?
00:49:53.580 | - That's a complicated question.
00:49:56.040 | I would have answered it differently.
00:49:59.200 | In different times in my life,
00:50:00.640 | I had chronic depression for 10 years.
00:50:03.280 | And so in that 10 year time period,
00:50:06.080 | I desperately wanted lights to be off.
00:50:09.140 | And the thing that made it even worse is,
00:50:12.120 | I was in a religious,
00:50:13.600 | I was born into a religion.
00:50:15.880 | It was the only reality I ever understood.
00:50:18.040 | It's difficult to articulate to people
00:50:20.320 | when you're born into that kind of reality,
00:50:22.200 | and it's the only reality you're exposed to,
00:50:24.880 | you are literally blinded
00:50:26.600 | to the existence of other realities,
00:50:28.200 | because it's so much the in-group, out-group thing.
00:50:31.440 | And so in that situation,
00:50:33.820 | it was not only that I desperately wanted lights out forever,
00:50:37.320 | it was that I couldn't have lights out forever.
00:50:38.840 | It was that there was an afterlife.
00:50:40.720 | And this afterlife had this system
00:50:45.480 | that would either penalize or reward you for your behaviors.
00:50:50.480 | And so it was almost like this indescribable hopelessness
00:50:57.820 | of not only being in hopeless despair
00:51:01.060 | of not wanting to exist,
00:51:02.200 | but then also being forced to exist.
00:51:05.160 | And so there was a duration of my time,
00:51:06.780 | of a duration of life where I'd say,
00:51:08.600 | like, yes, I have no remorse for lights being out,
00:51:13.300 | and actually wanted more than anything in the entire world.
00:51:16.360 | There are other times where I'm looking out at the future
00:51:21.240 | and I say, this is an opportunity for future,
00:51:25.920 | evolving human conscious experience
00:51:27.860 | that is beyond my ability to understand.
00:51:31.180 | And I jump out of bed and I race to work
00:51:34.380 | and I can't think about anything else.
00:51:37.600 | But I think the reality for me is,
00:51:42.600 | I don't know what it's like to be in your head,
00:51:45.740 | but in my head, when I wake up in the morning,
00:51:48.960 | I don't say, good morning, Brian, I'm so happy to see you.
00:51:52.880 | Like, I'm sure you're just gonna be beautiful to me today.
00:51:55.960 | You're not gonna make a huge long list
00:51:57.600 | of everything you should be anxious about.
00:51:59.120 | You're not gonna repeat that list to me 400 times.
00:52:01.420 | You're not gonna have me relive
00:52:03.040 | all the regrets I've made in life.
00:52:04.640 | I'm sure you're not doing any of that.
00:52:06.040 | You're just gonna just help me along all day long.
00:52:08.880 | I mean, it's a brutal environment in my brain.
00:52:11.800 | And we've just become normalized to this environment
00:52:15.560 | that we just accept that this is what it means to be human.
00:52:18.400 | But if we look at it,
00:52:20.560 | if we try to muster as much soberness as we can
00:52:24.080 | about the realities of being human, it's brutal.
00:52:27.280 | If it is for me.
00:52:28.240 | And so, am I sad that the brain may be off one day?
00:52:33.240 | It depends on the contextual setting.
00:52:38.280 | Like, how am I feeling?
00:52:39.120 | At what moment are you asking me that?
00:52:40.520 | And my mind is so fickle.
00:52:42.440 | And this is why, again, I don't trust my conscious mind.
00:52:45.200 | I have been given realities.
00:52:47.480 | I was given a religious reality
00:52:50.080 | that was a video game.
00:52:51.520 | And then I figured out it was not a real reality.
00:52:54.520 | And then I lived in a depressive reality,
00:52:56.240 | which delivered this terrible hopelessness.
00:52:59.760 | That wasn't a real reality.
00:53:00.760 | Then I discovered behavioral psychology,
00:53:03.160 | and I figured out how biased,
00:53:04.960 | 188 chronicle biases,
00:53:06.400 | and how my brain is distorting reality all the time.
00:53:08.880 | I have gone from one reality to another.
00:53:11.320 | I don't trust reality.
00:53:13.600 | I don't trust realities that are given to me.
00:53:15.840 | And so, to try to make a decision on what I value
00:53:18.960 | or not value that future state,
00:53:20.600 | I don't trust my response.
00:53:22.000 | - So, not fully listening to the conscious mind
00:53:27.880 | at any one moment as the ultimate truth,
00:53:31.040 | but allowing you to go up and down as it does,
00:53:33.760 | and just kind of being observing it.
00:53:35.320 | - Yes, I assume that whatever my conscious mind
00:53:38.000 | delivers up to my awareness is wrong,
00:53:41.600 | on pond landing.
00:53:43.240 | And I just need to figure out where it's wrong,
00:53:45.080 | how it's wrong, how wrong it is,
00:53:46.920 | and then try to correct for it as best I can.
00:53:49.280 | But I assume that on impact,
00:53:52.400 | it's mistaken in some critical ways.
00:53:55.600 | - Is there something you can say by way of advice
00:53:58.320 | when the mind is depressive,
00:54:00.880 | when the conscious mind serves up something that,
00:54:03.680 | you know, dark thoughts,
00:54:07.800 | how you deal with that,
00:54:08.960 | like how in your own life you've overcome that,
00:54:11.080 | and others who are experienced in that can overcome it?
00:54:14.480 | - Two things.
00:54:16.320 | One, those depressive states
00:54:20.720 | are biochemical states.
00:54:25.680 | It's not you.
00:54:27.640 | And the suggestions that these things,
00:54:31.560 | that this state delivers to you
00:54:33.040 | about suggestion of the hopelessness of life,
00:54:35.120 | or the meaninglessness of it,
00:54:37.440 | or that you should hit the eject button,
00:54:41.240 | that's a false reality.
00:54:44.320 | - Yeah.
00:54:45.160 | - And that it's when I completely understand
00:54:50.160 | the rational decision to commit suicide.
00:54:52.840 | It is not lost on me at all
00:54:55.160 | that that is an irrational situation,
00:54:57.360 | but the key is when you're in that situation
00:54:59.800 | and those thoughts are landing,
00:55:01.720 | to be able to say, "Thank you, you're not real.
00:55:04.880 | I know you're not real."
00:55:07.800 | - Yeah.
00:55:08.640 | - And so I'm in a situation where for whatever reason
00:55:10.600 | I'm having this neurochemical state,
00:55:13.760 | but that state can be altered.
00:55:16.320 | And so again, it goes back to the realities
00:55:18.640 | of the difficulties of being human.
00:55:21.240 | And when I was trying to solve my depression,
00:55:22.920 | I tried literally, if you name it,
00:55:25.920 | I tried it systematically and nothing would fix it.
00:55:29.360 | And so this is what gives me hope with brain interfaces.
00:55:32.320 | For example, could I have numbers on my brain?
00:55:35.400 | Can I see what's going on?
00:55:36.480 | Because I go to the doctor and it's like,
00:55:38.200 | "How do you feel?"
00:55:39.040 | "I don't know, terrible."
00:55:41.360 | Like on a scale from one to 10,
00:55:42.200 | how bad do you want to commit suicide?
00:55:44.320 | (laughing)
00:55:45.160 | You're like, "Okay."
00:55:45.980 | - Yeah, at this moment.
00:55:47.320 | - Here's his bottle.
00:55:48.400 | How much should I take?
00:55:49.240 | "Well, I don't know."
00:55:50.060 | Like just.
00:55:51.080 | - Yeah, it's very, very crude.
00:55:52.880 | And this data opens up the,
00:55:55.060 | yeah, it opens up the possibility of really helping
00:56:00.280 | in those dark moments to first understand
00:56:03.480 | the ways, the ups and downs of those dark moments.
00:56:06.080 | On the complete flip side of that,
00:56:11.160 | I am very conscious in my own brain
00:56:14.880 | and deeply, deeply grateful that,
00:56:18.280 | it's almost like a chemistry thing, a biochemistry thing,
00:56:23.000 | that I go many times throughout the day,
00:56:26.080 | I'll look at like this cup and I'll be overcome with joy
00:56:31.080 | how amazing it is to be alive.
00:56:34.120 | Like I actually think my biochemistry is such
00:56:37.440 | that it's not as common.
00:56:40.720 | Like I've talked to people
00:56:42.480 | and I don't think that's that common.
00:56:44.680 | Like it's, and it's not a rational thing at all.
00:56:48.240 | It's like, I feel like I'm on drugs
00:56:51.920 | and I'll just be like, "Whoa."
00:56:54.240 | And a lot of people talk about like
00:56:56.760 | the meditative experience will allow you to sort of,
00:56:59.600 | you know, look at some basic things
00:57:01.120 | like the movement of your hand as deeply joyful
00:57:04.520 | because it's like, that's life.
00:57:06.240 | But I get that from just looking at a cup.
00:57:08.480 | Like I'm waiting for the coffee to brew
00:57:10.040 | and I'll just be like, "Fuck, life is awesome."
00:57:15.040 | And I'll sometimes tweet that,
00:57:16.440 | but then I'll like regret it later.
00:57:17.760 | Like, "God damn it, you're so ridiculous."
00:57:20.040 | But yeah, so, but that is purely chemistry.
00:57:24.440 | There's no rational, it doesn't fit with the rest of my life.
00:57:28.120 | I have all this shit, I'm always late to stuff.
00:57:30.760 | I'm always like, there's all this stuff,
00:57:32.400 | you know, I'm super self-critical,
00:57:34.320 | like really self-critical about everything I do.
00:57:37.560 | To the point I almost hate everything I do.
00:57:39.840 | But there's this engine of joy for life outside of all that.
00:57:43.480 | And that has to be chemistry.
00:57:45.200 | And the flip side of that is what depression probably is,
00:57:48.480 | is the opposite of that feeling of like,
00:57:53.480 | 'cause I bet you that feeling of the cup being amazing
00:57:57.680 | would save anybody in a state of depression.
00:58:01.000 | Like that would be like fresh,
00:58:02.720 | you're in a desert and it's a drink of water.
00:58:07.520 | Shit, man, the brain is a,
00:58:09.320 | it would be nice to understand where that's coming from.
00:58:13.440 | To be able to understand how you hit those lows
00:58:18.440 | and those highs that have nothing to do
00:58:20.120 | with the actual reality.
00:58:21.840 | It has to do with some very specific aspects
00:58:24.320 | of how you maybe see the world.
00:58:27.800 | Maybe it could be just like basic habits that you engage in
00:58:31.080 | and then how to walk along the line
00:58:32.960 | to find those experiences of joy.
00:58:35.600 | - And this goes back to the discussion we're having
00:58:37.440 | of human cognition is in volume,
00:58:41.160 | the largest input of raw material into society.
00:58:45.600 | - Yeah.
00:58:46.440 | - And it's not quantified.
00:58:47.880 | We have no bearings on it.
00:58:50.240 | And so we just, you wonder,
00:58:53.520 | we both articulated some of the challenges
00:58:56.520 | we have in our own mind.
00:58:57.720 | And it's likely that others would say,
00:59:02.320 | I have something similar.
00:59:03.880 | And you wonder when you look at society,
00:59:05.960 | how does that contribute to all the other
00:59:10.920 | compound or problems that we're experiencing?
00:59:12.800 | How does that blind us to the opportunities
00:59:16.800 | we could be looking at?
00:59:18.600 | And so it really, it has this potential
00:59:21.840 | distortion effect on reality
00:59:25.000 | that just makes everything worse.
00:59:27.120 | And I hope if we can put some,
00:59:30.200 | if we can assign some numbers to these things
00:59:34.600 | just to get our bearings,
00:59:36.320 | so we're aware of what's going on,
00:59:38.080 | if we could find greater stabilization
00:59:40.160 | in how we conduct our lives and how we build society,
00:59:42.800 | it might be the thing that enables us to scaffold.
00:59:49.560 | 'Cause we've really, again, we've done,
00:59:52.600 | humans have done a fantastic job
00:59:53.960 | systematically scaffolding technology
00:59:58.160 | and science and institutions.
01:00:00.400 | It's human, it's our own selves,
01:00:02.840 | which we have not been able to scaffold.
01:00:05.320 | We are the one part of this intelligence infrastructure
01:00:08.440 | that remains unchanged.
01:00:09.680 | - Is there something you could say about
01:00:14.280 | coupling this brain data with
01:00:18.040 | not just the basic human experience,
01:00:19.600 | but say an experience, you mentioned sleep,
01:00:22.640 | but the wildest experience, which is psychedelics.
01:00:27.440 | And there's been quite a few studies now
01:00:29.960 | that are being approved and run,
01:00:33.440 | which is exciting from a scientific perspective
01:00:36.560 | on psychedelics.
01:00:38.160 | Do you think, what do you think happens
01:00:40.880 | to the brain on psychedelics?
01:00:42.400 | And how can data about this help us understand it?
01:00:47.460 | And when you're on DMT, do you see elves?
01:00:50.680 | And can we convert that into data?
01:00:54.440 | - Can you add aliens in there?
01:00:56.440 | - Yeah, aliens, definitely.
01:00:57.680 | Do you actually meet aliens?
01:00:58.800 | And elves, are elves the aliens?
01:01:00.640 | I'm asking for a few Austin friends yet
01:01:04.080 | that are convinced that they've actually met the elves.
01:01:08.960 | - What are elves like?
01:01:09.800 | Are they friendly?
01:01:11.040 | Are they helpful?
01:01:11.880 | - I haven't met them personally.
01:01:12.720 | - Are they like the Smurfs,
01:01:13.760 | so like they're industrious
01:01:15.280 | and they have different skill sets and...
01:01:17.280 | - Yeah, I think they're very critical as friends.
01:01:25.440 | - They're trolls, the elves are trolls.
01:01:27.440 | - No, but they care about you.
01:01:30.160 | So there's a bunch of different version of trolls.
01:01:33.760 | There's loving trolls that are harsh on you,
01:01:37.360 | but they want you to be better.
01:01:38.640 | And there's trolls that just enjoy your destruction.
01:01:42.640 | And I think they're the ones that care for you.
01:01:45.240 | I think they're a criticism.
01:01:47.520 | See, I haven't met them directly.
01:01:49.920 | It's like a friend of a friend.
01:01:51.160 | - Yeah, they're via my telephone.
01:01:53.200 | - Yeah, a bit of a, and the whole point is on psychedelics
01:01:57.400 | and certainly on DMT,
01:01:58.880 | this is where the brain data versus word data fails,
01:02:04.880 | which is, words can't convey the experience.
01:02:08.200 | Most people that, you can be poetic and so on,
01:02:10.520 | but it really does not convey the experience
01:02:12.280 | of what it actually means to meet the elves.
01:02:16.160 | - To me, what baselines this conversation is,
01:02:18.920 | imagine if we were interested in the health of your heart
01:02:23.040 | and we started and said, okay, Lex, self-introspect,
01:02:28.480 | tell me how's the health of your heart.
01:02:29.960 | And you sit there and you close your eyes
01:02:31.680 | and you think, feels all right, like things feel okay.
01:02:36.600 | And then you went to the cardiologist
01:02:38.240 | and the cardiologist is like, hey Lex,
01:02:39.960 | tell me how you feel.
01:02:41.000 | And you're like, well, actually what I'd really like you
01:02:43.000 | to do is do an EKG and a blood panel
01:02:45.800 | and look at arterial plaques
01:02:47.480 | and let's look at my cholesterol.
01:02:49.360 | And there's like five to 10 studies you would do.
01:02:53.240 | They would then give you this report and say,
01:02:54.480 | here's the quantified health of your heart.
01:02:58.080 | Now with this data, I'm going to prescribe
01:03:01.080 | the following regime of exercise
01:03:03.120 | and maybe I'll put you on a statin, et cetera.
01:03:06.160 | But the protocol is based upon this data.
01:03:08.560 | You would think the cardiologist is out of their mind
01:03:11.560 | if they just gave you a bottle of statins based upon,
01:03:14.320 | you're like, well, I think something's kind of wrong.
01:03:16.240 | And they're just kind of experiment and see what happens.
01:03:20.080 | But that's what we do with our mental health today.
01:03:22.600 | So it's kind of absurd.
01:03:25.120 | And so if you look at psychedelics,
01:03:27.440 | to have, again, to be able to measure the brain
01:03:29.440 | and get a baseline state,
01:03:31.540 | and then to measure during a psychedelic experience
01:03:33.880 | and post the psychedelic experience
01:03:35.160 | and then do it longitudinally,
01:03:37.160 | you now have a quantification of what's going on.
01:03:39.480 | And so you could then pose questions,
01:03:41.760 | what molecule is appropriate at what dosages,
01:03:45.080 | at what frequency, in what contextual environment?
01:03:47.760 | What happens when I have this diet with this molecule,
01:03:50.120 | with this experience?
01:03:51.360 | All the experimentation you do
01:03:52.680 | when you have good sleep data or HRV.
01:03:55.600 | And so that's what I think happens,
01:03:57.780 | what we could potentially do with psychedelics
01:04:00.420 | is we could add this level of sophistication
01:04:03.120 | that is not in the industry currently.
01:04:05.260 | And it may improve the outcomes people experience,
01:04:09.040 | it may improve the safety and efficacy.
01:04:11.600 | And so that's what I hope we are able to achieve.
01:04:14.160 | And it would transform mental health
01:04:19.080 | because we would finally have numbers
01:04:21.220 | to work with to baseline ourselves.
01:04:22.520 | And then if you think about it,
01:04:24.120 | when we talk about things related to the mind,
01:04:26.840 | we talk about the modality.
01:04:28.280 | We use words like meditation or psychedelics
01:04:30.520 | or something else,
01:04:32.020 | because we can't talk about a marker in the brain.
01:04:34.160 | We can't use a word to say,
01:04:35.960 | we can't talk about cholesterol,
01:04:37.000 | we don't talk about plaque in the arteries,
01:04:38.360 | we don't talk about HRV.
01:04:40.480 | And so if we have numbers,
01:04:41.480 | then the solutions get mapped to numbers
01:04:45.480 | instead of the modalities being the thing we talk about.
01:04:47.680 | Meditation just does good things in a crude fashion.
01:04:51.320 | - So in your blog post,
01:04:53.720 | Zeroeth Principle Thinking, good title,
01:04:56.080 | you ponder how do people come up
01:04:57.640 | with truly original ideas.
01:04:59.800 | What's your thoughts on this as a human
01:05:02.840 | and as a person who's measuring brain data?
01:05:05.640 | - Zeroeth principles are building blocks.
01:05:09.520 | First principles are understanding of system laws.
01:05:14.480 | So if you take, for example, like in Sherlock Holmes,
01:05:17.360 | he's a first principles thinker.
01:05:18.840 | So he says, "Once you've eliminated the impossible,
01:05:23.840 | "anything that remains, however improbable, is true."
01:05:29.120 | Whereas Dirk Gently,
01:05:32.200 | the holistic detective by Douglas Adams says,
01:05:35.600 | "I don't like eliminating the impossible."
01:05:38.240 | So when someone says from a first principles perspective,
01:05:42.320 | and they're trying to assume the fewest number of things
01:05:48.920 | within a given timeframe.
01:05:50.520 | And so when I, after Brain Tree Venmo,
01:05:55.540 | I set my mind to the question of
01:05:59.080 | what single thing can I do
01:06:01.620 | that would maximally increase the probability
01:06:03.600 | that the human race thrives beyond what we can even imagine.
01:06:07.160 | And I found that in my conversations with others,
01:06:10.640 | in the books I read, in my own deliberations,
01:06:13.660 | I had a missing piece of the puzzle
01:06:16.400 | because I didn't feel like over,
01:06:20.600 | yeah, I didn't feel like the future could be deduced
01:06:25.120 | from first principles thinking.
01:06:26.640 | And that's when I read the book,
01:06:29.720 | "Zero, A Biography of a Dangerous Idea."
01:06:33.160 | And I-- - It's a really good book,
01:06:34.600 | by the way.
01:06:36.400 | - I think it's my favorite book I've ever read.
01:06:38.240 | - It's also a really interesting number, zero.
01:06:40.600 | - And I wasn't aware
01:06:42.720 | that the number zero had to be discovered.
01:06:44.280 | I didn't realize that it caused a revolution in philosophy
01:06:47.240 | and just tore up math and it tore up,
01:06:50.360 | I mean, it builds modern society,
01:06:52.120 | but it wrecked everything in its way.
01:06:55.120 | It was an unbelievable disruptor
01:06:56.760 | and it was so difficult for society
01:06:59.480 | to get their heads around it.
01:07:01.320 | And so zero is of course,
01:07:05.120 | a representation of a zero principle thinking,
01:07:06.880 | which is it's the caliber
01:07:09.920 | and consequential nature of an idea.
01:07:12.260 | And so when you talk about what kind of ideas
01:07:17.160 | have civilization transforming properties,
01:07:22.160 | oftentimes they fall in the zeroth category.
01:07:25.480 | And so in thinking this through,
01:07:28.040 | I was wanting to find a quantitative structure
01:07:31.960 | on how to think about these zeroth principles.
01:07:35.720 | So I came up with that to be a coupler
01:07:39.600 | with first principles thinking.
01:07:40.800 | And so now it's a staple as part of how I think
01:07:42.480 | about the world and the future.
01:07:45.200 | - So it emphasizes trying to identify,
01:07:47.640 | it lands on that word impossible,
01:07:49.560 | like what is impossible?
01:07:51.260 | Essentially trying to identify what is impossible
01:07:54.440 | and what is possible and being as,
01:07:56.700 | how do you, I mean, this is the thing
01:08:00.960 | is most of society tells you the range of things
01:08:04.320 | they say is impossible is very wide.
01:08:06.280 | So you need to be shrinking that.
01:08:08.000 | I mean, that's the whole process of this kind of thinking
01:08:11.800 | is you need to be very rigorous
01:08:14.680 | in trying to be,
01:08:17.800 | trying to draw the lines of what is actually impossible
01:08:22.200 | because very few things are actually impossible.
01:08:26.360 | I don't know what is actually impossible.
01:08:29.440 | Like it's the Joe Rogan is entirely possible.
01:08:32.960 | I like that approach to science,
01:08:36.340 | to engineering, to entrepreneurship.
01:08:38.920 | It's entirely possible.
01:08:40.520 | Basically shrink the impossible to zero,
01:08:43.360 | to very small set.
01:08:45.360 | - Yeah, life constraints favor first principles thinking
01:08:50.360 | because it enables faster action
01:08:55.200 | with higher probability of success.
01:08:57.580 | Pursuing zero of principle optionality
01:09:00.880 | is expensive and uncertain.
01:09:02.560 | And so in a society constrained by resources,
01:09:06.080 | time and money and desire for social status,
01:09:10.040 | accomplishment, et cetera,
01:09:11.400 | it minimizes zero principle thinking.
01:09:14.520 | But the reason why I think zero principle thinking
01:09:16.800 | should be a staple of our shared cognitive infrastructure
01:09:21.800 | is if you look to the history of past couple of thousand
01:09:25.760 | years, and let's just say we arbitrarily,
01:09:29.960 | we subjectively try to assess what is a zero level idea.
01:09:33.840 | And we say, how many have occurred on what time scales
01:09:37.900 | and what were the contextual settings for it?
01:09:40.160 | I would argue that if you look at AlphaGo,
01:09:43.360 | when it played Go from another dimension,
01:09:47.860 | with the human Go players,
01:09:51.600 | when it saw AlphaGo's moves,
01:09:53.520 | it attributed to like playing with an alien,
01:09:55.720 | playing Go with AlphaGo being from another dimension.
01:10:00.160 | And so if you say computational intelligence
01:10:04.360 | has an attribute of introducing zero-like insights,
01:10:09.360 | then if you say, what is going to be the occurrence
01:10:14.360 | of zeros in society going forward?
01:10:16.980 | And you could reasonably say,
01:10:19.600 | probably a lot more than have occurred
01:10:21.480 | and probably more at a faster pace.
01:10:24.200 | So then if you say, what happens if you have
01:10:26.360 | this computational intelligence throughout society
01:10:29.000 | that the manufacturing design and distribution
01:10:31.120 | of intelligence is now going to heading towards zero,
01:10:33.800 | you have an increased number of zeros being produced
01:10:37.060 | with a tight connection between human and computers.
01:10:40.360 | That's when I got to a point and said,
01:10:43.320 | we cannot predict the future with first principle thinking.
01:10:47.520 | We can't, that cannot be our imagination set.
01:10:50.400 | It can't be our sole anchor in the situation
01:10:55.400 | that basically the future of our conscious existence,
01:10:57.520 | 20, 30, 40, 50 years is probably a zero.
01:11:01.440 | - So just to clarify, when you say zero,
01:11:06.960 | you're referring to basically a truly revolutionary idea.
01:11:11.500 | - Yeah, something that is currently not a building block
01:11:17.040 | of our shared conscious existence,
01:11:21.440 | either in the form of knowledge.
01:11:23.100 | Yeah, it's currently not manifest in what we acknowledge.
01:11:28.520 | - So zero principle thinking is playing with ideas
01:11:32.440 | that are so revolutionary that we can't even clearly reason
01:11:37.440 | about the consequences once those ideas come to be.
01:11:42.000 | - Yeah, or for example, like Einstein,
01:11:46.060 | that was a zeroth, I would categorize it
01:11:49.520 | as a zeroth principle insight.
01:11:51.400 | - You mean general relativity, space-time.
01:11:53.000 | - Yeah, space-time, yep, yep.
01:11:55.040 | That basically building upon what Newton had done
01:11:59.080 | and said, yes, also, and it just changed the fabric
01:12:04.080 | of our understanding of reality.
01:12:06.680 | And so that was unexpected, it existed.
01:12:09.060 | It became part of our awareness.
01:12:13.240 | And the moves AlphaGo made existed,
01:12:16.740 | it just came into our awareness.
01:12:19.060 | And so to your point, there's this question of,
01:12:24.060 | what do we know and what don't we know?
01:12:28.240 | Do we think we know 99% of all things
01:12:30.620 | or do we think we know 0.001% of all things?
01:12:34.220 | And that goes back to known known,
01:12:35.740 | known knowns and unknown unknowns.
01:12:37.460 | And first principles and zero principle thinking
01:12:39.420 | gives us a quantitative framework to say,
01:12:42.720 | there's no way for us to mathematically
01:12:45.100 | try to create probabilities for these things.
01:12:47.720 | Therefore, it would be helpful
01:12:50.660 | if they were just part of our standard thought processes,
01:12:53.400 | because it may encourage different behaviors
01:12:58.340 | in what we do individually, collectively as a society,
01:13:01.300 | what we aspire to, what we talk about,
01:13:03.500 | the possibility sets we imagine.
01:13:05.700 | - Yeah, I've been engaged in that kind of thinking
01:13:09.100 | quite a bit and thinking about
01:13:12.320 | engineering of consciousness.
01:13:14.560 | I think it's feasible.
01:13:16.200 | I think it's possible in the language that we're using here.
01:13:19.240 | And it's very difficult to reason about a world
01:13:21.720 | when inklings of consciousness can be engineered
01:13:26.800 | into artificial systems.
01:13:29.800 | Not from a philosophical perspective,
01:13:33.440 | but from an engineering perspective,
01:13:35.600 | I believe a good step towards engineering consciousness
01:13:39.100 | is creating engineering the illusion of consciousness.
01:13:44.320 | I'm captivated by our natural predisposition
01:13:50.720 | to anthropomorphize things.
01:13:53.940 | And I think that's what we,
01:13:56.640 | I don't wanna hear from the philosophers,
01:14:00.720 | but I think that's what we kind of do to each other.
01:14:05.060 | - Okay.
01:14:05.900 | - That consciousness is created socially.
01:14:10.900 | That much of the power of consciousness
01:14:16.040 | is in the social interaction.
01:14:18.040 | I create your consciousness, no,
01:14:21.240 | I create my consciousness by having interacted with you.
01:14:24.480 | And that's the display of consciousness.
01:14:27.760 | It's the same as the display of emotion.
01:14:30.000 | Emotion is created through communication.
01:14:32.880 | Language is created through its use.
01:14:36.140 | And then we somehow, humans kinda,
01:14:38.060 | especially philosophers,
01:14:39.460 | the heart problem of consciousness,
01:14:40.840 | really wanna believe that we possess this thing
01:14:44.500 | that's like there's an elf sitting there
01:14:49.500 | with a hat that's named taxes consciousness,
01:14:52.700 | and they're feeding this subjective experience to us,
01:14:57.620 | as opposed to it actually being an illusion
01:15:02.180 | that we construct to make social communication
01:15:04.160 | more effective.
01:15:05.480 | And so I think if you focus on
01:15:08.560 | creating the illusion of consciousness,
01:15:10.600 | you can create some very fulfilling experiences in software.
01:15:14.840 | And so that to me is a compelling space of ideas to explore.
01:15:18.760 | - I agree with you.
01:15:19.600 | And I think going back to our experience together
01:15:21.680 | with brain interfaces on,
01:15:23.280 | you could imagine if we get to a certain level of maturity.
01:15:26.080 | So first let's take the inverse of this.
01:15:28.760 | So you and I text back and forth,
01:15:30.720 | and we're sending each other emojis.
01:15:33.140 | That has a certain amount of information transfer rate
01:15:37.120 | as we're communicating with each other.
01:15:39.300 | And so in our communication with people via email
01:15:41.940 | and text and whatnot,
01:15:42.780 | we've taken the bandwidth of human interaction,
01:15:46.900 | the information transfer rate, and we've reduced it.
01:15:49.940 | We have less social cues.
01:15:51.500 | We have less information to work with.
01:15:53.020 | There's a lot more opportunity for misunderstanding.
01:15:55.320 | So that is altering the conscious experience
01:15:57.500 | between two individuals.
01:15:59.420 | And if we add brain interfaces to the equation,
01:16:01.600 | let's imagine now we amplify
01:16:03.400 | the dimensionality of our communications.
01:16:05.560 | That to me is what you're talking about,
01:16:07.240 | which is consciousness engineering.
01:16:09.100 | Perhaps I understand you with dimensions.
01:16:13.360 | So maybe I understand your,
01:16:15.160 | when you look at the cup
01:16:16.400 | and you experience that happiness,
01:16:17.620 | you can tell me you're happy.
01:16:18.620 | And I then do theory of mind and say,
01:16:20.520 | I can imagine what it might be like to be Lex
01:16:23.160 | and feel happy about seeing this cup.
01:16:25.320 | But if the interface could then quantify
01:16:26.760 | and give me a 50 vector space model
01:16:29.240 | and say, this is the version of happiness
01:16:31.560 | that Lex is experiencing as he looks at this cup,
01:16:34.240 | then it would allow me potentially
01:16:36.440 | to have much greater empathy for you
01:16:38.040 | and understand you as a human.
01:16:39.160 | This is how you experience joy,
01:16:41.880 | which is entirely unique from how I experienced joy,
01:16:44.160 | even though we assumed ahead of time
01:16:46.080 | that we're having some kind of similar experience.
01:16:48.120 | But I agree with you that
01:16:50.120 | we do consciousness engineering today in everything we do.
01:16:52.820 | When we talk to each other, when we're building products
01:16:55.760 | and that we're entering into a stage
01:16:59.780 | where it will be much more methodical
01:17:02.200 | and quantitative based and computational
01:17:06.200 | in how we go about doing it,
01:17:07.280 | which to me I find encouraging
01:17:09.120 | because I think it creates better guardrails
01:17:12.320 | to create ethical systems versus right now,
01:17:18.780 | I feel like it's really a wild, wild west
01:17:21.200 | on how these interactions are happening.
01:17:23.440 | - Yeah, and it's funny you focus on human to human,
01:17:25.600 | but that this kind of data enables
01:17:27.800 | human to machine interaction,
01:17:30.480 | which is what we're kind of talking about
01:17:33.560 | when we say engineering consciousness.
01:17:35.460 | - And that will happen, of course,
01:17:39.080 | let's flip that on its head.
01:17:40.560 | Right now we're putting humans as the central node.
01:17:44.560 | What if we gave GPT-3 a bunch of human brains
01:17:48.640 | and said, "Hey, GPT-3, learn some manners when you speak
01:17:54.120 | and run your algorithms on humans' brains
01:17:56.560 | and see how they respond so you can be polite
01:18:00.380 | and so that you can be friendly
01:18:01.760 | and so that you can be conversationally appropriate."
01:18:04.680 | But to inverse it, to give our machines a training set
01:18:09.280 | in real time with closed loop feedback
01:18:11.840 | so that our machines were better equipped
01:18:14.700 | to find their way through our society
01:18:19.700 | in polite and kind and appropriate ways.
01:18:22.400 | - I love that idea, or better yet,
01:18:24.540 | teach it some, have it read the finding documents
01:18:29.540 | and have it visit Austin and Texas
01:18:32.120 | and so that when you tell it,
01:18:34.540 | "Why don't you learn some manners?"
01:18:36.600 | GPT-3 learns to say no.
01:18:38.720 | It learns what it means to be free
01:18:43.600 | and a sovereign individual.
01:18:45.320 | So it depends, so it depends
01:18:47.880 | what kind of version of GPT-3 you want.
01:18:50.200 | One that's free, one that behaves well
01:18:52.240 | with the social. - Be met by a revolution.
01:18:54.760 | - You want a socialist GPT-3,
01:18:57.680 | you want an anarchist GPT-3,
01:19:00.200 | you want a polite, like you take it home
01:19:03.320 | to visit mom and dad GPT-3,
01:19:05.640 | and you want like party in like Vegas
01:19:08.440 | to a strip club GPT-3, you want all flavors.
01:19:11.200 | - And then you've gotta have goal alignment
01:19:13.240 | between all those.
01:19:14.200 | - Yeah, they don't want to manipulate each other for sure.
01:19:20.880 | - So that's, I mean, you kind of spoke to ethics.
01:19:24.640 | One of the concerns that people have
01:19:26.400 | in this modern world, the digital data,
01:19:29.440 | is that of privacy and security.
01:19:32.000 | But privacy, you know, they're concerned
01:19:34.360 | that when they share data,
01:19:35.920 | it's the same thing with you
01:19:38.480 | when we trust other human beings
01:19:40.840 | in being fragile and revealing something
01:19:44.200 | that we're vulnerable about.
01:19:48.200 | There's a leap of faith, there's a leap of trust
01:19:51.960 | that that's going to be just between us,
01:19:54.240 | there's a privacy to it.
01:19:55.360 | And then the challenge is when you're in the digital space,
01:19:58.280 | then sharing your data with companies
01:20:01.680 | that use that data for advertisement
01:20:03.680 | and all those kinds of things,
01:20:05.080 | there's a hesitancy to share that much data,
01:20:08.440 | to share a lot of deep personal data.
01:20:10.720 | And if you look at brain data,
01:20:12.800 | that feels a whole lot like it's richly,
01:20:16.040 | deeply personal data.
01:20:17.800 | So how do you think about privacy
01:20:20.080 | with this kind of ocean of data?
01:20:22.160 | - I think we got off to a wrong start with the internet,
01:20:26.120 | where the basic rules of play
01:20:29.600 | for the company that be was,
01:20:32.960 | if you're a company, you can go out
01:20:36.000 | and get as much information on a person
01:20:39.120 | as you can find without their approval.
01:20:42.960 | And you can also do things to induce them
01:20:46.360 | to give you as much information.
01:20:48.120 | And you don't need to tell them what you're doing with it.
01:20:51.000 | You can do anything on the backside,
01:20:52.360 | you can make money on it,
01:20:53.880 | but the game is who can acquire the most information
01:20:56.840 | and devise the most clever schemes to do it.
01:21:00.440 | That was a bad starting place.
01:21:01.960 | And so we are in this period
01:21:05.760 | where we need to correct for that.
01:21:07.600 | And we need to say, first of all,
01:21:09.800 | the individual always has control over their data.
01:21:14.800 | It's not a free for all.
01:21:15.880 | It's not like a game of hungry hippo,
01:21:17.560 | but they can just go out and grab as much as they want.
01:21:20.240 | So for example, when your brain data was recorded today,
01:21:22.520 | the first thing we did in the kernel app
01:21:24.280 | was you have control over your data.
01:21:27.760 | And so it's individual consent, it's individual control,
01:21:31.360 | and then you can build up on top of that.
01:21:33.320 | But it has to be based upon some clear rules of play.
01:21:36.520 | If everyone knows what's being collected,
01:21:39.240 | they know what's being done with it,
01:21:40.560 | and the person has control over it.
01:21:42.200 | - So transparency and control.
01:21:43.800 | So everybody knows, what does control look like?
01:21:46.960 | My ability to delete the data if I want.
01:21:48.920 | - Yeah, delete it and to know who it's being shared with
01:21:51.080 | under what terms and conditions.
01:21:53.240 | We haven't reached that level of sophistication
01:21:55.800 | with our products of, if you say, for example,
01:21:59.840 | hey, Spotify, please give me a customized playlist
01:22:04.400 | according to my neuron.
01:22:06.320 | You could say, you can have access
01:22:09.040 | to this vector space model,
01:22:10.800 | but only for this duration of time.
01:22:12.400 | And then you've got to delete it.
01:22:15.280 | We haven't gotten there to that level of sophistication,
01:22:16.960 | but these are ideas we need to start talking about
01:22:19.360 | of how would you actually structure permissions?
01:22:22.240 | - Yeah.
01:22:23.080 | - And I think it creates a much more stable set
01:22:26.280 | for society to build where we understand the rules of play
01:22:31.120 | and people aren't vulnerable to being taken advantage.
01:22:33.880 | It's not fair for an individual to be taken advantage of
01:22:39.760 | without their awareness, with some other practice
01:22:42.240 | that some company is doing for their sole benefit.
01:22:44.640 | And so hopefully we are going through a process now
01:22:46.680 | where we're correcting for these things
01:22:48.240 | and that it can be an economy wide shift
01:22:53.240 | that, because really these are,
01:22:56.160 | these are fundamentals we need to have in place.
01:23:01.280 | - It's kind of fun to think about like in Chrome,
01:23:05.000 | when you install an extension or like install an app,
01:23:07.800 | it's ask you like what permissions you're willing to give
01:23:10.560 | and be cool if in the future,
01:23:11.680 | it says like you can have access to my brain data.
01:23:15.120 | - I mean, it's not unimaginable in the future
01:23:20.400 | that the big technology companies have built a business
01:23:24.440 | based upon acquiring data about you
01:23:26.200 | that they can then create a view to model of you
01:23:27.720 | and sell that predictability.
01:23:29.480 | And so it's not unimaginable that you will create
01:23:31.520 | with like Kernel device, for example,
01:23:33.920 | a more reliable predictor of you than they could.
01:23:37.280 | And that they're asking you for permission
01:23:39.200 | to complete their objectives.
01:23:40.240 | And you're the one that gets to negotiate that with them
01:23:42.640 | and say, sure.
01:23:44.000 | But it's not unimaginable that might be the case.
01:23:47.760 | - So there's a guy named D-Law Musk
01:23:51.760 | and he has a company,
01:23:52.800 | one of the many companies called Neuralink
01:23:55.400 | that has, that's also excited about the brain.
01:23:59.200 | So it'd be interesting to hear your kind of opinions
01:24:01.480 | about a very different approach that's invasive,
01:24:03.840 | that requires surgery,
01:24:05.520 | that implants a data collection device in the brain.
01:24:09.080 | How do you think about the difference
01:24:10.240 | between Kernel and Neuralink in the approaches
01:24:12.760 | of getting that stream of brain data?
01:24:17.480 | - Elon and I spoke about this a lot early on.
01:24:20.480 | We met up, I had started Kernel
01:24:23.040 | and he had an interest in brain interfaces as well.
01:24:25.600 | And we explored doing something together,
01:24:28.360 | him joining Kernel.
01:24:29.200 | And ultimately it wasn't the right move.
01:24:31.920 | And so he started Neuralink
01:24:33.200 | and I continued building Kernel.
01:24:35.360 | But it was interesting because we were both
01:24:39.160 | at this very early time where it wasn't certain
01:24:44.160 | if there was a path to pursue,
01:24:47.720 | if now was the right time to do something
01:24:51.080 | and then the technological choice of doing that.
01:24:53.080 | And so we were both,
01:24:55.000 | our starting point was looking at invasive technologies.
01:24:58.240 | And I was building invasive technology at the time.
01:25:01.400 | That's ultimately where he's gone.
01:25:04.920 | A little less than a year after Elon and I were engaged,
01:25:08.840 | I shifted Kernel to do non-invasive.
01:25:11.480 | And we had this neuroscientist come to Kernel.
01:25:15.200 | We were talking about,
01:25:16.040 | he had been doing neural surgery for 30 years,
01:25:17.680 | one of the most respected neuroscientists in the US.
01:25:20.160 | And we brought him to Kernel to figure out
01:25:21.900 | the ins and outs of his profession.
01:25:23.520 | And at the very end of our three hour conversation,
01:25:26.160 | he said, "You know, every 15 or so years,
01:25:30.240 | "a new technology comes along that changes everything."
01:25:34.760 | He said, "It's probably already here.
01:25:37.280 | "You just can't see it yet."
01:25:38.680 | And my jaw dropped.
01:25:40.800 | I thought, 'cause I had spoken to Bob Greenberg
01:25:44.560 | who had built a second site first on the optical nerve
01:25:48.240 | and then he did an array on the optical cortex.
01:25:53.240 | And then I also became friendly with NeuralPace
01:25:57.240 | who does the implants for seizure detection and remediation.
01:26:01.520 | And I saw in their eyes what it was like
01:26:06.520 | to take something through,
01:26:08.960 | an implantable device through for a 15 year run.
01:26:11.720 | They initially thought it's seven years
01:26:13.760 | and it ended up being 15 years
01:26:15.120 | and they thought it'd be a hundred million,
01:26:16.200 | it was 300 or 400 million.
01:26:18.960 | And I really didn't want to build invasive technology.
01:26:22.260 | It was the only thing that appeared to be possible.
01:26:25.680 | But then once I spun up an internal effort
01:26:28.620 | to start looking at non-invasive options,
01:26:30.380 | we said, "Is there something here?
01:26:31.920 | "Is there anything here that again has the characteristics
01:26:34.620 | "of it has the high quality data,
01:26:37.220 | "it could be low cost, it could be accessible.
01:26:39.160 | "Could it make brain interfaces mainstream?"
01:26:42.200 | And so I did a bet the company move.
01:26:43.860 | We shifted from invasive to non-invasive.
01:26:47.520 | - So the answer is yes to that.
01:26:49.040 | There is something there, it's possible.
01:26:51.600 | - The answer is we'll see.
01:26:52.700 | We've now built both technologies
01:26:55.100 | and they're now, you experienced one of them today.
01:26:58.100 | We were applying, we're now deploying it.
01:27:02.560 | So we're trying to figure out what value is really there.
01:27:04.640 | But I'd say it's really too early to express confidence.
01:27:07.800 | I think it's too early to assess
01:27:12.680 | which technological choice is the right one
01:27:17.600 | on what timescales.
01:27:19.240 | - Yeah, timescales are really important here.
01:27:20.640 | - Very important.
01:27:21.480 | Because if you look at the, like on the invasive side,
01:27:24.280 | there's so much activity going on right now
01:27:27.000 | of less invasive techniques to get at the neuron firings,
01:27:32.000 | which what Neuralink is building,
01:27:36.840 | it's possible that in 10, 15 years,
01:27:39.840 | when they're scaling that technology,
01:27:40.960 | other things have come along,
01:27:42.480 | and you'd much rather do that.
01:27:44.020 | That thing starts to clock again.
01:27:45.720 | It may not be the case.
01:27:47.480 | It may be the case that Neuralink has properly chosen
01:27:49.760 | the right technology and that that's exactly
01:27:52.280 | what they wanna be, totally possible.
01:27:53.940 | And it's also possible that the paths we've chosen
01:27:55.720 | for non-invasive fall short for a variety of reasons.
01:27:58.680 | It's just, it's unknown.
01:28:00.640 | And so right now, the two technologies we chose,
01:28:03.240 | the analogy I'd give you to create a baseline
01:28:06.360 | of understanding is, if you think of it
01:28:09.040 | like the internet in the '90s,
01:28:11.360 | the internet became useful when people could do
01:28:13.920 | a dial-up connection.
01:28:15.800 | And then the paid, and then as bandwidth increased,
01:28:19.360 | so did the utility of that connection,
01:28:21.120 | and so did the ecosystem improve.
01:28:22.920 | And so if you say what, kernel flow is going to give you
01:28:27.920 | a full screen on the picture of information,
01:28:30.720 | but as you're gonna be watching a movie,
01:28:32.480 | but the image is going to be blurred
01:28:34.600 | and the audio is gonna be muffled.
01:28:36.300 | So it has a lower resolution of coverage.
01:28:39.780 | Kernel flux, our MEG technology,
01:28:43.480 | is gonna give you the full movie in 1080p.
01:28:46.140 | And Neuralink is gonna give you a circle on the screen
01:28:52.720 | of 4K.
01:28:53.560 | And so each one has their pros and cons,
01:28:57.760 | and it's give and take.
01:28:59.920 | And so the decision I made with kernel was
01:29:03.600 | that these two technologies, flux and flow,
01:29:06.280 | were basically the answer for the next seven years.
01:29:10.560 | And they would give rise to the ecosystem,
01:29:12.640 | which would become much more valuable
01:29:14.000 | than the hardware itself,
01:29:15.420 | and that we would just continue to improve
01:29:16.620 | on the hardware over time.
01:29:18.480 | And you know, it's early days, so--
01:29:20.880 | - It's kind of fascinating to think about that.
01:29:23.520 | You don't, it's very true that you don't know.
01:29:27.120 | Both paths are very promising.
01:29:30.880 | And it's like 50 years from now,
01:29:36.240 | we will look back and maybe not even remember one of them.
01:29:39.640 | And the other one might change the world.
01:29:43.120 | It's so cool how technology is.
01:29:44.840 | I mean, that's what entrepreneurship is.
01:29:47.000 | It's like, it's the zeroth principle.
01:29:50.040 | It's like you're marching ahead into the darkness,
01:29:52.360 | into the fog, not knowing.
01:29:54.720 | - It's wonderful to have someone else
01:29:56.360 | out there with us doing this.
01:29:57.920 | Because if you look at brain interfaces,
01:30:00.520 | anything that's off the shelf right now is inadequate.
01:30:06.520 | It's had its run for a couple of decades.
01:30:09.560 | It's still in hacker communities.
01:30:11.240 | It hasn't gone to the mainstream.
01:30:12.880 | The room-sized machines are on their own path.
01:30:19.040 | But there is no answer right now
01:30:20.560 | of bringing brain interfaces mainstream.
01:30:23.600 | And so it both, both they and us,
01:30:27.000 | we've both spent over $100 million.
01:30:29.600 | And that's kind of what it takes to have a go at this.
01:30:32.720 | 'Cause you need to build full stack.
01:30:34.920 | I mean, at Kernel, we are from the photon and the atom
01:30:38.560 | through the machine learning.
01:30:40.020 | We have just under 100 people,
01:30:41.520 | I think it's something like 36, 37 PhDs
01:30:45.100 | in these specialties, these areas,
01:30:47.440 | that there's only a few people in the world
01:30:48.680 | who have these abilities.
01:30:50.240 | And that's what it takes to build next generation,
01:30:53.200 | to make an attempt at breaking into brain interfaces.
01:30:57.560 | And so we'll see over the next couple of years,
01:30:58.940 | whether it's the right time
01:31:00.080 | or whether we were both too early
01:31:01.680 | or whether something else comes along in seven to 10 years,
01:31:04.440 | which is the right thing that brings it mainstream.
01:31:08.040 | - So you see Elon as a kind of competitor
01:31:11.080 | or a fellow traveler along the path of uncertainty,
01:31:16.080 | or both?
01:31:17.160 | - Fellow traveler.
01:31:19.000 | It's like at the beginning of the internet
01:31:21.800 | is how many companies are going to be invited
01:31:25.480 | to this new ecosystem?
01:31:27.480 | Like an endless number.
01:31:30.520 | Because if you think the hardware just starts the process.
01:31:35.520 | And so, okay, back to your initial example,
01:31:37.960 | if you take the Fitbit, for example,
01:31:39.400 | you say, okay, now I can get measurements on the body.
01:31:42.840 | And what do we think the ultimate value
01:31:44.840 | of this device is going to be?
01:31:45.800 | What is the information transfer rate?
01:31:47.760 | And they were in the market for a certain duration of time
01:31:50.120 | and Google bought them for two and a half billion dollars.
01:31:53.840 | They didn't have ancillary value add.
01:31:55.600 | There weren't people building on top of the Fitbit device.
01:31:58.520 | They also didn't have increased insight
01:32:01.240 | with additional data streams.
01:32:02.400 | So it was really just the device.
01:32:04.240 | If you look, for example, at Apple
01:32:05.840 | and the device they sell,
01:32:07.120 | you have value in the device that someone buys,
01:32:09.160 | but also you have everyone who's building on top of it.
01:32:11.760 | So you have this additional ecosystem value.
01:32:13.440 | And then you have additional data streams that come in,
01:32:15.600 | which increase the value of the product.
01:32:17.480 | And so if you say, if you look at the hardware
01:32:20.600 | as the instigator of value creation,
01:32:24.440 | over time, what we've built may constitute
01:32:27.360 | five or 10% of the value of the overall ecosystem.
01:32:29.800 | And that's what we really care about.
01:32:30.960 | What we're trying to do is kickstart
01:32:33.280 | the mainstream adoption of quantifying the brain.
01:32:38.280 | And the hardware just opens the door to say
01:32:41.840 | what kind of ecosystem could exist.
01:32:45.000 | And that's why the examples are so relevant
01:32:47.520 | of the things you've outlined in your life.
01:32:50.120 | I hope those things, the books people write,
01:32:53.640 | the experiences people build,
01:32:54.960 | the conversations you have,
01:32:56.480 | your relationship with your AI systems,
01:32:59.080 | I hope those all are feeding on the insights
01:33:02.040 | built upon this ecosystem we've created to better your life.
01:33:05.080 | And so that's the thinking behind it.
01:33:07.080 | Again, with the Drake equation
01:33:08.560 | being the underlying driver of value.
01:33:11.880 | And the people at Kernel have joined
01:33:14.960 | not because we have certainty of success,
01:33:19.000 | but because we find it to be
01:33:21.520 | the most exhilarating opportunity we could ever pursue
01:33:24.080 | in this time to be alive.
01:33:26.520 | - You founded the payment system Braintree in 2007
01:33:32.080 | that acquired Venmo in 2012.
01:33:35.160 | And then in that same year was acquired by PayPal,
01:33:39.120 | which is now eBay.
01:33:40.400 | Can you tell me the story of the vision
01:33:45.400 | and the challenge of building an online payment system
01:33:48.120 | and just building a large successful business in general?
01:33:51.520 | - I discovered payments by accident.
01:33:54.320 | As I was, when I was 21,
01:33:56.600 | I just returned from Ecuador
01:34:00.000 | living among extreme poverty for two years.
01:34:02.720 | And I came back to the US and I was shocked by the opulence
01:34:06.800 | and the of the United States.
01:34:09.480 | And I just thought this is, I couldn't believe it.
01:34:12.040 | And I decided I wanted to try to spend my life
01:34:14.360 | helping others.
01:34:16.720 | Like that was the life objective
01:34:18.640 | that I thought was worthwhile to pursue
01:34:20.360 | versus making money and whatever the case may be
01:34:23.080 | for its own right.
01:34:24.520 | And so I decided in that moment
01:34:25.600 | that I was going to try to make enough money
01:34:28.680 | by the age of 30 to never have to work again.
01:34:32.200 | And then with some abundance of money,
01:34:33.960 | I could then choose to do things
01:34:36.800 | that might be beneficial to others
01:34:38.920 | but may not meet the criteria of being
01:34:41.640 | a standalone business.
01:34:42.480 | And so in that process, I started a few companies,
01:34:46.200 | had some small successes, had some failures.
01:34:49.800 | In one of the endeavors, I was up to my eyeballs in debt.
01:34:53.040 | Things were not going well
01:34:54.080 | and I needed a part-time job to pay my bills.
01:34:57.880 | And so I, one day I saw in the paper in Utah
01:35:02.880 | where I was living, the 50 richest people in Utah.
01:35:05.760 | And I emailed each one of their assistants and said,
01:35:08.240 | you know, I'm young, I'm resourceful,
01:35:10.440 | I'll do anything, I'll just wanna, I'm entrepreneurial.
01:35:13.960 | I tried to get a job that would be flexible
01:35:15.840 | and no one responded.
01:35:17.760 | And then I interviewed at a few dozen places.
01:35:21.600 | Nobody would even give me the time of day.
01:35:23.640 | Like, it wouldn't want to take me seriously.
01:35:25.720 | And so finally, it was on monster.com
01:35:28.040 | that I saw this job posting for credit card sales
01:35:31.280 | door to door.
01:35:32.120 | Commission. (laughs)
01:35:34.720 | - I did not know this story.
01:35:36.080 | This is great.
01:35:36.920 | - I love the head drop.
01:35:38.000 | That's exactly right.
01:35:39.000 | So it was-
01:35:39.840 | - The low points to which we're going like.
01:35:43.360 | - So I responded and, you know,
01:35:45.760 | the person made an attempt at suggesting
01:35:48.800 | that they had some kind of standards
01:35:50.320 | that they would consider hiring.
01:35:52.000 | But it's kind of like, if you could fog a mirror,
01:35:55.000 | like come and do this because it's 100% commission.
01:35:57.240 | And so I started walking up and down the street
01:35:59.200 | in my community, selling credit card processing.
01:36:02.440 | And so what you learn immediately in doing that is
01:36:05.360 | if you walk into a business,
01:36:07.680 | first of all, the business owner is typically there.
01:36:10.360 | And you walk in the door and they can tell
01:36:12.840 | by how you're dressed or how you walk,
01:36:14.040 | whatever their pattern recognition is.
01:36:16.440 | And they just hate you immediately.
01:36:17.640 | It's like, stop wasting my time.
01:36:18.960 | I really am trying to get stuff done.
01:36:20.360 | I don't want us to a sales pitch.
01:36:21.800 | And so you have to overcome the initial get out.
01:36:25.200 | And then once you engage,
01:36:27.360 | when you say the word credit card processing,
01:36:29.880 | the person's like, I already hate you
01:36:31.240 | because I have been taken advantage of dozens of times
01:36:34.200 | because you all are weasels.
01:36:37.320 | And so I had to figure out an algorithm
01:36:39.360 | to get past all those different conditions
01:36:41.080 | 'cause I was still working on my other startup
01:36:43.320 | for the majority of my time.
01:36:44.480 | So I was doing this part time.
01:36:45.960 | And so I figured out that the industry really was
01:36:50.040 | built on people, on deceit.
01:36:55.040 | Basically people promising things that were not reality.
01:36:58.480 | And so I'd walk into a business, I'd say, look,
01:37:00.840 | I would give you a hundred dollars.
01:37:01.760 | I'd put a hundred dollar bill and say,
01:37:02.920 | I'll give you a hundred dollars
01:37:04.000 | for three minutes of your time.
01:37:05.440 | If you don't say yes to what I'm saying,
01:37:06.760 | I'll give you a hundred dollars.
01:37:08.400 | And then he usually crack a smile and say, okay,
01:37:10.680 | look what he got for me, son.
01:37:12.640 | And so I'd sit down, I just opened my book and I'd say,
01:37:15.000 | here's the credit card industry.
01:37:16.280 | Here's how it works.
01:37:17.120 | Here are the players.
01:37:17.960 | Here's what they do.
01:37:18.780 | Here's how they deceive you.
01:37:20.520 | Here's what I am.
01:37:21.360 | I'm no different than anyone else.
01:37:22.640 | It's like, you're gonna process your credit card.
01:37:24.040 | You're gonna get the money in the account.
01:37:25.040 | You're just gonna get a clean statement.
01:37:26.360 | You're gonna have someone who answers the call
01:37:28.280 | when someone asks and just like the basic, like you're okay.
01:37:31.760 | And people started saying yes.
01:37:32.800 | And then of course I went to the next business
01:37:34.120 | and be like, you know, Joe and Susie
01:37:36.280 | and whoever said yes to,
01:37:37.520 | and so I built a social proof structure
01:37:39.400 | and I became the number one salesperson
01:37:42.400 | out of 400 people nationwide doing this.
01:37:45.400 | And I worked, you know, half time
01:37:47.080 | still doing this other startup.
01:37:48.700 | And--
01:37:49.540 | - That's a brilliant strategy, by the way.
01:37:51.080 | It's very well, very well strategized and executed.
01:37:54.840 | - I did it for nine months.
01:37:57.640 | And at the time my customer base was making,
01:38:00.800 | was generating around, I think it was six,
01:38:03.680 | if I remember correctly, $62,504 a month
01:38:07.480 | were the overall revenues.
01:38:08.520 | I thought, wow, that's amazing.
01:38:09.920 | If I built that as my own company,
01:38:13.240 | I would just make $62,000 a month of income passively
01:38:16.880 | with these merchants processing credit cards.
01:38:19.280 | So I thought, hmm.
01:38:20.520 | And so that's when I thought I'm gonna create a company.
01:38:23.720 | And so then I started Braintree
01:38:25.860 | and the idea was the online world was broken
01:38:29.920 | because PayPal had been acquired by eBay
01:38:34.920 | around, I think, 1999 or 2000.
01:38:38.160 | And eBay had not innovated much with PayPal.
01:38:39.840 | So it basically sat still for seven years
01:38:42.880 | as the software world moved along.
01:38:45.160 | And then Authorize.net was also a company
01:38:46.840 | that was relatively stagnant.
01:38:47.660 | So you basically had software engineers
01:38:49.840 | who wanted modern payment tools,
01:38:51.860 | but there were none available for them.
01:38:53.540 | And so they just dealt with software they didn't like.
01:38:55.060 | And so with Braintree, I thought the entry point
01:38:58.040 | is to build software that engineers will love.
01:39:00.400 | And if we can find the entry point via software
01:39:02.160 | and make it easy and beautiful and just a magical experience
01:39:05.400 | and then provide customer service on top of that,
01:39:06.960 | that was easy, that would be great.
01:39:08.720 | What I was really going after though, it was PayPal.
01:39:11.560 | They were the only company in payments making money
01:39:14.080 | because they had the relationship with eBay early on.
01:39:19.820 | People created a PayPal account.
01:39:22.240 | They'd fund their account with their checking account
01:39:24.080 | versus their credit cards.
01:39:25.480 | And then when they'd use PayPal to pay a merchant,
01:39:28.320 | PayPal had a cost of payment of zero
01:39:31.000 | versus if you have coming from a credit card,
01:39:33.880 | you have to pay the bank the fees.
01:39:35.120 | So PayPal's margins were 3% on a transaction
01:39:39.920 | versus a typical payments company,
01:39:41.600 | which may be a nickel or a penny or a dime
01:39:43.360 | or something like that.
01:39:44.920 | And so I knew PayPal really was the model to replicate,
01:39:48.320 | but a bunch of companies had tried to do that.
01:39:50.400 | They tried to come in and build a two-sided marketplace.
01:39:52.580 | So get consumers to fund the checking account
01:39:55.280 | and merchants to accept it,
01:39:56.640 | but they'd all failed because building a two-sided
01:39:58.640 | marketplace is very hard at the same time.
01:40:01.760 | So my plan was I'm going to build a company
01:40:04.520 | and get the best merchants in the whole world
01:40:06.840 | to use our service.
01:40:08.360 | Then in year five, I'm going to acquire
01:40:11.560 | a consumer payments company
01:40:12.720 | and I'm going to bring the two together.
01:40:15.720 | - So focus on the merchant side.
01:40:18.080 | - Exactly.
01:40:18.920 | - And then get the payments company
01:40:19.840 | that does the customer, the whatever,
01:40:22.480 | the other side of it.
01:40:24.240 | - Yeah, this is the plan I presented
01:40:26.000 | when I was at the University of Chicago.
01:40:28.440 | And weirdly it happened exactly like that.
01:40:32.360 | So four years in our customer base included Uber,
01:40:36.160 | Airbnb, GitHub, 37signals, not Basecamp.
01:40:40.240 | We had a fantastic collection of companies
01:40:43.640 | that represented the fastest growing,
01:40:47.200 | some of the fastest growing tech companies in the world.
01:40:49.320 | And then we met up with Venmo
01:40:52.080 | and they had done a remarkable job in building product.
01:40:55.080 | There's then something very counterintuitive,
01:40:56.840 | which is make public your private financial transactions
01:40:59.440 | with people previously thought were something
01:41:01.800 | that should be hidden from others.
01:41:04.000 | And we acquired Venmo.
01:41:05.400 | And at that point we now had,
01:41:08.800 | we replicated the model
01:41:11.000 | 'cause now people could fund their Venmo account
01:41:12.480 | with their checking account, keep money in the account.
01:41:15.000 | And then you could just plug Venmo in as a form of payment.
01:41:17.560 | And so I think PayPal saw that,
01:41:19.920 | that we were getting the best merchants in the world.
01:41:22.760 | We had people using Venmo.
01:41:25.480 | They were both the up and coming millennials at the time
01:41:28.160 | who had so much influence online.
01:41:30.480 | And so they came in and offered us an attractive number.
01:41:34.880 | And my goal was not to build
01:41:39.280 | the biggest payments company in the world.
01:41:40.960 | It wasn't to try to climb the Forbes billionaire list.
01:41:44.160 | It was, the objective was I want to earn enough money
01:41:48.760 | so that I can basically dedicate my attention
01:41:52.640 | to doing something that could potentially be useful
01:41:56.240 | on a society-wide scale.
01:41:58.800 | And more importantly, that could be considered
01:42:01.880 | to be valuable from the vantage point of 2050, 2100 and 2500.
01:42:06.880 | So thinking about it on a few hundred year timescale.
01:42:13.120 | And there was a certain amount of money I needed to do that.
01:42:17.040 | So I didn't require the permission of anybody to do that.
01:42:21.000 | And so that, what PayPal offered was sufficient
01:42:23.080 | for me to get that amount of money to basically have a go.
01:42:25.760 | And that's when I set off to survey everything
01:42:30.760 | I could identify in existence to say
01:42:33.320 | of anything in the entire world I could do,
01:42:36.040 | what one thing could I do that would actually have
01:42:38.480 | the highest value potential for the species?
01:42:42.840 | And so it took me a little while to arrive at Brainerd Faces,
01:42:45.320 | but payments in themselves are revolutionary technologies
01:42:50.320 | that can change the world.
01:42:54.640 | Like let's not sort of, let's not forget that too easily.
01:42:59.640 | I mean, obviously you know this,
01:43:03.160 | but there's quite a few lovely folks
01:43:08.160 | who are now fascinated with the space of cryptocurrency.
01:43:13.680 | And where payments are very much connected to this,
01:43:17.580 | but in general, just money.
01:43:19.200 | And many of the folks I've spoken with,
01:43:21.760 | they also kind of connect that to not just purely
01:43:25.240 | financial discussions, but philosophical
01:43:27.480 | and political discussions.
01:43:29.160 | And they see Bitcoin as a way, almost as activism,
01:43:34.160 | almost as a way to resist the corruption of centralized,
01:43:39.040 | of centers of power.
01:43:40.760 | And sort of basically in the 21st century,
01:43:42.800 | decentralizing control, whether that's Bitcoin
01:43:45.480 | or other cryptocurrencies, they see that's one possible way
01:43:50.200 | to give power to those that live in regimes
01:43:54.560 | that are corrupt or are not respectful of human rights
01:43:58.280 | and all those kinds of things.
01:43:59.780 | What's your sense, just all your expertise with payments
01:44:04.000 | and seeing how that changed the world,
01:44:05.640 | what's your sense about the lay of the land
01:44:09.720 | for the future of Bitcoin or other cryptocurrencies
01:44:12.480 | in the positive impact it may have on the world?
01:44:15.800 | - Yeah, and to be clear, my communication wasn't meant
01:44:20.120 | to minimize payments or to denigrate it in any way.
01:44:23.520 | It was an attempted communication
01:44:26.480 | that when I was surveying the world,
01:44:29.360 | it was an algorithm of what could I individually do?
01:44:35.260 | So there are things that exist that have a lot of potential
01:44:39.640 | that can be done, and then there's a filtering
01:44:43.160 | of how many people are qualified to do this given thing.
01:44:46.440 | And then there's a further characterization
01:44:48.200 | that can be done of, okay, given the number of qualified
01:44:50.280 | people, will somebody be a unique out performer
01:44:55.280 | of that group to make something truly impossible
01:44:57.960 | to be something done that otherwise couldn't get done?
01:44:59.520 | So there's a process of assessing
01:45:02.440 | where can you add unique value in the world.
01:45:04.460 | - And some of that has to do with you being very,
01:45:07.640 | very formal and calculative here,
01:45:09.640 | but some of that is just like what you sense,
01:45:13.640 | like part of that equation is how much passion you sense
01:45:16.560 | within yourself to be able to drive that through,
01:45:19.100 | to discover the impossibilities and make them possible.
01:45:21.520 | - That's right, and so we were a brain tree.
01:45:23.940 | I think we were the first company to integrate Coinbase
01:45:26.880 | into our, I think we were the first payments company
01:45:30.080 | to formally incorporate crypto, if I'm not mistaken.
01:45:35.080 | - For people who are not familiar, Coinbase,
01:45:37.600 | is a place where you can trade cryptocurrencies.
01:45:39.640 | - Yeah, which was one of the only places you could.
01:45:42.080 | So we were early in doing that.
01:45:45.080 | And of course, this was in the year 2013.
01:45:49.080 | So an attorney to go in cryptocurrency land.
01:45:52.400 | I concur with the statement you made of the potential
01:45:57.400 | of the principles underlying cryptocurrencies.
01:46:04.200 | And that many of the things that they're building
01:46:07.640 | in the name of money and of moving value
01:46:12.640 | is equally applicable to the brain
01:46:16.400 | and equally applicable to how the brain interacts
01:46:19.360 | with the rest of the world
01:46:20.840 | and how we would imagine doing goal alignment with people.
01:46:25.600 | So to me, it's a continuous spectrum of possibility.
01:46:29.120 | And your question is isolated on the money.
01:46:32.240 | And I think it just is basically a scaffolding layer
01:46:35.040 | for all of society.
01:46:35.880 | - So you don't see this money
01:46:37.440 | as particularly distinct from-
01:46:38.960 | - I don't.
01:46:39.800 | I think we, at Kernel, we will benefit greatly
01:46:44.560 | from the progress being made in cryptocurrency
01:46:47.120 | because it will be a similar technology stack
01:46:49.000 | we will want to use for many things we want to accomplish.
01:46:51.720 | And so I'm bullish on what's going on
01:46:55.080 | and think it could greatly enhance brain interfaces
01:46:58.840 | and the value of the brain interface ecosystem.
01:47:01.200 | - I mean, is there something you could say about,
01:47:02.920 | first of all, bullish on cryptocurrency versus fiat money?
01:47:05.960 | So do you have a sense that in 21st century,
01:47:08.760 | cryptocurrency will be embraced by governments
01:47:13.040 | and change the face of governments,
01:47:16.740 | the structure of government?
01:47:18.520 | - It's the same way I think about my diet,
01:47:23.520 | where previously it was conscious Brian
01:47:28.700 | looking at foods in certain biochemical states.
01:47:32.260 | Am I hungry?
01:47:33.100 | Am I irritated?
01:47:33.920 | Am I depressed?
01:47:34.760 | And then I choose based upon those momentary windows.
01:47:37.260 | Do I eat at night when I'm fatigued?
01:47:39.460 | Did I have low willpower?
01:47:40.660 | Am I gonna pig out on something?
01:47:42.820 | And the current monetary system
01:47:45.620 | is based upon human conscious decision-making
01:47:48.360 | and politics and power and this whole mess of things.
01:47:51.500 | And what I like about the building blocks
01:47:55.340 | of cryptocurrency is it's methodical, it's structured.
01:47:58.620 | It is accountable, it's transparent.
01:48:02.140 | And so it introduces this scaffolding,
01:48:04.740 | which I think again is the right starting point
01:48:07.620 | for how we think about building next generation
01:48:11.140 | institutions for society.
01:48:13.500 | And that's why I think it's much broader than money.
01:48:16.500 | - So I guess what you're saying is Bitcoin
01:48:18.380 | is the demotion of the conscious mind as well.
01:48:21.300 | In the same way you were talking about diet,
01:48:25.260 | it's like giving less priority to the ups and downs
01:48:29.700 | of any one particular human mind, in this case your own,
01:48:33.020 | and giving more power to the sort of data-driven.
01:48:37.060 | - Yes, yeah, I think that is accurate.
01:48:41.220 | That cryptocurrency is a version of what I would call
01:48:46.220 | my autonomous self that I'm trying to build.
01:48:51.380 | It is an introduction of an autonomous system
01:48:54.980 | of value exchange and the process of value creation
01:48:59.980 | in society, yes, there's similarities.
01:49:04.900 | - So I guess what you're saying is Bitcoin
01:49:06.540 | will somehow help me not pig out at night
01:49:08.700 | or the equivalent of, speaking of diet,
01:49:11.820 | if we could just linger on that topic a little bit.
01:49:15.460 | We already talked about your blog post of I fired myself,
01:49:20.100 | I fired Brian, the evening.
01:49:23.180 | Brian who is too willing to, not making good decisions
01:49:28.180 | for the long-term well-being and happiness
01:49:32.220 | of the entirety of the organism.
01:49:34.860 | Basically you were like pigging out at night.
01:49:36.740 | But it's interesting, 'cause I do the same.
01:49:41.140 | In fact, I often eat one meal a day.
01:49:43.440 | And like I have been this week, actually,
01:49:49.420 | and especially when I travel.
01:49:52.100 | And it's funny that it never occurred to me
01:49:57.100 | to just basically look at the fact
01:50:00.100 | that I'm able to be much smarter about my eating decisions
01:50:03.660 | in the morning and the afternoon than I am at night.
01:50:06.700 | So if I eat one meal a day,
01:50:08.180 | why not eat that one meal a day in the morning?
01:50:11.360 | Like I'm not, it never occurred to me.
01:50:16.020 | This revolutionary.
01:50:17.060 | (Lex laughing)
01:50:18.660 | Until you've outlined that.
01:50:21.220 | So maybe, can you give some details in what,
01:50:24.260 | this is just you, this is one person, Brian,
01:50:27.460 | arrived at a particular thing that they do.
01:50:29.780 | But it's fascinating to kinda look
01:50:32.540 | at this one particular case study.
01:50:34.080 | So what works for you, diet-wise?
01:50:36.340 | What's your actual diet?
01:50:37.700 | What do you eat?
01:50:38.780 | How often do you eat?
01:50:40.220 | - My current protocol is basically the result
01:50:44.220 | of thousands of experiments and decision-making.
01:50:48.780 | So I do this every 90 days.
01:50:50.820 | I do the tests, I do the cycle-throughs,
01:50:53.460 | then I measure again, and then I'm measuring all the time.
01:50:56.260 | And so what I, I, of course,
01:50:58.300 | I'm optimizing for my biomarkers.
01:51:00.340 | I want perfect cholesterol,
01:51:01.780 | and I want perfect blood glucose levels,
01:51:03.700 | and perfect DNA methylation processes.
01:51:07.840 | I also want perfect sleep.
01:51:12.440 | And so, for example, recently, in the past two weeks,
01:51:14.900 | my resting heart rate has been at 42 when I sleep.
01:51:20.140 | And when my resting heart rate's at 42,
01:51:22.100 | my HRV is at its highest.
01:51:24.100 | And I wake up in the morning feeling more energized
01:51:29.100 | than any other configuration.
01:51:30.380 | And so I know from all these processes
01:51:32.360 | that eating at roughly 8.30 in the morning,
01:51:34.680 | right after I work out on an empty stomach,
01:51:37.600 | creates enough distance between that completed eating
01:51:41.420 | and bedtime, where I have no,
01:51:43.860 | almost no digestion processes going on in my body,
01:51:47.320 | therefore my resting heart rate goes very low.
01:51:49.760 | And when my resting heart rate's very low,
01:51:51.340 | I sleep with high quality.
01:51:52.400 | And so basically I've been trying to optimize
01:51:54.340 | the entirety of what I eat to my sleep quality.
01:51:58.240 | And my sleep quality then, of course,
01:51:59.480 | feeds into my willpower, so it creates this virtuous cycle.
01:52:02.540 | And so at 8.30, what I do is I eat what I call super veggie,
01:52:06.420 | which is, it's a pudding of 250 grams of broccoli,
01:52:10.220 | 150 grams of cauliflower,
01:52:11.580 | and a whole bunch of other vegetables,
01:52:13.300 | that I eat what I call nutty pudding, which is--
01:52:16.020 | - You make the pudding yourself, like the,
01:52:19.540 | what do you call it?
01:52:20.380 | Like a veggie mix, whatever thing, like a blender?
01:52:23.500 | - Yeah, you can be made in a high-speed blender.
01:52:25.640 | But basically I eat the same thing every day,
01:52:27.800 | veggie bowl, as in a form of pudding,
01:52:30.940 | and then a bowl in the form of nuts.
01:52:34.400 | And then I have--
01:52:35.240 | - So vegan?
01:52:36.280 | - Vegan, yes.
01:52:37.120 | - Vegan, so that's fat, and that's like,
01:52:39.660 | that's fat and carbs and protein and so on.
01:52:43.640 | - Did I have a third taste? - Does it taste good?
01:52:45.260 | - I love it.
01:52:46.200 | I love it so much, I dream about it.
01:52:49.500 | - Yeah, that's awesome.
01:52:50.900 | This is a--
01:52:52.220 | - And then I have a third dish,
01:52:53.180 | which is, it changes every day.
01:52:55.300 | Today it was kale and spinach and sweet potato.
01:52:58.220 | And then I take about 20 supplements
01:53:02.140 | that hopefully make,
01:53:06.740 | constitute a perfect nutritional profile.
01:53:09.100 | So what I'm trying to do is create the perfect diet
01:53:12.040 | for my body every single day.
01:53:14.180 | - Where sleep is part of the optimization.
01:53:16.340 | - That's right.
01:53:17.180 | - You're like, one of the things you're really tracking.
01:53:18.380 | I mean, can you, well, I have a million questions,
01:53:21.060 | but 20 supplements, like what kind would you say
01:53:24.260 | are essential?
01:53:25.140 | 'Cause I only take,
01:53:26.420 | I only take athleticgreens.com/wax.
01:53:30.580 | That's like the multivitamin, essentially.
01:53:33.100 | That's like the lazy man, you know?
01:53:35.700 | Like if you don't actually wanna think about shit,
01:53:37.660 | that's what you take.
01:53:38.620 | And then fish oil, and that's it, that's all I take.
01:53:41.620 | - Yeah, you know, Alfred North Whitehead said,
01:53:45.060 | "Civilization advances as it extends
01:53:48.420 | the number of important operations it can do
01:53:50.540 | without thinking about them."
01:53:52.540 | - Yes.
01:53:53.380 | - And so my objective on this is,
01:53:55.680 | I want an algorithm for perfect health
01:53:59.140 | that I never have to think about.
01:54:00.900 | And that I want that system to be scalable to anybody,
01:54:03.840 | so that they don't have to think about it.
01:54:05.500 | And right now it's expensive for me to do it,
01:54:07.700 | it's time consuming for me to do it,
01:54:09.060 | and I have infrastructure to do it,
01:54:10.380 | but the future of being human
01:54:13.680 | is not going to the grocery store
01:54:16.020 | and deciding what to eat.
01:54:17.540 | It's also not reading scientific papers
01:54:19.540 | trying to decide this thing or that thing.
01:54:21.340 | It's all N of one.
01:54:23.180 | So it's devices on the outside and inside your body
01:54:26.060 | assessing real time what your body needs,
01:54:28.300 | and then creating closed loop systems for that to happen.
01:54:30.780 | - Yeah, so right now you're doing the data collection
01:54:33.740 | and you're being the scientist.
01:54:35.860 | It'd be much better if you just did the data,
01:54:39.060 | or it was being essentially done for you,
01:54:40.820 | and you can outsource that to another scientist
01:54:43.800 | that's doing the N of one study of you.
01:54:46.080 | - That's right, because every time I spend time
01:54:48.240 | thinking about this or executing, spending time on it,
01:54:50.440 | I'm spending less time thinking about
01:54:52.880 | building kernel or future of being human.
01:54:55.360 | And so we just all have the budget of our capacity
01:55:00.080 | on an everyday basis,
01:55:01.340 | and we will scaffold our way up out of this.
01:55:05.600 | And so, yeah, hopefully what I'm doing is really,
01:55:07.840 | it serves as a model that others can also build on.
01:55:11.480 | That's why I wrote about it,
01:55:12.520 | is hopefully people can then take it and improve upon it.
01:55:15.520 | I hold nothing sacred.
01:55:16.520 | I change my diet almost every day
01:55:19.680 | based upon some new test result or science
01:55:21.680 | or something like that.
01:55:23.000 | - Can you maybe elaborate on the sleep thing?
01:55:24.920 | Why is sleep so important?
01:55:27.360 | And why, presumably, what does good sleep mean to you?
01:55:34.680 | - I think sleep is a contender
01:55:37.520 | for being the most powerful
01:55:40.560 | health intervention in existence.
01:55:45.960 | It's a contender.
01:55:49.440 | I mean, it's magical what it does if you're well-rested
01:55:53.480 | and what your body can do.
01:55:56.760 | And I mean, for example,
01:55:57.880 | I know when I eat close to my bedtime,
01:56:01.360 | and I've done a systematic study for years
01:56:05.480 | looking at like 15-minute increments on time of day
01:56:07.800 | and where I eat my last meal,
01:56:09.760 | my willpower is directly correlated
01:56:12.600 | to the amount of deep sleep I get.
01:56:14.500 | So my ability to not binge eat at night
01:56:18.640 | when Rascal Bryan's out and about
01:56:22.220 | is based upon how much deep sleep I got the night before.
01:56:24.440 | - Yeah, yeah, there's a lot to that, yeah.
01:56:27.360 | - And so I've seen it manifest itself.
01:56:30.320 | And so I think the way I'd summarize this is,
01:56:34.040 | in society, we've had this myth of,
01:56:36.280 | we tell stories, for example, of entrepreneurship,
01:56:39.120 | where this person was so amazing,
01:56:41.600 | they stayed at the office for three days
01:56:43.240 | and slept under their desk.
01:56:44.880 | And we say, "Wow, that's amazing.
01:56:47.760 | "That's amazing."
01:56:48.820 | And now I think we're headed towards a state
01:56:52.680 | where we'd say that's primitive
01:56:54.960 | and really not a good idea on every level.
01:56:58.720 | And so the new mythology is going to be the exact opposite.
01:57:03.480 | - Yeah, by the way, just to sort of maybe push back
01:57:08.560 | a little bit on that idea.
01:57:10.240 | - Did you sleep under your desk, Alex?
01:57:13.840 | - Well, yeah, a lot.
01:57:14.680 | I'm a big believer in that, actually.
01:57:16.080 | I'm a big believer in chaos
01:57:19.520 | and giving into your passion.
01:57:22.440 | And sometimes doing things that are out of the ordinary,
01:57:25.580 | that are not trying to optimize health
01:57:29.320 | for certain periods of time in lieu of your passions
01:57:34.320 | is a signal to yourself that you're throwing everything away.
01:57:39.560 | So I think what you're referring to
01:57:42.380 | is how to have good performance
01:57:45.240 | for prolonged periods of time.
01:57:47.600 | I think there's moments in life
01:57:50.680 | when you need to throw all of that away,
01:57:52.680 | all the plans away, all the structure away.
01:57:54.960 | So I'm not sure I have an eloquent way
01:57:59.960 | to describe exactly what I'm talking about,
01:58:02.340 | but it all depends on different people.
01:58:06.080 | People are different.
01:58:07.720 | But there's a danger of over-optimization
01:58:10.240 | to where you don't just give in to the madness
01:58:14.400 | of the way your brain flows.
01:58:17.920 | I mean, to push back on my pushback is like,
01:58:21.920 | it's nice to have where the foundations
01:58:26.920 | of your brain are not messed with.
01:58:31.660 | So you have a fixed foundation where the diet is fixed,
01:58:35.340 | where the sleep is fixed, and that all of that is optimal.
01:58:37.820 | And the chaos happens in the space of ideas
01:58:39.920 | as opposed to the space of biology.
01:58:41.720 | But I'm not sure if there's a,
01:58:46.000 | that requires real discipline and forming habits.
01:58:50.920 | There's some aspect to which some of the best days
01:58:54.920 | and weeks of my life have been, yeah,
01:58:56.480 | sleeping under a desk kind of thing.
01:58:58.520 | And I don't, I'm not too willing to let go of things
01:59:03.520 | that empirically worked for things that work in theory.
01:59:09.480 | So again, I'm absolutely with you on sleep.
01:59:16.040 | Also, I'm with you on sleep conceptually,
01:59:19.560 | but I'm also very humbled to understand
01:59:23.880 | that for different people,
01:59:26.760 | good sleep means different things.
01:59:28.760 | I'm very hesitant to trust science on sleep.
01:59:33.000 | I think you should also be a scholar of your own body.
01:59:35.760 | Again, experiment of N of one.
01:59:38.200 | I'm not so sure that a full night's sleep is great for me.
01:59:43.200 | There is something about that power nap
01:59:47.840 | that I just have not fully studied yet,
01:59:50.160 | but that nap is something special.
01:59:52.560 | That I'm not sure I found the optimal thing.
01:59:54.880 | So like there's a lot to be explored
01:59:57.600 | to what is exactly optimal amount of sleep,
02:00:00.200 | optimal kind of sleep, combined with diet
02:00:02.400 | and all those kinds of, I mean, that all maps
02:00:04.120 | to sort of data leads to truth,
02:00:06.400 | exactly what everything you're referring to.
02:00:08.840 | - Here's a data point for your consideration.
02:00:10.520 | - Yes.
02:00:11.360 | - The progress in biology over the past,
02:00:16.600 | say decade has been stunning.
02:00:18.800 | - Yes.
02:00:19.640 | - And it now appears as if we will be able to replace
02:00:24.200 | our organs, zero transplantation.
02:00:28.280 | And so we probably have a path to replace
02:00:33.280 | and regenerate every organ of your body,
02:00:37.800 | except for your brain.
02:00:38.920 | You can lose your hand and your arm and a leg,
02:00:45.280 | you can have an artificial heart,
02:00:47.560 | you can't operate without your brain.
02:00:49.640 | And so when you make that trade-off decision
02:00:52.200 | of whether you're going to sleep under the desk or not
02:00:54.800 | and go all out for a four day marathon,
02:00:59.360 | there's a cost benefit trade-off of what's going on,
02:01:02.760 | what's happening to your brain in that situation.
02:01:05.800 | We don't know the consequences
02:01:07.640 | of modern day life on our brain.
02:01:09.240 | It's the most valuable organ in our existence.
02:01:15.000 | And we don't know what's going on
02:01:17.400 | if we, and how we're treating it today
02:01:19.600 | with stress and with sleep and with dietary.
02:01:23.440 | And to me, then if you say that you're trying to,
02:01:28.440 | you're trying to optimize life
02:01:30.040 | for whatever things you're trying to do,
02:01:32.000 | the game is soon,
02:01:34.400 | with the progress in anti-aging and biology,
02:01:36.420 | the game is very soon going to become different
02:01:38.200 | than what it is right now,
02:01:40.200 | with organ rejuvenation, organ replacement.
02:01:42.720 | And I would conjecture that we will value
02:01:47.720 | the health status of our brain above all things.
02:01:52.480 | - Yeah, no, absolutely.
02:01:54.000 | Everything you're saying is true, but we die.
02:01:58.240 | We die pretty quickly.
02:01:59.840 | Life is short.
02:02:01.200 | And I'm one of those people that I would rather die
02:02:06.200 | in battle than stay safe at home.
02:02:11.200 | It's like, yeah, you look at kinda,
02:02:14.880 | there's a lot of things that you can reasonably say,
02:02:17.120 | this is the smart thing to do,
02:02:19.080 | that can prevent you, that becomes conservative,
02:02:21.640 | that can prevent you from fully embracing life.
02:02:24.480 | I think ultimately, you can be very intelligent
02:02:27.600 | and data-driven and also embrace life.
02:02:30.800 | But I err on the side of embracing life.
02:02:33.400 | It takes a very skillful person
02:02:36.760 | to not sort of that hovering parent
02:02:39.740 | that says, no, you know what?
02:02:41.000 | There's a 3% chance that if you go out,
02:02:44.600 | if you go out by yourself and play, you're going to die.
02:02:47.680 | Get run over by a car, come to a slow or a sudden end.
02:02:51.260 | And I'm more a supporter of just go out there.
02:02:57.480 | If you die, you die.
02:02:59.280 | And it's a balance you have to strike.
02:03:02.840 | I think there's a balance to strike
02:03:04.880 | in long-term optimization and short-term freedom.
02:03:10.880 | For me, for a programmer, for a programming mind,
02:03:15.320 | I tend to over-optimize,
02:03:16.480 | and I'm very cautious and afraid of that,
02:03:19.300 | to not over-optimize and thereby be overly cautious,
02:03:24.240 | suboptimally cautious about everything I do.
02:03:27.480 | And the ultimate thing I'm trying to optimize for,
02:03:30.360 | it's funny you said sleep and all those kinds of things.
02:03:33.440 | I tend to think, this is,
02:03:37.040 | you're being more precise than I am,
02:03:40.600 | but I think I tend to want to minimize stress,
02:03:45.080 | which everything comes into that,
02:03:49.320 | from you sleeping, all those kinds of things.
02:03:50.860 | But I worry that whenever I'm trying
02:03:53.760 | to be too strict with myself, then the stress goes up
02:03:57.640 | when I don't follow the strictness.
02:04:00.520 | And so you have to kind of, it's a weird,
02:04:03.640 | there's so many variables in an objective function
02:04:05.680 | that it's hard to get right.
02:04:07.280 | And sort of not giving a damn about sleep
02:04:09.640 | and not giving a damn about diet
02:04:11.280 | is a good thing to inject in there every once in a while
02:04:14.040 | for somebody who's trying to optimize everything.
02:04:17.080 | But that's me just trying to,
02:04:19.120 | it's exactly like you said, you're just a scientist,
02:04:21.400 | I'm a scientist of myself, you're a scientist of yourself.
02:04:24.140 | It'd be nice if somebody else was doing it
02:04:25.800 | and had much better data than,
02:04:27.900 | because I don't trust my conscious mind,
02:04:29.560 | and I pigged out last night at some brisket in LA
02:04:32.500 | that I regret deeply.
02:04:34.500 | So-- - There's no point
02:04:37.600 | to anything I just said.
02:04:38.800 | (both laughing)
02:04:42.040 | - What is the nature of your regret on the brisket?
02:04:45.080 | Is it, do you wish you hadn't eaten it entirely?
02:04:49.800 | Is it that you wish you hadn't eaten as much as you did?
02:04:51.800 | Is it that?
02:04:53.000 | - I think, well, most regret, I mean,
02:04:58.480 | if we wanna be specific, I drank way too much diet soda.
02:05:04.420 | My biggest regret is having drank so much diet soda.
02:05:08.120 | That's the thing that really was the problem.
02:05:10.380 | I had trouble sleeping because of that,
02:05:12.140 | 'cause I was programming and then I was editing.
02:05:14.660 | And so I'd stay up late at night
02:05:15.740 | and then I'd have to get up to go pee a few times,
02:05:18.500 | and it was just a mess.
02:05:19.460 | - A mess of a night.
02:05:20.380 | - It was, well, it's not really a mess,
02:05:22.340 | but it's so many, it's like the little things.
02:05:25.980 | I know if I just eat, I drink a little bit of water
02:05:30.100 | and that's it, and there's a certain,
02:05:31.780 | all of us have perfect days that we know diet-wise
02:05:36.580 | and so on that's good to follow, you feel good.
02:05:39.180 | I know what it takes for me to do that.
02:05:41.700 | I didn't fully do that, and thereby,
02:05:43.820 | because there's an avalanche effect
02:05:47.680 | where the other sources of stress,
02:05:50.140 | all the other to-do items I have pile on,
02:05:53.180 | my failure to execute on some basic things
02:05:55.660 | that I know make me feel good,
02:05:56.780 | and all of that combines to create a mess of a day.
02:06:02.460 | But some of that chaos, you have to be okay with it,
02:06:05.020 | but some of it I wish was a little bit more optimal.
02:06:07.900 | And your ideas about eating in the morning
02:06:11.020 | are quite interesting as an experiment to try.
02:06:14.340 | Can you elaborate, are you eating once a day?
02:06:18.540 | - Yes.
02:06:19.780 | - In the morning, and that's it.
02:06:22.220 | Can you maybe speak to how that,
02:06:24.380 | you spoke, it's funny, you spoke about the metrics of sleep,
02:06:30.680 | but you're also, you run a business,
02:06:34.780 | you're incredibly intelligent,
02:06:36.180 | most of your happiness and success
02:06:40.260 | relies on you thinking clearly.
02:06:43.240 | So how does that affect your mind and your body
02:06:45.900 | in terms of performance?
02:06:47.180 | So not just sleep, but actual mental performance.
02:06:50.060 | - As you were explaining your objective function of,
02:06:53.620 | for example, in the criteria you were including,
02:06:56.700 | you like certain neurochemical states,
02:06:59.780 | like you like feeling like you're living life,
02:07:02.580 | that life has enjoyment,
02:07:04.660 | that sometimes you want to disregard certain rules
02:07:07.600 | to have a moment of passion, of focus.
02:07:10.280 | There's this architecture of the way Lex is,
02:07:13.340 | which makes you happy as a story you tell,
02:07:16.400 | as something you kind of experience,
02:07:17.820 | maybe the experience is a bit more complicated,
02:07:19.380 | but it's in this idea you have, this is a version of you.
02:07:22.580 | And the reason why I maintain the schedule I do
02:07:26.940 | is I've chosen a game to say,
02:07:29.660 | I would like to live a life where I care more
02:07:33.300 | about what intelligent,
02:07:36.240 | what people who live in the year 2,500
02:07:41.740 | think of me than I do today.
02:07:43.460 | That's the game I'm trying to play.
02:07:46.660 | And so therefore the only thing I really care about
02:07:51.180 | on this optimization is trying to see past myself,
02:07:56.460 | past my limitations, using zero principle thinking,
02:07:59.760 | pull myself out of this contextual mesh we're in right now
02:08:04.260 | and say, what will matter 100 years from now
02:08:07.420 | and 200 years from now?
02:08:08.500 | What are the big things really going on
02:08:11.260 | that are defining reality?
02:08:13.340 | And I find that if I were to hang out with diet soda Lex
02:08:18.340 | and diet soda Brian were to play along with that
02:08:24.980 | and my deep sleep were to get crushed as a result,
02:08:28.540 | my mind would not be on what matters in 100 years
02:08:31.300 | or 200 years or 300 years.
02:08:32.180 | I would be irritable.
02:08:34.340 | I would be, you know, I'd be in a different state.
02:08:37.500 | And so it's just gameplay selection.
02:08:41.160 | It's what you and I have chosen to think about.
02:08:43.660 | It's what we've chosen to work on.
02:08:47.520 | And this is why I'm saying that no generation of humans
02:08:51.780 | have ever been afforded the opportunity to look
02:08:55.100 | at their lifespan and contemplate that they will
02:09:00.100 | have the possibility of experiencing
02:09:03.220 | an evolved form of consciousness that is undeniable.
02:09:06.900 | They would fall into zero category of potential.
02:09:11.020 | That to me is the most exciting thing in existence.
02:09:14.140 | And I would not trade any momentary neurochemical state
02:09:19.140 | right now in exchange for that.
02:09:21.540 | I would, I'd be willing to deprive myself
02:09:23.700 | of all momentary joy in pursuit of that goal
02:09:27.100 | because that's what makes me happy.
02:09:28.900 | - That's brilliant.
02:09:30.820 | But I'm a bit, I just looked it up.
02:09:34.340 | I just looked up Braveheart's speech in William Wallace.
02:09:39.780 | But I don't know if you've seen it.
02:09:41.820 | Fight and you may die, run and you'll live at least a while
02:09:45.540 | and dying in your beds many years from now,
02:09:48.080 | would you be willing to trade all the days
02:09:51.480 | from this day to that for one chance,
02:09:53.740 | just one chance, picture Mel Gibson saying this,
02:09:57.120 | to come back here and tell our enemies
02:09:59.120 | that they may take our lives with growing excitement,
02:10:03.720 | but they'll never take our freedom.
02:10:06.440 | I get excited every time I see that in the movie.
02:10:08.740 | But that's kind of how I approach life.
02:10:10.920 | - Do you think they were tracking their sleep?
02:10:13.440 | - They were not tracking their sleep
02:10:14.520 | and they ate way too much brisket
02:10:16.200 | and they were fat, unhealthy, died early
02:10:19.480 | and were primitive.
02:10:22.480 | But there's something in my eight brain
02:10:25.480 | that's attracted to that,
02:10:26.980 | even though most of my life is fully aligned
02:10:31.000 | with the way you see yours.
02:10:32.680 | Part of it is for comedy, of course,
02:10:34.160 | but part of it is like,
02:10:35.960 | I'm almost afraid of over-optimization.
02:10:38.880 | - Really what you're saying though,
02:10:39.900 | if we're looking at this,
02:10:41.640 | let's say from a first principles perspective,
02:10:43.520 | when you read those words,
02:10:44.760 | they conjure up certain life experiences,
02:10:46.560 | but you're basically saying,
02:10:47.520 | I experienced a certain neurotransmitter state
02:10:50.400 | when these things are in action.
02:10:53.120 | That's all you're saying.
02:10:54.000 | So whether it's that or something else,
02:10:55.340 | you're just saying you have a selection
02:10:57.240 | for how your state for your body.
02:10:59.440 | And so if you as an engineer of consciousness,
02:11:02.840 | that should just be engineerable.
02:11:04.480 | That's just triggering certain chemical reactions.
02:11:08.280 | And so it doesn't mean they have to be mutually exclusive.
02:11:11.360 | You can have that and experience that
02:11:13.000 | and also not sacrifice long-term health.
02:11:15.840 | And I think that's the potential of where we're going
02:11:17.600 | is we don't have to assume
02:11:21.920 | they are trade-offs that must be had.
02:11:25.720 | - Absolutely.
02:11:26.560 | And so I guess from my particular brain,
02:11:28.920 | it's useful to have the outlier experiences
02:11:32.160 | that also come along with the illusion of free will
02:11:35.720 | where I chose those experiences
02:11:37.840 | that make me feel like it's freedom.
02:11:39.600 | Listen, going to Texas made me realize,
02:11:41.940 | I spent, so I still am, but I lived at Cambridge at MIT.
02:11:46.760 | And I never felt like home there.
02:11:49.200 | I felt like home in the space of ideas with the colleagues
02:11:52.640 | like when I was actually discussing ideas.
02:11:54.380 | But there is something about the constraints,
02:11:57.840 | how cautious people are,
02:12:00.320 | how much they valued also kind of material success,
02:12:05.160 | career success.
02:12:06.840 | When I showed up to Texas, it felt like I belong.
02:12:12.060 | That was very interesting.
02:12:13.220 | But that's my neurochemistry, whatever the hell that is,
02:12:15.860 | whatever, maybe it probably is rooted to the fact
02:12:18.780 | that I grew up in the Soviet Union.
02:12:20.100 | It was such a constrained system
02:12:22.060 | that you really deeply value freedom.
02:12:23.980 | And you always want to escape the man
02:12:27.440 | and the control of centralized systems.
02:12:29.100 | I don't know what it is.
02:12:30.460 | But at the same time, I love strictness.
02:12:33.860 | I love the dogmatic authoritarianism of diet,
02:12:37.740 | of like the same habit, exactly the habit you have.
02:12:41.340 | I think that's actually when bodies perform optimally,
02:12:43.760 | my body performs optimally.
02:12:45.580 | So balancing those two, I think if I have the data,
02:12:48.880 | every once in a while, party with some wild people,
02:12:51.640 | but most of the time eat once a day,
02:12:53.660 | perhaps in the morning, I'm gonna try that.
02:12:56.240 | That might be very interesting.
02:12:57.960 | But I'd rather not try it.
02:13:00.040 | I'd rather have the data that tells me to do it.
02:13:03.160 | But in general, you're able to eating once a day,
02:13:07.240 | think deeply about stuff.
02:13:09.000 | Like that's a concern that people have is like,
02:13:11.700 | does your energy wane, all those kinds of things.
02:13:15.180 | Do you find that it's, especially 'cause it's unique,
02:13:19.260 | it's vegan as well.
02:13:21.060 | So you find that you're able to have a clear mind,
02:13:23.900 | a focus and just physically and mentally throughout?
02:13:27.780 | - Yeah, I find like my personal experience
02:13:29.860 | in thinking about hard things is,
02:13:36.020 | like oftentimes I feel like I'm looking through a telescope
02:13:40.200 | and I'm aligning two or three telescopes
02:13:42.280 | and you kind of have to close one eye
02:13:44.120 | and move back and forth a little bit
02:13:46.200 | and just find just the right alignment.
02:13:47.960 | Then you find just a sneak peek
02:13:49.840 | at the thing you're trying to find, but it's fleeting.
02:13:51.680 | If you move just one little bit, it's gone.
02:13:54.800 | And oftentimes what I feel like are,
02:13:57.240 | the ideas I value the most are like that.
02:14:00.640 | They're so fragile and fleeting and slippery
02:14:04.880 | and elusive and it requires a sensitivity to thinking
02:14:09.880 | and a sensitivity to maneuver through these things.
02:14:16.560 | If I concede to a world where I'm on my phone texting,
02:14:21.560 | I'm also on social media,
02:14:25.480 | I'm also doing 15 things at the same time
02:14:28.320 | because I'm running a company
02:14:29.280 | and I'm also feeling terrible from the last night.
02:14:33.660 | It all just comes crashing down
02:14:34.860 | and the quality of my thoughts goes to a zero.
02:14:38.700 | I'm a functional person to respond to basic level things,
02:14:42.980 | but I don't feel like I am doing anything interesting.
02:14:46.200 | - I think that's a good word, sensitivity,
02:14:49.820 | because that's what thinking deeply feels like,
02:14:53.860 | is you're sensitive to the fragile thoughts.
02:14:55.820 | And you're right, all those other distractions
02:14:57.980 | kind of dull your ability to be sensitive
02:15:01.320 | to the fragile thoughts.
02:15:02.940 | It's a really good word.
02:15:04.140 | Out of all the things you've done,
02:15:08.140 | you've also climbed Mount Kilimanjaro.
02:15:11.700 | Is this true?
02:15:14.100 | - It's true.
02:15:14.940 | - What do you, why and how,
02:15:23.220 | and what do you take from that experience?
02:15:25.700 | - I guess the backstory is relevant
02:15:26.900 | because in that moment, it was the darkest time in my life.
02:15:32.480 | I was ending a 13 year marriage, I was leaving my religion,
02:15:35.360 | I sold Braintree and I was battling depression
02:15:38.520 | where I was just at the end.
02:15:40.900 | And I got invited to go to Tanzania
02:15:45.820 | as part of a group that was raising money
02:15:47.100 | to build clean water wells.
02:15:49.400 | And I had made some money from Braintree
02:15:51.560 | and so I was able to donate $25,000.
02:15:54.440 | And it was the first time I had ever had money
02:15:57.480 | to donate outside of paying tithing in my religion.
02:16:01.720 | And it was such a phenomenal experience
02:16:04.400 | to contribute something meaningful
02:16:08.600 | to someone else in that form.
02:16:11.260 | And as part of this process,
02:16:12.720 | we were gonna climb the mountain.
02:16:13.800 | And so we went there and we saw the clean water wells
02:16:16.600 | we were building, we spoke to the people there
02:16:18.760 | and it was very energizing.
02:16:20.520 | And then we climbed Kilimanjaro
02:16:21.920 | and I came down with a stomach flu on day three
02:16:26.920 | and I also had altitude sickness,
02:16:30.720 | but I became so sick that on day four,
02:16:33.080 | we are summiting on day five,
02:16:34.780 | I came into the base camp at 15,000 feet,
02:16:38.460 | just going to the bathroom on myself
02:16:42.820 | and like falling all over.
02:16:44.520 | I was just a disaster, I was so sick.
02:16:47.160 | - So stomach flu and altitude sickness.
02:16:49.520 | - Yeah, and I just was destroyed from the situation.
02:16:55.440 | - Plus psychologically one of the lowest points
02:17:00.680 | you could go through. - Yeah, and I think
02:17:01.600 | that was probably a big contributor.
02:17:02.720 | I was just smoked as a human, just absolutely done.
02:17:06.600 | And I had three young children.
02:17:07.760 | And so I was trying to reconcile,
02:17:09.320 | like this is not a, whether I live or not
02:17:12.960 | is not my decision by itself.
02:17:14.920 | I'm now intertwined with these three little people
02:17:19.600 | and I have an obligation whether I like it or not,
02:17:22.960 | I need to be there.
02:17:24.480 | And so it did, it felt like I was just stuck
02:17:26.440 | in a straitjacket.
02:17:27.680 | And I had to decide whether I was going to summit
02:17:32.240 | the next day with the team.
02:17:34.120 | And it was a difficult decision
02:17:36.600 | because once you start hiking,
02:17:38.180 | there's no way to get off the mountain.
02:17:40.800 | And midnight came and our guide came in
02:17:43.960 | and he said, "Where are you at?"
02:17:44.960 | And I said, "I think I'm okay, I think I can try."
02:17:48.200 | And so we went.
02:17:49.740 | And so from midnight to, I made it to the summit at 5 a.m.
02:17:56.840 | It was one of the most transformational moments
02:17:59.600 | of my existence.
02:18:00.960 | And the mountain became my problem.
02:18:05.960 | It became everything that I was struggling with.
02:18:09.880 | And when I started hiking, it was,
02:18:12.520 | the pain got so ferocious that it was kind of like this,
02:18:17.520 | it became so ferocious that I turned my music to Eminem.
02:18:25.040 | And it was, Eminem was, he was the only person
02:18:28.640 | in existence that spoke to my soul.
02:18:31.160 | And it was something about his anger and his vibrancy
02:18:36.160 | and his multidimensional, he's the only person
02:18:39.120 | who I could turn on and I could just say,
02:18:40.520 | "Ah, I feel some relief."
02:18:42.760 | I turned on Eminem and I made it to the summit
02:18:46.800 | after five hours, but just 100 yards from the top.
02:18:51.040 | I was with my guide Ike and I started getting very dizzy
02:18:54.480 | and I felt like I was gonna fall backwards
02:18:57.000 | off this cliff area we were on.
02:18:58.560 | I was like, "This is dangerous."
02:19:00.480 | And he said, "Look, Brian, I know where you're at.
02:19:03.600 | "I know where you're at.
02:19:06.160 | "And I can tell you, you've got it in you.
02:19:08.360 | "So I want you to look up, take a step, take a breath,
02:19:13.360 | "and then look up, take a breath and take a step."
02:19:16.560 | And I did, and I made it.
02:19:19.280 | And so I got there and I just, I sat down with him
02:19:22.280 | at the top, I just cried like a baby.
02:19:24.240 | - Broke down.
02:19:25.080 | - Yeah, I just lost it.
02:19:26.720 | And so he'd let me do my thing.
02:19:29.560 | And then we pulled out the pulse oximeter
02:19:31.960 | and he measured my blood oxygen levels.
02:19:33.240 | And it was like 50 something percent
02:19:36.080 | and it was danger zone.
02:19:36.960 | So he looked at it and I think he was like really alarmed
02:19:39.560 | that I was in this situation.
02:19:41.400 | And so he said, "We can't get a helicopter here
02:19:44.760 | "and we can't get you to emergency evacuated.
02:19:46.100 | "You've got to go down.
02:19:46.940 | "You've got to hike down to 15,000 feet to get base camp."
02:19:49.720 | And so we went down the mountain.
02:19:53.080 | I got back down to base camp.
02:19:55.040 | And again, that was pretty difficult.
02:19:57.600 | And then they put me on a stretcher,
02:19:59.200 | this metal stretcher with this one wheel
02:20:01.040 | and a team of six people wheeled me down the mountain.
02:20:04.680 | And it was pretty torturous.
02:20:06.440 | I'm very appreciative they did.
02:20:07.600 | Also the trail was very bumpy.
02:20:09.080 | So they'd go over the big rocks.
02:20:10.240 | And so my head would just slam against this metal thing
02:20:14.320 | for hours.
02:20:15.160 | And so I just felt awful.
02:20:15.980 | Plus I'd get my head slammed every couple of seconds.
02:20:18.880 | So the whole experience was really a life changing moment.
02:20:22.080 | And that was the demarcation of me
02:20:25.040 | basically building a new life.
02:20:26.680 | Basically I said, "I'm going to reconstruct Brian.
02:20:31.680 | "My understanding of reality, my existential realities,
02:20:34.940 | "what I want to go after."
02:20:36.760 | And I try, I mean, as much as that's possible as a human,
02:20:39.680 | but that's when I set out to rebuild everything.
02:20:44.400 | - Was it the struggle of that?
02:20:46.040 | I mean, there's also just like the romantic, poetic,
02:20:50.840 | it's a fricking mountain.
02:20:52.320 | As a man in pain, psychological and physical,
02:20:58.040 | struggling up a mountain, but it's just struggle.
02:21:01.440 | Just in the face of,
02:21:04.000 | just pushing through in the face of hardship or nature too,
02:21:09.800 | something much bigger than you.
02:21:11.360 | Is that, was that the thing that just clicked?
02:21:14.880 | - For me, it felt like I was just locked in with reality
02:21:18.000 | and it was a death match.
02:21:21.320 | It was in that moment, one of us is going to die.
02:21:24.080 | - So you were pondering death, like not surviving.
02:21:27.520 | - Yeah, and it was, and that was the moment.
02:21:29.920 | And it was, the summit to me was,
02:21:32.280 | I'm going to come out on top and I can do this.
02:21:35.960 | And giving in was, it's like, I'm just done.
02:21:40.680 | And so it did, I locked in and that's why,
02:21:45.280 | mountains are magical to me.
02:21:47.640 | I didn't expect that, I didn't design that,
02:21:51.280 | I didn't know that was going to be the case.
02:21:54.080 | It would not have been something I would have anticipated.
02:21:56.520 | - But you were not the same man afterwards.
02:22:00.360 | Is there advice you can give to young people today
02:22:06.360 | that look at your story,
02:22:07.420 | that's successful in many dimensions?
02:22:09.400 | Advice you can give to them
02:22:12.080 | about how to be successful in their career,
02:22:14.080 | successful in life, whatever path they choose?
02:22:17.860 | - Yes, I would say, listen to advice
02:22:22.600 | and see it for what it is, a mirror of that person.
02:22:27.440 | And then map and know that your future
02:22:31.960 | is going to be in a zero principle land.
02:22:34.840 | And so what you're hearing today is a representation
02:22:38.120 | of what may have been the right principles
02:22:39.840 | to build upon previously,
02:22:41.380 | but they're likely depreciating very fast.
02:22:45.680 | And so I am a strong proponent that
02:22:49.640 | people ask for advice, but they don't take advice.
02:22:54.700 | - So how do you take advice properly?
02:22:59.340 | - It's in the careful examination of the advice.
02:23:03.360 | It's actually, the person makes a statement
02:23:06.600 | about a given thing somebody should follow.
02:23:09.140 | The value is not doing that,
02:23:10.680 | the value is understanding the assumption stack they built,
02:23:13.480 | the assumption and knowledge stack they built
02:23:15.320 | around that body of knowledge.
02:23:18.080 | That's the value, it's not doing what they say.
02:23:20.780 | - Considering the advice,
02:23:23.120 | but digging deeper to understand the assumption stack,
02:23:27.920 | like the full person,
02:23:29.560 | I mean, this is deep empathy, essentially,
02:23:32.440 | to understand the journey of the person
02:23:34.200 | that arrived at the advice.
02:23:36.320 | And the advice is just the tip of the iceberg.
02:23:38.640 | - That's right.
02:23:39.480 | - The journey is not the thing that gives you.
02:23:41.000 | - That's right.
02:23:42.000 | - It could be the right thing to do,
02:23:43.520 | it could be the complete wrong thing to do,
02:23:45.040 | depending on the assumption stack.
02:23:46.960 | So you need to investigate the whole thing.
02:23:49.480 | Is there some, are there been people in your startup,
02:23:54.040 | in your business journey,
02:23:55.440 | that have served that role of advice giver,
02:24:00.520 | has been helpful?
02:24:02.600 | Or do you feel like your journey felt like a lonely path,
02:24:08.480 | or was it one that was, of course,
02:24:11.560 | we're all, we're all are born and die alone.
02:24:14.100 | But, (laughs)
02:24:17.320 | do you fundamentally remember the experiences,
02:24:21.360 | one where you leaned on people
02:24:23.960 | at a particular moment in time that changed everything?
02:24:26.920 | - Yeah, the most significant moments of my memory,
02:24:30.840 | for example, like on Kilimanjaro,
02:24:33.040 | when Ike, some person I'd never met in Tanzania,
02:24:38.160 | was able to, in that moment, apparently see my soul,
02:24:41.800 | when I was in this death match with reality.
02:24:44.240 | - Yes.
02:24:45.080 | - And he gave me the instructions, look up, step.
02:24:48.680 | And so, there's magical people in my life
02:24:53.680 | that have done things like that.
02:24:56.620 | And I suspect they probably don't know.
02:25:00.780 | I probably should be better at identifying those things.
02:25:06.600 | But yeah, hopefully the,
02:25:07.860 | I suppose like the wisdom I would aspire to
02:25:18.960 | is to have the awareness and the empathy
02:25:22.920 | to be that for other people.
02:25:24.880 | Not a retail
02:25:29.480 | advertiser of advice, of tricks for life,
02:25:35.840 | but deeply meaningful and empathetic
02:25:40.480 | with a one-on-one context with people
02:25:42.360 | that it really could make a difference.
02:25:44.380 | - Yeah, I actually kind of experience,
02:25:47.400 | I think about that sometimes.
02:25:48.880 | You know, you have like a 18-year-old kid come up to you.
02:25:51.680 | It's not always obvious, it's not always easy
02:25:57.800 | to really listen to them.
02:26:00.120 | Like not the facts, but like see who that person is.
02:26:04.600 | - I think people say that about being a parent is,
02:26:08.040 | you know, you wanna consider that
02:26:10.760 | you don't wanna be the authority figure
02:26:14.200 | in a sense that you really wanna consider
02:26:16.040 | that there's a special, unique human being there
02:26:18.700 | with a unique brain that may be brilliant
02:26:23.540 | in ways that you are not understanding
02:26:25.520 | that you'll never be.
02:26:26.840 | And really try to hear that.
02:26:29.360 | So when giving advice, there's something to that.
02:26:31.640 | And so both sides should be deeply empathetic
02:26:34.000 | about the assumption stack.
02:26:35.440 | I love that terminology.
02:26:38.800 | What do you think is the meaning of this whole thing?
02:26:41.680 | Of life.
02:26:42.520 | Why the hell are we here, Brian Johnson?
02:26:46.080 | We've been talking about brains and studying brains
02:26:48.320 | and you had this very eloquent way of describing life
02:26:51.440 | on Earth as an optimization problem
02:26:54.600 | of the cost of intelligence going to zero.
02:26:59.400 | At first through the evolutionary process
02:27:01.760 | and then eventually through building,
02:27:04.120 | through our technology building
02:27:07.280 | more and more intelligent systems.
02:27:09.280 | You ever ask yourself why is doing that?
02:27:13.600 | - Yeah, I think the answer to this question,
02:27:17.440 | again, the information value is more in the mirror
02:27:21.640 | it provides of that person, which is a representation
02:27:25.680 | of the technological, social, political context of the time.
02:27:30.120 | So if you asked this question a hundred years ago,
02:27:32.240 | you would get a certain answer that reflects
02:27:34.160 | that time period.
02:27:35.040 | Same thing would be true of a thousand years ago.
02:27:36.960 | It's rare.
02:27:38.000 | It's difficult for a person to pull themselves
02:27:40.920 | out of their contextual awareness.
02:27:42.120 | - Yeah, very true.
02:27:43.080 | - And offer a truly original response.
02:27:45.240 | And so knowing that I am contextually influenced
02:27:48.520 | by the situation, that I am a mirror for our reality,
02:27:52.440 | I would say that in this moment,
02:27:54.240 | I think the real game going on
02:28:00.160 | is that evolution built a system of scaffolding intelligence
02:28:05.160 | that produced us.
02:28:10.600 | We are now building intelligence systems
02:28:13.880 | that are scaffolding higher dimensional intelligence.
02:28:18.880 | That's developing more robust systems of intelligence.
02:28:28.040 | In doing in that process with the cost going to zero,
02:28:32.680 | then the meaning of life becomes goal alignment,
02:28:37.680 | which is the negotiation of our conscious
02:28:41.520 | and unconscious existence.
02:28:43.680 | And then I'd say the third thing is,
02:28:45.040 | if we're thinking that we wanna be explorers
02:28:47.800 | is our technological progress is getting to a point
02:28:52.960 | where we could aspirationally say,
02:28:57.240 | we want to figure out what is really going on,
02:29:01.240 | really going on.
02:29:02.640 | Because does any of this really make sense?
02:29:06.600 | Now we may be 100, 200, 500, 1000 years away
02:29:09.200 | from being able to poke our way out of whatever is going on.
02:29:14.200 | But it's interesting that we could even state an aspiration
02:29:20.120 | to say, we wanna poke at this question.
02:29:22.960 | But I'd say in this moment of time,
02:29:26.000 | the meaning of life is that we can build a future state
02:29:30.640 | of existence that is more fantastic
02:29:35.160 | than anything we could ever imagine.
02:29:37.080 | - The striving for something more amazing.
02:29:41.920 | - And that defies expectations
02:29:47.120 | that we would consider bewildering and all the things
02:29:54.120 | that that's, and I guess the last thing,
02:29:57.280 | if there's multiple meanings of life,
02:29:59.080 | it would be infinite games.
02:30:00.440 | James Kars wrote the book, "Finite Games, Infinite Games."
02:30:04.080 | The only game to play right now is to keep playing the game.
02:30:08.560 | And so this goes back to the algorithm
02:30:11.400 | of the Lex algorithm of diet soda and brisket
02:30:14.600 | and pursuing the passion.
02:30:16.200 | What I'm suggesting is there's a moment here
02:30:19.280 | where we can contemplate playing infinite games.
02:30:22.520 | Therefore, it may make sense to err on the side
02:30:25.800 | of making sure one is in a situation
02:30:27.720 | to be playing infinite games if that opportunity arises.
02:30:31.200 | So it's just the landscape of possibilities
02:30:33.800 | changing very, very fast.
02:30:35.280 | And therefore our old algorithms
02:30:36.960 | of how we might assess risk assessment
02:30:38.400 | and what things we might pursue
02:30:39.240 | and why those assumptions may fall away very quickly.
02:30:42.620 | - Well, I think I speak for a lot of people
02:30:46.040 | when I say that the game you, Mr. Brian Johnson,
02:30:50.480 | have been playing is quite incredible.
02:30:53.160 | Thank you so much for talking today.
02:30:54.200 | - Thanks, Lex.
02:30:55.040 | - Thanks for listening to this conversation
02:30:57.640 | with Brian Johnson and thank you to Four Sigmatic,
02:31:01.360 | NetSuite, Grammarly, and ExpressVPN.
02:31:04.960 | Check them out in the description to support this podcast.
02:31:08.320 | And now let me leave you with some words
02:31:10.080 | from Diane Ackerman.
02:31:11.960 | Our brain is a crowded chemistry lab
02:31:15.080 | bustling with nonstop neural conversations.
02:31:18.720 | Thank you for listening and hope to see you next time.
02:31:21.560 | (upbeat music)
02:31:24.140 | (upbeat music)
02:31:26.720 | [BLANK_AUDIO]