back to indexBryan Johnson: Kernel Brain-Computer Interfaces | Lex Fridman Podcast #186
Chapters
0:0 Introduction
1:13 Kernel Flow demo
10:25 The future of brain-computer interfaces
43:54 Existential risk
49:33 Overcoming depression
64:52 Zeroth principles thinking
73:5 Engineering consciousness
79:19 Privacy
83:48 Neuralink
93:27 Braintree and Venmo
109:10 Eating one meal a day
115:22 Sleep
135:4 Climbing Mount Kilimanjaro
142:2 Advice for young people
146:38 Meaning of life
00:00:00.000 |
The following is a conversation with Brian Johnson, 00:00:02.560 |
founder of Kernel, a company that has developed devices 00:00:09.360 |
And previously, he was the founder of Braintree, 00:00:19.720 |
Four Sigmatic, NetSuite, Grammarly, and ExpressVPN. 00:00:24.280 |
Check them out in the description to support this podcast. 00:00:27.760 |
As a side note, let me say that this was a fun 00:00:35.920 |
as you can see if you watch the video version 00:00:39.520 |
And there was a Ubuntu Linux machine sitting next to me, 00:00:44.840 |
The whole thing gave me hope that the mystery 00:00:47.840 |
of the human mind will be unlocked in the coming decades, 00:00:51.480 |
as we begin to measure signals from the brain 00:00:55.400 |
To understand the mind, we either have to build it 00:01:02.200 |
Thanks to Brian and the rest of the Kernel team 00:01:08.640 |
and here is my conversation with Brian Johnson. 00:01:18.840 |
And then I will proceed to tell you a few jokes. 00:01:22.040 |
So we have two incredible pieces of technology 00:01:25.760 |
and a machine running Ubuntu 2004 in front of us. 00:01:34.080 |
And they will place it on our heads for proper alignment. 00:02:53.320 |
- Yeah, that is definitely a brain interface. 00:03:32.300 |
We've been working on a device that can read your mind, 00:03:46.520 |
- If I'm seeing the muscle activation correctly 00:03:52.460 |
on your lips, you're not gonna do well on this. 00:04:39.920 |
- We're getting activation patterns of your entire cortex. 00:04:52.840 |
- Lex, what do scholars eat when they're hungry? 00:05:17.720 |
So it's similar to what we wear wearables on the wrist 00:05:38.280 |
And so you'll see in the reconstructions we do for you, 00:05:41.160 |
you'll see your activation patterns in your brain 00:05:43.080 |
as throughout this entire time we were wearing it. 00:05:50.320 |
And so it's a, we're moving towards a real time feed 00:05:56.920 |
- So there's a bunch of things that are in contact 00:06:07.120 |
- There's 52 modules and each module has one laser 00:06:13.700 |
And the sensors fire in about 100 picoseconds. 00:06:18.480 |
And then the photons scatter and absorb in your brain. 00:06:25.080 |
then a few come back out and we sense those photons 00:06:27.920 |
and then we do the reconstruction for the activity. 00:06:30.360 |
Overall, there's about a thousand plus channels 00:06:35.640 |
- How difficult is it to make it as comfortable as it is? 00:06:48.320 |
I would not think it would be comfortable, but it is. 00:06:54.200 |
So people are accustomed to being in big systems like fMRI 00:07:09.240 |
And so, yes, I agree that this is a convenient option 00:07:20.200 |
or if we wanna be at home in a business setting, 00:07:22.660 |
it's freedom to be where to record your brain activity 00:07:29.000 |
- Yeah, but sort of from an engineering perspective, 00:07:34.960 |
and they're kinda, there's like a rubber band thing 00:07:41.700 |
So we built this version of the mechanical design 00:07:55.840 |
Okay, maybe I feel much better about my head now. 00:08:08.120 |
interesting that you could say while it's on our heads. 00:08:22.520 |
to the side measuring stuff and collecting data. 00:08:26.200 |
I mean, I just, I feel like much more important now 00:08:43.080 |
because it's like accurately listening to me, Ubuntu. 00:08:49.120 |
I hadn't thought about that, of feeling understood. 00:08:54.240 |
- Heard, yeah, heard deeply by the mechanical system 00:09:09.520 |
and depth of understanding of this software system, 00:09:14.600 |
And now you're able to communicate with this system 00:09:20.760 |
- Yeah, I mean, I guess what's interesting about this is 00:09:28.760 |
Most people have intuitions about brain interfaces 00:09:34.800 |
or typing or changing the channel or skipping a song. 00:09:54.920 |
emerge around that of what to do with those numbers. 00:09:57.760 |
- So before you tell me about the possibilities, 00:10:03.880 |
but I'm being told that for a bunch of reasons, 00:10:08.880 |
just because we probably wanna keep the data small 00:10:11.240 |
and visualize it nicely for the final product, 00:10:17.960 |
So Brian, thank you so much for this experience 00:10:28.920 |
Can you maybe speak to what kind of opportunities 00:10:38.360 |
What comes to mind when you put that on your head? 00:10:41.560 |
What does it mean to you and what possibilities emerge 00:10:49.480 |
- Well, for me, I'm really excited by the possibility 00:11:03.320 |
such that data can be used to create products 00:11:07.360 |
So that to me is a really exciting possibility. 00:11:09.760 |
Even just like a Fitbit that measures, I don't know, 00:11:20.560 |
the resolution of that information is very crude, 00:11:25.520 |
of just building a dataset coming in a clean way 00:11:44.680 |
in the sense that it feels like the full richness 00:11:56.080 |
I mean, there's a, I can't quite put it into words, 00:12:02.880 |
this is not some kind of joke about me being a robot, 00:12:32.720 |
into that world, but perhaps recording your brain 00:12:41.160 |
much more personal recording of that information 00:12:44.720 |
than a picture that would allow you to step back 00:12:48.040 |
into that, where you were in that particular moment 00:12:51.800 |
in history and then map out a certain trajectory 00:12:58.520 |
That could open up all kinds of applications. 00:13:02.640 |
but honestly, to me, the exciting thing is just being heard. 00:13:10.920 |
- What I heard you say is you have an entirety 00:13:13.820 |
of lived experience, some of which you can communicate 00:13:21.440 |
which cannot be captured in those communication modalities. 00:13:27.640 |
both the things you can try to articulate in words, 00:13:31.520 |
using one word, for example, to communicate focus, 00:13:34.560 |
when it really may be represented in a 20 dimensional space 00:13:42.560 |
So it's a closer representation to the entirety 00:13:46.200 |
of your experience captured in a dynamic fashion 00:13:56.440 |
That was the feeling, and it felt like the future. 00:14:17.940 |
people should listen to it, Huberman Lab Podcast. 00:14:28.440 |
and I'm a data person, machine learning person. 00:14:43.520 |
and all the different metrics that come from that 00:14:52.000 |
is how this could be turned into a tool for science. 00:14:59.760 |
how am I doing my personal metrics of health, 00:15:04.040 |
but doing larger scale studies of human behavior and so on. 00:15:07.760 |
So like data not at the scale of an individual, 00:15:28.320 |
- If you think about that second thing you said 00:15:40.600 |
like we, the human race has done a pretty good job 00:15:43.240 |
figuring out how to quantify the things around us 00:15:46.120 |
from distant stars to calories and steps and our genome. 00:15:51.120 |
So we can measure and quantify pretty much everything 00:16:08.880 |
but we haven't done this at population scale. 00:16:14.840 |
or human cognition is probably the single law, 00:16:35.180 |
And if you, when I think about it through that frame, 00:16:42.260 |
It's almost like we live in this wild, wild west 00:16:46.140 |
of unquantified communications within ourselves 00:16:53.380 |
I mean, for example, I know if I buy an appliance 00:16:58.420 |
I don't need to look at the measurements on the appliance 00:17:03.040 |
That's an engineered system of appliance manufacturing 00:17:07.120 |
Everyone's agreed upon engineering standards. 00:17:10.640 |
And we don't have engineering standards around cognition. 00:17:26.560 |
politics, economics, education, all the above. 00:17:29.600 |
And so to me, the most significant contribution 00:17:37.900 |
the introduction of the formal engineering of cognition 00:17:53.120 |
as being in a crude way, reduced down to like tweets 00:17:58.920 |
So it's a very hardcore, many scale compression 00:18:10.240 |
I think the first step is to get the brain data. 00:18:19.160 |
to sort of interpreting that data in terms of cognition? 00:18:24.440 |
to start collecting data at scale from the brain, 00:18:27.760 |
and then we start to really be able to take little steps 00:18:43.080 |
but we don't understand most of what makes up cognition. 00:18:47.280 |
- This has been one of the most significant challenges 00:18:51.240 |
And kernel wouldn't exist if I wasn't able to fund it 00:18:55.720 |
Because when I engage in conversations with investors, 00:18:59.240 |
the immediate thought is what is the killer app? 00:19:04.960 |
That's what they're looking at is they're looking to de-risk. 00:19:17.000 |
there was no known path to even build a technology 00:19:26.920 |
we could commence having conversations with investors 00:19:33.240 |
So I funded the first $53 million for the company 00:19:38.960 |
the first one we did, I spoke to 228 investors. 00:20:12.680 |
If it meets the criteria of being a mass market device, 00:20:42.720 |
who do experiments of certain data collection parameters, 00:20:48.000 |
These are all critical elements that ignite discovery. 00:21:03.480 |
of how do we try to generate 20 years of value discovery 00:21:28.600 |
it would be one alien civilization in that equation. 00:21:31.240 |
So meaning like this is in search of an application 00:21:37.360 |
We need to come up with a better term than killer app. 00:21:46.680 |
It's some very inspiringly impactful application. 00:21:57.140 |
I dislike the chosen words in capturing the concept. 00:22:12.720 |
especially when you're talking about software 00:22:14.600 |
and hardware and artificial intelligence applications, 00:22:19.020 |
I actually regret now having called attention 00:22:24.200 |
because it's something I would not normally do. 00:23:15.840 |
we flag it and say, human intuition alert, stop it. 00:23:20.480 |
And so we really want to focus on the algorithm 00:23:23.120 |
of there's a natural process of human discovery 00:23:30.480 |
and you give people the opportunity to play around with it 00:23:35.440 |
we are thinking that is a much better system of discovery 00:23:45.920 |
where I was speaking to this one young associate professor 00:23:50.960 |
hey, we have these five data streams that we're pulling off. 00:23:57.120 |
what weighted value do you add to each data source? 00:23:59.360 |
Like which one do you think is gonna be valuable 00:24:03.160 |
And he said, I don't care, just give me the data. 00:24:05.760 |
All I care about is my machine learning model. 00:24:08.080 |
But importantly, he did not have a theory of mind. 00:24:20.800 |
that certain people are devaluing human intuitions 00:24:40.380 |
make the product such that the collection of data 00:24:52.880 |
- Our objective is to create the most valuable 00:25:01.360 |
And with that, then applying all the best tools 00:25:15.880 |
But yes, our objective is really to systematize 00:25:19.240 |
because we can't put definite timeframes on discovery. 00:25:30.120 |
And so we really need to figure out how to... 00:25:36.880 |
That's why most of the time it just languishes 00:25:46.060 |
and make this mainstream in the coming years. 00:25:54.020 |
for millions of people to put this on their head 00:26:04.740 |
Is it going to be people just kind of organically, 00:26:08.080 |
or is there going to be a Angry Birds style application 00:26:23.480 |
when I started wearing a wearable on my wrist 00:26:43.180 |
They told me, for example, if I eat close to bedtime, 00:26:52.540 |
you have all these follow-on consequences in life. 00:26:54.420 |
And so it opened up this window of understanding of myself 00:26:59.420 |
that I cannot self-introspect and deduce these things. 00:27:03.180 |
This is information that was available to be acquired, 00:27:07.100 |
I would have to get an expensive sleep study, 00:27:10.480 |
and that's not good enough to run all my trials. 00:27:17.300 |
and now you're applying it to the entire cortex on the brain, 00:27:21.660 |
and you say, what kind of information could we acquire? 00:27:25.060 |
It opens up a whole new universe of possibilities. 00:27:28.400 |
For example, we did this internal study at Kernel 00:27:32.580 |
and we were measuring the cognitive effects of sleep. 00:27:41.780 |
we performed four cognitive tasks over 13 sessions. 00:27:45.780 |
And we focused on reaction time, impulse control, 00:27:48.640 |
short-term memory, and then a resting state task. 00:28:17.100 |
whether or not I would be able to resist temptation 00:28:27.940 |
on your entire cortex, you can control the settings, 00:28:30.880 |
I think there's probably a large number of things 00:28:39.860 |
I just, for example, like when you read news, 00:28:44.820 |
- Like when you use social media, when you use news, 00:29:15.540 |
and place to do it is when you're behind a desk, 00:29:45.120 |
to the actual physical meat, space, effects on our body. 00:30:11.340 |
is going to be up to eight hours of deep work, 00:30:27.240 |
the ups and downs of that as you're doing programming, 00:30:29.440 |
as you're doing thinking about particular problems, 00:30:32.580 |
you're trying to visualize things in your mind, 00:30:58.100 |
I started using, started meditating using his app, 00:31:09.880 |
you're removing all the noise from your head, 00:31:12.620 |
and you're very much, it's an active process of, 00:31:15.940 |
active noise removal, active noise canceling, 00:31:21.220 |
And it'd be interesting to see what is going on in the mind 00:31:29.220 |
- And in all of your examples, it's interesting that 00:31:32.300 |
everyone who's designed an experience for you, 00:31:35.380 |
so whether it be the meditation app or the deep work, 00:31:47.020 |
- Now, what if we expanded the number of knowns by 10X, 00:31:55.500 |
So it'd be, and so this is the dimensionality 00:32:00.300 |
is that people will be able to use this quantification, 00:32:04.620 |
use this information to build more effective products. 00:32:09.260 |
And this is, I'm not talking about better products 00:32:13.980 |
I'm talking about our focus is helping people, 00:32:20.340 |
and this quantification, and then to engage with others 00:32:26.060 |
That the objective is betterment across ourselves, 00:32:36.100 |
Like if you're building a podcast listening app, 00:32:38.020 |
it would be nice to know data about the listener 00:32:44.540 |
It's like really dumb, just very simple applications 00:32:48.380 |
that could just improve the quality of the experience 00:32:51.980 |
- I'm imagining if you have your neuron, this is Lex, 00:32:56.660 |
and there's a statistical representation of you, 00:33:02.540 |
"Lex, your best to engage with this meditation exercise 00:33:10.980 |
"At this time of day, after eating this kind of food 00:33:13.940 |
"or not eating, fasting, with this level of blood glucose 00:33:33.260 |
And so the question is, how much do we really know 00:33:38.500 |
And I would venture to guess in my life experience, 00:33:41.220 |
my self-awareness captures an extremely small percent 00:33:50.500 |
- Well, in some sense, the data would help encourage you 00:33:55.700 |
not just because you trust everything the data is saying, 00:33:58.600 |
but it'll give you a prod to start investigating. 00:34:12.300 |
it's probably important to do without the data, 00:34:37.380 |
- This is a safe space. - You're in a safe space. 00:34:40.420 |
- No, I definitely have much less self-control at night 00:35:11.420 |
I'm not gonna go on a whole rant about nutrition science, 00:35:18.380 |
but nutrition science is a very difficult field of study 00:35:28.460 |
And it's so difficult from a scientific perspective 00:35:32.780 |
that you have to be almost like a scientist of one. 00:35:39.060 |
That's the best way to understand what works for you or not. 00:35:41.740 |
And I don't understand why, 'cause it sounds unhealthy, 00:35:44.580 |
but eating only meat always makes me feel good. 00:35:49.740 |
And I don't have any allergies, any of that kind of stuff. 00:35:54.560 |
where if he deviates a little bit from the carnivore diet, 00:36:24.180 |
And I think that repeats itself in all kinds of experiences. 00:36:39.780 |
the impact it has on my mind and the clarity of mind 00:36:45.820 |
and all those kinds of things, I feel really good. 00:36:48.140 |
And to be able to concretely express that through data 00:36:53.820 |
It would be a nice reminder, almost like a statement. 00:37:09.100 |
that make me feel really good and make me feel not good. 00:37:20.140 |
from being responsible for constructing my diet. 00:37:24.640 |
where I now track over 200 biomarkers every 90 days. 00:37:30.740 |
the things you would expect like cholesterol, 00:37:39.380 |
And then I let that data generate the shopping list. 00:37:42.080 |
And so I never actually ask my mind what it wants. 00:37:45.580 |
It's entirely what my body is reporting that it wants. 00:37:48.340 |
And so I call this goal alignment with M. Brian. 00:37:55.220 |
And so I'm asking my liver, how are you doing? 00:38:02.220 |
and I only eat those foods until my next testing round. 00:38:06.140 |
And that has changed my life more than I think anything else 00:38:17.940 |
it led me astray because like you were saying, 00:38:22.460 |
and it navigates the dozens of different dietary regimens 00:38:32.740 |
in certain contextual settings, but it's not N of one. 00:38:35.680 |
And like you're saying, this dietary really is an N of one. 00:38:39.380 |
What people have published scientifically, of course, 00:38:46.420 |
but it changes when you get to an N of one level. 00:38:48.540 |
And so that's what gets me excited about brain interfaces 00:38:54.600 |
where I can stop asking my conscious mind for its advice 00:39:03.540 |
And I've never had better health markers in my life 00:39:13.900 |
is such a sort of engineering way of phrasing 00:39:26.120 |
By the way, testing round, what does that look like? 00:39:39.020 |
I do a food inflammation, a diet-induced inflammation. 00:39:45.440 |
So foods that produce inflammatory reactions. 00:39:52.140 |
I do, yeah, there's several micronutrient tests 00:39:57.480 |
to see how I'm looking at the various nutrients. 00:39:59.440 |
- What about like self-report of like how you feel? 00:40:09.200 |
you still exist within your conscious mind, right? 00:40:12.560 |
So that lived experience is of a lot of value. 00:40:17.960 |
- I do a temporal sampling over some duration of time. 00:40:20.960 |
So I'll think through how I feel over a week, 00:40:27.580 |
if I'm at the grocery store in front of a cereal box 00:40:30.960 |
Captain Crunch is probably the right thing for me today 00:40:33.320 |
'cause I'm feeling like I need a little fun in my life. 00:40:38.880 |
then I smooth out the function of my natural oscillations 00:40:53.360 |
if you're looking at health over a 90-day period of time, 00:41:02.400 |
this is how I'm doing and this is what I want. 00:41:04.640 |
And so it really is an accounting system for everybody. 00:41:09.000 |
if you think about the future of being human, 00:41:13.760 |
there's two things I think that are really going on. 00:41:25.680 |
over a certain design, over a certain timeframe. 00:41:35.600 |
We are now designing our own intelligence systems. 00:41:45.080 |
- Design, manufacturing, distribution of intelligent 00:41:54.520 |
And evolution is doing the design, manufacturing, 00:41:59.840 |
And now we are doing the design, manufacturing, 00:42:06.560 |
That's a very nice way of looking at life on Earth. 00:42:23.360 |
our existence becomes a goal alignment problem, 00:42:45.280 |
it's not just my conscious mind, which is opining, 00:42:50.160 |
And there's a whole bunch of more voices involved. 00:43:00.000 |
the things that we spend high energy on today 00:43:08.600 |
the terms and conditions of intelligent life. 00:43:11.160 |
Now we say conscious existence because we're biased 00:43:15.040 |
but it will be the largest computational exercise in history 00:43:18.640 |
'cause you're now doing goal alignment with planet Earth, 00:43:24.040 |
within all the intelligent agents we're building, 00:43:29.400 |
You basically have a trillions and trillions of agents 00:43:32.000 |
working on the negotiation of goal alignment. 00:43:50.160 |
We're negotiating the terms and conditions of existence. 00:43:55.040 |
- Do you worry about the survival of this process? 00:43:59.280 |
That life as we know it on Earth comes to an end 00:44:09.600 |
something happens where all of that intelligence 00:44:13.400 |
is thrown in the trash by something like nuclear war 00:44:17.600 |
or development of AGI systems that are very dumb. 00:44:20.800 |
Not AGI, I guess, but AI systems, the paperclip thing, 00:44:25.360 |
en masse is dumb but has unintended consequences 00:44:32.320 |
- I mean, it's unsurprising that a new thing comes into 00:44:45.480 |
Our amygdala fires up and says, "Scary, foreign. 00:44:53.400 |
And so it makes sense from a biological perspective 00:45:00.940 |
What I don't think has been properly weighted with that 00:45:15.160 |
that has been able to look out over their expected lifetime 00:45:18.880 |
and see there is a real possibility of evolving 00:45:25.880 |
So different that it would be totally unrecognizable 00:45:35.920 |
We can't, you can't look in the sky and see that. 00:45:38.160 |
Thing that is shining, we're gonna go up there. 00:45:40.920 |
You cannot even create an aspirational statement about it. 00:45:45.920 |
And instead, we've had this knee jerk reaction of fear 00:45:53.720 |
But in my estimation, this should be the defining aspiration 00:45:58.720 |
of all intelligent life on Earth that we would aspire. 00:46:04.800 |
That basically every generation surveys the landscape 00:46:11.200 |
and other contextual situation that they're in. 00:46:21.920 |
We should carefully think this thing through, 00:46:24.200 |
not just of mitigating the things that'll wipe us out, 00:46:31.760 |
even though it's within this realm of possibilities. 00:46:46.120 |
I mean, there's kind of a fog through which you can't see. 00:47:05.560 |
and become much more like collective intelligences, 00:47:18.880 |
And so it's almost, I mean, it almost feels like 00:47:33.040 |
how much that actually changed people or societies. 00:47:54.760 |
like plugging our brains into this whole thing, 00:48:01.320 |
And it seems like it's a fog, you don't know. 00:48:09.000 |
whatever comes to be could destroy the world. 00:48:14.040 |
But it also could transform in ways that creates 00:48:24.520 |
that's unlike anything we've ever experienced. 00:48:27.360 |
It might involve dissolution of ego and consciousness 00:48:32.640 |
It might be more, that might be a certain kind of death, 00:48:38.920 |
But the experience might be really exciting and enriching. 00:48:47.040 |
a bunch of sort of hypothetical questions of, 00:48:50.480 |
would it be more fulfilling to live in a virtual world? 00:49:14.080 |
but we still want to live in the physical world, 00:49:16.040 |
have friendships and relationships in the physical world. 00:50:28.200 |
because it's so much the in-group, out-group thing. 00:50:33.820 |
it was not only that I desperately wanted lights out forever, 00:50:37.320 |
it was that I couldn't have lights out forever. 00:50:45.480 |
that would either penalize or reward you for your behaviors. 00:50:50.480 |
And so it was almost like this indescribable hopelessness 00:51:08.600 |
like, yes, I have no remorse for lights being out, 00:51:13.300 |
and actually wanted more than anything in the entire world. 00:51:16.360 |
There are other times where I'm looking out at the future 00:51:21.240 |
and I say, this is an opportunity for future, 00:51:42.600 |
I don't know what it's like to be in your head, 00:51:45.740 |
but in my head, when I wake up in the morning, 00:51:48.960 |
I don't say, good morning, Brian, I'm so happy to see you. 00:51:52.880 |
Like, I'm sure you're just gonna be beautiful to me today. 00:51:59.120 |
You're not gonna repeat that list to me 400 times. 00:52:06.040 |
You're just gonna just help me along all day long. 00:52:08.880 |
I mean, it's a brutal environment in my brain. 00:52:11.800 |
And we've just become normalized to this environment 00:52:15.560 |
that we just accept that this is what it means to be human. 00:52:20.560 |
if we try to muster as much soberness as we can 00:52:24.080 |
about the realities of being human, it's brutal. 00:52:28.240 |
And so, am I sad that the brain may be off one day? 00:52:42.440 |
And this is why, again, I don't trust my conscious mind. 00:52:51.520 |
And then I figured out it was not a real reality. 00:53:06.400 |
and how my brain is distorting reality all the time. 00:53:13.600 |
I don't trust realities that are given to me. 00:53:15.840 |
And so, to try to make a decision on what I value 00:53:22.000 |
- So, not fully listening to the conscious mind 00:53:31.040 |
but allowing you to go up and down as it does, 00:53:35.320 |
- Yes, I assume that whatever my conscious mind 00:53:43.240 |
And I just need to figure out where it's wrong, 00:53:46.920 |
and then try to correct for it as best I can. 00:53:55.600 |
- Is there something you can say by way of advice 00:54:00.880 |
when the conscious mind serves up something that, 00:54:08.960 |
like how in your own life you've overcome that, 00:54:11.080 |
and others who are experienced in that can overcome it? 00:54:33.040 |
about suggestion of the hopelessness of life, 00:55:01.720 |
to be able to say, "Thank you, you're not real. 00:55:08.640 |
- And so I'm in a situation where for whatever reason 00:55:21.240 |
And when I was trying to solve my depression, 00:55:25.920 |
I tried it systematically and nothing would fix it. 00:55:29.360 |
And so this is what gives me hope with brain interfaces. 00:55:32.320 |
For example, could I have numbers on my brain? 00:55:55.060 |
yeah, it opens up the possibility of really helping 00:56:03.480 |
the ways, the ups and downs of those dark moments. 00:56:18.280 |
it's almost like a chemistry thing, a biochemistry thing, 00:56:26.080 |
I'll look at like this cup and I'll be overcome with joy 00:56:34.120 |
Like I actually think my biochemistry is such 00:56:44.680 |
Like it's, and it's not a rational thing at all. 00:56:56.760 |
the meditative experience will allow you to sort of, 00:57:01.120 |
like the movement of your hand as deeply joyful 00:57:10.040 |
and I'll just be like, "Fuck, life is awesome." 00:57:24.440 |
There's no rational, it doesn't fit with the rest of my life. 00:57:28.120 |
I have all this shit, I'm always late to stuff. 00:57:34.320 |
like really self-critical about everything I do. 00:57:39.840 |
But there's this engine of joy for life outside of all that. 00:57:45.200 |
And the flip side of that is what depression probably is, 00:57:53.480 |
'cause I bet you that feeling of the cup being amazing 00:58:02.720 |
you're in a desert and it's a drink of water. 00:58:09.320 |
it would be nice to understand where that's coming from. 00:58:13.440 |
To be able to understand how you hit those lows 00:58:27.800 |
Maybe it could be just like basic habits that you engage in 00:58:35.600 |
- And this goes back to the discussion we're having 00:58:41.160 |
the largest input of raw material into society. 00:59:10.920 |
compound or problems that we're experiencing? 00:59:30.200 |
if we can assign some numbers to these things 00:59:40.160 |
in how we conduct our lives and how we build society, 00:59:42.800 |
it might be the thing that enables us to scaffold. 01:00:05.320 |
We are the one part of this intelligence infrastructure 01:00:22.640 |
but the wildest experience, which is psychedelics. 01:00:33.440 |
which is exciting from a scientific perspective 01:00:42.400 |
And how can data about this help us understand it? 01:01:04.080 |
that are convinced that they've actually met the elves. 01:01:17.280 |
- Yeah, I think they're very critical as friends. 01:01:30.160 |
So there's a bunch of different version of trolls. 01:01:38.640 |
And there's trolls that just enjoy your destruction. 01:01:42.640 |
And I think they're the ones that care for you. 01:01:53.200 |
- Yeah, a bit of a, and the whole point is on psychedelics 01:01:58.880 |
this is where the brain data versus word data fails, 01:02:08.200 |
Most people that, you can be poetic and so on, 01:02:16.160 |
- To me, what baselines this conversation is, 01:02:18.920 |
imagine if we were interested in the health of your heart 01:02:23.040 |
and we started and said, okay, Lex, self-introspect, 01:02:31.680 |
and you think, feels all right, like things feel okay. 01:02:41.000 |
And you're like, well, actually what I'd really like you 01:02:49.360 |
And there's like five to 10 studies you would do. 01:02:53.240 |
They would then give you this report and say, 01:03:03.120 |
and maybe I'll put you on a statin, et cetera. 01:03:08.560 |
You would think the cardiologist is out of their mind 01:03:11.560 |
if they just gave you a bottle of statins based upon, 01:03:14.320 |
you're like, well, I think something's kind of wrong. 01:03:16.240 |
And they're just kind of experiment and see what happens. 01:03:20.080 |
But that's what we do with our mental health today. 01:03:27.440 |
to have, again, to be able to measure the brain 01:03:31.540 |
and then to measure during a psychedelic experience 01:03:37.160 |
you now have a quantification of what's going on. 01:03:41.760 |
what molecule is appropriate at what dosages, 01:03:45.080 |
at what frequency, in what contextual environment? 01:03:47.760 |
What happens when I have this diet with this molecule, 01:03:57.780 |
what we could potentially do with psychedelics 01:04:05.260 |
And it may improve the outcomes people experience, 01:04:11.600 |
And so that's what I hope we are able to achieve. 01:04:24.120 |
when we talk about things related to the mind, 01:04:32.020 |
because we can't talk about a marker in the brain. 01:04:45.480 |
instead of the modalities being the thing we talk about. 01:04:47.680 |
Meditation just does good things in a crude fashion. 01:05:09.520 |
First principles are understanding of system laws. 01:05:14.480 |
So if you take, for example, like in Sherlock Holmes, 01:05:18.840 |
So he says, "Once you've eliminated the impossible, 01:05:23.840 |
"anything that remains, however improbable, is true." 01:05:32.200 |
the holistic detective by Douglas Adams says, 01:05:38.240 |
So when someone says from a first principles perspective, 01:05:42.320 |
and they're trying to assume the fewest number of things 01:06:01.620 |
that would maximally increase the probability 01:06:03.600 |
that the human race thrives beyond what we can even imagine. 01:06:07.160 |
And I found that in my conversations with others, 01:06:10.640 |
in the books I read, in my own deliberations, 01:06:20.600 |
yeah, I didn't feel like the future could be deduced 01:06:36.400 |
- I think it's my favorite book I've ever read. 01:06:38.240 |
- It's also a really interesting number, zero. 01:06:44.280 |
I didn't realize that it caused a revolution in philosophy 01:07:05.120 |
a representation of a zero principle thinking, 01:07:12.260 |
And so when you talk about what kind of ideas 01:07:28.040 |
I was wanting to find a quantitative structure 01:07:31.960 |
on how to think about these zeroth principles. 01:07:40.800 |
And so now it's a staple as part of how I think 01:07:51.260 |
Essentially trying to identify what is impossible 01:08:00.960 |
is most of society tells you the range of things 01:08:08.000 |
I mean, that's the whole process of this kind of thinking 01:08:17.800 |
trying to draw the lines of what is actually impossible 01:08:22.200 |
because very few things are actually impossible. 01:08:29.440 |
Like it's the Joe Rogan is entirely possible. 01:08:45.360 |
- Yeah, life constraints favor first principles thinking 01:09:02.560 |
And so in a society constrained by resources, 01:09:14.520 |
But the reason why I think zero principle thinking 01:09:16.800 |
should be a staple of our shared cognitive infrastructure 01:09:21.800 |
is if you look to the history of past couple of thousand 01:09:29.960 |
we subjectively try to assess what is a zero level idea. 01:09:33.840 |
And we say, how many have occurred on what time scales 01:09:37.900 |
and what were the contextual settings for it? 01:09:55.720 |
playing Go with AlphaGo being from another dimension. 01:10:04.360 |
has an attribute of introducing zero-like insights, 01:10:09.360 |
then if you say, what is going to be the occurrence 01:10:26.360 |
this computational intelligence throughout society 01:10:29.000 |
that the manufacturing design and distribution 01:10:31.120 |
of intelligence is now going to heading towards zero, 01:10:33.800 |
you have an increased number of zeros being produced 01:10:37.060 |
with a tight connection between human and computers. 01:10:43.320 |
we cannot predict the future with first principle thinking. 01:10:47.520 |
We can't, that cannot be our imagination set. 01:10:55.400 |
that basically the future of our conscious existence, 01:11:06.960 |
you're referring to basically a truly revolutionary idea. 01:11:11.500 |
- Yeah, something that is currently not a building block 01:11:23.100 |
Yeah, it's currently not manifest in what we acknowledge. 01:11:28.520 |
- So zero principle thinking is playing with ideas 01:11:32.440 |
that are so revolutionary that we can't even clearly reason 01:11:37.440 |
about the consequences once those ideas come to be. 01:11:55.040 |
That basically building upon what Newton had done 01:11:59.080 |
and said, yes, also, and it just changed the fabric 01:12:19.060 |
And so to your point, there's this question of, 01:12:37.460 |
And first principles and zero principle thinking 01:12:45.100 |
try to create probabilities for these things. 01:12:50.660 |
if they were just part of our standard thought processes, 01:12:58.340 |
in what we do individually, collectively as a society, 01:13:05.700 |
- Yeah, I've been engaged in that kind of thinking 01:13:16.200 |
I think it's possible in the language that we're using here. 01:13:19.240 |
And it's very difficult to reason about a world 01:13:21.720 |
when inklings of consciousness can be engineered 01:13:35.600 |
I believe a good step towards engineering consciousness 01:13:39.100 |
is creating engineering the illusion of consciousness. 01:14:00.720 |
but I think that's what we kind of do to each other. 01:14:21.240 |
I create my consciousness by having interacted with you. 01:14:40.840 |
really wanna believe that we possess this thing 01:14:52.700 |
and they're feeding this subjective experience to us, 01:15:02.180 |
that we construct to make social communication 01:15:10.600 |
you can create some very fulfilling experiences in software. 01:15:14.840 |
And so that to me is a compelling space of ideas to explore. 01:15:19.600 |
And I think going back to our experience together 01:15:23.280 |
you could imagine if we get to a certain level of maturity. 01:15:33.140 |
That has a certain amount of information transfer rate 01:15:39.300 |
And so in our communication with people via email 01:15:42.780 |
we've taken the bandwidth of human interaction, 01:15:46.900 |
the information transfer rate, and we've reduced it. 01:15:53.020 |
There's a lot more opportunity for misunderstanding. 01:15:59.420 |
And if we add brain interfaces to the equation, 01:16:20.520 |
I can imagine what it might be like to be Lex 01:16:31.560 |
that Lex is experiencing as he looks at this cup, 01:16:41.880 |
which is entirely unique from how I experienced joy, 01:16:46.080 |
that we're having some kind of similar experience. 01:16:50.120 |
we do consciousness engineering today in everything we do. 01:16:52.820 |
When we talk to each other, when we're building products 01:17:23.440 |
- Yeah, and it's funny you focus on human to human, 01:17:40.560 |
Right now we're putting humans as the central node. 01:17:44.560 |
What if we gave GPT-3 a bunch of human brains 01:17:48.640 |
and said, "Hey, GPT-3, learn some manners when you speak 01:17:56.560 |
and see how they respond so you can be polite 01:18:01.760 |
and so that you can be conversationally appropriate." 01:18:04.680 |
But to inverse it, to give our machines a training set 01:18:24.540 |
teach it some, have it read the finding documents 01:19:14.200 |
- Yeah, they don't want to manipulate each other for sure. 01:19:20.880 |
- So that's, I mean, you kind of spoke to ethics. 01:19:48.200 |
There's a leap of faith, there's a leap of trust 01:19:55.360 |
And then the challenge is when you're in the digital space, 01:20:22.160 |
- I think we got off to a wrong start with the internet, 01:20:48.120 |
And you don't need to tell them what you're doing with it. 01:20:53.880 |
but the game is who can acquire the most information 01:21:09.800 |
the individual always has control over their data. 01:21:17.560 |
but they can just go out and grab as much as they want. 01:21:20.240 |
So for example, when your brain data was recorded today, 01:21:27.760 |
And so it's individual consent, it's individual control, 01:21:33.320 |
But it has to be based upon some clear rules of play. 01:21:43.800 |
So everybody knows, what does control look like? 01:21:48.920 |
- Yeah, delete it and to know who it's being shared with 01:21:53.240 |
We haven't reached that level of sophistication 01:21:55.800 |
with our products of, if you say, for example, 01:21:59.840 |
hey, Spotify, please give me a customized playlist 01:22:15.280 |
We haven't gotten there to that level of sophistication, 01:22:16.960 |
but these are ideas we need to start talking about 01:22:19.360 |
of how would you actually structure permissions? 01:22:23.080 |
- And I think it creates a much more stable set 01:22:26.280 |
for society to build where we understand the rules of play 01:22:31.120 |
and people aren't vulnerable to being taken advantage. 01:22:33.880 |
It's not fair for an individual to be taken advantage of 01:22:39.760 |
without their awareness, with some other practice 01:22:42.240 |
that some company is doing for their sole benefit. 01:22:44.640 |
And so hopefully we are going through a process now 01:22:56.160 |
these are fundamentals we need to have in place. 01:23:01.280 |
- It's kind of fun to think about like in Chrome, 01:23:05.000 |
when you install an extension or like install an app, 01:23:07.800 |
it's ask you like what permissions you're willing to give 01:23:11.680 |
it says like you can have access to my brain data. 01:23:15.120 |
- I mean, it's not unimaginable in the future 01:23:20.400 |
that the big technology companies have built a business 01:23:26.200 |
that they can then create a view to model of you 01:23:29.480 |
And so it's not unimaginable that you will create 01:23:33.920 |
a more reliable predictor of you than they could. 01:23:40.240 |
And you're the one that gets to negotiate that with them 01:23:44.000 |
But it's not unimaginable that might be the case. 01:23:55.400 |
that has, that's also excited about the brain. 01:23:59.200 |
So it'd be interesting to hear your kind of opinions 01:24:01.480 |
about a very different approach that's invasive, 01:24:05.520 |
that implants a data collection device in the brain. 01:24:10.240 |
between Kernel and Neuralink in the approaches 01:24:17.480 |
- Elon and I spoke about this a lot early on. 01:24:23.040 |
and he had an interest in brain interfaces as well. 01:24:39.160 |
at this very early time where it wasn't certain 01:24:51.080 |
and then the technological choice of doing that. 01:24:55.000 |
our starting point was looking at invasive technologies. 01:24:58.240 |
And I was building invasive technology at the time. 01:25:04.920 |
A little less than a year after Elon and I were engaged, 01:25:11.480 |
And we had this neuroscientist come to Kernel. 01:25:16.040 |
he had been doing neural surgery for 30 years, 01:25:17.680 |
one of the most respected neuroscientists in the US. 01:25:23.520 |
And at the very end of our three hour conversation, 01:25:30.240 |
"a new technology comes along that changes everything." 01:25:40.800 |
I thought, 'cause I had spoken to Bob Greenberg 01:25:44.560 |
who had built a second site first on the optical nerve 01:25:48.240 |
and then he did an array on the optical cortex. 01:25:53.240 |
And then I also became friendly with NeuralPace 01:25:57.240 |
who does the implants for seizure detection and remediation. 01:26:08.960 |
an implantable device through for a 15 year run. 01:26:18.960 |
And I really didn't want to build invasive technology. 01:26:22.260 |
It was the only thing that appeared to be possible. 01:26:31.920 |
"Is there anything here that again has the characteristics 01:26:37.220 |
"it could be low cost, it could be accessible. 01:26:55.100 |
and they're now, you experienced one of them today. 01:27:02.560 |
So we're trying to figure out what value is really there. 01:27:04.640 |
But I'd say it's really too early to express confidence. 01:27:19.240 |
- Yeah, timescales are really important here. 01:27:21.480 |
Because if you look at the, like on the invasive side, 01:27:27.000 |
of less invasive techniques to get at the neuron firings, 01:27:47.480 |
It may be the case that Neuralink has properly chosen 01:27:53.940 |
And it's also possible that the paths we've chosen 01:27:55.720 |
for non-invasive fall short for a variety of reasons. 01:28:00.640 |
And so right now, the two technologies we chose, 01:28:03.240 |
the analogy I'd give you to create a baseline 01:28:11.360 |
the internet became useful when people could do 01:28:15.800 |
And then the paid, and then as bandwidth increased, 01:28:22.920 |
And so if you say what, kernel flow is going to give you 01:28:46.140 |
And Neuralink is gonna give you a circle on the screen 01:29:06.280 |
were basically the answer for the next seven years. 01:29:20.880 |
- It's kind of fascinating to think about that. 01:29:23.520 |
You don't, it's very true that you don't know. 01:29:36.240 |
we will look back and maybe not even remember one of them. 01:29:50.040 |
It's like you're marching ahead into the darkness, 01:30:00.520 |
anything that's off the shelf right now is inadequate. 01:30:12.880 |
The room-sized machines are on their own path. 01:30:29.600 |
And that's kind of what it takes to have a go at this. 01:30:34.920 |
I mean, at Kernel, we are from the photon and the atom 01:30:50.240 |
And that's what it takes to build next generation, 01:30:53.200 |
to make an attempt at breaking into brain interfaces. 01:30:57.560 |
And so we'll see over the next couple of years, 01:31:01.680 |
or whether something else comes along in seven to 10 years, 01:31:04.440 |
which is the right thing that brings it mainstream. 01:31:11.080 |
or a fellow traveler along the path of uncertainty, 01:31:21.800 |
is how many companies are going to be invited 01:31:30.520 |
Because if you think the hardware just starts the process. 01:31:39.400 |
you say, okay, now I can get measurements on the body. 01:31:47.760 |
And they were in the market for a certain duration of time 01:31:50.120 |
and Google bought them for two and a half billion dollars. 01:31:55.600 |
There weren't people building on top of the Fitbit device. 01:32:07.120 |
you have value in the device that someone buys, 01:32:09.160 |
but also you have everyone who's building on top of it. 01:32:13.440 |
And then you have additional data streams that come in, 01:32:17.480 |
And so if you say, if you look at the hardware 01:32:27.360 |
five or 10% of the value of the overall ecosystem. 01:32:33.280 |
the mainstream adoption of quantifying the brain. 01:33:02.040 |
built upon this ecosystem we've created to better your life. 01:33:21.520 |
the most exhilarating opportunity we could ever pursue 01:33:26.520 |
- You founded the payment system Braintree in 2007 01:33:35.160 |
And then in that same year was acquired by PayPal, 01:33:45.400 |
and the challenge of building an online payment system 01:33:48.120 |
and just building a large successful business in general? 01:34:02.720 |
And I came back to the US and I was shocked by the opulence 01:34:09.480 |
And I just thought this is, I couldn't believe it. 01:34:12.040 |
And I decided I wanted to try to spend my life 01:34:20.360 |
versus making money and whatever the case may be 01:34:28.680 |
by the age of 30 to never have to work again. 01:34:42.480 |
And so in that process, I started a few companies, 01:34:49.800 |
In one of the endeavors, I was up to my eyeballs in debt. 01:34:54.080 |
and I needed a part-time job to pay my bills. 01:35:02.880 |
where I was living, the 50 richest people in Utah. 01:35:05.760 |
And I emailed each one of their assistants and said, 01:35:10.440 |
I'll do anything, I'll just wanna, I'm entrepreneurial. 01:35:17.760 |
And then I interviewed at a few dozen places. 01:35:28.040 |
that I saw this job posting for credit card sales 01:35:52.000 |
But it's kind of like, if you could fog a mirror, 01:35:55.000 |
like come and do this because it's 100% commission. 01:35:57.240 |
And so I started walking up and down the street 01:35:59.200 |
in my community, selling credit card processing. 01:36:02.440 |
And so what you learn immediately in doing that is 01:36:07.680 |
first of all, the business owner is typically there. 01:36:21.800 |
And so you have to overcome the initial get out. 01:36:27.360 |
when you say the word credit card processing, 01:36:31.240 |
because I have been taken advantage of dozens of times 01:36:41.080 |
'cause I was still working on my other startup 01:36:45.960 |
And so I figured out that the industry really was 01:36:55.040 |
Basically people promising things that were not reality. 01:36:58.480 |
And so I'd walk into a business, I'd say, look, 01:37:08.400 |
And then he usually crack a smile and say, okay, 01:37:12.640 |
And so I'd sit down, I just opened my book and I'd say, 01:37:22.640 |
It's like, you're gonna process your credit card. 01:37:26.360 |
You're gonna have someone who answers the call 01:37:28.280 |
when someone asks and just like the basic, like you're okay. 01:37:32.800 |
And then of course I went to the next business 01:37:51.080 |
It's very well, very well strategized and executed. 01:38:13.240 |
I would just make $62,000 a month of income passively 01:38:16.880 |
with these merchants processing credit cards. 01:38:20.520 |
And so that's when I thought I'm gonna create a company. 01:38:53.540 |
And so they just dealt with software they didn't like. 01:38:55.060 |
And so with Braintree, I thought the entry point 01:38:58.040 |
is to build software that engineers will love. 01:39:00.400 |
And if we can find the entry point via software 01:39:02.160 |
and make it easy and beautiful and just a magical experience 01:39:05.400 |
and then provide customer service on top of that, 01:39:08.720 |
What I was really going after though, it was PayPal. 01:39:11.560 |
They were the only company in payments making money 01:39:14.080 |
because they had the relationship with eBay early on. 01:39:22.240 |
They'd fund their account with their checking account 01:39:25.480 |
And then when they'd use PayPal to pay a merchant, 01:39:31.000 |
versus if you have coming from a credit card, 01:39:44.920 |
And so I knew PayPal really was the model to replicate, 01:39:48.320 |
but a bunch of companies had tried to do that. 01:39:50.400 |
They tried to come in and build a two-sided marketplace. 01:39:52.580 |
So get consumers to fund the checking account 01:39:56.640 |
but they'd all failed because building a two-sided 01:40:04.520 |
and get the best merchants in the whole world 01:40:32.360 |
So four years in our customer base included Uber, 01:40:47.200 |
some of the fastest growing tech companies in the world. 01:40:52.080 |
and they had done a remarkable job in building product. 01:40:55.080 |
There's then something very counterintuitive, 01:40:56.840 |
which is make public your private financial transactions 01:40:59.440 |
with people previously thought were something 01:41:11.000 |
'cause now people could fund their Venmo account 01:41:12.480 |
with their checking account, keep money in the account. 01:41:15.000 |
And then you could just plug Venmo in as a form of payment. 01:41:19.920 |
that we were getting the best merchants in the world. 01:41:25.480 |
They were both the up and coming millennials at the time 01:41:30.480 |
And so they came in and offered us an attractive number. 01:41:40.960 |
It wasn't to try to climb the Forbes billionaire list. 01:41:44.160 |
It was, the objective was I want to earn enough money 01:41:48.760 |
so that I can basically dedicate my attention 01:41:52.640 |
to doing something that could potentially be useful 01:41:58.800 |
And more importantly, that could be considered 01:42:01.880 |
to be valuable from the vantage point of 2050, 2100 and 2500. 01:42:06.880 |
So thinking about it on a few hundred year timescale. 01:42:13.120 |
And there was a certain amount of money I needed to do that. 01:42:17.040 |
So I didn't require the permission of anybody to do that. 01:42:21.000 |
And so that, what PayPal offered was sufficient 01:42:23.080 |
for me to get that amount of money to basically have a go. 01:42:25.760 |
And that's when I set off to survey everything 01:42:36.040 |
what one thing could I do that would actually have 01:42:42.840 |
And so it took me a little while to arrive at Brainerd Faces, 01:42:45.320 |
but payments in themselves are revolutionary technologies 01:42:54.640 |
Like let's not sort of, let's not forget that too easily. 01:43:08.160 |
who are now fascinated with the space of cryptocurrency. 01:43:13.680 |
And where payments are very much connected to this, 01:43:21.760 |
they also kind of connect that to not just purely 01:43:29.160 |
And they see Bitcoin as a way, almost as activism, 01:43:34.160 |
almost as a way to resist the corruption of centralized, 01:43:42.800 |
decentralizing control, whether that's Bitcoin 01:43:45.480 |
or other cryptocurrencies, they see that's one possible way 01:43:54.560 |
that are corrupt or are not respectful of human rights 01:43:59.780 |
What's your sense, just all your expertise with payments 01:44:09.720 |
for the future of Bitcoin or other cryptocurrencies 01:44:12.480 |
in the positive impact it may have on the world? 01:44:15.800 |
- Yeah, and to be clear, my communication wasn't meant 01:44:20.120 |
to minimize payments or to denigrate it in any way. 01:44:29.360 |
it was an algorithm of what could I individually do? 01:44:35.260 |
So there are things that exist that have a lot of potential 01:44:39.640 |
that can be done, and then there's a filtering 01:44:43.160 |
of how many people are qualified to do this given thing. 01:44:48.200 |
that can be done of, okay, given the number of qualified 01:44:50.280 |
people, will somebody be a unique out performer 01:44:55.280 |
of that group to make something truly impossible 01:44:57.960 |
to be something done that otherwise couldn't get done? 01:45:04.460 |
- And some of that has to do with you being very, 01:45:09.640 |
but some of that is just like what you sense, 01:45:13.640 |
like part of that equation is how much passion you sense 01:45:16.560 |
within yourself to be able to drive that through, 01:45:19.100 |
to discover the impossibilities and make them possible. 01:45:23.940 |
I think we were the first company to integrate Coinbase 01:45:26.880 |
into our, I think we were the first payments company 01:45:30.080 |
to formally incorporate crypto, if I'm not mistaken. 01:45:37.600 |
is a place where you can trade cryptocurrencies. 01:45:39.640 |
- Yeah, which was one of the only places you could. 01:45:52.400 |
I concur with the statement you made of the potential 01:45:57.400 |
of the principles underlying cryptocurrencies. 01:46:04.200 |
And that many of the things that they're building 01:46:16.400 |
and equally applicable to how the brain interacts 01:46:20.840 |
and how we would imagine doing goal alignment with people. 01:46:25.600 |
So to me, it's a continuous spectrum of possibility. 01:46:32.240 |
And I think it just is basically a scaffolding layer 01:46:39.800 |
I think we, at Kernel, we will benefit greatly 01:46:44.560 |
from the progress being made in cryptocurrency 01:46:47.120 |
because it will be a similar technology stack 01:46:49.000 |
we will want to use for many things we want to accomplish. 01:46:55.080 |
and think it could greatly enhance brain interfaces 01:46:58.840 |
and the value of the brain interface ecosystem. 01:47:01.200 |
- I mean, is there something you could say about, 01:47:02.920 |
first of all, bullish on cryptocurrency versus fiat money? 01:47:08.760 |
cryptocurrency will be embraced by governments 01:47:28.700 |
looking at foods in certain biochemical states. 01:47:34.760 |
And then I choose based upon those momentary windows. 01:47:45.620 |
is based upon human conscious decision-making 01:47:48.360 |
and politics and power and this whole mess of things. 01:47:55.340 |
of cryptocurrency is it's methodical, it's structured. 01:48:04.740 |
which I think again is the right starting point 01:48:07.620 |
for how we think about building next generation 01:48:13.500 |
And that's why I think it's much broader than money. 01:48:18.380 |
is the demotion of the conscious mind as well. 01:48:25.260 |
it's like giving less priority to the ups and downs 01:48:29.700 |
of any one particular human mind, in this case your own, 01:48:33.020 |
and giving more power to the sort of data-driven. 01:48:41.220 |
That cryptocurrency is a version of what I would call 01:48:51.380 |
It is an introduction of an autonomous system 01:48:54.980 |
of value exchange and the process of value creation 01:49:11.820 |
if we could just linger on that topic a little bit. 01:49:15.460 |
We already talked about your blog post of I fired myself, 01:49:23.180 |
Brian who is too willing to, not making good decisions 01:49:34.860 |
Basically you were like pigging out at night. 01:50:00.100 |
that I'm able to be much smarter about my eating decisions 01:50:03.660 |
in the morning and the afternoon than I am at night. 01:50:08.180 |
why not eat that one meal a day in the morning? 01:50:40.220 |
- My current protocol is basically the result 01:50:44.220 |
of thousands of experiments and decision-making. 01:50:53.460 |
then I measure again, and then I'm measuring all the time. 01:51:12.440 |
And so, for example, recently, in the past two weeks, 01:51:14.900 |
my resting heart rate has been at 42 when I sleep. 01:51:24.100 |
And I wake up in the morning feeling more energized 01:51:37.600 |
creates enough distance between that completed eating 01:51:43.860 |
almost no digestion processes going on in my body, 01:51:47.320 |
therefore my resting heart rate goes very low. 01:51:52.400 |
And so basically I've been trying to optimize 01:51:54.340 |
the entirety of what I eat to my sleep quality. 01:51:59.480 |
feeds into my willpower, so it creates this virtuous cycle. 01:52:02.540 |
And so at 8.30, what I do is I eat what I call super veggie, 01:52:06.420 |
which is, it's a pudding of 250 grams of broccoli, 01:52:13.300 |
that I eat what I call nutty pudding, which is-- 01:52:20.380 |
Like a veggie mix, whatever thing, like a blender? 01:52:23.500 |
- Yeah, you can be made in a high-speed blender. 01:52:25.640 |
But basically I eat the same thing every day, 01:52:43.640 |
- Did I have a third taste? - Does it taste good? 01:52:55.300 |
Today it was kale and spinach and sweet potato. 01:53:09.100 |
So what I'm trying to do is create the perfect diet 01:53:17.180 |
- You're like, one of the things you're really tracking. 01:53:18.380 |
I mean, can you, well, I have a million questions, 01:53:21.060 |
but 20 supplements, like what kind would you say 01:53:35.700 |
Like if you don't actually wanna think about shit, 01:53:38.620 |
And then fish oil, and that's it, that's all I take. 01:53:41.620 |
- Yeah, you know, Alfred North Whitehead said, 01:54:00.900 |
And that I want that system to be scalable to anybody, 01:54:05.500 |
And right now it's expensive for me to do it, 01:54:23.180 |
So it's devices on the outside and inside your body 01:54:28.300 |
and then creating closed loop systems for that to happen. 01:54:30.780 |
- Yeah, so right now you're doing the data collection 01:54:35.860 |
It'd be much better if you just did the data, 01:54:40.820 |
and you can outsource that to another scientist 01:54:46.080 |
- That's right, because every time I spend time 01:54:48.240 |
thinking about this or executing, spending time on it, 01:54:55.360 |
And so we just all have the budget of our capacity 01:55:05.600 |
And so, yeah, hopefully what I'm doing is really, 01:55:07.840 |
it serves as a model that others can also build on. 01:55:12.520 |
is hopefully people can then take it and improve upon it. 01:55:23.000 |
- Can you maybe elaborate on the sleep thing? 01:55:27.360 |
And why, presumably, what does good sleep mean to you? 01:55:49.440 |
I mean, it's magical what it does if you're well-rested 01:56:05.480 |
looking at like 15-minute increments on time of day 01:56:22.220 |
is based upon how much deep sleep I got the night before. 01:56:30.320 |
And so I think the way I'd summarize this is, 01:56:36.280 |
we tell stories, for example, of entrepreneurship, 01:56:58.720 |
And so the new mythology is going to be the exact opposite. 01:57:03.480 |
- Yeah, by the way, just to sort of maybe push back 01:57:22.440 |
And sometimes doing things that are out of the ordinary, 01:57:29.320 |
for certain periods of time in lieu of your passions 01:57:34.320 |
is a signal to yourself that you're throwing everything away. 01:58:10.240 |
to where you don't just give in to the madness 01:58:31.660 |
So you have a fixed foundation where the diet is fixed, 01:58:35.340 |
where the sleep is fixed, and that all of that is optimal. 01:58:46.000 |
that requires real discipline and forming habits. 01:58:50.920 |
There's some aspect to which some of the best days 01:58:58.520 |
And I don't, I'm not too willing to let go of things 01:59:03.520 |
that empirically worked for things that work in theory. 01:59:33.000 |
I think you should also be a scholar of your own body. 01:59:38.200 |
I'm not so sure that a full night's sleep is great for me. 02:00:02.400 |
and all those kinds of, I mean, that all maps 02:00:08.840 |
- Here's a data point for your consideration. 02:00:19.640 |
- And it now appears as if we will be able to replace 02:00:38.920 |
You can lose your hand and your arm and a leg, 02:00:52.200 |
of whether you're going to sleep under the desk or not 02:00:59.360 |
there's a cost benefit trade-off of what's going on, 02:01:02.760 |
what's happening to your brain in that situation. 02:01:09.240 |
It's the most valuable organ in our existence. 02:01:23.440 |
And to me, then if you say that you're trying to, 02:01:36.420 |
the game is very soon going to become different 02:01:47.720 |
the health status of our brain above all things. 02:01:54.000 |
Everything you're saying is true, but we die. 02:02:01.200 |
And I'm one of those people that I would rather die 02:02:14.880 |
there's a lot of things that you can reasonably say, 02:02:19.080 |
that can prevent you, that becomes conservative, 02:02:21.640 |
that can prevent you from fully embracing life. 02:02:24.480 |
I think ultimately, you can be very intelligent 02:02:44.600 |
if you go out by yourself and play, you're going to die. 02:02:47.680 |
Get run over by a car, come to a slow or a sudden end. 02:02:51.260 |
And I'm more a supporter of just go out there. 02:03:04.880 |
in long-term optimization and short-term freedom. 02:03:10.880 |
For me, for a programmer, for a programming mind, 02:03:19.300 |
to not over-optimize and thereby be overly cautious, 02:03:27.480 |
And the ultimate thing I'm trying to optimize for, 02:03:30.360 |
it's funny you said sleep and all those kinds of things. 02:03:40.600 |
but I think I tend to want to minimize stress, 02:03:49.320 |
from you sleeping, all those kinds of things. 02:03:53.760 |
to be too strict with myself, then the stress goes up 02:04:03.640 |
there's so many variables in an objective function 02:04:11.280 |
is a good thing to inject in there every once in a while 02:04:14.040 |
for somebody who's trying to optimize everything. 02:04:19.120 |
it's exactly like you said, you're just a scientist, 02:04:21.400 |
I'm a scientist of myself, you're a scientist of yourself. 02:04:29.560 |
and I pigged out last night at some brisket in LA 02:04:42.040 |
- What is the nature of your regret on the brisket? 02:04:45.080 |
Is it, do you wish you hadn't eaten it entirely? 02:04:49.800 |
Is it that you wish you hadn't eaten as much as you did? 02:04:58.480 |
if we wanna be specific, I drank way too much diet soda. 02:05:04.420 |
My biggest regret is having drank so much diet soda. 02:05:08.120 |
That's the thing that really was the problem. 02:05:12.140 |
'cause I was programming and then I was editing. 02:05:15.740 |
and then I'd have to get up to go pee a few times, 02:05:22.340 |
but it's so many, it's like the little things. 02:05:25.980 |
I know if I just eat, I drink a little bit of water 02:05:31.780 |
all of us have perfect days that we know diet-wise 02:05:36.580 |
and so on that's good to follow, you feel good. 02:05:56.780 |
and all of that combines to create a mess of a day. 02:06:02.460 |
But some of that chaos, you have to be okay with it, 02:06:05.020 |
but some of it I wish was a little bit more optimal. 02:06:11.020 |
are quite interesting as an experiment to try. 02:06:14.340 |
Can you elaborate, are you eating once a day? 02:06:24.380 |
you spoke, it's funny, you spoke about the metrics of sleep, 02:06:43.240 |
So how does that affect your mind and your body 02:06:47.180 |
So not just sleep, but actual mental performance. 02:06:50.060 |
- As you were explaining your objective function of, 02:06:53.620 |
for example, in the criteria you were including, 02:06:59.780 |
like you like feeling like you're living life, 02:07:04.660 |
that sometimes you want to disregard certain rules 02:07:17.820 |
maybe the experience is a bit more complicated, 02:07:19.380 |
but it's in this idea you have, this is a version of you. 02:07:22.580 |
And the reason why I maintain the schedule I do 02:07:29.660 |
I would like to live a life where I care more 02:07:46.660 |
And so therefore the only thing I really care about 02:07:51.180 |
on this optimization is trying to see past myself, 02:07:56.460 |
past my limitations, using zero principle thinking, 02:07:59.760 |
pull myself out of this contextual mesh we're in right now 02:08:13.340 |
And I find that if I were to hang out with diet soda Lex 02:08:18.340 |
and diet soda Brian were to play along with that 02:08:24.980 |
and my deep sleep were to get crushed as a result, 02:08:28.540 |
my mind would not be on what matters in 100 years 02:08:34.340 |
I would be, you know, I'd be in a different state. 02:08:41.160 |
It's what you and I have chosen to think about. 02:08:47.520 |
And this is why I'm saying that no generation of humans 02:08:51.780 |
have ever been afforded the opportunity to look 02:08:55.100 |
at their lifespan and contemplate that they will 02:09:03.220 |
an evolved form of consciousness that is undeniable. 02:09:06.900 |
They would fall into zero category of potential. 02:09:11.020 |
That to me is the most exciting thing in existence. 02:09:14.140 |
And I would not trade any momentary neurochemical state 02:09:34.340 |
I just looked up Braveheart's speech in William Wallace. 02:09:41.820 |
Fight and you may die, run and you'll live at least a while 02:09:53.740 |
just one chance, picture Mel Gibson saying this, 02:09:59.120 |
that they may take our lives with growing excitement, 02:10:06.440 |
I get excited every time I see that in the movie. 02:10:10.920 |
- Do you think they were tracking their sleep? 02:10:41.640 |
let's say from a first principles perspective, 02:10:47.520 |
I experienced a certain neurotransmitter state 02:10:59.440 |
And so if you as an engineer of consciousness, 02:11:04.480 |
That's just triggering certain chemical reactions. 02:11:08.280 |
And so it doesn't mean they have to be mutually exclusive. 02:11:15.840 |
And I think that's the potential of where we're going 02:11:32.160 |
that also come along with the illusion of free will 02:11:41.940 |
I spent, so I still am, but I lived at Cambridge at MIT. 02:11:49.200 |
I felt like home in the space of ideas with the colleagues 02:11:54.380 |
But there is something about the constraints, 02:12:00.320 |
how much they valued also kind of material success, 02:12:06.840 |
When I showed up to Texas, it felt like I belong. 02:12:13.220 |
But that's my neurochemistry, whatever the hell that is, 02:12:15.860 |
whatever, maybe it probably is rooted to the fact 02:12:33.860 |
I love the dogmatic authoritarianism of diet, 02:12:37.740 |
of like the same habit, exactly the habit you have. 02:12:41.340 |
I think that's actually when bodies perform optimally, 02:12:45.580 |
So balancing those two, I think if I have the data, 02:12:48.880 |
every once in a while, party with some wild people, 02:13:00.040 |
I'd rather have the data that tells me to do it. 02:13:03.160 |
But in general, you're able to eating once a day, 02:13:09.000 |
Like that's a concern that people have is like, 02:13:11.700 |
does your energy wane, all those kinds of things. 02:13:15.180 |
Do you find that it's, especially 'cause it's unique, 02:13:21.060 |
So you find that you're able to have a clear mind, 02:13:23.900 |
a focus and just physically and mentally throughout? 02:13:36.020 |
like oftentimes I feel like I'm looking through a telescope 02:13:49.840 |
at the thing you're trying to find, but it's fleeting. 02:14:04.880 |
and elusive and it requires a sensitivity to thinking 02:14:09.880 |
and a sensitivity to maneuver through these things. 02:14:16.560 |
If I concede to a world where I'm on my phone texting, 02:14:29.280 |
and I'm also feeling terrible from the last night. 02:14:34.860 |
and the quality of my thoughts goes to a zero. 02:14:38.700 |
I'm a functional person to respond to basic level things, 02:14:42.980 |
but I don't feel like I am doing anything interesting. 02:14:49.820 |
because that's what thinking deeply feels like, 02:14:55.820 |
And you're right, all those other distractions 02:15:26.900 |
because in that moment, it was the darkest time in my life. 02:15:32.480 |
I was ending a 13 year marriage, I was leaving my religion, 02:15:35.360 |
I sold Braintree and I was battling depression 02:15:54.440 |
And it was the first time I had ever had money 02:15:57.480 |
to donate outside of paying tithing in my religion. 02:16:13.800 |
And so we went there and we saw the clean water wells 02:16:16.600 |
we were building, we spoke to the people there 02:16:21.920 |
and I came down with a stomach flu on day three 02:16:49.520 |
- Yeah, and I just was destroyed from the situation. 02:16:55.440 |
- Plus psychologically one of the lowest points 02:17:02.720 |
I was just smoked as a human, just absolutely done. 02:17:14.920 |
I'm now intertwined with these three little people 02:17:19.600 |
and I have an obligation whether I like it or not, 02:17:27.680 |
And I had to decide whether I was going to summit 02:17:44.960 |
And I said, "I think I'm okay, I think I can try." 02:17:49.740 |
And so from midnight to, I made it to the summit at 5 a.m. 02:17:56.840 |
It was one of the most transformational moments 02:18:05.960 |
It became everything that I was struggling with. 02:18:12.520 |
the pain got so ferocious that it was kind of like this, 02:18:17.520 |
it became so ferocious that I turned my music to Eminem. 02:18:25.040 |
And it was, Eminem was, he was the only person 02:18:31.160 |
And it was something about his anger and his vibrancy 02:18:36.160 |
and his multidimensional, he's the only person 02:18:42.760 |
I turned on Eminem and I made it to the summit 02:18:46.800 |
after five hours, but just 100 yards from the top. 02:18:51.040 |
I was with my guide Ike and I started getting very dizzy 02:19:00.480 |
And he said, "Look, Brian, I know where you're at. 02:19:08.360 |
"So I want you to look up, take a step, take a breath, 02:19:13.360 |
"and then look up, take a breath and take a step." 02:19:19.280 |
And so I got there and I just, I sat down with him 02:19:36.960 |
So he looked at it and I think he was like really alarmed 02:19:41.400 |
And so he said, "We can't get a helicopter here 02:19:44.760 |
"and we can't get you to emergency evacuated. 02:19:46.940 |
"You've got to hike down to 15,000 feet to get base camp." 02:20:01.040 |
and a team of six people wheeled me down the mountain. 02:20:10.240 |
And so my head would just slam against this metal thing 02:20:15.980 |
Plus I'd get my head slammed every couple of seconds. 02:20:18.880 |
So the whole experience was really a life changing moment. 02:20:26.680 |
Basically I said, "I'm going to reconstruct Brian. 02:20:31.680 |
"My understanding of reality, my existential realities, 02:20:36.760 |
And I try, I mean, as much as that's possible as a human, 02:20:39.680 |
but that's when I set out to rebuild everything. 02:20:46.040 |
I mean, there's also just like the romantic, poetic, 02:20:52.320 |
As a man in pain, psychological and physical, 02:20:58.040 |
struggling up a mountain, but it's just struggle. 02:21:04.000 |
just pushing through in the face of hardship or nature too, 02:21:11.360 |
Is that, was that the thing that just clicked? 02:21:14.880 |
- For me, it felt like I was just locked in with reality 02:21:21.320 |
It was in that moment, one of us is going to die. 02:21:24.080 |
- So you were pondering death, like not surviving. 02:21:32.280 |
I'm going to come out on top and I can do this. 02:21:54.080 |
It would not have been something I would have anticipated. 02:22:00.360 |
Is there advice you can give to young people today 02:22:14.080 |
successful in life, whatever path they choose? 02:22:22.600 |
and see it for what it is, a mirror of that person. 02:22:34.840 |
And so what you're hearing today is a representation 02:22:49.640 |
people ask for advice, but they don't take advice. 02:22:59.340 |
- It's in the careful examination of the advice. 02:23:10.680 |
the value is understanding the assumption stack they built, 02:23:13.480 |
the assumption and knowledge stack they built 02:23:18.080 |
That's the value, it's not doing what they say. 02:23:23.120 |
but digging deeper to understand the assumption stack, 02:23:36.320 |
And the advice is just the tip of the iceberg. 02:23:39.480 |
- The journey is not the thing that gives you. 02:23:49.480 |
Is there some, are there been people in your startup, 02:24:02.600 |
Or do you feel like your journey felt like a lonely path, 02:24:17.320 |
do you fundamentally remember the experiences, 02:24:23.960 |
at a particular moment in time that changed everything? 02:24:26.920 |
- Yeah, the most significant moments of my memory, 02:24:33.040 |
when Ike, some person I'd never met in Tanzania, 02:24:38.160 |
was able to, in that moment, apparently see my soul, 02:24:45.080 |
- And he gave me the instructions, look up, step. 02:25:00.780 |
I probably should be better at identifying those things. 02:25:48.880 |
You know, you have like a 18-year-old kid come up to you. 02:25:51.680 |
It's not always obvious, it's not always easy 02:26:00.120 |
Like not the facts, but like see who that person is. 02:26:04.600 |
- I think people say that about being a parent is, 02:26:16.040 |
that there's a special, unique human being there 02:26:29.360 |
So when giving advice, there's something to that. 02:26:31.640 |
And so both sides should be deeply empathetic 02:26:38.800 |
What do you think is the meaning of this whole thing? 02:26:46.080 |
We've been talking about brains and studying brains 02:26:48.320 |
and you had this very eloquent way of describing life 02:27:17.440 |
again, the information value is more in the mirror 02:27:21.640 |
it provides of that person, which is a representation 02:27:25.680 |
of the technological, social, political context of the time. 02:27:30.120 |
So if you asked this question a hundred years ago, 02:27:35.040 |
Same thing would be true of a thousand years ago. 02:27:38.000 |
It's difficult for a person to pull themselves 02:27:45.240 |
And so knowing that I am contextually influenced 02:27:48.520 |
by the situation, that I am a mirror for our reality, 02:28:00.160 |
is that evolution built a system of scaffolding intelligence 02:28:13.880 |
that are scaffolding higher dimensional intelligence. 02:28:18.880 |
That's developing more robust systems of intelligence. 02:28:28.040 |
In doing in that process with the cost going to zero, 02:28:32.680 |
then the meaning of life becomes goal alignment, 02:28:47.800 |
is our technological progress is getting to a point 02:28:57.240 |
we want to figure out what is really going on, 02:29:09.200 |
from being able to poke our way out of whatever is going on. 02:29:14.200 |
But it's interesting that we could even state an aspiration 02:29:26.000 |
the meaning of life is that we can build a future state 02:29:47.120 |
that we would consider bewildering and all the things 02:30:00.440 |
James Kars wrote the book, "Finite Games, Infinite Games." 02:30:04.080 |
The only game to play right now is to keep playing the game. 02:30:11.400 |
of the Lex algorithm of diet soda and brisket 02:30:19.280 |
where we can contemplate playing infinite games. 02:30:22.520 |
Therefore, it may make sense to err on the side 02:30:27.720 |
to be playing infinite games if that opportunity arises. 02:30:39.240 |
and why those assumptions may fall away very quickly. 02:30:46.040 |
when I say that the game you, Mr. Brian Johnson, 02:30:57.640 |
with Brian Johnson and thank you to Four Sigmatic, 02:31:04.960 |
Check them out in the description to support this podcast. 02:31:18.720 |
Thank you for listening and hope to see you next time.