back to indexIf You Could Live Forever Would You? (Ben Goertzel) | AI Podcast Clips with Lex Fridman
Chapters
0:0 Would you live forever
4:23 Human vs AI
7:53 The Meaning of Life
00:00:00.000 |
So if you could live forever, would you live forever? 00:00:07.740 |
My goal with longevity research is to abolish the plague of involuntary death. 00:00:14.600 |
I don't think people should die unless they choose to die. 00:00:18.640 |
If I had to choose forced immortality versus dying, I would choose forced immortality. 00:00:26.480 |
On the other hand, if I had the choice of immortality with the choice of suicide whenever 00:00:32.020 |
I felt like it, of course I would take that instead. 00:00:36.800 |
There's no reason you should have forced immortality. 00:00:38.880 |
You should be able to live until you get sick of living. 00:00:44.240 |
And that will seem insanely obvious to everyone 50 years from now. 00:00:50.960 |
People who thought death gives meaning to life so we should all die, they will look 00:00:55.220 |
at that 50 years from now the way we now look at the Anabaptists in the year 1000 who gave 00:01:01.240 |
away all their positions, went on top of the mountain for Jesus to come and bring them 00:01:08.400 |
It's ridiculous that people think death is good because you gain more wisdom as you approach 00:01:19.120 |
I mean, I'm 53 and the fact that I might have only a few more decades left, it does make 00:01:29.540 |
It does give me a deeper understanding of many things. 00:01:35.320 |
You could get a deep understanding in a lot of different ways. 00:01:39.340 |
We're going to abolish pain and that's even more amazing than abolishing death. 00:01:45.460 |
Once we get a little better at neuroscience, we'll be able to go in and adjust the brain 00:01:52.040 |
And people will say that's bad because there's so much beauty in overcoming pain and suffering. 00:01:58.560 |
Well, sure, and there's beauty in overcoming torture too. 00:02:02.480 |
And some people like to cut themselves, but not many. 00:02:05.160 |
That's an interesting, but to push back again, this is the Russian side of me, I do romanticize 00:02:12.320 |
It's not obvious, I mean, the way you put it, it seems very logical. 00:02:16.680 |
It's almost absurd to romanticize suffering or pain or death. 00:02:21.200 |
But to me, a world without suffering, without pain, without death, it's not obvious what 00:02:28.840 |
Well, then you can stay in the people zoo with the people torturing each other. 00:02:32.680 |
No, but what I'm saying is, I guess what I'm trying to say, I don't know if I was presented 00:02:39.320 |
with that choice, what I would choose, because to me-- 00:02:42.440 |
No, this is a subtler, it's a subtler matter, and I've posed it in this conversation in 00:02:54.320 |
So I think the way you should think about it is what if there's a little dial on the 00:02:59.880 |
side of your head, and you could turn how much pain hurt. 00:03:07.040 |
Turn it up to 11, like in spinal tap if it wants, maybe through an actual spinal tap, 00:03:12.000 |
So I mean, would you opt to have that dial there or not? 00:03:17.240 |
The question isn't whether you would turn the pain down to zero all the time. 00:03:24.400 |
My guess is that in some dark moment of your life, you would choose to have the dial implanted, 00:03:30.680 |
Just to confess a small thing, don't ask me why, but I'm doing this physical challenge 00:03:36.920 |
currently where I'm doing 680 push-ups and pull-ups a day, and my shoulder is currently, 00:03:51.560 |
I would certainly right now, if you gave me a dial, I would turn that sucker to zero as 00:03:58.080 |
But I think the whole point of this journey is, I don't know. 00:04:07.680 |
So the question is, am I somehow twisted because I created some kind of narrative for myself 00:04:14.560 |
so that I can deal with the injustice and the suffering in the world? 00:04:21.040 |
Or is this actually going to be a source of happiness for me? 00:04:24.160 |
Well, this is, to an extent, is a research question that humanity will undertake, right? 00:04:30.960 |
Human beings do have a particular biological makeup, which sort of implies a certain probability 00:04:39.920 |
distribution over motivational systems, right? 00:04:47.120 |
Now, the question is, how flexibly can that morph as society and technology change, right? 00:04:56.240 |
So if we're given that dial, and we're given a society in which, say, we don't have to 00:05:03.240 |
work for a living, and in which there's an ambient, decentralized, benevolent AI network 00:05:08.880 |
that will warn us when we're about to hurt ourself, if we're in a different context, 00:05:14.400 |
can we consistently, with being genuinely and fully human, can we consistently get into 00:05:21.120 |
a state of consciousness where we just want to keep the pain dial turned all the way down, 00:05:27.960 |
and yet we're leading very rewarding and fulfilling lives, right? 00:05:31.440 |
Now, I suspect the answer is yes, we can do that, but I don't know that-- 00:05:39.720 |
I'm more confident that we could create a non-human AGI system which just didn't need 00:05:50.360 |
And I think that AGI system will be fundamentally healthier and more benevolent than human beings. 00:05:57.000 |
So I think it might or might not be true that humans need a certain element of suffering 00:06:02.400 |
to be satisfied humans, consistent with the human physiology. 00:06:06.680 |
If it is true, that's one of the things that makes us fucked and disqualified to be the 00:06:16.840 |
This is the nature of the human motivational system is that we seem to gravitate towards 00:06:24.760 |
situations where the best thing in the large scale is not the best thing in the small scale, 00:06:35.320 |
So we gravitate towards subjective value judgments where to gratify ourselves in the large, we 00:06:46.560 |
There's a theory of music which says the key to musical aesthetics is the surprising fulfillment 00:06:54.620 |
You want something that will fulfill the expectations enlisted in the prior part of the music, but 00:06:59.080 |
in a way with a bit of a twist that surprises you. 00:07:02.280 |
And that's true not only in outdoor music like my own or that of Zappa or Steve Vai 00:07:08.800 |
or Buckethead or Christoph Penderecki or something. 00:07:15.240 |
It's not there in elevator music too much, but that's why it's boring, right? 00:07:20.260 |
But wrapped up in there is we want to hurt a little bit so that we can feel the pain 00:07:28.920 |
We want to be a little confused by what's coming next. 00:07:32.980 |
So then when the thing that comes next actually makes sense, it's so satisfying, right? 00:07:37.000 |
It's the surprising fulfillment of expectations, is that what you said? 00:07:42.200 |
I know we've been skirting around a little bit, but if I were to ask you the most ridiculous 00:07:45.680 |
big question of what is the meaning of life, what would your answer be? 00:08:03.520 |
I mean, that's the basis of everything if you want the number one value. 00:08:06.920 |
On the other hand, I'm unsatisfied with a static joy that doesn't progress, perhaps 00:08:13.720 |
because of some elemental element of human perversity. 00:08:17.380 |
But the idea of something that grows and becomes more and more and better and better in some 00:08:24.160 |
But I also sort of like the idea of individuality, that as a distinct system, I have some agency. 00:08:31.720 |
So there's some nexus of causality within this system rather than the causality being 00:08:37.640 |
wholly evenly distributed over the joyous growing mass. 00:08:41.120 |
So you start with joy, growth, and choice as three basic values. 00:08:46.120 |
Those three things could continue indefinitely. 00:08:52.920 |
Is there some aspect of something you called, which I like, super longevity that you find 00:09:00.480 |
exciting, research-wise, is there ideas in that space? 00:09:05.760 |
I think, yeah, in terms of the meaning of life, this really ties into that. 00:09:11.920 |
Because for us as humans, probably the way to get the most joy, growth, and choice is 00:09:19.920 |
transhumanism and to go beyond the human form that we have right now. 00:09:25.760 |
And I think human body is great, and by no means do any of us maximize the potential 00:09:32.200 |
for joy, growth, and choice imminent in our human bodies. 00:09:35.760 |
On the other hand, it's clear that other configurations of matter could manifest even greater amounts 00:09:42.320 |
of joy, growth, and choice than humans do, maybe even finding ways to go beyond the realm 00:09:52.200 |
So I think in a practical sense, much of the meaning I see in human life is to create something 00:10:02.800 |
But certainly that's not all of it for me in a practical sense. 00:10:06.440 |
I have four kids and a granddaughter and many friends and parents and family and just enjoying 00:10:19.200 |
I mean, I love, I've always, when I could live near nature, I spend a bunch of time 00:10:24.480 |
out in nature in the forest and on the water every day and so forth. 00:10:28.160 |
So I mean, enjoying the pleasant moment is part of it. 00:10:32.320 |
But the growth and choice aspect are severely limited by our human biology. 00:10:39.200 |
In particular, dying seems to inhibit your potential for personal growth considerably 00:10:46.680 |
I mean, there's some element of life after death perhaps, but even if there is, why not 00:10:52.480 |
also continue going in this biological realm, right? 00:10:57.280 |
In super longevity, I mean, we haven't yet cured aging. 00:11:05.640 |
Certainly there's very interesting progress all around. 00:11:09.000 |
I mean, CRISPR and gene editing can be an incredible tool. 00:11:14.680 |
And I mean, right now, stem cells could potentially prolong life a lot. 00:11:20.440 |
Like if you got stem cell injections of just stem cells for every tissue of your body injected 00:11:26.680 |
into every tissue, and you can just have replacement of your old cells with new cells produced 00:11:33.360 |
by those stem cells, I mean, that could be highly impactful at prolonging life. 00:11:38.440 |
Now we just need slightly better technology for having them grow, right? 00:11:42.600 |
So using machine learning to guide procedures for stem cell differentiation and trans differentiation, 00:11:49.960 |
it's kind of nitty gritty, but I mean, that's quite interesting. 00:11:53.920 |
So I think there's a lot of different things being done to help with prolongation of human 00:12:04.800 |
So for example, the extracellular matrix, which is the bunch of proteins in between 00:12:10.160 |
the cells in your body, they get stiffer and stiffer as you get older. 00:12:14.680 |
And the extracellular matrix transmits information both electrically, mechanically, and to some 00:12:22.640 |
So there's all this transmission through the parts of the body, but the stiffer the 00:12:27.040 |
extracellular matrix gets, the less the transmission happens, which makes your body get worse coordinated 00:12:32.740 |
between the different organs as you get older. 00:12:34.680 |
So my friend Christian Schaffmeister at my alumnus organization, my alma mater, the great 00:12:40.600 |
Temple University, Christian Schaffmeister has a potential solution to this, where he 00:12:45.920 |
has these novel molecules called spiral ligamers, which are like polymers that are not organic. 00:12:51.640 |
They're specially designed polymers so that you can algorithmically predict exactly how 00:12:58.780 |
So he designed the molecular scissors that have spiral ligamers that you could eat and 00:13:03.800 |
then cut through all the glucosamine and other cross-linked proteins in your extracellular 00:13:10.040 |
But to make that technology really work and be mature is several years of work. 00:13:13.960 |
As far as I know, no one's funding it at the moment. 00:13:17.320 |
So there's so many different ways that technology could be used to prolong longevity. 00:13:22.420 |
What we really need, we need an integrated database of all biological knowledge about 00:13:26.840 |
human beings and model organisms, like hopefully a massively distributed open-cog bioatom space, 00:13:35.480 |
We need that data to be opened up in a suitably privacy-protecting way. 00:13:40.500 |
We need massive funding into machine learning, AGI, proto-AGI, statistical research aimed 00:13:46.600 |
at solving biology, both molecular biology and human biology, based on this massive, 00:13:54.400 |
And then we need regulators not to stop people from trying radical therapies on themselves 00:14:00.840 |
if they so wish to, as well as better cloud-based platforms for automated experimentation on 00:14:13.120 |
You look, after the last financial crisis, Obama, who I generally like pretty well, but 00:14:18.560 |
he gave $4 trillion to large banks and insurance companies. 00:14:22.840 |
Now in this COVID crisis, trillions are being spent to help everyday people and small businesses. 00:14:29.460 |
In the end, we'll probably will find many more trillions are being given to large banks 00:14:35.840 |
Could the world put $10 trillion into making a massive holistic bio-AI and bio-simulation 00:14:46.000 |
We could put $10 trillion into that without even screwing us up too badly, just as in 00:14:49.880 |
the end COVID and the last financial crisis won't screw up the world economy so badly. 00:14:57.480 |
Instead, all this research is siloed inside a few big companies and government agencies. 00:15:05.200 |
Most of the data that comes from our individual bodies, personally, that could feed this AI 00:15:10.600 |
to solve aging and death, most of that data is sitting in some hospital's database doing