back to indexRay Kurzweil: Singularity, Superintelligence, and Immortality | Lex Fridman Podcast #321
Chapters
0:0 Introduction
1:6 Turing test
14:51 Brain–computer interfaces
26:31 Singularity
32:51 Virtual reality
35:31 Evolution of information processing
41:57 Automation
51:57 Nanotechnology
53:51 Nuclear war
55:57 Uploading minds
63:38 How to think
70:8 Digital afterlife
79:28 Intelligent alien life
82:18 Simulation hypothesis
86:31 Mortality
94:10 Meaning of life
00:00:07.680 |
and it's just very hard to imagine what that will be like. 00:00:10.820 |
- The following is a conversation with Ray Kurzweil, 00:00:24.320 |
predicting that exponentially improving technologies 00:00:29.880 |
beyond which super intelligent artificial intelligence 00:00:33.480 |
will transform our world in nearly unimaginable ways. 00:00:38.360 |
18 years ago, in the book "Singularity is Near," 00:00:41.280 |
he predicted that the onset of the singularity 00:00:47.360 |
He still holds to this prediction and estimate. 00:00:50.800 |
In fact, he's working on a new book on this topic 00:01:05.380 |
In your 2005 book titled "The Singularity is Near," 00:01:10.960 |
you predicted that the singularity will happen in 2045. 00:01:17.640 |
do you still estimate that the singularity will happen 00:01:24.960 |
the technological singularity, and when will it happen? 00:01:27.760 |
- Singularity is where computers really change our view 00:01:35.860 |
But we're getting close to some salient things 00:01:59.680 |
But Stanford got very alarmed at my prediction about 2029. 00:02:12.960 |
- And then you repeated the prediction in 2005. 00:02:26.600 |
So people gave different predictions and they took a poll. 00:02:30.840 |
It was really the first time that AI experts worldwide 00:02:55.840 |
that the poll of AI experts has come down over the years. 00:03:07.080 |
assesses different types of experts on the future. 00:03:11.560 |
They again assessed what AI experts then felt. 00:03:37.960 |
I haven't changed at all, I've stayed with 2029. 00:03:46.880 |
- So Alan Turing formulated the Turing test and-- 00:03:50.980 |
- Right, now what he said was very little about it. 00:03:58.060 |
there's like a few lines that talk about the Turing test. 00:04:04.440 |
And it really wasn't very clear how to administer it. 00:04:11.840 |
And he said if they did it in like 15 minutes, 00:04:25.540 |
I mean, you can talk to it and have a conversation with it. 00:04:35.360 |
There's some problems with large language models, 00:04:39.600 |
But some people are convinced by the Turing test. 00:04:56.000 |
It's not necessarily clear what the implications are. 00:05:00.880 |
Anyway, I believe 2029, that's six, seven years from now, 00:05:05.880 |
we'll have something that passes the Turing test 00:05:12.480 |
meaning it goes for hours, not just a few minutes. 00:05:23.180 |
of the Turing test, so what does that look like? 00:05:25.420 |
- Basically, it's just to assess it over several hours 00:05:28.580 |
and also have a human judge that's fairly sophisticated 00:05:39.220 |
If you take somebody who's not that sophisticated 00:05:47.260 |
they may not really assess various aspects of it. 00:05:52.100 |
- So you really want the human to challenge the system. 00:06:10.180 |
that would involve assessing chains of reasoning. 00:06:19.420 |
If you talk to them, they actually can talk to you 00:06:24.860 |
But it's somebody that would really convince you 00:06:34.800 |
but it would really convince you that it's human. 00:06:44.560 |
You can read conversations and they appear pretty good. 00:06:55.000 |
Can ask, "How many legs did 10 elephants have?" 00:06:58.200 |
And they'll tell you, "Well, okay, each elephant 00:07:00.600 |
"has four legs and it's 10 elephants, so it's 40 legs." 00:07:07.960 |
And they don't seem to understand the question. 00:07:15.880 |
I mean, how advanced a human do you want it to be? 00:07:21.840 |
to do multi-chain reasoning, to be able to take a few facts 00:07:40.520 |
but it's something where it really would convince you 00:07:45.600 |
- Is your intuition that large language models 00:07:56.760 |
- No, I think it will be a large language model, 00:07:58.680 |
but they have to go beyond what they're doing now. 00:08:05.360 |
And another key issue is if somebody actually passes 00:08:10.000 |
the Turing test validly, I would believe they're conscious. 00:08:17.400 |
but we don't really believe that it's conscious. 00:08:26.680 |
But I don't believe that of large language models today. 00:08:40.720 |
- I mean, consciousness is not something that's scientific. 00:08:55.500 |
When you go outside of shared human assumption, 00:09:08.680 |
And would a machine that acts just like a human 00:09:22.700 |
I can't take an entity and prove that it's conscious. 00:09:25.480 |
There's nothing that you can do that would indicate that. 00:09:30.360 |
- It's like saying a piece of art is beautiful. 00:09:35.000 |
Multiple people can experience a piece of art is beautiful, 00:09:41.320 |
- But it's also an extremely important issue. 00:10:05.920 |
It's not scientific, and therefore we should dismiss it. 00:10:08.400 |
And any talk about consciousness is just not to be believed. 00:10:13.400 |
But when he actually engaged with somebody who was conscious, 00:10:34.040 |
- But that's true of a lot of people as well. 00:10:59.400 |
It's not very interesting, but that's a consciousness. 00:11:07.240 |
a more interesting decision, still not at human levels, 00:11:10.440 |
but it's also conscious and at a higher level 00:11:16.000 |
There's many different views of what consciousness is. 00:11:24.600 |
it's not scientific, but in issues of philosophy, 00:11:29.600 |
things like ethics start to enter the picture. 00:11:32.640 |
Do you think there would be, we would start contending 00:11:58.480 |
It has a different, I mean, a computer that's conscious 00:12:03.480 |
has a little bit different connotations than a human. 00:12:15.580 |
We're in an entity that does not last forever. 00:12:22.080 |
Now, actually, a significant portion of humans still exist 00:12:29.240 |
but anybody who is over a certain age doesn't exist anymore. 00:12:42.280 |
and a copy of it could be stored and you could recreate it. 00:12:52.880 |
You could eliminate its memory and have it go over again. 00:12:55.800 |
I mean, it has a different kind of connotation 00:13:01.760 |
- Well, perhaps it can do the same thing with humans. 00:13:04.360 |
It's just that we don't know how to do that yet. 00:13:06.840 |
It's possible that we figure out all of these things 00:13:10.740 |
but that doesn't mean the machine isn't conscious. 00:13:15.440 |
- I mean, if you look at the way people react, 00:13:17.600 |
say 3CPO or other machines that are conscious in movies, 00:13:22.600 |
they don't actually present how it's conscious, 00:13:30.080 |
and people will believe that they are conscious 00:13:46.480 |
to start to consider the role of AI in this world. 00:14:04.520 |
because most people have not taken that position. 00:14:25.840 |
more and more people will accept that they're conscious. 00:14:46.220 |
- And so that takes us one step closer to the singularity. 00:14:59.820 |
which is where we do our thinking, to computers. 00:15:04.820 |
And I mean, just as this actually gains a lot 00:15:21.860 |
- If you're just listening to this, by the way, 00:15:24.420 |
Ray's holding up the all-powerful smartphone. 00:15:29.420 |
- So we're gonna do that directly from our brains. 00:15:35.060 |
These already have amplified our intelligence. 00:15:37.740 |
I'm already much smarter than I would otherwise be 00:15:45.960 |
there was no way to get information from computers. 00:15:52.040 |
I actually would go to a library, find a book, 00:15:55.360 |
find the page that had an information I wanted, 00:16:04.320 |
was a roll of quarters where I could feed the copier. 00:16:08.440 |
So we're already greatly advanced that we have these things. 00:16:19.680 |
I've actually never lost it, but you have to find it, 00:16:30.100 |
if someone would just listen to your conversation 00:17:03.560 |
But another step is to actually go inside your brain. 00:17:15.240 |
They actually don't have the amount of bandwidth 00:17:21.920 |
So if it actually would connect to your neocortex, 00:18:07.480 |
and just do what we can do by using this machine. 00:18:29.240 |
I mean, they're gonna get permission for this, 00:18:31.980 |
because there are a lot of people who absolutely need it, 00:18:38.400 |
who have ideas, and they cannot move their muscles, 00:18:56.600 |
turn this into something that would be like we have a phone, 00:19:09.440 |
would not require this low-bandwidth mechanism of language. 00:19:17.280 |
although we do know that computers can share information 00:19:34.200 |
it actually can manipulate different parameters. 00:19:39.060 |
So we talk about these large language models. 00:19:57.620 |
and that we would not see AI really being effective 00:20:06.400 |
that were like 10 billion bytes, didn't work very well. 00:20:23.500 |
Well, what if you had something that had one byte, 00:20:33.940 |
and so you put in something that would detect its trunk. 00:20:39.140 |
If it doesn't have a trunk, it's not an elephant. 00:20:46.320 |
Really wouldn't be able to tell what a trunk is, 00:20:50.620 |
- And maybe other things other than elephants have trunks. 00:20:58.820 |
but you know, plus how do you define a trunk? 00:21:06.380 |
- So these things have 100 billion parameters, 00:21:08.740 |
so they're able to deal with very complex issues. 00:21:14.000 |
- Human beings actually have a little bit more than that, 00:21:20.860 |
If we were able to connect this to our neocortex, 00:21:25.860 |
we would basically add more of these abilities 00:21:45.260 |
- So you think that there will be a merger in the '30s, 00:21:57.720 |
And the AI brain is really an emulation of human beings. 00:22:30.660 |
it starts adding features we might not otherwise have, 00:22:38.580 |
like look up thousands of Wikipedia articles in one take. 00:22:48.140 |
where it can simulate many different things at once. 00:22:53.140 |
We already had one example of simulated biology, 00:23:11.140 |
But they were able to simulate what each example 00:23:17.780 |
and they were able to simulate that quite reliably. 00:23:31.060 |
And they did, and talk about doing it quickly, 00:23:38.060 |
to simulate billions of different mRNA sequences? 00:23:47.320 |
And one of the reasons that people didn't like vaccines 00:24:02.900 |
So they figured, okay, it took 10 months to create this. 00:24:08.100 |
And we also will be able to ultimately do the tests 00:24:14.180 |
- Oh, 'cause we can simulate how the body will respond to it. 00:24:19.140 |
'cause the body has a lot of different elements, 00:24:27.520 |
So ultimately, we could create it in a few days, 00:24:30.240 |
and then test it in a few days, and it would be done. 00:24:34.020 |
And we can do that with every type of medical insufficiency 00:24:47.620 |
supplements, drugs, for recreation, for health, 00:24:53.900 |
for performance, for productivity, all that kind of stuff. 00:24:58.060 |
'Cause I mean, right now, we have a very inefficient way 00:25:14.860 |
And we'll get to the point where we can test it out 00:25:30.060 |
this primitive building block of life, which is a protein, 00:25:36.100 |
- It's pretty remarkable that they can actually predict 00:25:42.020 |
But they did it with the same type of neural net, 00:25:54.420 |
- They took that same thing and just changed the rules 00:25:58.900 |
And within a couple of days, it now played a master level 00:26:05.820 |
And the same thing then worked for AlphaFold, 00:26:14.780 |
I mean, human beings could do, the best humans 00:26:25.820 |
And after a few takes, it ultimately did just about 100%. 00:26:30.820 |
- Do you still think the singularity will happen in 2045? 00:26:39.020 |
- You know, once we can amplify our brain with computers 00:26:49.780 |
That's another whole theme, which is the exponential growth 00:26:54.940 |
- Yeah, so looking at price performance of computation 00:26:59.780 |
- Right, so that starts with the very first computer 00:27:02.940 |
actually created by a German during World War II. 00:27:05.620 |
You might have thought that that might be significant, 00:27:09.420 |
but actually the Germans didn't think computers 00:27:12.860 |
were significant, and they completely rejected it. 00:27:22.220 |
with the X-axis being the year from 1935 to 2025, 00:27:27.220 |
and on the Y-axis in log scale is computation per second 00:27:32.340 |
per constant dollar, so dollar normalized to inflation. 00:27:47.740 |
and it cracked the German code and enabled the British 00:27:52.740 |
to win the Battle of Britain, which otherwise 00:27:56.340 |
absolutely would not have happened if they hadn't 00:28:00.780 |
But that's an exponential graph, so a straight line 00:28:15.220 |
and this happened shortly before the pandemic, 00:28:18.280 |
people saying, well, they call it Moore's Law, 00:28:20.660 |
which is not the correct, because that's not all Intel. 00:28:25.540 |
In fact, this started decades before Intel was even created. 00:28:29.700 |
It wasn't with transistors formed into a grid. 00:28:34.140 |
- So it's not just transistor count or transistor size. 00:28:43.580 |
to individual transistors, and then to integrated circuits. 00:29:02.940 |
but a few years ago, they stopped making the fastest chips. 00:29:07.200 |
But if you take the fastest chip of any technology 00:29:19.820 |
- So you don't think Moore's Law, broadly defined, is dead? 00:29:31.420 |
because it has nothing to do with Moore or with Intel. 00:29:34.780 |
But yes, the exponential growth of computing is continuing 00:29:41.620 |
and has never stopped. - From various sources. 00:29:50.700 |
And if you continue that out, along with software gains, 00:30:03.600 |
from software gains, you multiply by the computer gains, 00:30:15.900 |
And that actually expands roughly twice a year. 00:30:22.880 |
- So we're looking at a plot from 2010 to 2022. 00:30:27.880 |
On the x-axis is the publication date of the model, 00:30:31.440 |
and perhaps sometimes the actual paper associated with it. 00:30:34.260 |
And on the y-axis is training, compute, and flops. 00:30:40.300 |
And so basically this is looking at the increase 00:30:43.900 |
in the, not transistors, but the computational power 00:30:51.540 |
- Yes, the computational power that created these models. 00:30:57.620 |
- Which is even faster, the transistor division. 00:31:01.260 |
Actually, since it goes faster than the amount of cost, 00:31:06.920 |
this has actually become a greater investment 00:31:21.520 |
And it's just very hard to imagine what that will be like. 00:31:25.080 |
- And that's the singularity, what we can't even imagine. 00:31:28.440 |
- Right, that's why we call it the singularity. 00:31:51.160 |
that what the singularity actually feels like 00:31:58.280 |
with exponentially increasing cognitive capabilities 00:32:05.460 |
and we almost, because everything's moving so quickly, 00:32:08.460 |
aren't really able to introspect that our life has changed? 00:32:13.700 |
- Yeah, but I mean, we will have that much greater capacity 00:32:17.460 |
to understand things, so we should be able to look back. 00:32:23.220 |
- But we will need people, basically like you and me, 00:32:29.260 |
- But we might be distracted by all the other sources 00:32:32.260 |
of entertainment and fun because the exponential power 00:32:59.340 |
or will most of our advancement be in physical reality? 00:33:02.060 |
- Well, that's a little bit like Second Life, 00:33:04.700 |
although the Second Life actually didn't work very well 00:33:07.100 |
because it couldn't actually handle too many people. 00:33:09.580 |
And I don't think the metaverse has come to being. 00:33:16.860 |
It won't necessarily be from that one company. 00:33:23.700 |
but yes, we're gonna live increasingly online, 00:33:31.540 |
- Do you think it's possible that given this merger 00:33:42.660 |
Most of our life, we fall in love, we make friends, 00:33:46.300 |
we come up with ideas, we do collaborations, we have fun. 00:33:49.420 |
- I actually know somebody who's marrying somebody 00:33:52.940 |
I think they just met her briefly before the wedding, 00:33:57.700 |
but she actually fell in love with this other person 00:34:10.300 |
- That's a beautiful story, but do you think that story 00:34:13.140 |
is one that might be experienced as opposed to by 00:34:18.460 |
but instead by hundreds of millions of people? 00:34:27.220 |
And if anybody can do it, then it's really not 00:34:33.640 |
So I think more and more people will do that. 00:34:37.500 |
- But that's turning our back on our entire history 00:34:41.620 |
Or the old days, we used to fall in love by holding hands 00:34:50.780 |
- Actually, I have five patents on where you can hold hands, 00:34:58.740 |
So the touch, the sense, it's all just senses. 00:35:02.860 |
- Yeah, I mean, touch is, it's not just that you're touching 00:35:06.420 |
someone or not, there's a whole way of doing it, 00:35:21.620 |
- I have certain worries about the future, but not-- 00:35:39.500 |
Can you maybe talk through some of those stages, 00:35:42.820 |
from the physics and chemistry to DNA and brains, 00:35:54.180 |
So physics and chemistry, that's how we started. 00:35:57.720 |
- So from the very beginning of the universe-- 00:36:02.180 |
- We had lots of electrons and various things 00:36:09.060 |
many billions of years, kind of jumping ahead here 00:36:16.020 |
where we have things like love and creativity. 00:36:19.300 |
It's really quite remarkable that that happens. 00:36:21.820 |
But finally, physics and chemistry created biology and DNA, 00:36:27.860 |
and now you had actually one type of molecule 00:36:33.460 |
that described the cutting edge of this process. 00:36:37.000 |
And we go from physics and chemistry to biology. 00:36:47.120 |
I mean, not everything that's created by biology 00:36:51.460 |
has a brain, but eventually brains came along. 00:36:56.460 |
- And all of this is happening faster and faster. 00:37:04.540 |
Another key thing is actually not just brains, 00:37:18.080 |
Elephants have a bigger brain, whales have a bigger brain, 00:37:37.920 |
that's useful for puzzle solving in the physical reality. 00:37:41.360 |
- So I could think, I could look at a tree and go, 00:37:46.280 |
and eliminate the leaves and carve a tip on it 00:37:51.840 |
And you can't do that if you don't have a thumb. 00:38:33.320 |
And we're putting that into our human technology. 00:38:37.920 |
- So create the technology inspired by our own intelligence 00:38:48.680 |
Or do you ultimately see it as fundamentally-- 00:38:50.520 |
- And we ride along, but a lot of people don't see that. 00:38:52.840 |
They say, well, you got humans and you got machines 00:38:56.240 |
and there's no way we can ultimately compete with humans. 00:39:02.240 |
Lee Sedol, who's like the best Go player in the world, 00:39:16.680 |
But now a machine can actually go way beyond him. 00:39:22.440 |
And so he says, well, there's no point playing it anymore. 00:39:25.120 |
- That may be more true for games than it is for life. 00:39:28.960 |
I think there's a lot of benefit to working together 00:39:38.000 |
is it more likely that we merge with AI or AI replaces us? 00:39:43.000 |
- A lot of people just think computers come along 00:39:48.400 |
We can't really compete and that's the end of it. 00:40:36.840 |
even as a percentage of the population, has gone way up. 00:40:41.680 |
- We're looking at the X-axis year from 1774 to 2024 00:40:46.240 |
and on the Y-axis, personal income per capita 00:40:49.600 |
in constant dollars and it's growing super linearly. 00:41:06.520 |
is because we've basically enhanced our own capabilities 00:41:13.400 |
That's a key way in which we're gonna be able 00:41:18.640 |
by increasing the number of different parameters 00:41:28.600 |
to be able to get a glimpse preview of your upcoming book, 00:41:37.280 |
And one of the themes outside of just discussing 00:41:41.880 |
the increasing exponential growth of technology, 00:41:44.720 |
one of the themes is that things are getting better 00:41:53.680 |
So one of the things you're saying is with jobs. 00:42:01.040 |
especially powerful AI, will get rid of jobs. 00:42:16.680 |
And so the question is, will this time be different? 00:42:24.480 |
And it really has to do with how quickly we can merge 00:42:34.920 |
and maybe it's overcome some of its key problems, 00:42:38.640 |
and we really haven't enhanced human intelligence, 00:42:45.640 |
But I mean, that's why we create technologies, 00:42:58.800 |
We're not just gonna sit here with 300 million 00:43:10.940 |
Because that's useful, but we can multiply that by 10, 00:43:22.280 |
And you might think, well, what's the point of doing that? 00:43:28.700 |
It's like asking somebody that's never heard music, 00:43:36.580 |
I mean, you can't appreciate it until you've created it. 00:43:39.920 |
- There's some worry that there'll be a wealth disparity. 00:43:46.860 |
- Class or wealth disparity, only the rich people will be, 00:43:51.560 |
basically, the rich people will first have access 00:43:54.280 |
to this kind of thing, and then because of this kind 00:43:56.940 |
of thing, because the ability to merge will get richer, 00:44:05.180 |
I mean, there's like four billion cell phones 00:44:17.540 |
So you had to have some wealth in order to afford them. 00:44:26.480 |
So you can only afford these things if you're wealthy 00:44:31.260 |
at a point where they really don't work very well. 00:44:34.060 |
- So achieving scale and making it inexpensive 00:44:43.560 |
So these are not totally cheap, but they're pretty cheap. 00:44:46.960 |
I mean, you can get them for a few hundred dollars. 00:44:52.140 |
- Especially given the kind of things it provides for you. 00:44:57.100 |
that have very little, but they have a smartphone. 00:45:03.840 |
- I mean, I see homeless people have their own cell phones. 00:45:07.640 |
- Yeah, so your sense is any kind of advanced technology 00:45:13.760 |
- Right, it ultimately becomes cheap and will be affordable. 00:45:21.040 |
to put something in my brain to connect to computers 00:45:39.640 |
- So in which other ways, as you outline your book, 00:45:54.400 |
like even if you look at extreme poverty, for example. 00:46:00.880 |
taken on extreme poverty, and the people were asked, 00:46:16.720 |
If you're watching this or listening to this, 00:46:27.080 |
88% thought it had gotten worse, it remained the same. 00:46:39.480 |
- So only 1% of people got the right optimistic estimate 00:46:47.480 |
and it's true of almost everything you look at. 00:46:51.080 |
You don't wanna go back 100 years or 50 years. 00:47:01.000 |
- So literacy rate increasing over the past few centuries 00:47:07.920 |
nearly to 100% across many of the nations in the world. 00:47:28.160 |
particularly as we get into more advanced stages 00:47:33.400 |
- For life expectancy, these trends are the same 00:47:48.360 |
which might bring up some sort of controversial issues, 00:47:59.560 |
- Exactly, and somebody might represent democracy, 00:48:03.240 |
and go backwards, but we basically had no democracies 00:48:13.880 |
which in the scale of human history isn't that long. 00:48:30.640 |
and having their ideas, having their beliefs, 00:48:51.360 |
being, for example, being against any kind of stuff 00:49:15.160 |
- I mean, I do have some concerns about this, 00:49:27.940 |
Spreading misinformation on social networks is one of them, 00:50:03.560 |
We'd get under our desks and put our hands behind our heads 00:50:32.120 |
I mean, we have viruses that are hard to spread, 00:50:51.580 |
that would be very easy to spread and very dangerous, 00:51:02.060 |
without people noticing, 'cause people could get it, 00:51:04.620 |
they'd have no symptoms, and then everybody would get it, 00:51:08.340 |
and then symptoms would occur maybe a month later. 00:51:11.820 |
So I mean, and that actually doesn't occur normally, 00:51:17.820 |
because if we were to have a problem with that, 00:51:29.020 |
that we don't have viruses that can spread easily 00:51:33.180 |
and kill us, because otherwise we wouldn't exist. 00:51:39.060 |
They want to spread and keep the host alive somewhat. 00:51:44.100 |
So you can describe various dangers with biology. 00:51:47.240 |
Also nanotechnology, which we actually haven't experienced 00:51:53.500 |
yet, but there are people that are creating nanotechnology, 00:52:07.540 |
What's exciting, what's terrifying about nanobots? 00:52:19.020 |
and you need a small entity that can actually get in there 00:52:28.180 |
to connect our brains to AI within ourselves, 00:52:41.820 |
And that's key, actually, 'cause a lot of the things 00:52:45.740 |
like Neuralink are really not high-bandwidth yet. 00:52:49.020 |
- So nanobots is the way you achieve high-bandwidth. 00:52:52.700 |
How much intelligence would those nanobots have? 00:53:19.820 |
- Well, I mean, there's the great goo challenge. 00:53:23.580 |
- If you have a nanobot that wanted to create 00:53:37.520 |
and was able to operate in a natural environment, 00:53:55.460 |
- I'd love to hear your opinion about the 21st century 00:54:01.840 |
and whether you think we might destroy ourselves. 00:54:11.760 |
that we could have a hot war with nuclear powers involved 00:54:16.760 |
and the tensions building and the seeming forgetting 00:54:23.320 |
of how terrifying and destructive nuclear weapons are. 00:54:55.360 |
with this one nuclear power plant that's been taken over, 00:55:06.920 |
a lot of people worrying that that's gonna happen. 00:55:22.400 |
- We've never had another one go off through anger. 00:55:40.660 |
and ultimately, superintelligent AI will help us 00:55:47.720 |
- I think so, but we do have to be mindful of these dangers. 00:55:52.440 |
And there are other dangers besides nuclear weapons. 00:56:01.100 |
will we be able to upload our mind in a computer 00:56:11.660 |
So copy our mind into a computer and leave the body behind? 00:56:15.320 |
- Let me describe one thing I've already done with my father. 00:56:25.420 |
This is public, came out, I think, six years ago, 00:56:35.740 |
is still on the market, it would read 200,000 books, 00:56:40.740 |
and then find the one sentence in 200,000 books 00:56:52.740 |
and you get the best answer in 200,000 books. 00:56:56.220 |
But I was also able to take it and not go through 00:57:02.260 |
200,000 books, but go through a book that I put together, 00:57:07.060 |
which is basically everything my father had written. 00:57:09.700 |
So everything he had written, I had gathered, 00:57:20.220 |
Now, I didn't think this actually would work that well, 00:57:23.340 |
because stuff he had written was stuff about how to lay out, 00:57:28.340 |
I mean, he directed choral groups and music groups, 00:57:35.860 |
and he would be laying out how the people should, 00:57:49.620 |
and all kinds of things that really didn't seem 00:57:55.100 |
And yet, when you ask a question, it would go through it, 00:58:00.780 |
and it would actually give you a very good answer. 00:58:04.700 |
So I said, "Well, who's the most interesting composer?" 00:58:09.500 |
And he would go on about how Brahms was fabulous, 00:58:13.200 |
and talk about the importance of music education. 00:58:17.980 |
- So you could have essentially a question and answer, 00:58:22.980 |
which was actually more interesting than talking to him, 00:58:25.900 |
because if you talked to him, he'd be concerned about 00:58:50.280 |
You know, you get used to missing somebody after 52 years, 00:58:57.920 |
and I didn't really have intelligent conversations with him 00:59:11.880 |
about different things like music and other things. 00:59:19.840 |
- What did you learn about life from your father? 00:59:59.840 |
- I mean, I got involved with technology at like age five. 01:00:11.320 |
I remember, this actually happened with my grandmother. 01:00:20.080 |
and she wrote a book, "One Life is Not Enough," 01:00:23.040 |
which actually a good title for a book I might write. 01:00:33.800 |
So my mother's mother's mother created the school in 1868, 01:00:47.020 |
and you were lucky enough to get an education at all, 01:00:52.920 |
and many people didn't have any education as a girl. 01:01:03.520 |
and the book was about the history of the school 01:01:14.000 |
I was not so interested in the story of the school, 01:01:19.000 |
but I was totally amazed with this manual typewriter. 01:01:44.820 |
But in magic, if somebody actually knows how it works, 01:01:57.880 |
I didn't have that word when I was five or six. 01:02:08.800 |
and then went around, collected little pieces 01:02:12.600 |
of mechanical objects from bicycles, from broken radios. 01:02:19.400 |
This was an era where you would allow five or six-year-olds 01:02:37.360 |
And I actually remember talking to these very old girls, 01:02:42.760 |
and telling them, "If I could just figure this out, 01:02:50.080 |
And they said, "Well, you have quite an imagination." 01:03:09.920 |
And all of it was controlled through one control box. 01:03:15.840 |
And it was a big hit in my third grade class. 01:03:27.680 |
where I won the Westinghouse Science Talent Search. 01:03:37.480 |
- You've talked about how you use lucid dreaming 01:03:42.400 |
to think, to come up with ideas as a source of creativity. 01:04:07.080 |
- Well, I mean, sometimes I will think through in a dream 01:04:12.320 |
But I think the key issue that I would tell younger people 01:04:25.080 |
that what you're trying to create already exists. 01:04:39.220 |
You paint a world that you would like to exist, 01:04:42.780 |
you think it exists, and reverse engineer that. 01:04:46.820 |
you're giving a speech about how you created this. 01:04:51.780 |
as to how you would create it in order to make it work. 01:05:10.640 |
we're trying to invent, I would use the present tense 01:05:42.940 |
I mean, that's really why I got into futurism. 01:05:54.300 |
It's really to figure out when things are feasible. 01:06:11.140 |
In fact, they did create GPT-2, which didn't work. 01:06:26.280 |
- So futurism, in some sense, is a study of timing, 01:06:31.220 |
trying to understand how the world will evolve 01:06:34.340 |
and when will the capacity for certain ideas emerge. 01:06:42.540 |
But really, its original purpose was to time my products. 01:06:53.840 |
because OCR doesn't require a lot of computation. 01:07:02.780 |
- Yeah, so we were able to do that in the '70s, 01:07:06.540 |
and I waited 'til the '80s to address speech recognition, 01:07:21.340 |
- And that's how you've developed that brain power 01:07:35.300 |
because looking at what things will be like in the future 01:07:40.300 |
reflects such dramatic changes in how humans will live. 01:07:51.260 |
- So you developed that muscle of predicting the future, 01:07:56.260 |
and then apply it broadly, and start to discuss 01:08:02.280 |
how it changes the world of human life on Earth. 01:08:11.620 |
to question assumptions that limit human imagination 01:08:22.820 |
- Well, it's good that you picked that quote, 01:08:24.580 |
because I think that does symbolize what Danielle is about. 01:08:32.840 |
- I mean, we see that when people can go beyond 01:08:38.660 |
the current realm and create something that's new. 01:08:49.980 |
and it did require changes in the way people work. 01:08:53.200 |
- Is there practical advice you give in the book 01:08:57.860 |
about what each of us can do to be a Danielle? 01:09:02.040 |
- Well, she looks at the situation and tries to imagine 01:09:19.700 |
so she can communicate these ideas to other people. 01:09:24.700 |
- And there's practical advice of learning to program 01:09:27.620 |
and recording your life and things of this nature. 01:09:42.220 |
how young people can actually change the world 01:09:56.780 |
that your mind, your body can change the world. 01:10:02.780 |
- And not letting anyone else tell you otherwise. 01:10:08.940 |
When we upload, the story you told about your dad 01:10:15.340 |
we're talking about uploading your mind to the computer. 01:10:25.620 |
We'll have avatars that mimic increasingly better and better 01:10:29.820 |
our behavior, our appearance, all that kind of stuff. 01:10:36.780 |
- Yes, I mean, we need some information about them. 01:11:00.860 |
Now, you could do a Frederick Kurzweil Turing test. 01:11:25.540 |
- Yeah, well, I think that would be the goal. 01:11:28.340 |
- And that's the connection we have with loved ones. 01:11:30.340 |
It's not really based on very strict definition of truth. 01:11:46.740 |
- So do you think we'll have a world of replicants, 01:11:49.900 |
of copies, there'll be a bunch of Ray Kurzweils. 01:12:00.020 |
And you, the original copy, wouldn't even know about it. 01:12:10.020 |
first of all, do you think that world is feasible? 01:12:13.340 |
And do you think there's ethical challenges there? 01:12:18.020 |
with Ray Kurzweil and you not knowing about it? 01:12:38.740 |
but if somebody hanging out with you, a replicant of you. 01:12:43.740 |
- Well, I think I would start, it sounds exciting, 01:12:46.780 |
but then what if they start doing better than me 01:12:55.020 |
And then, because they may be an imperfect copy 01:13:00.020 |
or they may be more social or all these kinds of things. 01:13:10.260 |
Maybe they're a copy of the best version of me 01:13:13.220 |
- Yeah, but if you hang out with a replicant of me 01:13:20.220 |
I'd feel proud of that person 'cause it was based on me. 01:13:24.940 |
- So, but it is a kind of death of this version of you. 01:13:40.260 |
that they've done even more than you were able to do. 01:14:07.400 |
they're not necessarily gonna have your rights. 01:14:10.320 |
And if a replicant occurs to somebody who's already dead, 01:14:21.160 |
Do they have all the agreements that they had? 01:14:25.780 |
- I think you're gonna have to have laws that say, yes. 01:14:30.140 |
There has to be, if you wanna create a replicant, 01:14:33.180 |
they have to have all the same rights as human rights. 01:15:10.620 |
and then all the other ones kind of share the rights. 01:15:13.300 |
Yeah, I just don't think that's very difficult 01:15:17.340 |
to conceive for us humans, the idea that this country-- 01:15:20.420 |
- Well, you create a replicant that has certain, 01:15:26.820 |
including my wife, who would like to get back her father. 01:15:30.580 |
And she doesn't worry about who has rights to what. 01:15:36.620 |
She would have somebody that she could visit with 01:15:42.500 |
And she wouldn't care about any of these other rights. 01:15:49.300 |
- What does your wife think about multiple recursals? 01:15:58.260 |
- I think ultimately that's an important question, 01:16:09.980 |
So the loved ones really are the key determinant, 01:16:24.140 |
and we have to contend with that idea with AI. 01:16:30.260 |
I mean, I talk to people who really miss people who are gone 01:16:52.860 |
the more we're able to reconstruct that person 01:17:10.240 |
In fact, anything that's data is always collected. 01:17:38.300 |
I mean, you can create somebody that's just like them 01:17:51.700 |
in some sense I would continue on if I continued tweeting. 01:17:57.600 |
- Yeah, well, I mean, that's one of the advantages 01:18:13.180 |
do you hope humans will become a multi-planetary species? 01:18:17.380 |
You've talked about the phases, the six epochs, 01:18:20.000 |
and one of them is reaching out into the stars in part. 01:18:23.580 |
- Yes, but the kind of attempts we're making now 01:18:41.100 |
- Yeah, and we're also putting out other human beings, 01:18:59.240 |
it's where we can spread our superintelligence 01:19:13.840 |
- I mean, we would send intelligence masses of nanobots 01:19:27.840 |
- Do you think there's intelligent alien civilizations 01:19:58.980 |
And one gives you thousands of advanced civilizations 01:20:20.200 |
Because we've gone from where the fastest way 01:20:24.900 |
I could send a message to somebody was with a pony, 01:20:28.540 |
which was what, like a century and a half ago? 01:20:42.760 |
you're gonna have an absolutely fantastic amount 01:20:50.440 |
- Yeah, the speed and the scale of information transfer 01:20:53.360 |
is just growing exponentially, in a blink of an eye. 01:21:11.780 |
by maybe millions of years, which isn't that much. 01:21:27.540 |
if two or 300 years is enough to go from a pony 01:21:30.860 |
to a fantastic amount of civilization, we would see that. 01:21:35.100 |
So of other civilizations that have occurred, 01:21:55.280 |
But we don't see anything doing galaxy-wide engineering. 01:22:00.120 |
So either they don't exist, or this very universe 01:22:10.940 |
- Well, that's another explanation that, yes, 01:22:14.920 |
you've got some teenage kids in another civilization. 01:22:19.180 |
- Do you find compelling the simulation hypothesis 01:22:22.320 |
as a thought experiment, that we're living in a simulation? 01:22:29.440 |
so we are an example in a computational world. 01:22:44.860 |
but it nonetheless is taking place in a computational world, 01:23:05.680 |
- Well, then it's the teenager that makes the video game. 01:23:14.720 |
cognitive capability have strived to understand ourselves, 01:23:37.720 |
We started out with lots of particles going around, 01:23:42.840 |
and there's nothing that represents love and creativity. 01:23:56.880 |
and that has to do actually with consciousness, 01:24:00.000 |
because you can't have love without consciousness. 01:24:29.200 |
And I think they've identified three of them. 01:24:39.080 |
that you can make deals with, and he gets angry, 01:24:50.440 |
as a symbol of love and peace and harmony and so forth. 01:25:03.640 |
not as a person in the sky that you can make deals with. 01:25:08.120 |
- It's whatever the magic that goes from basic elements 01:25:19.200 |
extremely beautiful and powerful is cellular automata, 01:25:23.600 |
Do you think whatever the heck happens in cellular automata 01:25:27.720 |
where interesting, complicated objects emerge, 01:25:33.480 |
The emergence of love in this seemingly primitive universe? 01:25:54.880 |
about cellular automata is it's primitive building blocks, 01:26:17.520 |
We went through all the six phases of reality. 01:26:41.320 |
of being able to expand human life quickly enough 01:27:01.480 |
particularly with, for example, doing simulated biology. 01:27:08.880 |
say, by the end of this decade, and that's my goal. 01:27:12.800 |
- Do you hope to achieve the longevity, escape velocity? 01:27:22.960 |
I can't really come on your program saying, I've done it. 01:27:26.080 |
I've achieved immortality, because it's never forever. 01:27:35.280 |
- But we'd like to actually advance human life expectancy, 01:27:42.480 |
every year, and I think we can get there within, 01:27:53.600 |
"The Nine Steps to Living Well Forever," your book. 01:28:02.320 |
- Yeah, I mean, we live in a body that doesn't last forever. 01:28:11.160 |
and we're discovering things, I think, that will extend it. 01:28:22.080 |
Went to Mexico 40 years ago, developed salmonella. 01:28:41.800 |
an autoimmune disorder that destroys your pancreas. 01:28:48.720 |
'cause type two diabetes, your pancreas works fine, 01:28:51.800 |
but your cells don't absorb the insulin well. 01:28:57.060 |
The pancreatitis I had partially damaged my pancreas, 01:29:04.560 |
but it was a one-time thing, it didn't continue, 01:29:29.360 |
just tinkering with my own body to keep it going. 01:29:34.240 |
So I do think I'll last 'til the end of this decade, 01:29:37.720 |
and I think we'll achieve longevity, escape velocity. 01:30:14.780 |
- Is there a case, this is a difficult question, 01:30:18.400 |
but is there a case to be made against living forever 01:30:23.400 |
that a finite life, that mortality is a feature, not a bug, 01:30:29.640 |
that living a shorter, so dying makes ice cream 01:30:34.640 |
taste delicious, makes life intensely beautiful 01:31:03.860 |
So I mean, death is not something to celebrate, 01:31:10.640 |
but we've lived in a world where people just accept this. 01:31:15.980 |
Well, life is short, you see it all the time on TV, 01:31:18.340 |
oh, life's short, you have to take advantage of it. 01:31:22.700 |
that you could actually go beyond normal lifetimes. 01:31:27.940 |
But anytime we talk about death or a death of a person, 01:31:35.420 |
If you have somebody that lives to 100 years old, 01:31:47.620 |
In fact, these kinds of trends are gonna provide 01:31:52.000 |
greater and greater opportunity for everybody, 01:32:03.060 |
that will look back and remember Ray Kurzweil version zero. 01:32:17.020 |
In a "Hitchhiker's Guide to the Galaxy" summary 01:32:20.460 |
of Ray Kurzweil, what do you hope your legacy is? 01:32:23.160 |
- Well, I mean, I do hope to be around, so that's-- 01:32:28.060 |
Do you think you'll be the same person around? 01:32:32.140 |
- I mean, am I the same person I was when I was 20 or 10? 01:32:37.020 |
- You would be the same person in that same way, 01:32:41.560 |
All we have of that, all you have of that person 01:32:46.900 |
is your memories, which are probably distorted in some way. 01:32:55.860 |
Depending on your psyche, you might focus on the bad parts, 01:33:01.260 |
- Right, but I mean, I'd still have a relationship 01:33:10.820 |
- How will you and the other super intelligent AIs 01:33:17.740 |
What do you hope to be remembered by, this version of you, 01:33:25.660 |
- Well, I think it's expressed well in my books, 01:33:28.220 |
trying to create some new realities that people will accept. 01:33:32.620 |
I mean, that's something that gives me great pleasure 01:33:36.060 |
and greater insight into what makes humans valuable. 01:33:45.340 |
I'm not the only person who's tempted to comment on that. 01:34:09.100 |
- So you asked your dad about the meaning of life 01:34:15.100 |
and he said, "Love, let me ask you the same question. 01:34:22.840 |
"This beautiful journey that we're on in phase four, 01:34:35.500 |
- Well, I think I'd give the same answer as my father. 01:34:54.260 |
Well, I think that's a beautiful way to end it. 01:35:01.120 |
Thank you for dreaming about a beautiful future 01:35:12.220 |
- It was my pleasure and you have some great insights 01:35:25.580 |
please check out our sponsors in the description. 01:35:37.580 |
"that is the dominant factor in society today. 01:35:41.000 |
"No sensible decision could be made any longer 01:35:43.780 |
"without taking into account not only the world as it is, 01:35:55.300 |
"must take on a science fictional way of thinking." 01:35:58.540 |
Thank you for listening and hope to see you next time.