back to index

Elon Musk: So You're Saying There's a Chance - Neuralink and Merging with AI


Whisper Transcript | Transcript Only Page

00:00:00.000 | There's a tremendous amount of good that Neuralink can do in solving critical damage to the brain
00:00:08.480 | or the spinal cord.
00:00:10.120 | There's a lot that can be done to improve quality of life of individuals, and those
00:00:15.080 | will be steps along the way.
00:00:17.080 | And then ultimately, it's intended to address the existential risk associated with digital
00:00:24.840 | superintelligence.
00:00:25.840 | Like, we will not be able to be smarter than a digital supercomputer.
00:00:33.920 | So therefore, if you cannot beat them, join them.
00:00:37.480 | And at least we won't have that option.
00:00:39.920 | So you have hope that Neuralink will be able to be a kind of connection to allow us to
00:00:47.880 | merge, to ride the wave of the improving AI systems?
00:00:52.080 | I think the chance is above 0%.
00:00:55.200 | So it's non-zero.
00:00:56.200 | There's a chance.
00:00:57.200 | Have you seen Dumb and Dumber?
00:00:59.200 | So I'm saying there's a chance.
00:01:00.200 | He's saying one in a billion or one in a million, whatever it was at Dumb and Dumber.
00:01:07.840 | You know, it went from maybe one in a million to improving, maybe it'll be one in a thousand
00:01:11.720 | and then one in a hundred, then one in ten.
00:01:13.520 | It depends on the rate of improvement of Neuralink and how fast we're able to make progress.
00:01:19.960 | Well, I've talked to a few folks here that are quite brilliant engineers, so I'm excited.
00:01:25.080 | It's important that Neuralink solve this problem sooner rather than later, because the point
00:01:29.360 | at which we have digital superintelligence, that's when we pass to singularity and things
00:01:33.800 | become just very uncertain.
00:01:34.800 | It doesn't mean that they're necessarily bad or good, but the point at which we pass to
00:01:37.960 | singularity, things become extremely unstable.
00:01:40.280 | So we want to have a human brain interface before the singularity, or at least not long
00:01:45.960 | after it, to minimize existential risk for humanity and consciousness as we know it.
00:01:51.440 | Thank you.
00:01:52.440 | [end of transcript]
00:02:09.460 | [BLANK_AUDIO]