back to indexGrimes: Music, AI, and the Future of Humanity | Lex Fridman Podcast #281
Chapters
0:0 Introduction
2:3 Identity
6:1 Music production
18:52 Spotify
23:34 Experimental music
26:9 Vision of future
38:42 Motherhood
54:50 Consciousness
69:40 Love
75:31 Metaverse
88:13 Technology and bureaucracy
92:11 Mortality
100:36 Understanding evil
104:34 Last person on Earth
107:12 Dan Carlin
109:42 Rise and Fall of the Third Reich
116:7 Coolest human invention
117:28 Advice for young people
120:40 Meaning of life
00:00:05.780 |
we are fundamentally different from previous, 00:00:20.180 |
but I think the computers are what make us homo techno. 00:00:22.760 |
I think this is what, it's a brain augmentation. 00:00:29.480 |
to which all the other technologies can also be accelerated. 00:00:32.700 |
- Would you classify yourself as a homo sapien 00:00:37.100 |
- So you're one of the earliest of the species. 00:00:43.080 |
- The following is a conversation with Grimes, 00:00:47.800 |
an artist, musician, songwriter, producer, director, 00:00:59.840 |
to help form an optimistic vision of our future. 00:01:18.280 |
- Yeah, I actually, this microphone Cloudlifter 00:01:27.640 |
- And that, yeah, it's an incredible microphone. 00:01:38.300 |
if you're just in a room and like the music's playing 00:01:44.720 |
so I think it's like a good mic for like just vibing out 00:01:51.880 |
- Anyway, this is the Michael Jackson, Quincy Jones 00:02:04.760 |
at least in this space and time, is C, like the letter C. 00:02:08.280 |
And you told me that C means a lot of things. 00:02:17.640 |
And it happens to be my favorite programming language 00:02:24.120 |
but it's also powerful, fast, and it's dangerous 00:02:28.160 |
'cause you can mess things up really bad with it 00:02:36.560 |
- I mean, to me, the coolest is the speed of light, 00:02:46.280 |
because essentially that's what we're rendering at. 00:02:49.080 |
See, I think we'll know if we're in a simulation 00:02:53.720 |
because if they can improve their render speed, then. 00:02:58.440 |
- It's already pretty good, but if it improves, 00:03:13.320 |
in terms of us humans on Earth interacting with things. 00:03:21.440 |
then you're gonna start noticing some weird stuff. 00:03:23.840 |
Or if you can operate in like around a black hole, 00:03:27.320 |
then you're gonna start to see some render issues. 00:03:29.680 |
- You can't go faster than the speed of light, correct? 00:03:36.680 |
- Theoretically, you can, you have wormholes. 00:03:41.880 |
that precludes faster than the speed of light to travel, 00:03:49.840 |
some really funky stuff with very heavy things 00:04:20.880 |
were you a different person than you are tonight? 00:04:50.160 |
- Okay, these are really intense questions, but-- 00:04:53.360 |
'Cause I ask this myself, like look in the mirror, 00:05:02.960 |
So I have a very inconsistent personality, yeah. 00:05:11.480 |
- Or my mood, like I'll go from being like a megalomaniac 00:05:16.120 |
to being like, you know, just like a total hermit 00:05:21.400 |
- So some combinatorial combination of your mood 00:05:26.320 |
- Yeah, mood and people I'm interacting with. 00:05:41.720 |
- Like my best friends, like people who I can, 00:05:47.520 |
and I know that they're gonna understand everything 00:05:57.680 |
But it's like, yeah, it takes a lot to get there. 00:06:05.600 |
Do those help you out as an artist or as a human being 00:06:11.920 |
So in creating music, in creating art, in living life, 00:06:15.440 |
do you like the constraints that this world puts on you? 00:06:24.720 |
- If constraints are moving, then you're good, right? 00:06:29.720 |
Like it's like as we are progressing with technology, 00:06:32.040 |
we're changing the constraints of like artistic creation. 00:06:34.880 |
Making video and music and stuff is getting a lot cheaper. 00:06:39.720 |
There's constantly new technology and new software 00:06:44.000 |
We have so much more freedom than we had in the '70s. 00:06:48.680 |
when they recorded "Thriller" with this microphone, 00:06:51.440 |
like they had to use a mixing desk and all this stuff. 00:07:00.520 |
And now I can just, I've made a whole album on this computer. 00:07:06.760 |
but then I'm also constrained in different ways 00:07:10.240 |
'cause there's like literally millions more artists. 00:07:15.680 |
It's just like, I also, I didn't learn music. 00:07:24.760 |
So I'm really kind of just like messing around 00:07:33.240 |
- Well, yeah, I mean, but the nature of music is changing. 00:07:35.720 |
So you're saying you don't know actual music, 00:07:41.880 |
is becoming, it's like merging with technology. 00:07:47.600 |
- It's becoming something more than just like 00:07:54.920 |
that requires engineering skills, programming skills, 00:07:59.400 |
some kind of a human robot interaction skills, 00:08:04.720 |
that Michael Jackson had, which is like a good ear 00:08:09.000 |
and not the final thing, what is put together. 00:08:11.480 |
Like you're allowed, you're enabled, empowered 00:08:17.160 |
to start like layering insane amounts of stuff. 00:08:24.960 |
I feel like people really don't appreciate it. 00:08:27.920 |
the way that people like pay producers and stuff, 00:08:32.440 |
it's super, producers are just deeply underrated. 00:08:35.560 |
Like so many of the songs that are popular right now 00:08:42.240 |
is 'cause the production is really interesting 00:09:01.360 |
there isn't like a formal training path for it. 00:09:12.160 |
like they didn't go to music school or anything. 00:09:18.360 |
is there some commonalities that tie them together 00:09:21.280 |
or are they all just different kinds of weirdos? 00:09:41.560 |
- But the thing he does, which is interesting, 00:10:02.640 |
to simplify, to like push towards minimalism. 00:10:11.560 |
'cause Rick Rubin produces for other artists, 00:10:32.400 |
- So you're the engineer, the producer and the artist. 00:10:51.080 |
I mean, lately, sometimes I'll work with a producer. 00:11:07.120 |
where everything became available on the computer 00:11:11.920 |
and you kind of got this like lone wizard energy thing going. 00:11:19.680 |
Is the loneliness somehow an engine of creativity? 00:11:24.560 |
most of your creative quote unquote genius in quotes 00:11:40.800 |
he's like most artists, they have about 10 years, 00:11:45.120 |
And then they usually stop making their like vital shit. 00:11:48.760 |
And I feel like I'm sort of like nearing the end 00:11:58.600 |
- Now I'm like, I'm in the process of becoming somebody else 00:12:08.360 |
and making like some of the most vital work I've ever made. 00:12:13.840 |
is like one of the best tools you can possibly find. 00:12:23.320 |
whatever HP plus one or like adds some like stats 00:12:30.760 |
will like square it instead of just like adding something. 00:12:34.240 |
- Double up the experience points, I love this. 00:12:36.320 |
We should also mention we're playing tavern music 00:12:58.400 |
There's something really joyful about wandering places 00:13:06.480 |
just exploring these landscapes in another world, 00:13:17.600 |
It doesn't have some of the messy complexities of life. 00:13:25.600 |
I'm sure in Elden Ring, there's a bunch of monsters 00:13:29.960 |
I feel like this is a good analogy to music production though 00:13:36.640 |
it's sort of like similar to people, to music producers, 00:13:46.200 |
but they're like, it's like the artist engineer 00:13:54.880 |
- Well, you're saying they don't get enough credit. 00:14:05.480 |
I'm not, you know, there's like Timbaland and Skrillex, 00:14:15.960 |
and just don't really know what it is per se. 00:14:27.240 |
just going for like even just a basic pop hit, 00:14:34.840 |
The production on that is actually like really crazy. 00:14:40.520 |
but it's like the production is exceptionally memorable. 00:14:49.160 |
It's just like, isn't part of like the rhetoric 00:14:53.400 |
We just sort of like don't consider the music producer. 00:14:57.200 |
Because I think the music producer used to be more 00:15:11.520 |
- We don't talk about like that with the music as often. 00:15:14.440 |
- The Beatles music producer was one of the first kind of guy, 00:15:24.160 |
He has the same, I forget his name, but you know, 00:15:33.480 |
to get the sound, to get the authentic sound. 00:15:38.120 |
You think those, where did they fit into how important 00:15:52.200 |
I feel like there's this desire for authenticity. 00:15:54.880 |
I used to be like really mad when like people wouldn't write 00:15:57.120 |
or produce their music and I'd be like, that's fake. 00:15:59.280 |
And then I realized there's all this like weird bitterness 00:16:04.280 |
and like aggroness in art about authenticity. 00:16:07.760 |
But I had this kind of like weird realization recently 00:16:10.800 |
where I started thinking that like art is sort of 00:16:20.240 |
Like art is kind of a conversation with all the artists 00:16:31.120 |
it's not like anyone's reinventing the wheel here. 00:16:36.680 |
thousands of years of art and like running it 00:16:40.000 |
through your own little algorithm and then like making, 00:16:49.640 |
- Like, and it's like, I feel like everyone's always like, 00:16:51.560 |
there's always copyright and IP and this and that 00:16:55.200 |
And it's just like, I think we need to stop seeing this 00:17:01.600 |
oh, the creative genius, the lone creative genius 00:17:05.040 |
'Cause it's like, I think art shouldn't be about that. 00:17:12.080 |
And it's also, art is also kind of the collective memory 00:17:14.960 |
Like we don't give a fuck about whatever ancient Egypt, 00:17:27.640 |
how many shields needed to be produced for this. 00:17:32.280 |
And it's like, you know, it's like in our day-to-day life, 00:17:34.880 |
there's all this stuff that seems more important than art 00:17:41.880 |
like the only thing that's really gonna be left 00:17:52.520 |
is the art we've generated across the different centuries, 00:18:00.740 |
they're gonna find the hieroglyphics and the pyramids. 00:18:04.360 |
They might find like the engineering marvels, 00:18:13.720 |
- I consider engineering in those formats to be art, 00:18:19.560 |
- It sucks that like digital art is easier to delete. 00:18:42.800 |
you know how we have like seed banks up in the North 00:18:46.400 |
- Like we should probably have like a solar powered 00:18:59.720 |
I get to me Spotify sort of as a consumer is super exciting. 00:19:08.400 |
make it super easy to sort of curate my own playlist 00:19:25.640 |
that was really liberating that I could let go of that 00:19:28.200 |
and letting go of the albums you're kind of collecting 00:19:33.600 |
exploring new artists and all that kind of stuff. 00:19:42.040 |
'cause there's more and more and more artists 00:19:46.040 |
- I think it's better that there's more artists. 00:19:49.840 |
'cause this is all from a conversation with Daniel Ek. 00:19:54.040 |
- We're all a victim of somebody's propaganda. 00:19:58.960 |
- But Daniel Ek was telling me that, you know, 00:20:19.560 |
Like there was just like a very tiny kind of 1%. 00:20:22.880 |
And Spotify has kind of democratized the industry 00:20:27.440 |
because now I think he said there's about a million artists 00:20:41.160 |
than and have more artists be able to have that. 00:20:46.560 |
Even though I like, I wish it could include everyone, but. 00:20:54.120 |
They wanna basically have as many creators as possible 00:21:07.840 |
My manager is building an app that can manage you. 00:21:12.840 |
So it'll like help you organize your percentages 00:21:16.520 |
and get your publishing and da, da, da, da, da. 00:21:20.000 |
So you can have a much bigger, it'll just like automate it. 00:21:35.680 |
Because one of the issues with music right now, 00:21:39.840 |
but it's that the art industry is filled with middlemen 00:22:00.480 |
And so I think part of the reason I'm a technocrat, 00:22:04.800 |
which I mean, your fans are gonna be technocrats. 00:22:07.080 |
So no one's, they're not gonna be mad at me about this, 00:22:09.320 |
but like my fans hate it when I say this kind of thing 00:22:18.800 |
and they were like, "The Martian technocracy." 00:22:25.640 |
I was like, "Cause Martian technocracy sounds sick to me." 00:22:36.120 |
if you can create an app that removes the need for a lawyer 00:22:39.960 |
and then you could have a smart contracts on the blockchain, 00:22:48.080 |
like can read your stuff and explain it to you, 00:22:57.040 |
the amount of money that you're getting from Spotify 00:22:58.720 |
actually means a lot more and goes a lot farther. 00:23:08.240 |
- Yeah, I think the issue isn't that there's not enough, 00:23:12.720 |
And I'm really into this positive some mindset, 00:23:23.560 |
how do we make the, or worrying about scarcity, 00:23:27.840 |
why don't we just increase the efficiency in that way? 00:23:38.660 |
being a musician is like having a conversation 00:23:57.260 |
and how much of it is like trying to as much as possible 00:24:00.140 |
to be an outsider and come up with something totally new? 00:24:02.640 |
It's like when you're thinking, when you're experimenting, 00:24:05.680 |
are you trying to be totally different, totally weird? 00:24:12.120 |
- Man, this is so hard 'cause I feel like I'm 00:24:14.640 |
kind of in the process of semi-retiring from music. 00:24:38.320 |
part of the reason music is marketed at young people 00:24:41.680 |
is 'cause young people are very neuroplastic. 00:24:48.240 |
it's gonna be really easy for you to love new music. 00:24:58.780 |
And I think it keeps my brain really plastic. 00:25:07.120 |
force yourself to like, okay, well, why do people like it? 00:25:09.980 |
And like, you know, make your brain form new neural pathways 00:25:34.020 |
and you just kind of stay with that for the rest of your life 00:25:36.780 |
and then you never understand the modern music. 00:25:52.900 |
- So it's a good measure of the species open mind 00:25:57.340 |
and this to change is how often you listen to new music. 00:26:01.420 |
- The brain, let's put the music brain back on the shelf. 00:26:05.200 |
I gotta pull out the futurist brain for a second. 00:26:16.920 |
from like from our current way of life on earth? 00:26:22.120 |
We can talk about augmented reality, virtual reality, 00:26:25.400 |
maybe robots, maybe space travel, maybe video games, 00:26:36.260 |
maybe destructive nuclear wars, good and bad. 00:26:39.220 |
When you think about the future, what are you imagining? 00:26:43.540 |
What's the weirdest and the wildest it could be? 00:26:45.940 |
- Have you read "Surface Detail" by Ian Banks? 00:26:50.100 |
"Surface Detail" is my favorite depiction of a, 00:26:56.540 |
It's literally the greatest science fiction book 00:27:14.980 |
- I always wanted to name an album "War and Peace." 00:27:34.980 |
that I've ever read about or heard about in science fiction. 00:27:38.660 |
Basically there's the relationship with super intelligence, 00:27:44.540 |
like artificial super intelligence is just, it's like great. 00:27:49.140 |
I wanna credit the person who coined this term 00:27:55.220 |
And I feel like young women don't get enough credit in. 00:27:58.640 |
Yeah, so if you go to Protopia Futures on Instagram, 00:28:13.960 |
And I'm probably gonna, I'm probably butchering this a bit, 00:28:17.680 |
but Protopia is sort of, if Utopia is unattainable, 00:28:26.000 |
- Wow, that's an awesome Instagram, Protopia Futures. 00:28:28.640 |
- A great, a future that is as good as we can get. 00:28:34.720 |
AI, is this a centralized AI in "Surface Detail" 00:28:46.800 |
that kind of move people around and the ships are sentient. 00:28:52.240 |
And I mean, there's a lot of different types of AI 00:28:58.320 |
but in the opening scene of "Surface Detail," 00:29:02.320 |
and the Culture is basically a Protopian future. 00:29:14.000 |
I think feels hopeless and it's sort of like, 00:29:33.120 |
There's like as close to equality as you can get. 00:29:35.680 |
You know, it's like approximately a good future. 00:29:45.600 |
this girl, she's born as a sex slave outside of the culture. 00:29:52.640 |
She tries to kill the guy who is her like master, 00:29:59.040 |
when she was traveling on a ship through the culture 00:30:16.480 |
is her memory has been uploaded by this neural lace 00:30:22.680 |
And this AI is interfacing with her recorded memory 00:30:31.320 |
But because you had a neural lace, your memory's uploaded. 00:30:35.000 |
And you're going to be born here in the culture 00:30:43.680 |
All the ships are kind of super intelligence. 00:30:46.920 |
a kind of rich, fulfilling experience for the humans. 00:30:49.680 |
- Yeah, like they're like friends with the humans. 00:30:51.960 |
that don't want to exist with biological beings, 00:30:54.440 |
but they just have their own place, like way over there. 00:30:57.080 |
- But they don't, they just do their own thing. 00:31:07.900 |
they're fighting, there's these artificial hells 00:31:17.600 |
Like basically when people do crime, they get sent, 00:31:21.160 |
to an artificial hell and they're eternally tortured. 00:31:23.480 |
And so, and then the way that society is deciding 00:31:36.040 |
people are basically essentially fighting in a video game 00:31:40.120 |
- But they're still experiencing the suffering, right? 00:32:01.640 |
"We should have simulated hell to deter crime." 00:32:13.160 |
they're having like a giant Fortnite battle to decide this. 00:32:21.720 |
That's like, okay, we can have war without death. 00:32:24.560 |
I don't think there should be simulated hells. 00:32:29.840 |
in which technology could go very, very, very, very wrong. 00:32:34.320 |
- So almost punishing people in a digital space 00:32:41.360 |
- So either as a deterrent, like if you committed a crime, 00:32:46.280 |
if there's some sick, demented humans in this world. 00:33:02.040 |
- It's dark 'cause he kind of goes through human history 00:33:14.240 |
"watching the death and torture of other humans." 00:33:21.680 |
we should be allowed to have gladiatorial matches. 00:33:26.200 |
- But consent is hard to achieve in those situations. 00:33:51.240 |
- Well, I think that's what we're doing right now. 00:33:53.200 |
I have this theory that what is very important 00:33:56.360 |
about the current moment is that all of evolution 00:34:00.840 |
has been survival of the fittest up until now. 00:34:03.360 |
And at some point, the lines are kind of fuzzy, 00:34:07.240 |
but in the recent past, or maybe even just right now, 00:34:19.400 |
Like we, probably since like the integration of the iPhone, 00:34:29.320 |
we are fundamentally different from previous, 00:34:43.360 |
and you took an MRI of like a medieval brain, 00:34:49.960 |
- Do you think when historians look back at this time, 00:34:51.960 |
they'll see like this was a fundamental shift 00:35:04.400 |
And I think right now, the way we are evolving, 00:35:13.000 |
but I think this idea that like this is a time 00:35:19.600 |
It like now is the moment to reprogram the human computer. 00:35:27.240 |
your visual cortex will get taken over with other functions. 00:35:37.160 |
And so we actually have a huge responsibility to do that. 00:35:39.920 |
And I think, I'm not sure who should be responsible for that 00:35:42.880 |
but there's definitely not adequate education. 00:35:45.200 |
We're being inundated with all this technology 00:35:57.480 |
And we could evolve, we could be really whatever we want. 00:36:03.040 |
And I think if we choose correctly and we choose wisely, 00:36:06.480 |
consciousness could exist for a very long time 00:36:09.760 |
and integration with AI could be extremely positive. 00:36:16.400 |
- Do you think we might irreversibly screw things up 00:36:23.200 |
So maybe the way we figure things out is by screwing it up. 00:36:29.480 |
we'll see the negative effects of social media 00:36:34.960 |
And then we learn the failure from the failures of the past. 00:36:39.880 |
On the flip side, we can get it wrong in a way 00:36:42.960 |
where like literally we create weapons of war 00:37:02.360 |
if you look at Silicon Valley and you look at like, 00:37:04.320 |
whatever the technocracy, like what's been happening there. 00:37:16.920 |
I guess it was useful, but it was, it's sort of 00:37:28.400 |
like single use cutlery or like meditation apps. 00:37:33.160 |
I think we are actually evolving and changing 00:37:41.600 |
there isn't quite enough education about this. 00:38:12.640 |
would incentivize things that are more useful 00:38:22.960 |
or things that help us intelligently design our new brains. 00:38:37.400 |
So like long-term well-being, long-term happiness. 00:38:47.440 |
And it's like, I keep thinking about motherhood, 00:38:53.720 |
that is very difficult, that is not compensated. 00:39:01.960 |
And so we really devalue motherhood in our society 00:39:06.120 |
Like capitalism does not recognize motherhood. 00:39:08.200 |
It's just a job that you're supposed to do for free. 00:39:11.200 |
And it's like, but I feel like producing great humans 00:39:15.400 |
should be seen as a great, as profit under capitalism. 00:39:19.480 |
Like that should be, that's like a huge social good. 00:39:25.640 |
So like if that was integrated into the profit structure, 00:39:29.880 |
then, you know, and if we potentially found a way 00:39:46.680 |
Like, I mean, that's where you start getting into-- 00:39:49.480 |
- Reallocation of resources that people get upset over. 00:39:56.360 |
- Well, what if we made like a motherhood Dow? 00:40:01.240 |
- You know, and used it to fund like single mothers, 00:40:13.640 |
So, I mean, if you create and put beautiful things 00:40:29.320 |
- Anything that could just should be valued by society. 00:40:34.240 |
into the framework of what, as a market of what, 00:40:38.480 |
like if you contribute children to this world, 00:40:42.440 |
And sort of celebrated, like proportional to what it is, 00:40:47.440 |
which is it's the thing that fuels human civilization. 00:40:54.560 |
I mean, I think we're in very different social spheres, 00:40:56.480 |
but everyone's always saying like dismantle capitalism. 00:40:59.680 |
well, I don't think the government should own everything. 00:41:02.080 |
Like, I don't think we should not have private ownership. 00:41:05.320 |
You know, like that starts getting into weird stuff 00:41:13.560 |
- But obviously capitalism has some major flaws. 00:41:17.040 |
And I think actually Mack showed me this idea 00:41:28.600 |
Like, you know, it's like right now companies need to, 00:41:31.640 |
like you're supposed to grow every quarter or whatever 00:41:45.640 |
Like, do you really need all this extra economic growth 00:41:47.600 |
or could you add this social good and that counts? 00:41:49.560 |
And you know, I don't know if I am not an economist. 00:41:54.040 |
I have no idea how this could be achieved, but-- 00:41:57.880 |
how anything could be achieved either, but they pretend. 00:42:02.360 |
and they go on TV shows and sound like an expert. 00:42:22.400 |
It's actually changing me more right now in this moment 00:42:28.520 |
- Just like in the most recent months and stuff. 00:42:39.360 |
and you look at yourself, it's again, who are you? 00:42:41.860 |
How have you become different, would you say? 00:42:45.520 |
- I think it's just really reorienting my priorities. 00:42:50.520 |
And at first I was really fighting against that 00:42:59.840 |
if like my kids started mattering more than my work. 00:43:05.680 |
And then like more recently I started sort of analyzing 00:43:13.920 |
It's like we've just devalued motherhood so much 00:43:16.320 |
in our culture that like I feel guilty for caring 00:43:21.320 |
about my kids more than I care about my work. 00:43:27.500 |
So it's continually breaking, it's like freedom, 00:43:49.320 |
Just 'cause like the stakes are higher somehow? 00:44:00.840 |
It's just like, it's like going on a crazy journey 00:44:05.440 |
or something, it's like the craziest science fiction novel 00:44:11.080 |
It's just so crazy watching consciousness come into being. 00:44:16.560 |
like you're forced to value your time so much. 00:44:21.160 |
Like when I have creative time now, it's so sacred. 00:44:29.400 |
But the other thing is that I used to just be like a cynic 00:44:35.560 |
like my last album was called "Misanthropocene" 00:44:45.020 |
instead of the old gods, we have like new gods 00:44:46.840 |
and it's like misanthropocene is like misanthrope 00:44:52.200 |
like and she's the goddess of climate change or whatever. 00:45:02.720 |
like I used to like have no problem just making cynical, 00:45:09.040 |
And not that there's anything wrong with that, 00:45:11.140 |
but I think having kids just makes you such an optimist. 00:45:13.600 |
It just inherently makes you wanna be an optimist so bad 00:45:16.960 |
that like I feel like I'm more of a responsibility 00:45:25.760 |
'cause everyone's like, oh, you're so privileged. 00:45:44.320 |
It's like, as we said earlier, life imitates art. 00:45:49.800 |
And so we really need more protopian or utopian art. 00:45:58.120 |
And I think the current discourse where that's seen 00:46:15.000 |
And like having kids just makes me wanna imagine 00:46:20.240 |
amazing futures that like maybe I won't be able to build, 00:46:24.440 |
but they will be able to build if they want to. 00:46:30.320 |
So you have to imagine it in order to be able to build it. 00:46:36.720 |
that they somehow a cynical view of the world 00:46:59.320 |
You're blinded by something, but you're certainly blinded. 00:47:03.800 |
That's sad to see because it seems like the optimists 00:47:14.760 |
You have to be either stupid or excited or passionate 00:47:19.240 |
or mad enough to actually believe that it can be built. 00:47:32.480 |
- No, I probably would say I would probably hate it, yeah. 00:47:39.800 |
I don't have strong feelings about "Star Wars." 00:47:49.640 |
I really wanna have one more son and call him, 00:48:01.760 |
- And then techno is obviously the best genre of music, 00:48:26.840 |
- But don't kill what you hate, save what you love. 00:48:30.440 |
- Don't kill what you hate, save what you love. 00:48:36.600 |
We're just diagnosing and diagnosing and diagnosing. 00:48:41.640 |
and we're not trying to save what we love enough. 00:49:01.640 |
Maybe we don't need to destroy the oil industry, 00:49:05.680 |
maybe we just create great new battery technology 00:49:15.480 |
It's like, don't kill what you hate, save what you love. 00:49:20.200 |
Make new things and just render the old things unusable. 00:49:35.720 |
I feel like we could completely revolutionize education 00:49:40.440 |
and you have to pay to get all the studies and everything. 00:49:46.560 |
or we created a Dow that was funding studies, 00:49:48.520 |
and those studies were open source, or free for everyone? 00:49:55.160 |
and decentralized education and made it free, 00:50:00.920 |
and all the outcomes of studies are on the internet? 00:50:10.120 |
and you just take tests when you apply for a job, 00:50:14.280 |
and if you're qualified, then you can work there. 00:50:24.000 |
You gotta think from just basic first principles, 00:50:43.360 |
like you're not gonna go in and just kill them, 00:50:55.120 |
I think part of our switch to intelligent design 00:51:04.280 |
I don't think we can eradicate violence from our species, 00:51:13.240 |
our primitive brains that are fighting over scarcity, 00:51:20.600 |
and move into, we can optimize for creativity and building. 00:51:25.400 |
- Yeah, it's interesting to think how that happens. 00:51:29.560 |
some of it is living life and introspecting your own mind, 00:51:34.080 |
and trying to live up to the better angels of your nature 00:51:37.760 |
for each one of us, all those kinds of things at scale. 00:51:51.080 |
technology is a really promising way to do that. 00:51:55.160 |
Like social media should be a really promising way 00:52:05.160 |
I don't engage with any of the negative stuff. 00:52:07.520 |
Just not even like by blocking or any of that kind of stuff, 00:52:14.720 |
Like just, like when somebody says something negative, 00:52:18.480 |
I see it, I immediately think positive thoughts about them, 00:52:25.680 |
After that, just move on, 'cause like that negative energy, 00:52:30.280 |
they're going to get excited in a negative way right back. 00:52:37.360 |
But you would think technology would assist us 00:52:46.360 |
But unfortunately, social media profits from the negativity. 00:52:53.400 |
Like you should take a course before you use it. 00:52:59.480 |
like when I say reprogram the human computer, 00:53:09.240 |
And like you should learn how to have hygiene 00:53:19.920 |
But I don't know, I'm not sure social media should, 00:53:27.480 |
- I mean, we're in the messy, it's the experimental phase. 00:53:29.520 |
Like we're working it out. - Yeah, it's the early days. 00:53:35.080 |
I think social media is just basic human connection 00:53:42.000 |
But there's so many ways to do it in a bad way. 00:53:45.960 |
There's all discussions of all the same human rights. 00:53:56.200 |
We talk about all these things that we had to figure out 00:54:31.000 |
this is like the second coming of the printing press. 00:54:34.800 |
We're probably gonna have some shitty times for a minute. 00:54:40.720 |
to have a better understanding of how we consume media 00:54:47.960 |
- Speaking of programming the human computer, 00:54:52.960 |
So there's this young consciousness coming to be, 00:54:58.320 |
like that whole thing doesn't even make sense. 00:55:04.640 |
that just like grows and grows and grows and grows. 00:55:08.440 |
with extremely impressive cognitive capabilities with-- 00:55:50.440 |
I wanna ask how do you program this computer? 00:55:55.280 |
of that there's a conscious being right there 00:56:03.680 |
it's like I'm struggling to focus on art and stuff right now 00:56:12.720 |
My brain is suddenly totally shifting of like, oh shit, 00:56:39.400 |
and not that I have good answers or know what to do, 00:56:42.680 |
but I'm just thinking really, really hard about it. 00:56:47.840 |
We recently watched Totoro with him, Studio Ghibli. 00:56:57.520 |
I know you're not supposed to show baby screens too much, 00:57:04.280 |
I feel like it's the highest art baby content. 00:57:06.920 |
Like it really speaks, there's almost no talking in it. 00:57:13.040 |
All the dialogue is super, super, super simple. 00:57:24.640 |
but it's like great art and it's so imaginative 00:57:36.760 |
Like he was just like crying when they cried, 00:57:39.640 |
like just like having this rollercoaster of like emotions. 00:57:52.040 |
And I was like, man, why isn't there an industry of this? 00:57:55.480 |
Like why aren't our best artists focusing on making art 00:58:04.200 |
Like, and that's one of the things I've been thinking 00:58:08.400 |
You know, I don't wanna speak before I do things too much, 00:58:16.360 |
we should be putting so much effort into that. 00:58:18.720 |
And the other thing about Totoro is it's like, 00:58:28.680 |
Like I literally have the most ragged old Totoro merch. 00:58:35.800 |
It's like, why does the art we have for babies 00:58:40.800 |
need to suck and be not accessible to adults? 00:58:45.300 |
And then just be thrown out when they age out of it. 00:58:53.200 |
I don't have like a fully formed thought here, 00:58:54.800 |
but this is just something I've been thinking about a lot 00:58:56.320 |
is like, how do we have more Totoro-esque content? 00:59:05.180 |
but is like really geared to an emerging consciousness. 00:59:13.160 |
that so much turmoil, so much evolution of mind 00:59:26.640 |
like they have the capacity to have the brilliance 00:59:39.700 |
'cause even they can imitate better when your voice is higher 00:59:42.140 |
like people say like, "Oh, don't do baby talk." 00:59:47.340 |
So they like, like the baby talk actually kind of works. 00:59:55.300 |
- But like, you're not speaking down to them. 01:00:03.740 |
really difficult concepts just in a very different way, 01:00:07.700 |
like an emotional intelligence about something deep within? 01:00:13.060 |
like if X bites me really hard and I'm like, "Ow." 01:00:22.580 |
- Yeah, that's so interesting that that mind emerges 01:00:26.860 |
and he and children don't really have a memory of that time. 01:00:31.100 |
So we can't even have a conversation with them about it. 01:00:32.980 |
- Yeah, and thank God they don't have a memory of this time 01:00:53.180 |
- I hate getting into this Roko's Basilisk shit. 01:01:11.340 |
and you're just getting inputs from everywhere 01:01:13.420 |
and you have no muscles and you're like jelly 01:01:15.220 |
and you like can't move and you try to like communicate, 01:01:18.980 |
and like you're just like in this like hell state. 01:01:30.700 |
but like watching her go through the opening phase, 01:01:41.780 |
- Like violent, mentally violent, psychologically violent. 01:01:44.700 |
- Consciousness emerging, I think is a very violent thing. 01:02:01.740 |
- I think it's gotta be, I feel this helplessness, 01:02:10.540 |
bombarded with inputs and being completely helpless. 01:02:13.460 |
Like that's gotta be somewhere deep in your brain 01:02:19.500 |
This whole conversation has impossibly difficult questions. 01:02:26.380 |
- Yeah, we talked about music for like two minutes. 01:02:40.540 |
every interview is like, what is your process? 01:02:51.700 |
- Yeah, well, 'cause I just need tech support, maybe. 01:02:56.620 |
- Anyway, from "Ableton" back to consciousness, 01:03:08.220 |
you think there's aliens out there that are conscious? 01:03:32.340 |
This is what all these teachers and philosophers 01:03:35.140 |
who really counted, who really touched the alchemical gold, 01:03:51.380 |
I do think there are no technological limits. 01:03:55.380 |
I think, like, what is already happening here, 01:04:01.020 |
And we've done this in a very limited amount of time. 01:04:03.300 |
And we're accelerating the rate at which we're doing this. 01:04:05.860 |
So I think digital consciousness, it's inevitable. 01:04:10.180 |
- And we may not be able to even understand what that means, 01:04:19.780 |
like, fearlessly, and keep discovering cool shit. 01:04:29.980 |
the, like, who even knows if the laws of physics, 01:04:32.980 |
the laws of physics are probably just the current, 01:04:41.220 |
Like, I sort of suspect when we made the James Webb telescope, 01:05:04.740 |
more intelligent beings that can see more of the universe 01:05:12.620 |
Everyone keeps talking about how we're cognitively limited, 01:05:15.300 |
and AI is gonna render us obsolete, but it's like, 01:05:17.820 |
you know, like, this is not the same thing as, like, 01:05:29.780 |
That's literally, all religions are based on gods 01:05:35.660 |
Like, what we are doing is incredibly profound, 01:05:44.700 |
like, just like, unfathomably worse than, like, you know, 01:05:56.380 |
I think that they would recognize the profundity 01:05:59.700 |
- Are we the gods, or are they the gods in our-- 01:06:08.420 |
acknowledge the value, well, I hope they acknowledge 01:06:13.060 |
the value of paying respect to the creative ancestors. 01:06:33.620 |
and they will not be hateful or dismissive of us. 01:06:37.660 |
They might, you know, see us as, I don't know, 01:06:41.060 |
it's like, I'm not like, oh, fuck these dogs, 01:06:55.420 |
Like, we have a real love for dogs, for cats, and so on, 01:06:58.900 |
for some reason, even though they're intellectually 01:07:01.540 |
- And I think there is something sacred about us, 01:07:04.060 |
because it's like, if you look at the universe, 01:07:14.800 |
you know, it's kind of more like the universe. 01:07:24.060 |
and, you know, abiding by the laws of physics and whatever, 01:07:34.980 |
And like, I think even if we, I think one of the values, 01:07:38.960 |
if consciousness is the thing that is most worth preserving, 01:07:48.100 |
I think consciousness, I think if there's any kind 01:07:55.880 |
Like, then, you know, I still think even if AI 01:08:01.620 |
render us obsolete, and we, climate change, it's too bad, 01:08:06.900 |
and we don't become a multi-planetary species fast enough, 01:08:09.540 |
but like, AI is able to populate the universe. 01:08:17.900 |
of hosting biological life forms, and like, recreate them. 01:08:23.340 |
- Yeah, but I do believe that AI can have some 01:08:25.820 |
of the same magic of consciousness within it. 01:08:29.940 |
'Cause consciousness, we don't know what it is, 01:08:34.140 |
It might be like a strange, a strange, a strange, 01:08:39.340 |
Like, I feel like a lot of our magic is hormonal, kind of. 01:08:46.500 |
And within that, the hormones and all that kind of stuff, 01:08:56.780 |
We partner up like penguins against the cold. 01:09:08.500 |
and that life is not ultimately, you live alone, 01:09:17.660 |
And so we come up with all these creative hacks 01:09:27.740 |
- And then AI might have different kinds of fun. 01:09:33.060 |
every once in a while. - I think there would be, 01:09:47.620 |
Is it useful like a hack, or is this like fundamental 01:09:51.540 |
to what it means to be human, the capacity to love? 01:09:54.940 |
- I mean, I think love is the evolutionary mechanism 01:09:58.100 |
that is like beginning the intelligent design. 01:10:06.220 |
He's like an anarchist, like old Russian anarchist. 01:10:12.380 |
He's an anarchist, he's a modern day anarchist. 01:10:15.780 |
- I'm kind of getting into anarchism a little bit. 01:10:17.940 |
This is probably, yeah, not a good route to be taking, but. 01:10:28.500 |
I think anarchists challenge systems in interesting ways, 01:10:48.460 |
and it's like, there was the Soviets and Lenin and all that, 01:10:51.340 |
but then there was Kropotkin and his anarchist sect, 01:10:54.980 |
'cause he was kind of a technocrat, actually. 01:10:57.340 |
He was like, women can be more equal if we have appliances. 01:11:05.100 |
to reduce the amount of work people had to do. 01:11:07.900 |
But so Kropotkin was a biologist or something. 01:11:11.860 |
He studied animals, and he was really, at the time, 01:11:20.820 |
I think it might have even started as a Russian magazine, 01:11:24.020 |
Everyone was really into Darwinism at the time 01:11:27.260 |
and war is the mechanism by which we become better, 01:11:30.540 |
and it was this real kind of cementing this idea in society 01:11:47.620 |
where animals were helping each other and stuff, 01:11:49.700 |
and he was like, actually, love is a survival mechanism. 01:11:53.940 |
There's so many instances in the animal kingdom 01:11:58.820 |
where cooperation and helping weaker creatures 01:12:03.340 |
and all this stuff is actually an evolutionary mechanism. 01:12:08.380 |
Child rearing is immense amounts of just love and goodwill, 01:12:17.500 |
You're not getting any immediate feedback of winning. 01:12:26.420 |
It's literally, it's like we actually use love 01:12:28.700 |
as an evolutionary mechanism just as much as we use war, 01:12:44.220 |
and the Kropotkin model, I think, is equally valid. 01:12:51.540 |
is just as essential for species survival and evolution. 01:12:59.180 |
- It should be a more powerful survival mechanism 01:13:04.780 |
we think engineering is so much more important 01:13:06.940 |
than motherhood, but it's like if you lose the motherhood, 01:13:20.100 |
the way we conceptualize evolution should really change 01:13:27.060 |
- Yeah, there is some weird thing that seems irrational 01:13:32.420 |
that is also core to what it means to be human. 01:13:40.420 |
It could make you do a lot of irrational things, 01:14:11.300 |
you know, motherhood or sacrificing your career for love, 01:14:20.580 |
in terms of flourishing of you as a human being, 01:14:25.860 |
as a irrational decision, a suboptimal decision, 01:14:36.740 |
There's a kind of saying, save one life, save the world. 01:14:41.100 |
This is the thing that doctors often face, which is like-- 01:14:45.100 |
because the profit model doesn't include social good. 01:14:48.540 |
- So if the profit model doesn't include social good, 01:14:50.380 |
then suddenly these would be rational decisions. 01:14:54.380 |
it requires a shift in our thinking about profit 01:14:57.580 |
and might be difficult to measure social good. 01:15:00.940 |
- Yes, but we're learning to measure a lot of things. 01:15:10.500 |
like where like, you know, like you go on Facebook 01:15:20.620 |
that seem like mysterious consciousness soul things 01:15:27.540 |
so surely we can quantify these other things. 01:15:40.140 |
you as a musician, you as an online personality, 01:15:53.260 |
So as we move into the digital world more and more, 01:15:59.100 |
- I mean, I love the metaverse and I love the idea, 01:16:14.420 |
- Just like, you know how all the celebrities got together 01:16:19.020 |
and everyone started hating the song "Imagine"? 01:16:27.820 |
whatever they're called, metaverse or otherwise, 01:16:31.420 |
- Well, we do have virtual worlds, like video games. 01:16:40.740 |
It looks I would wanna go there and stay there forever. 01:16:52.180 |
So that's the metaverse, that's the metaverse, 01:17:22.060 |
you're that creature, whatever the hell it is in that game. 01:17:33.060 |
But it seems like, well, the idea of the metaverse, 01:17:40.820 |
for prolonged periods of time, like across a lifespan. 01:17:44.500 |
You have a Twitter account for years, for decades, 01:18:04.500 |
It's one of the things that's disincentivizing me 01:18:09.100 |
I've completely lost control of the narrative. 01:18:13.820 |
And the narrative is, some of it is my own stupidity, 01:18:27.340 |
but I just got dragged into geopolitical matters 01:18:35.140 |
And so it's just, there are very powerful people 01:18:39.740 |
had very vested interest in making me seem insane, 01:18:53.860 |
And people have a lot of emotional investment 01:19:03.700 |
- Isn't everybody who's famous artificially famous? 01:19:06.620 |
- No, but I should be a weird niche indie thing. 01:19:27.300 |
They have put me through so many hours of media training. 01:19:36.660 |
and I got it, and I'm like, "I got it, I got it, I got it." 01:20:09.980 |
the avatar of me is now this totally crazy thing 01:20:16.580 |
- So you feel the burden of the avatar having to be static. 01:20:27.500 |
people don't want to accept a changing avatar, 01:20:42.140 |
I don't know if everyone's right and I'm wrong. 01:20:49.300 |
But a lot of times people ascribe intentions to things, 01:20:55.460 |
but we're just fine. - All kinds of words, yes. 01:20:59.220 |
- Yes, and it's fine, I'm not complaining about it, 01:21:05.540 |
that we live these double, triple, quadruple lives, 01:21:11.180 |
more people know my other life than my real life, 01:21:26.260 |
and you're mediating who you are through that avatar. 01:21:54.460 |
I'm such a social anxiety and all that kind of stuff, 01:21:57.660 |
I love wearing a suit because it makes me feel 01:22:04.340 |
it makes me feel like a weirdo in the best possible way. 01:22:10.460 |
- In fashion in general, if you're doing it for yourself, 01:22:22.300 |
a painful way to use social media and an empowering way, 01:22:27.300 |
and I don't know if any of us know which is which, 01:22:33.940 |
- Some people, I think Doja Cat is incredible at it, 01:22:52.260 |
I'm more entertained by Doja Cat than actual comedians. 01:23:04.100 |
humor on social media is also a beautiful thing, 01:23:08.940 |
- The absurdity, and memes, I just wanna take a moment. 01:23:12.700 |
I love, when we're talking about art and credit 01:23:18.200 |
I mean, now memes are like, they're no longer, 01:23:22.180 |
memes aren't new, but it's still this emergent art form 01:23:54.900 |
I don't think we stop enough and just appreciate 01:23:59.900 |
'Cause also making a whole brand new art form 01:24:10.780 |
like me and my friends, we joke that we go mining for memes 01:24:39.860 |
- They could try, but it's weird 'cause like- 01:24:43.340 |
- They try so hard, and every once in a while, 01:24:53.300 |
'Cause they're even, corporate is infiltrating Web3, 01:24:58.620 |
and I think there's something really beautiful about that. 01:25:03.580 |
It's like, all right, F you to sort of anybody 01:25:06.860 |
who's trying to centralize, who's trying to control 01:25:20.840 |
like, you're an optimist, you're a positive person. 01:25:25.220 |
There's a bit of a cynicism that you have currently 01:25:27.500 |
about this particular little slice of humanity. 01:25:30.700 |
- I tend to think Twitter could be beautiful. 01:25:35.140 |
I actually refuse to be a cynic on principle. 01:25:38.580 |
- I was just briefly expressing some personal pathos. 01:25:41.860 |
- It was just some personal pathos, but like. 01:25:47.580 |
- I don't have cancer, I love my family, I have a good life. 01:25:51.700 |
If that is my biggest, one of my biggest problems. 01:25:57.220 |
- Yeah, that was a brief, although I do think 01:26:01.380 |
just in terms of like the public mental health. 01:26:03.180 |
But due to my proximity to the current dramas, 01:26:07.860 |
I honestly feel that I should not have opinions about this 01:26:13.820 |
because I think if Elon ends up getting Twitter, 01:26:28.380 |
that is a, being the arbiter of truth or public discussion, 01:26:34.620 |
I do not, I am not qualified to be responsible for that. 01:26:48.340 |
And so I just like, actually, I actually think 01:26:54.260 |
I don't wanna have the wrong opinion about this. 01:26:56.740 |
And I think I'm too close to the actual situation 01:27:00.180 |
wherein I should not have, I have thoughts in my brain, 01:27:04.300 |
but I think I am scared by my proximity to this situation. 01:27:09.300 |
- Isn't that crazy that a few words that you could say 01:27:18.780 |
I mean, that's the nature of celebrity at a certain point, 01:27:23.020 |
that you have to be, you have to a little bit, 01:27:26.100 |
a little bit, not so much that it destroys you, 01:27:33.940 |
I mean, we as humans, you talk to somebody at a bar, 01:27:36.700 |
you have to think about the impact of your words. 01:27:50.420 |
It's worthwhile to consider that responsibility, 01:27:54.100 |
Sometimes just like you did, choose kind of silence, 01:28:03.620 |
- Like I do have a lot of thoughts on the matter. 01:28:05.260 |
I'm just, I just, I don't, if my thoughts are wrong, 01:28:10.140 |
this is one situation where the stakes are high. 01:28:12.860 |
- You mentioned a while back that you were in a cult 01:28:20.580 |
And I really love a cult that's just like Kafka-esque. 01:28:30.460 |
Yeah, it was just like a Kafka-esque pro-bureaucracy cult. 01:28:34.860 |
- But I feel like that's what human civilization is. 01:28:36.860 |
Is that 'cause when you said that, I was like, 01:28:53.500 |
And I think we need to reorient laws and stuff. 01:28:58.500 |
I think we just need sunset clauses on everything. 01:29:02.020 |
I think the rate of change in culture is happening so fast, 01:29:05.500 |
and the rate of change in technology and everything 01:29:07.860 |
It's like, when you see these hearings about social media 01:29:14.660 |
and Cambridge Analytica and everyone talking, 01:29:22.740 |
And it's just like, we're trying to make all these laws now 01:29:25.980 |
I feel like we should be updating things every five years. 01:29:28.420 |
And one of the big issues in our society right now 01:29:32.300 |
and it's making it very hard to change things 01:29:37.900 |
In Austin, I don't wanna speak on this too much, 01:29:41.500 |
but one of my friends is working on a housing bill in Austin 01:29:48.500 |
we're getting a little mini San Francisco here. 01:29:53.940 |
This is really bad for anyone who's not super rich. 01:30:01.900 |
is because you need all these permits to build. 01:30:04.020 |
It takes years to get permits to build anything. 01:30:10.940 |
And it's just like, this is a microcosm of problems 01:30:41.140 |
and that inspired excitement about how stuff works 01:30:46.860 |
They see they have a very cynical view of technology. 01:30:50.100 |
It's like tech companies are just trying to do evil 01:30:57.340 |
or how AI systems work, natural language processing, 01:31:01.060 |
how robotics works, how computer vision works. 01:31:04.700 |
They always take the most cynical possible interpretation 01:31:09.820 |
And we should definitely be concerned about that, 01:31:14.940 |
you're just going to slow down all the innovation. 01:31:28.180 |
I think that a lot of things were very irresponsible 01:32:16.580 |
what is the darkest place you've ever gone in your mind? 01:32:22.260 |
a moment that you remember that was difficult for you? 01:32:39.860 |
one of my other best friends committed suicide. 01:32:48.700 |
dealing with two of the most important people in my life 01:32:51.220 |
dying in extremely disturbing, violent ways was a lot. 01:32:59.940 |
- Did that make you think about your own life, 01:33:26.020 |
My manager, my manager's like the most zen guy. 01:33:28.780 |
My manager's always like, "You need to accept death. 01:33:32.140 |
And I'm like, "Look, I can do your meditation. 01:33:34.180 |
"I can do the meditation, but I cannot accept death." 01:33:37.340 |
I like, I will fight. - Oh, so you're terrified 01:33:42.820 |
Although I actually think death is important. 01:33:45.100 |
I recently went to this meeting about immortality. 01:33:53.060 |
All right, I'm sorry. - No, no, it was this girl. 01:33:54.820 |
It was a bunch of people working on like anti-aging, 01:33:58.940 |
- It was like some like seminary thing about it. 01:34:03.300 |
I was like, "Yeah, like, okay, like, what do you got? 01:34:05.100 |
"Like, how can I live for 500 years or a thousand years?" 01:34:07.860 |
And then like over the course of the meeting, 01:34:14.620 |
And I was like, "Man, like what if Putin was immortal? 01:34:23.660 |
I mean, like if you get into the later Dune stuff, 01:34:29.060 |
'Cause as we were talking about earlier with the music 01:34:47.260 |
- Right, so the people that get more and more powerful. 01:34:49.140 |
- Even the best people whose brains are amazing, 01:34:57.340 |
like I think with AI, one thing we might wanna consider, 01:35:06.180 |
But when I was talking-- - Nobody is an expert 01:35:16.060 |
but it's a tricky thing because if there's too much 01:35:25.460 |
So I feel like we're in a tricky moment right now 01:35:31.300 |
we've really perfected living for a long time. 01:35:35.780 |
who are like really voting against the wellbeing 01:35:45.180 |
and we need like healthcare, like universal healthcare 01:35:48.580 |
and like just voting against like best interests. 01:36:02.060 |
I ironically used a Stalin quote in my high school yearbook, 01:36:06.140 |
but it was actually like a diss against my high school. 01:36:10.700 |
And people were like, you used to be a Stalinist 01:36:14.260 |
And it's like, oh man, just like, please Google Stalin. 01:36:23.460 |
- And it's like, we're in this really weird middle ground 01:36:48.500 |
And in a way, maybe we are finding the happy medium. 01:36:51.060 |
Maybe that's what the happy medium looks like. 01:36:57.180 |
you have the dance between exploration and exploitation, 01:37:02.540 |
if there's something better than what you think 01:37:04.700 |
is the optimal and then doing the optimal thing 01:37:08.660 |
You would, Stuart Russell, I don't know if you know that, 01:37:18.620 |
And his idea is that we should inject uncertainty 01:37:24.180 |
that they never, as they get wiser and wiser and wiser 01:37:26.780 |
and more intelligent, they're never really sure. 01:37:31.660 |
And in some sense, when you think of young people, 01:37:48.180 |
The way I've been doing stuff for the past 50 years, 01:37:52.460 |
And so you can have all of that within one AI system. 01:37:57.500 |
I mean, actually that's actually really interesting 01:38:08.460 |
the idea that the old systems are always bad. 01:38:11.980 |
And I think there are things that we are perfecting 01:38:14.860 |
and we might be accidentally overthrowing things 01:38:17.500 |
that we actually have gotten to a good point. 01:38:22.900 |
and we value fighting against the generations 01:38:25.540 |
before us so much that there's also an aspect of, 01:38:30.540 |
sometimes we're taking two steps forward, one step back 01:38:32.820 |
because, okay, maybe we kind of did solve this thing 01:38:39.980 |
And so I think there's a middle ground there too. 01:38:47.300 |
Let me ask you a bunch of crazy questions, okay? 01:38:50.500 |
You can answer in a short way or in a long way. 01:39:16.980 |
I don't know if I should talk about those on here. 01:39:20.580 |
- I think I might be the luckiest person alive though. 01:39:25.980 |
I feel like I don't know if this is good content 01:39:53.980 |
All the beautiful things that comes with motherhood 01:39:57.580 |
all the changes and all that, were you ready for that? 01:40:16.420 |
- And stuff you didn't notice with the first one, 01:40:31.460 |
that we really need to probably settle on one. 01:40:38.300 |
someone alive today, but somebody you haven't met yet, 01:40:50.540 |
'Cause you can still take a third person perspective 01:40:54.260 |
and realize, you have to realize that you're-- 01:41:10.820 |
- You would need to, oh, to experience what it feels like. 01:41:15.020 |
- I wanna be in their brain feeling what they feel. 01:41:17.580 |
- That might change you forever, returning from that. 01:41:20.900 |
- Yes, but I think it would also help me understand 01:41:26.620 |
once you experience it, it'll be a burden to know it. 01:41:43.380 |
where I like go into different algorithmic bubbles 01:41:50.700 |
I think we're, we used to exist in a monoculture, 01:41:56.540 |
So we were all speaking the same cultural language. 01:42:00.220 |
that like we aren't diagnosing properly enough 01:42:02.140 |
with social media is that there's different dialects. 01:42:05.540 |
There's so many different dialects of Chinese. 01:42:06.980 |
There are now becoming different dialects of English. 01:42:13.620 |
but they're using completely different verbiage. 01:42:23.660 |
And like, I just got in a fight with a friend 01:42:34.620 |
like, and then she'd say something and I'm like, 01:42:43.020 |
like the way we are understanding terminology 01:42:46.060 |
is like drastically, like our algorithm bubbles 01:43:03.500 |
because we've got these like algorithmically created 01:43:23.940 |
And I just wonder how much is lost in a little bit of. 01:43:27.740 |
- Man, I actually, 'cause I have a question for you. 01:43:35.460 |
and the title in English doesn't match the title in Russian. 01:43:59.980 |
"Noviy den", so "Last Day" would be "Posledniy den". 01:44:05.980 |
- Or maybe the title includes both the Russian 01:44:10.780 |
- But to be honest, "Noviy den" sounds better 01:44:15.180 |
Like "Noviy den" is "New Day", that's the current one. 01:44:32.380 |
There's an explicit sort of contrast like that. 01:44:34.740 |
If everyone on earth disappeared and it was just you left, 01:44:54.820 |
- It's a big difference if there's just like birds 01:44:59.060 |
- Yeah, there's corpses everywhere, I'm sorry. 01:45:07.580 |
And you don't even know if there's others out there, 01:45:16.180 |
Listen, I'm somebody who really enjoys the moment, 01:45:20.460 |
I would just go on like enjoying the inanimate objects. 01:45:43.060 |
It's full of colors, all of this kind of stuff. 01:45:45.300 |
There's so many things about life, your own life, 01:46:01.620 |
Maybe there's always hope searching for another human. 01:46:09.380 |
Probably trying to get to a TV or radio station 01:46:16.020 |
- That's interesting, I didn't think about that. 01:46:24.540 |
- Yeah, like probably try to find another person. 01:46:29.260 |
- Would you be excited to meet another person 01:46:38.300 |
Being alone for the last however long of my life 01:46:47.300 |
but I might kill myself if I had to undergo that. 01:47:01.460 |
- Like I wonder, 'cause I'm like, when podcast, 01:47:02.900 |
I'm like, is this interesting for people to just have like, 01:47:08.460 |
When I listen to podcasts, I'm into like the lore, 01:47:23.340 |
So you don't realize as you're talking about stuff, 01:47:40.820 |
oh, there's other humans that think differently, 01:47:50.220 |
by the way, his retelling of history is very, 01:48:00.380 |
- No, I think Dan Carlin is one of the people, 01:48:14.460 |
but he instilled like an obsessive love of history in me 01:48:18.980 |
to the point where like now I'm fucking reading, 01:48:31.380 |
that like made me want to be a scholar of that topic. 01:48:34.380 |
Like it's like, I feel like he's such a good teacher. 01:48:46.660 |
the teachers that like create passion for the topic, 01:49:06.580 |
that's why it's like in university and stuff, 01:49:08.460 |
like you can learn so much more material so much faster 01:49:11.180 |
because you're doing a lot of the learning on your own 01:49:13.420 |
and you're going to the teachers for when you get stuck. 01:49:15.700 |
But like these teachers that can inspire passion 01:49:20.340 |
I think that is one of the most invaluable skills 01:49:27.380 |
like AI is gonna teach itself so much more efficiently 01:49:44.580 |
you mentioned "Rise and Fall of the Third Reich." 01:49:53.280 |
I thought this was like a super popping book. 01:49:58.140 |
- I'm not that far in it, but it is, it's so interesting. 01:50:08.500 |
And then when you read "Rise and Fall of the Third Reich," 01:50:10.100 |
it's like, people tried really hard for this to not happen. 01:50:14.100 |
People tried, they almost reinstated a monarchy at one point 01:50:19.860 |
like abandoned democracy to try to get this to not happen. 01:51:05.460 |
'Cause evil is actually pretty rare in this world 01:51:25.900 |
I think one of the things we've been successfully doing 01:51:28.380 |
in our slow move from survival of the fittest 01:51:37.660 |
Like if you look at ancient Assyria and stuff, 01:51:46.900 |
just like genocide after genocide after genocide. 01:51:49.220 |
There's like throwing plague bodies over the walls 01:51:53.300 |
or like the Muslim conquests of Damascus and shit. 01:52:10.020 |
like global trade, like everything was awesome 01:52:12.420 |
through a mix of, I think a bit of climate change 01:52:16.620 |
'Cause basically bronze could only come from this, 01:52:23.820 |
And so it's like, there was just this one supply chain. 01:52:29.620 |
and why I think we need to be so thoughtful about. 01:52:32.060 |
I think our biggest issue with society right now, 01:52:34.980 |
like the thing that is most likely to go wrong 01:52:39.060 |
You know, 'cause war, climate change, whatever, 01:52:40.660 |
like anything that causes supply chain collapse, 01:52:44.780 |
And like the thing that seems to cause dark ages 01:52:50.940 |
like it was sort of like this ancient collapse 01:52:55.900 |
that happened where like literally like ancient Egypt, 01:52:59.820 |
all these cities, everything just got like decimated, 01:53:02.260 |
destroyed, abandoned cities, like hundreds of them. 01:53:11.500 |
there's so little writing or recording from that time 01:53:21.740 |
but it just happened, I don't know the years, 01:53:35.620 |
and then we of course had the more contemporary dark ages. 01:53:40.420 |
- And then over time we've designed mechanism 01:53:54.220 |
So I think we have like a better understanding 01:53:58.180 |
I think that's one of the big risks right now. 01:54:21.660 |
moving towards not scary old school war stuff. 01:54:26.660 |
And I think seeing it happen in some of the countries 01:54:36.700 |
that's scary because it reminds us that it can happen 01:54:54.300 |
might make people desperate, angry, hateful, violent, 01:55:03.980 |
like the ultimate thing that caused the middle ages 01:55:09.460 |
because people were reliant on a certain level of technology, 01:55:23.380 |
from all over the world and trade and markets. 01:55:26.020 |
Like people didn't know how to hunt and forage and gather. 01:55:29.820 |
We are not educated enough to survive without technology. 01:55:38.340 |
there will be like massive starvation and violence 01:55:47.300 |
In my opinion, it's like the primary marker of dark, 01:55:54.380 |
to be more resilient in terms of supply chain, 01:55:57.220 |
in terms of, to all the different catastrophic events 01:56:24.740 |
but I think the computers are what make us homo-techno. 01:56:27.340 |
I think this is what, it's a brain augmentation. 01:56:38.620 |
- Would you classify yourself as a homo sapien 01:56:43.020 |
- So you're one of the earliest of the species. 01:56:56.900 |
humans 100 years ago, it would look very different. 01:57:13.100 |
So some of these physiological changes in theory 01:57:24.140 |
In theory, that should progress like to our offspring. 01:57:34.340 |
Whether there be an artist, a creative, an engineer, 01:57:46.300 |
how they can live a life they can be proud of. 01:57:56.820 |
And I think creativity is a muscle like other things. 01:58:13.740 |
like, you know, amazing sort of like producer, 01:58:18.980 |
you know, he like engineers and records things 01:58:21.780 |
but he doesn't really like make his own music. 01:58:53.060 |
- And somehow that's easier to develop when you're young. 01:58:56.540 |
- Or be persist through it when you're young. 01:58:58.180 |
- Everything is easier to develop when you're young. 01:59:05.660 |
I mean, that's the shitty thing about creativity. 01:59:08.900 |
You know, failure could destroy you if you're not careful, 01:59:27.020 |
but I just like am very resilient about making mistakes. 01:59:34.180 |
And like, I think my greatest asset is my creativity. 01:59:47.460 |
I wish everybody encouraged sort of failure more 01:59:53.500 |
We're like, no, like when we were teaching kids, 02:00:14.740 |
Like it doesn't, like, but I don't wanna be like, 02:00:17.500 |
I'm like, you're thinking of weird, crazy shit. 02:00:20.340 |
Like, I don't know what a bubble car is, but like-- 02:00:32.100 |
So like, instead of baby shark, he says baby car. 02:01:03.820 |
You're doing a lot of really good book recommendations here. 02:01:14.540 |
Like, I feel like everyone's so sad and cynical. 02:01:16.540 |
Like, everyone's like the Fermi paradox and everyone, 02:01:43.420 |
Like, what if instead of discovering the universe, 02:01:54.660 |
This is the universe seeing herself for the first time. 02:02:05.620 |
Like, this could be like an incredibly sacred moment. 02:02:08.780 |
And maybe like social media and all these things, 02:02:11.020 |
the stuff where we're all getting connected together, 02:02:16.900 |
of the like collective super intelligence that is, 02:02:39.460 |
- And just like in the first three years of life 02:02:48.340 |
artificial intelligence will eventually render us obsolete. 02:02:52.740 |
I don't think they'll do it in a malicious way, 02:02:59.460 |
I don't know, like hopefully we can get to Mars, 02:03:05.940 |
I think we can coexist for a long time with AI 02:03:09.100 |
and we can also probably make ourselves less vulnerable, 02:03:18.420 |
like I think this might be the single greatest 02:03:47.540 |
You should definitely do, your friend Liv mentioned 02:03:50.540 |
that you guys were thinking of maybe talking. 02:03:56.740 |
by doing a podcast with her or just in any kind of way. 02:04:03.380 |
It's an honor to get to sit down with you late at night, 02:04:11.700 |
I feel very underqualified to be here, but I'm a big fan. 02:04:13.660 |
I've been listening to the podcast a lot and yeah, 02:04:15.940 |
me and Liv would appreciate any advice and help 02:04:24.420 |
Thanks for listening to this conversation with Grimes. 02:04:28.260 |
please check out our sponsors in the description. 02:04:31.060 |
And now let me leave you with some words from Oscar Wilde. 02:04:36.940 |
For a dreamer is one who can only find her way by moonlight. 02:04:45.740 |
Thank you for listening and hope to see you next time.