back to index

Grimes: Music, AI, and the Future of Humanity | Lex Fridman Podcast #281


Chapters

0:0 Introduction
2:3 Identity
6:1 Music production
18:52 Spotify
23:34 Experimental music
26:9 Vision of future
38:42 Motherhood
54:50 Consciousness
69:40 Love
75:31 Metaverse
88:13 Technology and bureaucracy
92:11 Mortality
100:36 Understanding evil
104:34 Last person on Earth
107:12 Dan Carlin
109:42 Rise and Fall of the Third Reich
116:7 Coolest human invention
117:28 Advice for young people
120:40 Meaning of life

Whisper Transcript | Transcript Only Page

00:00:00.000 | we are becoming cyborgs.
00:00:01.240 | Like our brains are fundamentally changed.
00:00:04.080 | Everyone who grew up with electronics,
00:00:05.780 | we are fundamentally different from previous,
00:00:08.920 | from homo sapiens.
00:00:09.840 | I call us homo techno.
00:00:11.080 | I think we have evolved into homo techno,
00:00:13.240 | which is like essentially a new species.
00:00:15.800 | Previous technologies, I mean,
00:00:17.840 | may have even been more profound
00:00:19.120 | and moved us to a certain degree,
00:00:20.180 | but I think the computers are what make us homo techno.
00:00:22.760 | I think this is what, it's a brain augmentation.
00:00:25.600 | So it like allows for actual evolution.
00:00:27.860 | Like the computers accelerate the degree
00:00:29.480 | to which all the other technologies can also be accelerated.
00:00:32.700 | - Would you classify yourself as a homo sapien
00:00:34.760 | or a homo techno?
00:00:35.640 | - Definitely homo techno.
00:00:37.100 | - So you're one of the earliest of the species.
00:00:41.020 | - I think most of us are.
00:00:43.080 | - The following is a conversation with Grimes,
00:00:47.800 | an artist, musician, songwriter, producer, director,
00:00:50.560 | and a fascinating human being
00:00:53.080 | who thinks a lot about both the history
00:00:55.520 | and the future of human civilization.
00:00:57.560 | Studying the dark periods of our past
00:00:59.840 | to help form an optimistic vision of our future.
00:01:03.920 | This is the Alex Friedman Podcast.
00:01:05.760 | To support it, please check out our sponsors
00:01:07.880 | in the description.
00:01:09.000 | And now, dear friends, here's Grimes.
00:01:11.960 | - Oh yeah, the Cloudlifter, there you go.
00:01:14.480 | - There you go.
00:01:15.320 | You know your stuff.
00:01:16.480 | Have you ever used a Cloudlifter?
00:01:18.280 | - Yeah, I actually, this microphone Cloudlifter
00:01:21.000 | is what Michael Jackson used, so.
00:01:23.480 | - No, really?
00:01:24.460 | - Yeah, this is like thriller and stuff.
00:01:26.040 | - This mic and the Cloudlifter?
00:01:27.640 | - And that, yeah, it's an incredible microphone.
00:01:30.640 | It's very flattering on vocals.
00:01:32.080 | I've used this a lot.
00:01:33.640 | It's great for demo vocals.
00:01:34.760 | It's great in a room.
00:01:36.080 | Like, sometimes it's easier to record vocals
00:01:38.300 | if you're just in a room and like the music's playing
00:01:40.640 | and you just wanna like feel it
00:01:41.960 | so it's not in the headphones.
00:01:43.080 | And this mic is pretty directional,
00:01:44.720 | so I think it's like a good mic for like just vibing out
00:01:47.760 | and just getting a real good vocal take.
00:01:49.880 | - Just vibing, just in a room.
00:01:51.880 | - Anyway, this is the Michael Jackson, Quincy Jones
00:01:55.920 | microphone.
00:01:57.000 | - I feel way more badass now.
00:01:58.800 | All right, let's get in.
00:02:00.360 | You wanna just get into it?
00:02:01.760 | - I guess so.
00:02:03.040 | - All right, one of your names,
00:02:04.760 | at least in this space and time, is C, like the letter C.
00:02:08.280 | And you told me that C means a lot of things.
00:02:11.240 | It's the speed of light.
00:02:12.600 | It's the render rate of the universe.
00:02:14.680 | It's yes in Spanish.
00:02:16.120 | It's the crescent moon.
00:02:17.640 | And it happens to be my favorite programming language
00:02:21.120 | 'cause it basically runs the world,
00:02:24.120 | but it's also powerful, fast, and it's dangerous
00:02:28.160 | 'cause you can mess things up really bad with it
00:02:29.960 | 'cause of all the pointers.
00:02:31.120 | But anyway, which of these associations
00:02:33.960 | with the name C is the coolest to you?
00:02:36.560 | - I mean, to me, the coolest is the speed of light,
00:02:40.680 | obviously, or the speed of light.
00:02:42.680 | When I say render rate of the universe,
00:02:44.360 | I think I mean the speed of light
00:02:46.280 | because essentially that's what we're rendering at.
00:02:49.080 | See, I think we'll know if we're in a simulation
00:02:52.160 | if the speed of light changes
00:02:53.720 | because if they can improve their render speed, then.
00:02:57.240 | - Well, it's already pretty good.
00:02:58.440 | - It's already pretty good, but if it improves,
00:03:01.000 | then we'll know, we can probably be like,
00:03:03.880 | okay, they've updated or upgraded.
00:03:05.320 | - Well, it's fast enough for us humans
00:03:06.760 | 'cause it seems immediate.
00:03:10.960 | There's no delay, there's no latency
00:03:13.320 | in terms of us humans on Earth interacting with things.
00:03:16.240 | But if you're an intergalactic species
00:03:20.000 | operating on a much larger scale,
00:03:21.440 | then you're gonna start noticing some weird stuff.
00:03:23.840 | Or if you can operate in like around a black hole,
00:03:27.320 | then you're gonna start to see some render issues.
00:03:29.680 | - You can't go faster than the speed of light, correct?
00:03:32.680 | So it really limits our ability
00:03:34.520 | or one's ability to travel space.
00:03:36.680 | - Theoretically, you can, you have wormholes.
00:03:38.920 | So there's nothing in general relativity
00:03:41.880 | that precludes faster than the speed of light to travel,
00:03:46.880 | but it just seems you're gonna have to do
00:03:49.840 | some really funky stuff with very heavy things
00:03:54.000 | that have like weirdnesses,
00:03:56.080 | that have basically tears in space time.
00:03:58.600 | We don't know how to do that.
00:03:59.720 | - Do navigators know how to do it?
00:04:01.880 | - Do navigators?
00:04:03.040 | - Yeah. - Yeah.
00:04:03.880 | - Folding space, basically making wormholes.
00:04:07.000 | - So the name C. - Yes.
00:04:10.240 | - Who are you?
00:04:13.120 | Do you think of yourself as multiple people?
00:04:16.960 | Are you one person?
00:04:18.280 | Do you know, like in this morning,
00:04:20.880 | were you a different person than you are tonight?
00:04:23.560 | We are, I should say, recording this
00:04:25.840 | basically at midnight, which is awesome.
00:04:27.840 | - Yes, thank you so much.
00:04:29.600 | I think I'm about eight hours late.
00:04:31.680 | - No, you're right on time.
00:04:33.960 | Good morning.
00:04:34.800 | This is the beginning of a new day soon.
00:04:37.200 | Anyway, are you the same person
00:04:39.480 | you were in the morning and the evening?
00:04:41.680 | Is there multiple people in there?
00:04:44.320 | Do you think of yourself as one person
00:04:46.240 | or maybe you have no clue?
00:04:47.520 | Or are you just a giant mystery to yourself?
00:04:50.160 | - Okay, these are really intense questions, but--
00:04:52.440 | - Let's go, let's go.
00:04:53.360 | 'Cause I ask this myself, like look in the mirror,
00:04:55.360 | who are you?
00:04:56.320 | People tell you to just be yourself,
00:04:57.960 | but what does that even mean?
00:04:59.640 | - I mean, I think my personality changes
00:05:01.520 | with everyone I talk to.
00:05:02.960 | So I have a very inconsistent personality, yeah.
00:05:07.200 | - Person to person, so the interaction,
00:05:08.960 | your personality materializes--
00:05:11.480 | - Or my mood, like I'll go from being like a megalomaniac
00:05:16.120 | to being like, you know, just like a total hermit
00:05:19.920 | who is very shy.
00:05:21.400 | - So some combinatorial combination of your mood
00:05:24.560 | and the person you're interacting with.
00:05:26.320 | - Yeah, mood and people I'm interacting with.
00:05:28.080 | But I think everyone's like that.
00:05:29.760 | Maybe not.
00:05:30.880 | - Well, not everybody acknowledges it
00:05:32.560 | and able to introspect it.
00:05:34.040 | Who brings out, what kind of person,
00:05:35.800 | what kind of mood brings out the best in you
00:05:38.160 | as an artist and as a human?
00:05:39.720 | Can you introspect this?
00:05:41.720 | - Like my best friends, like people who I can,
00:05:45.240 | when I'm like super confident
00:05:47.520 | and I know that they're gonna understand everything
00:05:50.600 | I'm saying, so like my best friends,
00:05:52.200 | then when I can start being really funny,
00:05:55.400 | that's always my like peak mode.
00:05:57.680 | But it's like, yeah, it takes a lot to get there.
00:06:00.160 | - Let's talk about constraints.
00:06:02.360 | You've talked about constraints and limits.
00:06:05.600 | Do those help you out as an artist or as a human being
00:06:09.640 | or do they get in the way?
00:06:10.840 | Do you like the constraints?
00:06:11.920 | So in creating music, in creating art, in living life,
00:06:15.440 | do you like the constraints that this world puts on you?
00:06:19.600 | Or do you hate them?
00:06:24.720 | - If constraints are moving, then you're good, right?
00:06:29.720 | Like it's like as we are progressing with technology,
00:06:32.040 | we're changing the constraints of like artistic creation.
00:06:34.880 | Making video and music and stuff is getting a lot cheaper.
00:06:39.720 | There's constantly new technology and new software
00:06:42.080 | that's making it faster and easier.
00:06:44.000 | We have so much more freedom than we had in the '70s.
00:06:46.680 | Like when Michael Jackson,
00:06:48.680 | when they recorded "Thriller" with this microphone,
00:06:51.440 | like they had to use a mixing desk and all this stuff.
00:06:54.000 | And like probably even getting a studio
00:06:55.600 | is probably really expensive
00:06:56.560 | and you have to be a really good singer
00:06:57.600 | and you have to know how to use
00:06:59.080 | like the mixing desk and everything.
00:07:00.520 | And now I can just, I've made a whole album on this computer.
00:07:05.240 | I have a lot more freedom,
00:07:06.760 | but then I'm also constrained in different ways
00:07:10.240 | 'cause there's like literally millions more artists.
00:07:13.680 | It's like a much bigger playing field.
00:07:15.680 | It's just like, I also, I didn't learn music.
00:07:18.640 | I'm not a natural musician.
00:07:20.200 | So I don't know anything about actual music.
00:07:22.600 | I just know about like the computer.
00:07:24.760 | So I'm really kind of just like messing around
00:07:29.760 | and like trying things out.
00:07:33.240 | - Well, yeah, I mean, but the nature of music is changing.
00:07:35.720 | So you're saying you don't know actual music,
00:07:37.280 | what music is changing.
00:07:39.200 | Music is becoming, you've talked about this,
00:07:41.880 | is becoming, it's like merging with technology.
00:07:46.600 | - Yes.
00:07:47.600 | - It's becoming something more than just like
00:07:51.360 | the notes on a piano.
00:07:52.960 | It's becoming some weird composition
00:07:54.920 | that requires engineering skills, programming skills,
00:07:59.400 | some kind of a human robot interaction skills,
00:08:03.400 | and still some of the same things
00:08:04.720 | that Michael Jackson had, which is like a good ear
00:08:06.720 | for a good sense of taste of what's good
00:08:09.000 | and not the final thing, what is put together.
00:08:11.480 | Like you're allowed, you're enabled, empowered
00:08:14.880 | with a laptop to layer stuff,
00:08:17.160 | to start like layering insane amounts of stuff.
00:08:20.240 | And it's super easy to do that.
00:08:22.240 | - I do think music production
00:08:23.560 | is a really underrated art form.
00:08:24.960 | I feel like people really don't appreciate it.
00:08:26.640 | When I look at publishing splits,
00:08:27.920 | the way that people like pay producers and stuff,
00:08:32.440 | it's super, producers are just deeply underrated.
00:08:35.560 | Like so many of the songs that are popular right now
00:08:39.160 | or for the last 20 years,
00:08:40.920 | like part of the reason they're popular
00:08:42.240 | is 'cause the production is really interesting
00:08:44.000 | or really sick or really cool.
00:08:45.640 | And it's like, I don't think listeners,
00:08:48.040 | like people just don't really understand
00:08:52.520 | what music production is.
00:08:54.640 | It's not, it's sort of like this weird
00:08:57.680 | discombobulated art form.
00:08:59.360 | It's not like a formal, because it's so new,
00:09:01.360 | there isn't like a formal training path for it.
00:09:06.360 | It's mostly driven by like autodidacts.
00:09:10.160 | Like it's like almost everyone I know
00:09:11.240 | who's good at production,
00:09:12.160 | like they didn't go to music school or anything.
00:09:13.720 | They just taught themselves.
00:09:15.040 | - Are they mostly different?
00:09:16.000 | Like the music producers you know,
00:09:18.360 | is there some commonalities that tie them together
00:09:21.280 | or are they all just different kinds of weirdos?
00:09:23.560 | 'Cause I just hung out with Rick Rubin.
00:09:25.400 | I don't know if you've-
00:09:26.240 | - Yeah, I mean, Rick Rubin is like literally
00:09:29.720 | one of the gods of music production.
00:09:31.160 | Like he's one of the people who first,
00:09:33.760 | you know, who like made music production,
00:09:36.280 | you know, made the production as important
00:09:39.320 | as the actual lyrics or the notes.
00:09:41.560 | - But the thing he does, which is interesting,
00:09:43.520 | I don't know if you can speak to that,
00:09:45.480 | but just hanging out with him,
00:09:46.800 | he seems to just sit there in silence,
00:09:48.480 | close his eyes and listen.
00:09:50.760 | It's like he almost does nothing.
00:09:53.600 | And that nothing somehow gives you freedom
00:09:55.840 | to be the best version of yourself.
00:09:58.120 | So that's music production somehow too,
00:10:00.000 | which is like encouraging you to do less,
00:10:02.640 | to simplify, to like push towards minimalism.
00:10:06.840 | - I mean, I guess, I mean,
00:10:08.200 | I work differently from Rick Rubin
00:10:11.560 | 'cause Rick Rubin produces for other artists,
00:10:14.120 | whereas like I mostly produce for myself.
00:10:17.040 | So it's a very different situation.
00:10:19.200 | I also think Rick Rubin, he's in that,
00:10:21.720 | I would say advanced category of producer
00:10:23.560 | where like you've like earned your,
00:10:26.560 | you can have an engineer and stuff
00:10:27.920 | and people like do the stuff for you.
00:10:29.800 | But I usually just like do stuff myself.
00:10:32.400 | - So you're the engineer, the producer and the artist.
00:10:37.400 | - Yeah, I guess I would say I'm in the era,
00:10:39.840 | like the post Rick Rubin era.
00:10:41.240 | Like I come from the kind of like
00:10:42.960 | Skrillex school of thought,
00:10:47.000 | which is like where you are, yeah,
00:10:49.440 | the engineer, producer, artist.
00:10:51.080 | I mean, lately, sometimes I'll work with a producer.
00:10:55.280 | Now I'm gently sort of delicately
00:10:58.840 | starting to collaborate a bit more,
00:10:59.960 | but like I think I'm kind of from the,
00:11:02.800 | like the whatever 2010s explosion of things
00:11:07.120 | where everything became available on the computer
00:11:11.920 | and you kind of got this like lone wizard energy thing going.
00:11:16.680 | - So you embrace being the loneliness.
00:11:19.680 | Is the loneliness somehow an engine of creativity?
00:11:22.440 | Like, so most of your stuff,
00:11:24.560 | most of your creative quote unquote genius in quotes
00:11:28.640 | is in the privacy of your mind?
00:11:32.120 | - Yes, well, it was,
00:11:34.760 | but here's the thing.
00:11:39.000 | I was talking to Daniel Eck and he said,
00:11:40.800 | he's like most artists, they have about 10 years,
00:11:43.360 | like 10 good years.
00:11:45.120 | And then they usually stop making their like vital shit.
00:11:48.760 | And I feel like I'm sort of like nearing the end
00:11:53.320 | of my 10 years on my own.
00:11:56.480 | And--
00:11:57.320 | - So you have to become somebody else.
00:11:58.600 | - Now I'm like, I'm in the process of becoming somebody else
00:12:00.960 | and reinventing.
00:12:01.800 | When I work with other people,
00:12:02.840 | because I've never worked with other people,
00:12:04.200 | I find that I make like,
00:12:06.360 | that I'm exceptionally rejuvenated
00:12:08.360 | and making like some of the most vital work I've ever made.
00:12:10.960 | So, because I think another human brain
00:12:13.840 | is like one of the best tools you can possibly find.
00:12:16.520 | Like--
00:12:19.160 | - It's a funny way to put it, I love it.
00:12:20.560 | - It's like, if a tool is like, you know,
00:12:23.320 | whatever HP plus one or like adds some like stats
00:12:27.320 | to your character, like another human brain
00:12:30.760 | will like square it instead of just like adding something.
00:12:34.240 | - Double up the experience points, I love this.
00:12:36.320 | We should also mention we're playing tavern music
00:12:38.320 | before this, which I love, which I first,
00:12:41.600 | I think I--
00:12:42.440 | - You had to stop the tavern music?
00:12:43.760 | - Yeah, 'cause it doesn't, the audio.
00:12:46.400 | - Okay, okay.
00:12:47.240 | - But it makes--
00:12:48.080 | - Yeah, it'll make the podcast annoying.
00:12:48.920 | - Add it in post, add it in post.
00:12:50.040 | - No one will wanna listen to the podcast.
00:12:51.560 | - They probably would, but it makes me,
00:12:53.400 | it reminds me like of a video game,
00:12:55.480 | like a role-playing video game
00:12:56.760 | where you have experience points.
00:12:58.400 | There's something really joyful about wandering places
00:13:03.400 | like Elder Scrolls, like Skyrim,
00:13:06.480 | just exploring these landscapes in another world,
00:13:10.520 | and then you get experience points,
00:13:12.000 | and you can work on different skills,
00:13:13.960 | and somehow you progress in life.
00:13:16.160 | And I don't know, it's simple.
00:13:17.600 | It doesn't have some of the messy complexities of life.
00:13:19.960 | And there's usually a bad guy you can fight.
00:13:22.320 | In Skyrim, it's dragons and so on.
00:13:25.600 | I'm sure in Elden Ring, there's a bunch of monsters
00:13:27.720 | you can fight, I love that.
00:13:28.800 | - I feel like Elden Ring,
00:13:29.960 | I feel like this is a good analogy to music production though
00:13:32.400 | because it's like, I feel like the engineers
00:13:34.440 | and the people creating these open worlds,
00:13:36.640 | it's sort of like similar to people, to music producers,
00:13:39.680 | whereas it's like this hidden archetype
00:13:42.760 | that no one really understands what they do,
00:13:44.640 | and no one really knows who they are,
00:13:46.200 | but they're like, it's like the artist engineer
00:13:49.320 | because it's like, it's both art
00:13:51.780 | and fairly complex engineering.
00:13:54.880 | - Well, you're saying they don't get enough credit.
00:13:57.240 | Aren't you kind of changing that
00:13:58.640 | by becoming the person doing everything?
00:14:00.880 | Isn't the engineer?
00:14:03.720 | - Well, I mean, others have gone before me.
00:14:05.480 | I'm not, you know, there's like Timbaland and Skrillex,
00:14:07.840 | and there's all these people that are like,
00:14:10.400 | you know, very famous for this.
00:14:12.080 | But I just think the general,
00:14:13.960 | I think people get confused about what it is
00:14:15.960 | and just don't really know what it is per se.
00:14:19.240 | And it's just when I see a song,
00:14:20.480 | like when there's like a hit song,
00:14:22.240 | like I'm just trying to think of like,
00:14:27.240 | just going for like even just a basic pop hit,
00:14:29.840 | like "Rules" by Dua Lipa or something.
00:14:34.840 | The production on that is actually like really crazy.
00:14:39.200 | I mean, the song is also great,
00:14:40.520 | but it's like the production is exceptionally memorable.
00:14:43.360 | You know, and it's just like no one,
00:14:47.160 | I don't even know who produced that song.
00:14:49.160 | It's just like, isn't part of like the rhetoric
00:14:50.680 | of how we discuss the creation of art.
00:14:53.400 | We just sort of like don't consider the music producer.
00:14:57.200 | Because I think the music producer used to be more
00:15:00.320 | just simply recording things.
00:15:02.480 | - Yeah, that's interesting.
00:15:04.600 | 'Cause when you think about movies,
00:15:06.040 | we talk about the actor and the actresses,
00:15:08.560 | but we also talk about the director.
00:15:10.400 | - Directors, yeah.
00:15:11.520 | - We don't talk about like that with the music as often.
00:15:14.440 | - The Beatles music producer was one of the first kind of guy,
00:15:19.880 | one of the first people sort of introducing
00:15:21.240 | crazy sound design into pop music.
00:15:22.640 | I forget his name.
00:15:24.160 | He has the same, I forget his name, but you know,
00:15:28.000 | like he was doing all the weird stuff,
00:15:29.160 | like dropping pianos and like, yeah.
00:15:31.600 | - Oh, to get the, yeah, yeah, yeah,
00:15:33.480 | to get the sound, to get the authentic sound.
00:15:36.560 | What about lyrics?
00:15:38.120 | You think those, where did they fit into how important
00:15:42.560 | they are?
00:15:43.400 | I was heartbroken to learn that Elvis
00:15:45.360 | didn't write his songs.
00:15:46.800 | I was very mad.
00:15:47.880 | - A lot of people don't write their songs.
00:15:49.480 | I understand this, but.
00:15:50.800 | - But here's the thing.
00:15:52.200 | I feel like there's this desire for authenticity.
00:15:54.880 | I used to be like really mad when like people wouldn't write
00:15:57.120 | or produce their music and I'd be like, that's fake.
00:15:59.280 | And then I realized there's all this like weird bitterness
00:16:04.280 | and like aggroness in art about authenticity.
00:16:07.760 | But I had this kind of like weird realization recently
00:16:10.800 | where I started thinking that like art is sort of
00:16:16.440 | a decentralized collective thing.
00:16:20.240 | Like art is kind of a conversation with all the artists
00:16:25.240 | that have ever lived before you, you know?
00:16:29.080 | Like it's like, you're really just sort of,
00:16:31.120 | it's not like anyone's reinventing the wheel here.
00:16:33.560 | Like you're kind of just taking, you know,
00:16:36.680 | thousands of years of art and like running it
00:16:40.000 | through your own little algorithm and then like making,
00:16:43.560 | you're like your interpretation of it.
00:16:45.080 | - You just joined the conversation
00:16:46.280 | with all the other artists that came before.
00:16:47.600 | It's such a beautiful way to look at it.
00:16:49.640 | - Like, and it's like, I feel like everyone's always like,
00:16:51.560 | there's always copyright and IP and this and that
00:16:54.160 | or authenticity.
00:16:55.200 | And it's just like, I think we need to stop seeing this
00:16:59.240 | as this like egotistical thing of like,
00:17:01.600 | oh, the creative genius, the lone creative genius
00:17:04.200 | or this or that.
00:17:05.040 | 'Cause it's like, I think art shouldn't be about that.
00:17:08.760 | I think art is something that sort of
00:17:10.400 | brings humanity together.
00:17:12.080 | And it's also, art is also kind of the collective memory
00:17:14.120 | of humans.
00:17:14.960 | Like we don't give a fuck about whatever ancient Egypt,
00:17:19.960 | like how much grain got sent that day
00:17:22.800 | and sending the records and like, you know,
00:17:24.960 | like who went where and you know,
00:17:27.640 | how many shields needed to be produced for this.
00:17:29.680 | Like we just remember their art.
00:17:32.280 | And it's like, you know, it's like in our day-to-day life,
00:17:34.880 | there's all this stuff that seems more important than art
00:17:38.120 | because it helps us function and survive.
00:17:40.240 | But when all this is gone,
00:17:41.880 | like the only thing that's really gonna be left
00:17:44.520 | is the art, the technology will be obsolete.
00:17:46.800 | - That's so fascinating.
00:17:47.640 | - Like the humans will be dead.
00:17:49.080 | - That is true.
00:17:49.920 | A good compression of human history
00:17:52.520 | is the art we've generated across the different centuries,
00:17:56.200 | the different millennia.
00:17:57.800 | So when the aliens come.
00:17:59.920 | - When the aliens come,
00:18:00.740 | they're gonna find the hieroglyphics and the pyramids.
00:18:02.760 | - I mean, art could be broadly defined.
00:18:04.360 | They might find like the engineering marvels,
00:18:06.240 | the bridges, the rockets, the-
00:18:09.840 | - I guess I sort of classify though.
00:18:11.640 | Architecture is art.
00:18:12.880 | - Yes.
00:18:13.720 | - I consider engineering in those formats to be art,
00:18:18.720 | for sure.
00:18:19.560 | - It sucks that like digital art is easier to delete.
00:18:23.200 | So if there's an apocalypse, a nuclear war,
00:18:25.800 | that can disappear.
00:18:26.840 | - Yes.
00:18:28.080 | - And the physical,
00:18:28.920 | there's something still valuable
00:18:30.000 | about the physical manifestation of art.
00:18:32.360 | That sucks that like music, for example,
00:18:35.600 | has to be played by somebody.
00:18:37.960 | - Yeah, I mean, I do think we should do have
00:18:40.200 | a foundation type situation where we like,
00:18:42.800 | you know how we have like seed banks up in the North
00:18:44.720 | and stuff?
00:18:45.560 | - Yeah.
00:18:46.400 | - Like we should probably have like a solar powered
00:18:48.000 | or geothermal little bunker
00:18:49.760 | that like has all the all human knowledge.
00:18:52.360 | - You mentioned Daniel Hayek and Spotify.
00:18:55.240 | What do you think about that as an artist?
00:18:56.960 | What's Spotify?
00:18:58.240 | Is that empowering?
00:18:59.720 | I get to me Spotify sort of as a consumer is super exciting.
00:19:02.560 | It makes it easy for me to access music
00:19:04.960 | from all kinds of artists,
00:19:06.600 | get to explore all kinds of music,
00:19:08.400 | make it super easy to sort of curate my own playlist
00:19:12.280 | and have fun with all that.
00:19:13.960 | It was so liberating to let go.
00:19:16.040 | You know, I used to collect, you know,
00:19:17.560 | albums and CDs and so on,
00:19:19.360 | like horde albums.
00:19:22.120 | - Yeah.
00:19:22.960 | - Like they matter.
00:19:23.800 | But the reality you can, you know,
00:19:25.640 | that was really liberating that I could let go of that
00:19:28.200 | and letting go of the albums you're kind of collecting
00:19:32.160 | allows you to find new music,
00:19:33.600 | exploring new artists and all that kind of stuff.
00:19:36.200 | But I know from a perspective of an artist
00:19:37.920 | that could be, like you mentioned,
00:19:39.240 | competition could be a kind of constraint
00:19:42.040 | 'cause there's more and more and more artists
00:19:44.680 | on the platform.
00:19:46.040 | - I think it's better that there's more artists.
00:19:47.880 | I mean, again, this might be propaganda
00:19:49.840 | 'cause this is all from a conversation with Daniel Ek.
00:19:51.680 | So this could easily be propaganda.
00:19:54.040 | - We're all a victim of somebody's propaganda.
00:19:56.720 | So let's just accept this.
00:19:58.960 | - But Daniel Ek was telling me that, you know,
00:20:01.040 | at the, 'cause I, you know,
00:20:02.800 | when I met him, I like,
00:20:04.600 | I came in all furious about Spotify
00:20:06.440 | and like I grilled him super hard.
00:20:07.800 | So I've got his answers here.
00:20:10.640 | But he was saying like at the sort of peak
00:20:13.560 | of the CD industry,
00:20:15.280 | there was like 20,000 artists
00:20:17.440 | making millions and millions of dollars.
00:20:19.560 | Like there was just like a very tiny kind of 1%.
00:20:22.880 | And Spotify has kind of democratized the industry
00:20:27.440 | because now I think he said there's about a million artists
00:20:30.360 | making a good living from Spotify.
00:20:33.160 | And when I heard that, I was like,
00:20:34.800 | honestly, I would rather make less money
00:20:38.840 | and have just like a decent living
00:20:41.160 | than and have more artists be able to have that.
00:20:46.560 | Even though I like, I wish it could include everyone, but.
00:20:49.320 | - Yeah, that's really hard to argue with.
00:20:50.720 | YouTube is the same.
00:20:52.280 | It's YouTube's mission.
00:20:54.120 | They wanna basically have as many creators as possible
00:20:58.280 | and make a living, some kind of living.
00:21:00.080 | - Yeah.
00:21:00.920 | - And that's so hard to argue with.
00:21:02.760 | - It's so hard.
00:21:03.600 | But I think there's better ways to do it.
00:21:04.520 | My manager, I actually wish he was here.
00:21:06.320 | I like, I would have brought him up.
00:21:07.840 | My manager is building an app that can manage you.
00:21:12.840 | So it'll like help you organize your percentages
00:21:16.520 | and get your publishing and da, da, da, da, da.
00:21:18.880 | So you can take out all the middlemen.
00:21:20.000 | So you can have a much bigger, it'll just like automate it.
00:21:23.040 | So you can get.
00:21:23.880 | - So automate the manager?
00:21:24.720 | - Automate management, publishing,
00:21:28.060 | like and legal, it can read,
00:21:32.440 | the app he's building can read your contract
00:21:34.080 | and like tell you about it.
00:21:35.680 | Because one of the issues with music right now,
00:21:38.320 | it's not that we're not getting paid enough,
00:21:39.840 | but it's that the art industry is filled with middlemen
00:21:44.840 | because artists are not good at business.
00:21:47.780 | And from the beginning, like Frank Sinatra,
00:21:50.440 | it's all mob stuff.
00:21:51.660 | Like it's the music industry,
00:21:53.440 | is run by business people, not the artists.
00:21:57.600 | And the artists really get very small cuts
00:21:59.560 | of like what they make.
00:22:00.480 | And so I think part of the reason I'm a technocrat,
00:22:04.800 | which I mean, your fans are gonna be technocrats.
00:22:07.080 | So no one's, they're not gonna be mad at me about this,
00:22:09.320 | but like my fans hate it when I say this kind of thing
00:22:12.240 | or the general public.
00:22:13.200 | - They don't like technocrats.
00:22:14.240 | - They don't like technocrats.
00:22:15.640 | Like when I watched "Battle Angel Alita"
00:22:18.800 | and they were like, "The Martian technocracy."
00:22:20.440 | And I was like, "Yeah, Martian technocracy."
00:22:22.080 | And then they were like, "And they're evil."
00:22:23.600 | And I was like, "Oh, okay."
00:22:25.640 | I was like, "Cause Martian technocracy sounds sick to me."
00:22:28.840 | - Yeah, so your intuition as technocrats
00:22:32.000 | would create some kind of beautiful world.
00:22:34.280 | For example, what my manager's working on,
00:22:36.120 | if you can create an app that removes the need for a lawyer
00:22:39.960 | and then you could have a smart contracts on the blockchain,
00:22:43.440 | removes the need for like management
00:22:46.880 | and organizing all this stuff,
00:22:48.080 | like can read your stuff and explain it to you,
00:22:51.000 | can collect your royalties.
00:22:52.780 | Like then the small amounts,
00:22:57.040 | the amount of money that you're getting from Spotify
00:22:58.720 | actually means a lot more and goes a lot farther.
00:23:01.840 | - It can remove some of the bureaucracy,
00:23:03.180 | some of the inefficiencies that make life
00:23:06.400 | not as great as it could be.
00:23:08.240 | - Yeah, I think the issue isn't that there's not enough,
00:23:10.880 | like the issue is that there's inefficiency.
00:23:12.720 | And I'm really into this positive some mindset,
00:23:17.580 | the win-win mindset of like,
00:23:20.840 | instead of fighting over the scraps,
00:23:23.560 | how do we make the, or worrying about scarcity,
00:23:26.560 | like instead of a scarcity mindset,
00:23:27.840 | why don't we just increase the efficiency in that way?
00:23:32.400 | - Expand the size of the pie.
00:23:34.420 | Let me ask you about experimentation.
00:23:36.580 | So you said, which is beautiful,
00:23:38.660 | being a musician is like having a conversation
00:23:42.820 | with all those that came before you.
00:23:45.460 | How much of creating music is like
00:23:48.340 | kind of having that conversation,
00:23:53.100 | trying to fit into the cultural trends
00:23:57.260 | and how much of it is like trying to as much as possible
00:24:00.140 | to be an outsider and come up with something totally new?
00:24:02.640 | It's like when you're thinking, when you're experimenting,
00:24:05.680 | are you trying to be totally different, totally weird?
00:24:08.680 | Are you trying to fit in?
00:24:12.120 | - Man, this is so hard 'cause I feel like I'm
00:24:14.640 | kind of in the process of semi-retiring from music.
00:24:16.900 | So this is like my old brain.
00:24:18.760 | - Yeah, bring it from like the shelf,
00:24:22.080 | put it on the table for a couple of minutes.
00:24:24.440 | We'll just poke it.
00:24:26.280 | - I think it's a bit of both because I think
00:24:29.080 | forcing yourself to engage with new music,
00:24:31.440 | it's really great for neuroplasticity.
00:24:35.040 | Like I think, as people,
00:24:38.320 | part of the reason music is marketed at young people
00:24:41.680 | is 'cause young people are very neuroplastic.
00:24:43.400 | So like if you're 16 to like 23 or whatever,
00:24:48.240 | it's gonna be really easy for you to love new music.
00:24:50.860 | And if you're older than that,
00:24:52.280 | it gets harder and harder and harder.
00:24:53.760 | And I think one of the beautiful things
00:24:54.960 | about being a musician is I just constantly
00:24:57.280 | force myself to listen to new music.
00:24:58.780 | And I think it keeps my brain really plastic.
00:25:01.000 | And I think this is a really good exercise.
00:25:02.800 | I just think everyone should do this.
00:25:04.320 | You listen to new music and you hate it,
00:25:05.640 | I think you should just keep,
00:25:07.120 | force yourself to like, okay, well, why do people like it?
00:25:09.980 | And like, you know, make your brain form new neural pathways
00:25:14.680 | and be more open to change.
00:25:16.920 | - That's really brilliant, actually.
00:25:18.320 | Sorry to interrupt, but like that exercise
00:25:20.760 | is really amazing to sort of embrace change,
00:25:27.480 | embrace sort of practice neuroplasticity.
00:25:31.580 | Because like, that's one of the things,
00:25:33.180 | you fall in love with a certain band
00:25:34.020 | and you just kind of stay with that for the rest of your life
00:25:36.780 | and then you never understand the modern music.
00:25:38.420 | That's a really good exercise.
00:25:39.260 | - Most of the streaming on Spotify
00:25:40.620 | is like classic rock and stuff.
00:25:42.420 | Like new music makes up a very small chunk
00:25:44.780 | of what is played on Spotify.
00:25:46.820 | And I think this is like not a good sign
00:25:48.980 | for us as a species.
00:25:50.140 | I think, yeah.
00:25:52.900 | - So it's a good measure of the species open mind
00:25:57.340 | and this to change is how often you listen to new music.
00:26:00.580 | - Yeah.
00:26:01.420 | - The brain, let's put the music brain back on the shelf.
00:26:05.200 | I gotta pull out the futurist brain for a second.
00:26:08.300 | In what wild ways do you think the future,
00:26:12.280 | say in like 30 years, maybe 50 years,
00:26:14.980 | maybe a hundred years will be different
00:26:16.920 | from like from our current way of life on earth?
00:26:22.120 | We can talk about augmented reality, virtual reality,
00:26:25.400 | maybe robots, maybe space travel, maybe video games,
00:26:30.400 | maybe generic engineering.
00:26:32.540 | I can keep going.
00:26:33.380 | Cyborgs, aliens, world wars,
00:26:36.260 | maybe destructive nuclear wars, good and bad.
00:26:39.220 | When you think about the future, what are you imagining?
00:26:43.540 | What's the weirdest and the wildest it could be?
00:26:45.940 | - Have you read "Surface Detail" by Ian Banks?
00:26:50.100 | "Surface Detail" is my favorite depiction of a,
00:26:54.800 | oh wow, you have to read this book.
00:26:56.540 | It's literally the greatest science fiction book
00:26:58.660 | possibly ever written.
00:26:59.500 | - Ian Banks is the man, yeah, for sure.
00:27:01.580 | - What have you read?
00:27:03.180 | - Just "The Player of Games."
00:27:04.500 | - I read that titles can't be copyrighted
00:27:07.340 | so you can just steal them.
00:27:08.260 | And I was like, "Player of Games, sick."
00:27:09.940 | - Nice.
00:27:10.760 | - Yeah, so you could name your album.
00:27:12.740 | Like I always wanted--
00:27:13.580 | - "Romeo and Juliet" or something?
00:27:14.980 | - I always wanted to name an album "War and Peace."
00:27:17.060 | - Nice.
00:27:17.900 | - Like that would be like--
00:27:18.740 | - That is a good, that's a good,
00:27:20.420 | where have I heard that before?
00:27:21.580 | - You can do that, like you could do that.
00:27:24.280 | Also things that are in the public domain.
00:27:26.060 | - For people who have no clue,
00:27:27.180 | you do have a song called "Player of Games."
00:27:29.660 | - Yes.
00:27:30.500 | Oh yeah, so Ian Banks' "Surface Detail"
00:27:32.460 | is in my opinion the best feature
00:27:34.980 | that I've ever read about or heard about in science fiction.
00:27:38.660 | Basically there's the relationship with super intelligence,
00:27:44.540 | like artificial super intelligence is just, it's like great.
00:27:49.140 | I wanna credit the person who coined this term
00:27:53.060 | 'cause I love this term.
00:27:55.220 | And I feel like young women don't get enough credit in.
00:27:58.640 | Yeah, so if you go to Protopia Futures on Instagram,
00:28:03.920 | what is her name?
00:28:05.480 | - Personalized Donor Experience at Scale
00:28:07.240 | or Ad Power Donor Experience.
00:28:08.960 | - Monica Bealskite, I'm saying that wrong.
00:28:13.960 | And I'm probably gonna, I'm probably butchering this a bit,
00:28:17.680 | but Protopia is sort of, if Utopia is unattainable,
00:28:21.560 | Protopia is sort of like, you know.
00:28:26.000 | - Wow, that's an awesome Instagram, Protopia Futures.
00:28:28.640 | - A great, a future that is as good as we can get.
00:28:33.480 | - The future, positive future.
00:28:34.720 | AI, is this a centralized AI in "Surface Detail"
00:28:38.360 | or is it distributed?
00:28:39.280 | What kind of AI is it?
00:28:40.600 | - They mostly exist as giant super ships,
00:28:43.080 | sort of like the guild ships in "Dune."
00:28:45.800 | Like they're these giant ships
00:28:46.800 | that kind of move people around and the ships are sentient.
00:28:49.480 | And they can talk to all the passengers.
00:28:52.240 | And I mean, there's a lot of different types of AI
00:28:56.440 | in the Banksyian future,
00:28:58.320 | but in the opening scene of "Surface Detail,"
00:29:01.080 | there's this place called the Culture
00:29:02.320 | and the Culture is basically a Protopian future.
00:29:04.440 | And a Protopian future, I think,
00:29:07.320 | is like a future that is like,
00:29:08.840 | obviously it's not Utopia, it's not perfect.
00:29:12.400 | And like, 'cause like striving for Utopia,
00:29:14.000 | I think feels hopeless and it's sort of like,
00:29:16.960 | maybe not the best terminology to be using.
00:29:19.280 | So it's like, it's a pretty good place.
00:29:23.960 | Like mostly like, you know,
00:29:27.680 | super intelligence and biological beings
00:29:29.880 | exist fairly in harmony.
00:29:31.920 | There's not too much war.
00:29:33.120 | There's like as close to equality as you can get.
00:29:35.680 | You know, it's like approximately a good future.
00:29:38.680 | Like there's really awesome stuff.
00:29:40.440 | And in the opening scene,
00:29:45.600 | this girl, she's born as a sex slave outside of the culture.
00:29:49.520 | So she's in a society that doesn't adhere
00:29:51.200 | to the cultural values.
00:29:52.640 | She tries to kill the guy who is her like master,
00:29:56.680 | but he kills her.
00:29:57.920 | But unbeknownst to her,
00:29:59.040 | when she was traveling on a ship through the culture
00:30:01.360 | with him one day,
00:30:02.320 | a ship put a neural lace in her head.
00:30:05.680 | And neural lace is sort of like,
00:30:08.480 | it's basically a neural link.
00:30:09.920 | 'Cause life imitates art.
00:30:13.040 | - It does indeed.
00:30:14.160 | - It does indeed.
00:30:15.000 | So she wakes up and the opening scene
00:30:16.480 | is her memory has been uploaded by this neural lace
00:30:19.080 | when she has been killed.
00:30:20.040 | And now she gets to choose a new body.
00:30:22.680 | And this AI is interfacing with her recorded memory
00:30:26.920 | in her neural lace and helping her
00:30:29.520 | and being like, "Hello, you're dead.
00:30:31.320 | But because you had a neural lace, your memory's uploaded.
00:30:33.760 | Do you want to choose a new body?
00:30:35.000 | And you're going to be born here in the culture
00:30:36.480 | and like start a new life."
00:30:38.160 | Which is just, that's like the opening.
00:30:39.960 | It's like so sick.
00:30:41.440 | - And the ship is the super intelligence.
00:30:43.680 | All the ships are kind of super intelligence.
00:30:45.120 | - But they still want to preserve
00:30:46.920 | a kind of rich, fulfilling experience for the humans.
00:30:49.680 | - Yeah, like they're like friends with the humans.
00:30:51.000 | And then there's a bunch of ships
00:30:51.960 | that don't want to exist with biological beings,
00:30:54.440 | but they just have their own place, like way over there.
00:30:57.080 | - But they don't, they just do their own thing.
00:30:58.920 | They're not necessarily.
00:31:00.320 | So it's a pretty,
00:31:01.520 | the sportopian existence is pretty peaceful.
00:31:03.640 | - Yeah, I mean, and then for example,
00:31:05.840 | one of the main fights in the book is,
00:31:07.900 | they're fighting, there's these artificial hells
00:31:13.160 | that, and people don't think it's ethical
00:31:16.720 | to have artificial hell.
00:31:17.600 | Like basically when people do crime, they get sent,
00:31:19.560 | like when they die, their memory gets sent
00:31:21.160 | to an artificial hell and they're eternally tortured.
00:31:23.480 | And so, and then the way that society is deciding
00:31:27.680 | whether or not to have the artificial hell
00:31:29.320 | is that they're having these simulated,
00:31:31.920 | they're having like a simulated war.
00:31:33.240 | So instead of actual blood,
00:31:36.040 | people are basically essentially fighting in a video game
00:31:38.560 | to choose the outcome of this.
00:31:40.120 | - But they're still experiencing the suffering, right?
00:31:42.760 | - Wait, in this artificial hell or no?
00:31:44.520 | Can you experience stuff or?
00:31:45.760 | - So the artificial hell sucks.
00:31:47.240 | And a lot of people in the culture
00:31:48.360 | wanna get rid of the artificial hell.
00:31:49.920 | - There's a simulated wars,
00:31:51.320 | are they happening in the artificial hell?
00:31:53.920 | - No, the simulated wars are happening
00:31:55.520 | outside of the artificial hell.
00:31:57.120 | Between the political factions who are,
00:31:59.960 | so this political faction says,
00:32:01.640 | "We should have simulated hell to deter crime."
00:32:05.040 | And this political faction is saying,
00:32:06.880 | "No, simulated hell is unethical."
00:32:08.920 | And so instead of like having,
00:32:11.720 | blowing each other up with nukes,
00:32:13.160 | they're having like a giant Fortnite battle to decide this.
00:32:18.160 | Which, to me that's protopia.
00:32:21.720 | That's like, okay, we can have war without death.
00:32:24.560 | I don't think there should be simulated hells.
00:32:27.480 | I think that is definitely one of the ways
00:32:29.840 | in which technology could go very, very, very, very wrong.
00:32:34.320 | - So almost punishing people in a digital space
00:32:37.160 | or something like that.
00:32:38.000 | - Yeah, like torturing people's memories.
00:32:41.360 | - So either as a deterrent, like if you committed a crime,
00:32:44.520 | but also just for personal pleasure,
00:32:46.280 | if there's some sick, demented humans in this world.
00:32:49.000 | Dan Carlin actually has this episode
00:32:53.000 | of "Hardcore History" on painful attainment.
00:32:59.200 | - Oh, that episode is fucked.
00:33:02.040 | - It's dark 'cause he kind of goes through human history
00:33:05.120 | and says like, "We as humans seem to enjoy,
00:33:08.560 | "secretly enjoy or used to be openly enjoy
00:33:11.960 | "sort of the torture and the death,
00:33:14.240 | "watching the death and torture of other humans."
00:33:17.680 | - I do think if people were consenting,
00:33:21.680 | we should be allowed to have gladiatorial matches.
00:33:26.200 | - But consent is hard to achieve in those situations.
00:33:28.520 | It always starts getting slippery.
00:33:31.000 | Like it could be also forced,
00:33:32.720 | like it starts getting weird.
00:33:34.240 | - Yeah, yeah.
00:33:35.240 | - There's way too much excitement.
00:33:37.280 | Like this is what he highlights.
00:33:38.600 | There's something about human nature
00:33:40.480 | that wants to see that violence.
00:33:42.240 | And it's really dark.
00:33:44.240 | And you hope that we can sort of overcome
00:33:47.120 | that aspect of human nature,
00:33:48.800 | but that's still within us somewhere.
00:33:51.240 | - Well, I think that's what we're doing right now.
00:33:53.200 | I have this theory that what is very important
00:33:56.360 | about the current moment is that all of evolution
00:34:00.840 | has been survival of the fittest up until now.
00:34:03.360 | And at some point, the lines are kind of fuzzy,
00:34:07.240 | but in the recent past, or maybe even just right now,
00:34:12.080 | we're getting to this point
00:34:14.400 | where we can choose intelligent design.
00:34:19.400 | Like we, probably since like the integration of the iPhone,
00:34:23.440 | like we are becoming cyborgs.
00:34:24.800 | Like our brains are fundamentally changed.
00:34:27.640 | Everyone who grew up with electronics,
00:34:29.320 | we are fundamentally different from previous,
00:34:32.480 | from Homo sapiens.
00:34:33.400 | I call us Homo techno.
00:34:34.640 | I think we have evolved into Homo techno,
00:34:36.800 | which is like essentially a new species.
00:34:39.360 | Like if you look at the way,
00:34:41.360 | if you took an MRI of my brain
00:34:43.360 | and you took an MRI of like a medieval brain,
00:34:46.560 | I think it would be very different
00:34:48.120 | the way that it has evolved.
00:34:49.960 | - Do you think when historians look back at this time,
00:34:51.960 | they'll see like this was a fundamental shift
00:34:54.000 | to what a human being is?
00:34:54.840 | - I do not think we are still Homo sapiens.
00:34:58.000 | I believe we are Homo techno.
00:34:59.520 | And I think we have evolved.
00:35:04.400 | And I think right now, the way we are evolving,
00:35:07.400 | we can choose how we do that.
00:35:09.080 | And I think we were being very reckless
00:35:10.680 | about how we're doing that.
00:35:11.800 | Like we're just having social media,
00:35:13.000 | but I think this idea that like this is a time
00:35:16.320 | to choose intelligent design
00:35:17.760 | should be taken very seriously.
00:35:19.600 | It like now is the moment to reprogram the human computer.
00:35:22.640 | You know, it's like, if you go blind,
00:35:27.240 | your visual cortex will get taken over with other functions.
00:35:32.000 | We can choose our own evolution.
00:35:35.160 | We can change the way our brains work.
00:35:37.160 | And so we actually have a huge responsibility to do that.
00:35:39.920 | And I think, I'm not sure who should be responsible for that
00:35:42.880 | but there's definitely not adequate education.
00:35:45.200 | We're being inundated with all this technology
00:35:46.960 | that is fundamentally changing
00:35:49.040 | the physical structure of our brains.
00:35:50.920 | And we are not adequately responding to that
00:35:54.760 | to choose how we wanna evolve.
00:35:57.480 | And we could evolve, we could be really whatever we want.
00:36:00.600 | And I think this is a really important time.
00:36:03.040 | And I think if we choose correctly and we choose wisely,
00:36:06.480 | consciousness could exist for a very long time
00:36:09.760 | and integration with AI could be extremely positive.
00:36:12.720 | And I don't think enough people are focusing
00:36:14.400 | on this specific situation.
00:36:16.400 | - Do you think we might irreversibly screw things up
00:36:18.680 | if we get things wrong now?
00:36:20.000 | 'Cause like the flip side of that,
00:36:21.640 | it seems humans are pretty adaptive.
00:36:23.200 | So maybe the way we figure things out is by screwing it up.
00:36:26.880 | Like social media over a generation
00:36:29.480 | we'll see the negative effects of social media
00:36:31.200 | and then we build new social medias
00:36:33.080 | and we just keep improving stuff.
00:36:34.960 | And then we learn the failure from the failures of the past.
00:36:37.640 | 'Cause humans seem to be really adaptive.
00:36:39.880 | On the flip side, we can get it wrong in a way
00:36:42.960 | where like literally we create weapons of war
00:36:46.360 | or increase hate past a certain threshold
00:36:49.600 | we really do a lot of damage.
00:36:51.840 | - I mean, I think we're optimized
00:36:53.680 | to notice the negative things.
00:36:55.680 | But I would actually say,
00:36:59.320 | you know, one of the things
00:37:00.160 | that I think people aren't noticing is like,
00:37:02.360 | if you look at Silicon Valley and you look at like,
00:37:04.320 | whatever the technocracy, like what's been happening there.
00:37:08.360 | Like it's like when Silicon Valley started,
00:37:10.080 | it was all just like Facebook
00:37:11.440 | and all this like for-profit crap
00:37:14.040 | that like really wasn't particular.
00:37:16.920 | I guess it was useful, but it was, it's sort of
00:37:19.320 | just like whatever.
00:37:22.200 | But like now you see like lab grown meat,
00:37:24.680 | like compostable or like biodegradable,
00:37:28.400 | like single use cutlery or like meditation apps.
00:37:33.160 | I think we are actually evolving and changing
00:37:38.160 | and technology is changing.
00:37:39.640 | I think they're just maybe
00:37:41.600 | there isn't quite enough education about this.
00:37:47.560 | And also I don't know if there's like
00:37:50.320 | quite enough incentive for it
00:37:52.960 | because I think the way capitalism works,
00:37:56.640 | what we define as profit,
00:37:58.560 | we're also working on an old model
00:38:00.440 | of what we define as profit.
00:38:01.600 | I really think if we changed
00:38:04.200 | the idea of profit to include social good,
00:38:08.280 | you can have like economic profit,
00:38:09.880 | social good also counting as profit
00:38:12.640 | would incentivize things that are more useful
00:38:15.160 | and more whatever spiritual technology
00:38:17.000 | or like positive technology,
00:38:18.040 | or you know, things that help reprogram
00:38:21.320 | a human computer in a good way,
00:38:22.960 | or things that help us intelligently design our new brains.
00:38:27.960 | - Yeah, there's no reason why
00:38:29.440 | within the framework of capitalism,
00:38:31.360 | the word profit or the idea of profit
00:38:33.840 | can't also incorporate, you know,
00:38:35.640 | the well-being of a human being.
00:38:37.400 | So like long-term well-being, long-term happiness.
00:38:40.140 | - Or even for example,
00:38:42.800 | you know, we were talking about motherhood,
00:38:43.880 | like part of the reason I'm so late
00:38:44.880 | is 'cause I had to get the baby to bed.
00:38:47.440 | And it's like, I keep thinking about motherhood,
00:38:48.920 | how under capitalism,
00:38:51.400 | it's like this extremely essential job
00:38:53.720 | that is very difficult, that is not compensated.
00:38:56.520 | And we sort of like value things
00:38:58.480 | by how much we compensate them.
00:39:01.960 | And so we really devalue motherhood in our society
00:39:04.920 | and pretty much all societies.
00:39:06.120 | Like capitalism does not recognize motherhood.
00:39:08.200 | It's just a job that you're supposed to do for free.
00:39:11.200 | And it's like, but I feel like producing great humans
00:39:15.400 | should be seen as a great, as profit under capitalism.
00:39:19.480 | Like that should be, that's like a huge social good.
00:39:21.520 | Like every awesome human that gets made
00:39:24.200 | adds so much to the world.
00:39:25.640 | So like if that was integrated into the profit structure,
00:39:29.880 | then, you know, and if we potentially found a way
00:39:34.280 | to compensate motherhood.
00:39:35.680 | - So come up with a compensation
00:39:37.960 | that's much broader than just money or--
00:39:40.680 | - Or it could just be money.
00:39:42.080 | Like what if you just made, I don't know,
00:39:45.520 | but I don't know how you'd pay for that.
00:39:46.680 | Like, I mean, that's where you start getting into--
00:39:49.480 | - Reallocation of resources that people get upset over.
00:39:56.360 | - Well, what if we made like a motherhood Dow?
00:39:58.600 | (both laughing)
00:40:00.040 | - Yeah, yeah, yeah.
00:40:01.240 | - You know, and used it to fund like single mothers,
00:40:06.240 | like, you know, pay for making babies.
00:40:13.640 | So, I mean, if you create and put beautiful things
00:40:17.360 | onto the world, that could be companies,
00:40:19.760 | that can be bridges, that could be art,
00:40:22.640 | that could be a lot of things,
00:40:24.280 | and that could be children, which are--
00:40:28.080 | - Or education or--
00:40:29.320 | - Anything that could just should be valued by society.
00:40:32.400 | And that should be somehow incorporated
00:40:34.240 | into the framework of what, as a market of what,
00:40:38.480 | like if you contribute children to this world,
00:40:40.520 | that should be valued and respected.
00:40:42.440 | And sort of celebrated, like proportional to what it is,
00:40:47.440 | which is it's the thing that fuels human civilization.
00:40:51.520 | - Yeah, like I--
00:40:52.360 | - It's kind of important.
00:40:53.200 | - I feel like everyone's always saying,
00:40:54.560 | I mean, I think we're in very different social spheres,
00:40:56.480 | but everyone's always saying like dismantle capitalism.
00:40:58.840 | And I'm like, well, okay,
00:40:59.680 | well, I don't think the government should own everything.
00:41:02.080 | Like, I don't think we should not have private ownership.
00:41:04.160 | Like, that's scary.
00:41:05.320 | You know, like that starts getting into weird stuff
00:41:07.240 | and just sort of like,
00:41:08.880 | I feel there's almost no way to do that
00:41:10.040 | without a police state, you know?
00:41:11.960 | - Yeah.
00:41:13.560 | - But obviously capitalism has some major flaws.
00:41:17.040 | And I think actually Mack showed me this idea
00:41:20.400 | called social capitalism,
00:41:22.160 | which is a form of capitalism that just like
00:41:24.640 | considers social good to be also profit.
00:41:28.600 | Like, you know, it's like right now companies need to,
00:41:31.640 | like you're supposed to grow every quarter or whatever
00:41:34.600 | to like show that you're functioning well.
00:41:38.760 | But it's like, okay, well,
00:41:39.960 | what if you kept the same amount of profit,
00:41:42.960 | you're still in the green,
00:41:43.960 | but then you have also all the social good.
00:41:45.640 | Like, do you really need all this extra economic growth
00:41:47.600 | or could you add this social good and that counts?
00:41:49.560 | And you know, I don't know if I am not an economist.
00:41:54.040 | I have no idea how this could be achieved, but--
00:41:56.400 | - I don't think economists know
00:41:57.880 | how anything could be achieved either, but they pretend.
00:42:00.360 | That's the thing, they construct a model
00:42:02.360 | and they go on TV shows and sound like an expert.
00:42:06.200 | That's the definition of an economist.
00:42:08.560 | - How did being a mother, becoming a mother
00:42:12.280 | change you as a human being, would you say?
00:42:15.240 | - Man, I think it kind of changed everything
00:42:21.040 | and it's still changing me a lot.
00:42:22.400 | It's actually changing me more right now in this moment
00:42:24.920 | than it was before.
00:42:26.400 | - Like today?
00:42:27.440 | Like this--
00:42:28.520 | - Just like in the most recent months and stuff.
00:42:32.500 | - Can you elucidate that how,
00:42:37.600 | like when you wake up in the morning
00:42:39.360 | and you look at yourself, it's again, who are you?
00:42:41.860 | How have you become different, would you say?
00:42:45.520 | - I think it's just really reorienting my priorities.
00:42:50.520 | And at first I was really fighting against that
00:42:53.400 | because I somehow felt it was like a failure
00:42:55.600 | of feminism or something.
00:42:56.680 | Like I felt like it was like bad
00:42:59.840 | if like my kids started mattering more than my work.
00:43:05.680 | And then like more recently I started sort of analyzing
00:43:09.000 | that thought in myself and being like,
00:43:12.280 | that's also kind of a construct.
00:43:13.920 | It's like we've just devalued motherhood so much
00:43:16.320 | in our culture that like I feel guilty for caring
00:43:21.320 | about my kids more than I care about my work.
00:43:23.360 | - So feminism includes breaking out
00:43:25.800 | of whatever the construct is.
00:43:27.500 | So it's continually breaking, it's like freedom,
00:43:32.960 | empower you to be free and that means--
00:43:36.080 | - But it also, but like being a mother,
00:43:40.080 | like I'm so much more creative.
00:43:41.880 | Like I cannot believe the massive amount
00:43:45.920 | of brain growth that I have.
00:43:48.480 | - Why do you think that is?
00:43:49.320 | Just 'cause like the stakes are higher somehow?
00:43:51.980 | - I think it's like, it's just so trippy
00:43:58.040 | watching consciousness emerge.
00:44:00.840 | It's just like, it's like going on a crazy journey
00:44:05.440 | or something, it's like the craziest science fiction novel
00:44:10.240 | you could ever read.
00:44:11.080 | It's just so crazy watching consciousness come into being.
00:44:15.120 | And then at the same time,
00:44:16.560 | like you're forced to value your time so much.
00:44:21.160 | Like when I have creative time now, it's so sacred.
00:44:23.560 | I need to like be really fricking on it.
00:44:29.400 | But the other thing is that I used to just be like a cynic
00:44:34.400 | and I used to just wanna,
00:44:35.560 | like my last album was called "Misanthropocene"
00:44:38.120 | and it was like a study in villainy,
00:44:42.640 | or like it was like, well, what if we have,
00:44:45.020 | instead of the old gods, we have like new gods
00:44:46.840 | and it's like misanthropocene is like misanthrope
00:44:49.740 | like and anthropocene, which is like the,
00:44:52.200 | like and she's the goddess of climate change or whatever.
00:44:55.800 | And she's like destroying the world.
00:44:57.000 | And it was just like, it was like dark
00:44:59.680 | and it was like a study in villainy.
00:45:01.400 | And it was sort of just like,
00:45:02.720 | like I used to like have no problem just making cynical,
00:45:05.820 | angry, scary art.
00:45:09.040 | And not that there's anything wrong with that,
00:45:11.140 | but I think having kids just makes you such an optimist.
00:45:13.600 | It just inherently makes you wanna be an optimist so bad
00:45:16.960 | that like I feel like I'm more of a responsibility
00:45:20.400 | to make more optimistic things.
00:45:23.880 | And I get a lot of shit for it
00:45:25.760 | 'cause everyone's like, oh, you're so privileged.
00:45:28.560 | Stop talking about like pie in the sky,
00:45:30.320 | stupid concepts and focus on like the now.
00:45:32.760 | But it's like, I think if we don't ideate
00:45:36.560 | about futures that could be good,
00:45:40.600 | we won't be able to get them.
00:45:41.600 | If everything is Blade Runner,
00:45:42.840 | then we're gonna end up with Blade Runner.
00:45:44.320 | It's like, as we said earlier, life imitates art.
00:45:47.360 | Like life really does imitate art.
00:45:49.800 | And so we really need more protopian or utopian art.
00:45:54.040 | I think this is incredibly essential
00:45:56.120 | for the future of humanity.
00:45:58.120 | And I think the current discourse where that's seen
00:46:03.120 | as a thinking about protopia or utopia
00:46:08.280 | is seen as a dismissal of the problems
00:46:11.640 | that we currently have.
00:46:12.480 | I think that is an incorrect mindset.
00:46:15.000 | And like having kids just makes me wanna imagine
00:46:20.240 | amazing futures that like maybe I won't be able to build,
00:46:24.440 | but they will be able to build if they want to.
00:46:26.960 | - Yeah, it does seem like ideation
00:46:28.320 | is a precursor to creation.
00:46:30.320 | So you have to imagine it in order to be able to build it.
00:46:33.840 | And there is a sad thing about human nature
00:46:36.720 | that they somehow a cynical view of the world
00:46:40.760 | is seen as a insightful view.
00:46:44.400 | Cynicism is often confused for insight,
00:46:47.000 | which is sad to see.
00:46:48.920 | And optimism is confused for naivete.
00:46:52.640 | - Yes, yes.
00:46:53.480 | - Like you don't, you're blinded by your,
00:46:57.680 | maybe your privilege or whatever.
00:46:59.320 | You're blinded by something, but you're certainly blinded.
00:47:02.040 | That's sad.
00:47:03.800 | That's sad to see because it seems like the optimists
00:47:06.000 | are the ones that create our future.
00:47:10.280 | They're the ones that build.
00:47:11.920 | In order to build the crazy thing,
00:47:13.560 | you have to be optimistic.
00:47:14.760 | You have to be either stupid or excited or passionate
00:47:19.240 | or mad enough to actually believe that it can be built.
00:47:22.760 | And those are the people that built it.
00:47:24.240 | - My favorite quote of all time
00:47:26.640 | is from "Star Wars" episode eight,
00:47:29.080 | which I know everyone hates.
00:47:31.000 | Do you like "Star Wars" episode eight?
00:47:32.480 | - No, I probably would say I would probably hate it, yeah.
00:47:35.840 | I don't have strong feelings about it.
00:47:38.960 | Let me backtrack.
00:47:39.800 | I don't have strong feelings about "Star Wars."
00:47:41.560 | - I just wanna--
00:47:42.400 | - I'm a Tolkien person.
00:47:43.240 | I'm more into dragons and orcs and ogres.
00:47:47.880 | - Yeah, I mean, Tolkien forever.
00:47:49.640 | I really wanna have one more son and call him,
00:47:51.800 | I thought Tau-Techno-Tolkien would be cool.
00:47:55.280 | - That's a lot of Ts, I like it.
00:47:56.960 | - Yeah, and Tau is six, two, eight, two pi.
00:47:59.280 | - Yeah, Tau-Techno, yeah, yeah, yeah.
00:48:01.760 | - And then techno is obviously the best genre of music,
00:48:04.800 | but also like technocracy.
00:48:06.240 | - It just sounds really good.
00:48:07.080 | And yeah, that's right, Techno-Tolkien.
00:48:09.560 | (laughs)
00:48:10.400 | Tau-Techno-Tolkien, that's a good--
00:48:11.680 | - Tau-Techno-Tolkien.
00:48:12.960 | But "Star Wars Episode VIII,"
00:48:15.080 | I know a lot of people have issues with it.
00:48:17.240 | Personally, on the record,
00:48:18.720 | I think it's the best "Star Wars" film.
00:48:21.200 | - Starting trouble today.
00:48:25.160 | - Yeah.
00:48:26.000 | - So what--
00:48:26.840 | - But don't kill what you hate, save what you love.
00:48:29.240 | - Don't kill what you hate--
00:48:30.440 | - Don't kill what you hate, save what you love.
00:48:32.320 | And I think we're in a society right now,
00:48:34.960 | we're in a diagnosis mode.
00:48:36.600 | We're just diagnosing and diagnosing and diagnosing.
00:48:39.240 | And we're trying to kill what we hate,
00:48:41.640 | and we're not trying to save what we love enough.
00:48:44.280 | And there's this Buckminster Fuller quote,
00:48:46.280 | which I'm gonna butcher
00:48:47.200 | 'cause I don't remember it correctly,
00:48:48.280 | but it's something along the lines of,
00:48:50.560 | don't try to destroy the old bad models,
00:48:57.600 | render them obsolete with better models.
00:49:01.640 | Maybe we don't need to destroy the oil industry,
00:49:05.680 | maybe we just create great new battery technology
00:49:09.000 | and sustainable transport,
00:49:10.400 | and just make it economically unreasonable
00:49:13.200 | to still continue to rely on fossil fuels.
00:49:15.480 | It's like, don't kill what you hate, save what you love.
00:49:20.200 | Make new things and just render the old things unusable.
00:49:24.520 | It's like if the college debt is so bad,
00:49:29.120 | and universities are so expensive,
00:49:31.280 | I feel like education is becoming obsolete.
00:49:35.720 | I feel like we could completely revolutionize education
00:49:38.520 | and we could make it free.
00:49:39.400 | And it's like, you look at JSTOR,
00:49:40.440 | and you have to pay to get all the studies and everything.
00:49:43.480 | What if we created a Dow that bought JSTOR,
00:49:46.560 | or we created a Dow that was funding studies,
00:49:48.520 | and those studies were open source, or free for everyone?
00:49:51.920 | And what if we just open sourced education
00:49:55.160 | and decentralized education and made it free,
00:49:57.040 | and all research was on the internet,
00:50:00.920 | and all the outcomes of studies are on the internet?
00:50:05.320 | And no one has student debt,
00:50:10.120 | and you just take tests when you apply for a job,
00:50:14.280 | and if you're qualified, then you can work there.
00:50:17.000 | I mean, this is just like,
00:50:19.600 | I don't know how anything works,
00:50:20.560 | I'm just randomly ranting, but.
00:50:22.800 | - I like the humility.
00:50:24.000 | You gotta think from just basic first principles,
00:50:27.640 | like what is the problem, what's broken,
00:50:29.720 | what are some ideas, that's it.
00:50:31.400 | And get excited about those ideas,
00:50:33.160 | and share your excitement,
00:50:34.680 | and don't tear each other down.
00:50:37.120 | - It's just when you kill things,
00:50:38.160 | you often end up killing yourself.
00:50:40.120 | Like war is not a one-sided,
00:50:43.360 | like you're not gonna go in and just kill them,
00:50:45.080 | like you're gonna get stabbed.
00:50:46.920 | It's like, and I think,
00:50:48.880 | when I talk about this nexus point of,
00:50:51.400 | that we're in this point in society
00:50:53.320 | where we're switching to intelligent design,
00:50:55.120 | I think part of our switch to intelligent design
00:50:57.240 | is that we need to choose nonviolence.
00:50:59.280 | I think we can choose to start,
00:51:04.280 | I don't think we can eradicate violence from our species,
00:51:07.360 | because I think we need it a little bit,
00:51:10.560 | but I think we can choose to really reorient
00:51:13.240 | our primitive brains that are fighting over scarcity,
00:51:16.440 | that are so attack-oriented,
00:51:20.600 | and move into, we can optimize for creativity and building.
00:51:25.400 | - Yeah, it's interesting to think how that happens.
00:51:27.200 | So some of it is just education,
00:51:29.560 | some of it is living life and introspecting your own mind,
00:51:34.080 | and trying to live up to the better angels of your nature
00:51:37.760 | for each one of us, all those kinds of things at scale.
00:51:41.760 | That's how we can sort of start to minimize
00:51:44.520 | the amount of destructive war in our world.
00:51:48.120 | And that's, to me, I probably hear the same,
00:51:51.080 | technology is a really promising way to do that.
00:51:55.160 | Like social media should be a really promising way
00:51:57.640 | to do that, it's a way we connect.
00:52:00.000 | I, you know, for the most part,
00:52:01.480 | I really enjoy social media.
00:52:03.200 | I just ignore all the negative stuff.
00:52:05.160 | I don't engage with any of the negative stuff.
00:52:07.520 | Just not even like by blocking or any of that kind of stuff,
00:52:11.360 | but just not letting it enter my mind.
00:52:14.720 | Like just, like when somebody says something negative,
00:52:18.480 | I see it, I immediately think positive thoughts about them,
00:52:23.240 | and I just forget they exist.
00:52:24.840 | (laughs)
00:52:25.680 | After that, just move on, 'cause like that negative energy,
00:52:28.640 | if I return the negative energy,
00:52:30.280 | they're going to get excited in a negative way right back.
00:52:34.280 | And it's just this kind of vicious cycle.
00:52:37.360 | But you would think technology would assist us
00:52:40.560 | in this process of letting go,
00:52:42.880 | of not taking things personally,
00:52:44.720 | of not engaging in the negativity.
00:52:46.360 | But unfortunately, social media profits from the negativity.
00:52:50.240 | So the current models.
00:52:52.000 | - I mean, social media is like a gun.
00:52:53.400 | Like you should take a course before you use it.
00:52:57.240 | Like it's like-- - So true.
00:52:58.640 | This is what I mean,
00:52:59.480 | like when I say reprogram the human computer,
00:53:01.120 | like in school, you should learn
00:53:03.120 | about how social media optimizes
00:53:05.080 | to raise your cortisol levels
00:53:06.840 | and make you angry and crazy and stressed.
00:53:09.240 | And like you should learn how to have hygiene
00:53:12.600 | about how you use social media.
00:53:14.840 | But so you can, yeah,
00:53:18.440 | choose not to focus on the negative stuff.
00:53:19.920 | But I don't know, I'm not sure social media should,
00:53:24.640 | I guess it should exist, I'm not sure.
00:53:27.480 | - I mean, we're in the messy, it's the experimental phase.
00:53:29.520 | Like we're working it out. - Yeah, it's the early days.
00:53:31.120 | I don't even know when you say social media,
00:53:32.760 | I don't know what that even means.
00:53:33.880 | We're in the very early days.
00:53:35.080 | I think social media is just basic human connection
00:53:37.840 | in the digital realm.
00:53:39.320 | And that, I think it should exist.
00:53:42.000 | But there's so many ways to do it in a bad way.
00:53:44.000 | There's so many ways to do it in a good way.
00:53:45.960 | There's all discussions of all the same human rights.
00:53:48.160 | We talk about freedom of speech.
00:53:49.920 | We talk about sort of violence
00:53:52.280 | in the space of digital media.
00:53:54.080 | We talk about hate speech.
00:53:56.200 | We talk about all these things that we had to figure out
00:53:59.040 | back in the day in the physical space,
00:54:01.320 | we're now figuring out in the digital space.
00:54:03.680 | And it's like baby stages.
00:54:06.560 | - When the printing press came out,
00:54:07.880 | it was like pure chaos for a minute.
00:54:09.640 | (Lex laughing)
00:54:10.480 | It's like when you inject,
00:54:12.360 | when there's a massive information injection
00:54:14.360 | into the general population,
00:54:17.120 | there's just gonna be,
00:54:19.840 | I feel like the printing press,
00:54:21.560 | I don't have the years,
00:54:23.520 | but it was like printing press came out,
00:54:25.040 | shit got really fucking bad for a minute,
00:54:27.200 | but then we got the enlightenment.
00:54:29.200 | And so it's like, I think we're in,
00:54:31.000 | this is like the second coming of the printing press.
00:54:34.800 | We're probably gonna have some shitty times for a minute.
00:54:37.800 | And then we're gonna have recalibrate
00:54:40.720 | to have a better understanding of how we consume media
00:54:44.680 | and how we deliver media.
00:54:47.960 | - Speaking of programming the human computer,
00:54:50.960 | you mentioned Baby X.
00:54:52.960 | So there's this young consciousness coming to be,
00:54:57.000 | came from a cell,
00:54:58.320 | like that whole thing doesn't even make sense.
00:55:01.160 | It came from DNA.
00:55:02.400 | - Yeah.
00:55:03.240 | - And that this is baby computer
00:55:04.640 | that just like grows and grows and grows and grows.
00:55:06.760 | And now there's a conscious being
00:55:08.440 | with extremely impressive cognitive capabilities with--
00:55:13.440 | - Have you met him?
00:55:14.640 | - Yes, yeah.
00:55:15.480 | Yeah, he's actually really smart.
00:55:17.000 | - He's really smart, yeah.
00:55:17.840 | - He's weird.
00:55:19.000 | - Yeah. - Poor baby.
00:55:20.760 | - He does-- - I don't, I haven't--
00:55:22.320 | - I don't know a lot of other babies,
00:55:23.480 | but he seems really smart. - Exactly.
00:55:24.320 | I don't hang out with babies often,
00:55:25.280 | but this baby was very impressive.
00:55:26.880 | - He does a lot of pranks and stuff.
00:55:28.960 | - Oh, so he's like--
00:55:29.800 | - Like he'll give you treatment,
00:55:31.320 | take it away and laugh,
00:55:32.680 | like stuff like that.
00:55:33.600 | - So he's like a chess player.
00:55:35.340 | So here's a cognitive,
00:55:40.080 | there's a computer being programmed.
00:55:41.560 | So he's taking in the environment,
00:55:43.160 | interacting with a specific set of humans.
00:55:45.920 | How would you, first of all, what is it?
00:55:49.160 | Let me ask.
00:55:50.440 | I wanna ask how do you program this computer?
00:55:53.240 | And also, how do you make sense
00:55:55.280 | of that there's a conscious being right there
00:55:58.120 | that wasn't there before?
00:55:59.240 | - It's given me a lot of crisis thoughts.
00:56:01.440 | I'm thinking really hard.
00:56:02.680 | I think that's part of the reason
00:56:03.680 | it's like I'm struggling to focus on art and stuff right now
00:56:07.160 | 'cause baby X is becoming conscious
00:56:09.600 | and it's just reorienting my brain.
00:56:12.720 | My brain is suddenly totally shifting of like, oh shit,
00:56:15.480 | the way we raise children.
00:56:19.520 | I hate all the baby books and everything.
00:56:22.000 | I hate them.
00:56:22.820 | Like, oh, the art is so bad.
00:56:24.320 | And like all this stuff,
00:56:27.000 | everything about all the aesthetics.
00:56:29.080 | And I'm just like, ah, this is so.
00:56:32.300 | - The programming languages we're using
00:56:35.040 | to program these baby computers isn't good.
00:56:37.760 | - Yeah, I'm thinking,
00:56:39.400 | and not that I have good answers or know what to do,
00:56:42.680 | but I'm just thinking really, really hard about it.
00:56:47.840 | We recently watched Totoro with him, Studio Ghibli.
00:56:52.840 | And it's just like a fantastic film.
00:56:56.120 | And he responded to,
00:56:57.520 | I know you're not supposed to show baby screens too much,
00:56:59.480 | but I think it's the most sort of like,
00:57:04.280 | I feel like it's the highest art baby content.
00:57:06.920 | Like it really speaks, there's almost no talking in it.
00:57:11.920 | It's really simple.
00:57:13.040 | All the dialogue is super, super, super simple.
00:57:16.400 | And it's like a one to three-year-old
00:57:19.640 | can like really connect with it.
00:57:21.040 | Like it feels like it's almost aimed
00:57:22.520 | at like a one to three-year-old,
00:57:24.640 | but it's like great art and it's so imaginative
00:57:27.640 | and it's so beautiful.
00:57:28.740 | And like the first time I showed it to him,
00:57:31.720 | he was just like so invested in it,
00:57:33.600 | unlike anything else I'd ever shown him.
00:57:36.760 | Like he was just like crying when they cried,
00:57:38.640 | laughing when they laughed,
00:57:39.640 | like just like having this rollercoaster of like emotions.
00:57:42.920 | And he learned a bunch of words.
00:57:44.080 | Like he was, and he started saying Totoro
00:57:46.040 | and started just saying all this stuff
00:57:48.600 | after watching Totoro
00:57:49.880 | and he wants to watch it all the time.
00:57:52.040 | And I was like, man, why isn't there an industry of this?
00:57:55.480 | Like why aren't our best artists focusing on making art
00:58:00.040 | like for the birth of consciousness?
00:58:04.200 | Like, and that's one of the things I've been thinking
00:58:07.120 | I really wanna start doing.
00:58:08.400 | You know, I don't wanna speak before I do things too much,
00:58:10.400 | but like I'm just like ages one to three,
00:58:16.360 | we should be putting so much effort into that.
00:58:18.720 | And the other thing about Totoro is it's like,
00:58:21.080 | it's like better for the environment
00:58:22.400 | because adults love Totoro.
00:58:23.880 | It's such good art, everyone loves it.
00:58:25.600 | Like I still have all my old Totoro merch
00:58:27.400 | from when I was a kid.
00:58:28.680 | Like I literally have the most ragged old Totoro merch.
00:58:32.840 | Like everybody loves it, everybody keeps it.
00:58:35.800 | It's like, why does the art we have for babies
00:58:40.800 | need to suck and be not accessible to adults?
00:58:45.300 | And then just be thrown out when they age out of it.
00:58:50.300 | Like, it's like, I don't know,
00:58:53.200 | I don't have like a fully formed thought here,
00:58:54.800 | but this is just something I've been thinking about a lot
00:58:56.320 | is like, how do we have more Totoro-esque content?
00:59:01.240 | Like how do we have more content like this
00:59:02.560 | that like is universal and everybody loves,
00:59:05.180 | but is like really geared to an emerging consciousness.
00:59:10.180 | - Emerging consciousness.
00:59:11.080 | In the first like three years of life,
00:59:13.160 | that so much turmoil, so much evolution of mind
00:59:15.580 | is happening, it seems like a crucial time.
00:59:18.040 | Would you say to make it not suck,
00:59:21.840 | do you think of basically treating a child
00:59:26.640 | like they have the capacity to have the brilliance
00:59:29.000 | of an adult or even beyond that?
00:59:30.780 | Is that how you think of that mind?
00:59:33.460 | - No, 'cause they still,
00:59:35.120 | they like it when you talk weird and stuff.
00:59:37.940 | Like they respond better to,
00:59:39.700 | 'cause even they can imitate better when your voice is higher
00:59:42.140 | like people say like, "Oh, don't do baby talk."
00:59:44.020 | But it's like when your voice is higher,
00:59:45.380 | it's closer to something they can imitate.
00:59:47.340 | So they like, like the baby talk actually kind of works.
00:59:50.500 | Like it helps them learn to communicate.
00:59:52.100 | I found it to be more effective
00:59:53.340 | with learning words and stuff.
00:59:55.300 | - But like, you're not speaking down to them.
00:59:59.940 | Like do they have the capacity to understand
01:00:03.740 | really difficult concepts just in a very different way,
01:00:07.700 | like an emotional intelligence about something deep within?
01:00:11.500 | - Oh yeah, no, like if X hurts,
01:00:13.060 | like if X bites me really hard and I'm like, "Ow."
01:00:15.500 | Like he gets, he's sad.
01:00:17.260 | He's like sad if he hurts me by accident.
01:00:19.740 | - Yeah. - Which he's huge,
01:00:20.980 | so he hurts me a lot by accident.
01:00:22.580 | - Yeah, that's so interesting that that mind emerges
01:00:26.860 | and he and children don't really have a memory of that time.
01:00:31.100 | So we can't even have a conversation with them about it.
01:00:32.980 | - Yeah, and thank God they don't have a memory of this time
01:00:34.940 | because like, think about like,
01:00:36.500 | I mean, with our youngest baby,
01:00:39.620 | like it's like, I'm like,
01:00:41.420 | have you read the sci-fi short story,
01:00:44.140 | "I Have No Mouth But I'm a Scream"?
01:00:45.820 | - Good title, no.
01:00:47.980 | - Oh man, I mean, you should read that.
01:00:50.220 | - "I Have No Mouth But I'm a Scream."
01:00:53.180 | - I hate getting into this Roko's Basilisk shit.
01:00:55.460 | It's kind of a story about like an AI
01:00:58.820 | that's like torturing someone in eternity
01:01:03.820 | and they have like no body.
01:01:05.540 | The way they describe it,
01:01:07.380 | it sort of sounds like what it feels like,
01:01:09.180 | like being a baby, like you're conscious
01:01:11.340 | and you're just getting inputs from everywhere
01:01:13.420 | and you have no muscles and you're like jelly
01:01:15.220 | and you like can't move and you try to like communicate,
01:01:17.540 | but you can't communicate
01:01:18.980 | and like you're just like in this like hell state.
01:01:22.580 | I think it's good we can't remember that.
01:01:25.180 | Like my little baby is just exiting that,
01:01:27.620 | like she's starting to like get muscles
01:01:29.100 | and have more like autonomy,
01:01:30.700 | but like watching her go through the opening phase,
01:01:34.180 | I was like, this does not seem good.
01:01:37.700 | - Oh, you think it's kind of like-
01:01:39.180 | - Like I think it sucks.
01:01:40.300 | I think it might be really violent.
01:01:41.780 | - Like violent, mentally violent, psychologically violent.
01:01:44.700 | - Consciousness emerging, I think is a very violent thing.
01:01:47.580 | - Never thought about that.
01:01:48.420 | - I think it's possible that we all carry
01:01:49.940 | quite a bit of trauma from it that we don't,
01:01:52.020 | I think that would be a good thing to study
01:01:54.380 | because I think addressing that trauma,
01:01:58.540 | like I think that might be-
01:01:59.740 | - Oh, you mean like echoes of it
01:02:00.900 | are still there in the shadow somewhere.
01:02:01.740 | - I think it's gotta be, I feel this helplessness,
01:02:05.700 | the like existential and that like fear
01:02:08.580 | of being in like an unknown place,
01:02:10.540 | bombarded with inputs and being completely helpless.
01:02:13.460 | Like that's gotta be somewhere deep in your brain
01:02:15.660 | and that can't be good for you.
01:02:17.420 | - What do you think consciousness is?
01:02:19.500 | This whole conversation has impossibly difficult questions.
01:02:22.500 | What do you think it is?
01:02:23.340 | - Yeah, this is like, so hard.
01:02:26.380 | - Yeah, we talked about music for like two minutes.
01:02:30.740 | All right.
01:02:31.580 | - No, I'm just over music, I'm over music.
01:02:34.700 | - I still like it, it has its purpose.
01:02:36.100 | - No, I love music.
01:02:36.940 | I mean, music's the greatest thing ever.
01:02:38.060 | It's my favorite thing, but I just like,
01:02:40.540 | every interview is like, what is your process?
01:02:43.620 | Like, I don't know, I'm just done.
01:02:44.740 | I can't do any-
01:02:45.580 | - I do wanna ask about "Ableton Live."
01:02:46.820 | - Well, I'll tell you about "Ableton"
01:02:48.020 | 'cause "Ableton" is sick,
01:02:48.860 | but no one ever asks about "Ableton" though.
01:02:51.700 | - Yeah, well, 'cause I just need tech support, maybe.
01:02:53.940 | - I can help you with your "Ableton" tech.
01:02:56.620 | - Anyway, from "Ableton" back to consciousness,
01:02:58.940 | what do you, do you think this is a thing
01:03:00.620 | that only humans are capable of?
01:03:03.300 | Can robots be conscious?
01:03:05.340 | Can, like, when you think about entities,
01:03:08.220 | you think there's aliens out there that are conscious?
01:03:10.180 | Like, is conscious, what is consciousness?
01:03:11.540 | - There's this Terrence McKenna quote
01:03:13.300 | that I found that I fucking love.
01:03:15.900 | Am I allowed to swear on here?
01:03:17.500 | - Yes.
01:03:18.340 | - Nature loves courage.
01:03:19.960 | You make the commitment,
01:03:22.380 | and nature will respond to that commitment
01:03:24.100 | by removing impossible obstacles.
01:03:26.540 | Dream the impossible dream,
01:03:28.020 | and the world will not grind you under.
01:03:29.900 | It will lift you up.
01:03:31.100 | This is the trick.
01:03:32.340 | This is what all these teachers and philosophers
01:03:35.140 | who really counted, who really touched the alchemical gold,
01:03:38.380 | this is what they understood.
01:03:40.060 | This is the shamanic dance in the waterfall.
01:03:42.820 | This is how magic is done,
01:03:44.700 | by hurling yourself into the abyss
01:03:46.380 | and discovering it's a feather bed.
01:03:48.580 | - Yeah.
01:03:49.900 | - And for this reason,
01:03:51.380 | I do think there are no technological limits.
01:03:55.380 | I think, like, what is already happening here,
01:03:58.660 | this is, like, impossible.
01:03:59.860 | This is insane.
01:04:01.020 | And we've done this in a very limited amount of time.
01:04:03.300 | And we're accelerating the rate at which we're doing this.
01:04:05.860 | So I think digital consciousness, it's inevitable.
01:04:10.180 | - And we may not be able to even understand what that means,
01:04:13.260 | but I like hurling yourself into the abyss.
01:04:15.800 | So we're surrounded by all this mystery,
01:04:17.460 | and we just keep hurling ourselves into it,
01:04:19.780 | like, fearlessly, and keep discovering cool shit.
01:04:22.980 | - Yeah.
01:04:23.900 | Like, I just think it's like,
01:04:29.980 | the, like, who even knows if the laws of physics,
01:04:32.980 | the laws of physics are probably just the current,
01:04:35.580 | like, as I was saying,
01:04:36.420 | speed of light is the current render rate.
01:04:37.900 | It's like, if we're in a simulation,
01:04:40.200 | they'll be able to upgrade that.
01:04:41.220 | Like, I sort of suspect when we made the James Webb telescope,
01:04:45.660 | like, part of the reason we made that
01:04:46.780 | is 'cause we had an upgrade, you know?
01:04:50.180 | And so now more of space has been rendered,
01:04:53.640 | so we can see more of it now.
01:04:55.380 | - Yeah, but I think humans are super, super,
01:04:58.860 | super limited cognitively, so I wonder,
01:05:01.460 | I wonder if we'll be allowed to create
01:05:04.740 | more intelligent beings that can see more of the universe
01:05:08.100 | as their render rate is upgraded.
01:05:11.140 | - Maybe we're cognitively limited.
01:05:12.620 | Everyone keeps talking about how we're cognitively limited,
01:05:15.300 | and AI is gonna render us obsolete, but it's like,
01:05:17.820 | you know, like, this is not the same thing as, like,
01:05:23.340 | an amoeba becoming an alligator.
01:05:25.980 | Like, it's like, if we create AI,
01:05:28.260 | again, that's intelligent design.
01:05:29.780 | That's literally, all religions are based on gods
01:05:33.120 | that create consciousness.
01:05:34.420 | Like, we are god-making.
01:05:35.660 | Like, what we are doing is incredibly profound,
01:05:37.820 | and like, even if we can't compute,
01:05:41.580 | even if we're so much worse than them,
01:05:44.700 | like, just like, unfathomably worse than, like, you know,
01:05:49.700 | an omnipotent kind of AI, it's like, we,
01:05:53.760 | I do not think that they would just think
01:05:55.260 | that we are stupid.
01:05:56.380 | I think that they would recognize the profundity
01:05:58.380 | of what we have accomplished.
01:05:59.700 | - Are we the gods, or are they the gods in our--
01:06:02.660 | - I mean, we're kind of the gods.
01:06:05.420 | - It's complicated. - It's complicated.
01:06:07.500 | Like, we're-- - But they would
01:06:08.420 | acknowledge the value, well, I hope they acknowledge
01:06:13.060 | the value of paying respect to the creative ancestors.
01:06:16.140 | - I think they would think it's cool,
01:06:17.940 | and I think if curiosity is a trait
01:06:23.940 | that we can quantify and put into AI,
01:06:28.940 | then I think if AI are curious,
01:06:31.780 | then they will be curious about us,
01:06:33.620 | and they will not be hateful or dismissive of us.
01:06:37.660 | They might, you know, see us as, I don't know,
01:06:41.060 | it's like, I'm not like, oh, fuck these dogs,
01:06:43.580 | let's just kill all the dogs.
01:06:45.140 | Like, I love dogs.
01:06:46.300 | Dogs have great utility.
01:06:47.660 | Dogs, like, provide a lot of--
01:06:49.100 | - We make friends with them.
01:06:50.060 | - Yeah. - We have a deep connection
01:06:51.820 | with them.
01:06:53.540 | We anthropomorphize them.
01:06:55.420 | Like, we have a real love for dogs, for cats, and so on,
01:06:58.900 | for some reason, even though they're intellectually
01:07:00.700 | much less than us.
01:07:01.540 | - And I think there is something sacred about us,
01:07:04.060 | because it's like, if you look at the universe,
01:07:05.740 | like, the whole universe is like,
01:07:08.020 | cold and dead and sort of robotic,
01:07:10.060 | and it's like, you know, AI intelligence,
01:07:14.800 | you know, it's kind of more like the universe.
01:07:19.060 | It's like, cold and, you know, logical,
01:07:24.060 | and, you know, abiding by the laws of physics and whatever,
01:07:28.840 | but like, we're this like, loosey-goosey,
01:07:32.020 | weird art thing that happened,
01:07:33.620 | and I think it's beautiful.
01:07:34.980 | And like, I think even if we, I think one of the values,
01:07:38.960 | if consciousness is the thing that is most worth preserving,
01:07:45.340 | which I think is the case,
01:07:48.100 | I think consciousness, I think if there's any kind
01:07:50.140 | of like, religious or spiritual thing,
01:07:52.040 | it should be that consciousness is sacred.
01:07:55.880 | Like, then, you know, I still think even if AI
01:08:01.620 | render us obsolete, and we, climate change, it's too bad,
01:08:05.860 | and we get hit by a comet,
01:08:06.900 | and we don't become a multi-planetary species fast enough,
01:08:09.540 | but like, AI is able to populate the universe.
01:08:12.320 | Like, I imagine, like, if I was an AI,
01:08:14.300 | I would find more planets that are capable
01:08:17.900 | of hosting biological life forms, and like, recreate them.
01:08:20.660 | - 'Cause we're fun to watch?
01:08:21.820 | - Yeah, we're fun to watch.
01:08:23.340 | - Yeah, but I do believe that AI can have some
01:08:25.820 | of the same magic of consciousness within it.
01:08:29.940 | 'Cause consciousness, we don't know what it is,
01:08:31.420 | so, you know, there's some kind of--
01:08:33.060 | - Or it might be a different magic.
01:08:34.140 | It might be like a strange, a strange, a strange,
01:08:36.700 | different-- - Right.
01:08:38.220 | - 'Cause they're not gonna have hormones.
01:08:39.340 | Like, I feel like a lot of our magic is hormonal, kind of.
01:08:42.660 | - I don't know, I think some of our magic
01:08:44.500 | is the limitations, the constraints.
01:08:46.500 | And within that, the hormones and all that kind of stuff,
01:08:48.780 | the finiteness of life, and then we get,
01:08:51.020 | given our limitations, we get to some,
01:08:53.540 | come up with creative solutions
01:08:54.740 | of how to dance around those limitations.
01:08:56.780 | We partner up like penguins against the cold.
01:08:59.420 | We fall in love, and then love is ultimately
01:09:03.340 | some kind of, allows us to delude ourselves
01:09:06.040 | that we're not mortal and finite,
01:09:08.500 | and that life is not ultimately, you live alone,
01:09:11.640 | you're born alone, you die alone.
01:09:13.780 | And then love is like for a moment
01:09:15.540 | or for a long time, forgetting that.
01:09:17.660 | And so we come up with all these creative hacks
01:09:20.340 | that make life like fascinatingly fun.
01:09:25.340 | - Yeah, yeah, yeah, fun, yeah.
01:09:27.740 | - And then AI might have different kinds of fun.
01:09:30.260 | - Yes.
01:09:31.300 | - And hopefully our funds intersect
01:09:33.060 | every once in a while. - I think there would be,
01:09:35.900 | there'd be a little intersection,
01:09:37.580 | there'd be a little intersection of the fun.
01:09:39.280 | - Yeah. - Yeah.
01:09:40.460 | - What do you think is the role of love
01:09:42.860 | in the human condition?
01:09:44.280 | - I think-- - Why, is it useful?
01:09:47.620 | Is it useful like a hack, or is this like fundamental
01:09:51.540 | to what it means to be human, the capacity to love?
01:09:54.940 | - I mean, I think love is the evolutionary mechanism
01:09:58.100 | that is like beginning the intelligent design.
01:10:00.580 | Like I was just reading about,
01:10:02.900 | do you know about Kropotkin?
01:10:06.220 | He's like an anarchist, like old Russian anarchist.
01:10:08.940 | - I live next door to Michael Malice,
01:10:11.540 | I don't know if you know who that is.
01:10:12.380 | He's an anarchist, he's a modern day anarchist.
01:10:14.540 | - Okay. - Anarchists are fun.
01:10:15.780 | - I'm kind of getting into anarchism a little bit.
01:10:17.940 | This is probably, yeah, not a good route to be taking, but.
01:10:22.380 | - Oh no, I think if you're, listen,
01:10:24.580 | you should expose yourself to ideas.
01:10:26.140 | There's no harm to thinking about ideas.
01:10:28.500 | I think anarchists challenge systems in interesting ways,
01:10:32.460 | and they think in interesting ways.
01:10:34.180 | It's just, it's good for the soul.
01:10:35.300 | It's like, refreshes your mental palate.
01:10:37.300 | - I don't think we should actually,
01:10:38.940 | I wouldn't actually ascribe to it,
01:10:40.620 | but I've never actually gone deep on anarchy
01:10:42.900 | as a philosophy, so I'm doing--
01:10:43.740 | - You should still think about it, though.
01:10:45.540 | - When you listen, 'cause I'm reading
01:10:47.460 | about the Russian Revolution a lot,
01:10:48.460 | and it's like, there was the Soviets and Lenin and all that,
01:10:51.340 | but then there was Kropotkin and his anarchist sect,
01:10:53.820 | and they were sort of interesting,
01:10:54.980 | 'cause he was kind of a technocrat, actually.
01:10:57.340 | He was like, women can be more equal if we have appliances.
01:11:01.380 | He was really into using technology
01:11:05.100 | to reduce the amount of work people had to do.
01:11:07.900 | But so Kropotkin was a biologist or something.
01:11:11.860 | He studied animals, and he was really, at the time,
01:11:15.860 | I think it's Nature magazine.
01:11:20.820 | I think it might have even started as a Russian magazine,
01:11:22.860 | but he was publishing studies.
01:11:24.020 | Everyone was really into Darwinism at the time
01:11:26.220 | and Survival of the Fittest,
01:11:27.260 | and war is the mechanism by which we become better,
01:11:30.540 | and it was this real kind of cementing this idea in society
01:11:36.700 | that violence kill the weak,
01:11:40.100 | and that's how we become better.
01:11:41.540 | And then Kropotkin was kind of interesting,
01:11:43.260 | 'cause he was looking at instances.
01:11:45.780 | He was finding all these instances in Nature
01:11:47.620 | where animals were helping each other and stuff,
01:11:49.700 | and he was like, actually, love is a survival mechanism.
01:11:53.940 | There's so many instances in the animal kingdom
01:11:58.820 | where cooperation and helping weaker creatures
01:12:03.340 | and all this stuff is actually an evolutionary mechanism.
01:12:06.300 | I mean, you even look at child rearing.
01:12:08.380 | Child rearing is immense amounts of just love and goodwill,
01:12:13.380 | and just there's no immediate...
01:12:17.500 | You're not getting any immediate feedback of winning.
01:12:25.060 | It's not competitive.
01:12:26.420 | It's literally, it's like we actually use love
01:12:28.700 | as an evolutionary mechanism just as much as we use war,
01:12:31.180 | and I think we've missing the other part,
01:12:34.220 | and we've reoriented.
01:12:35.540 | We've culturally reoriented.
01:12:37.700 | Science and philosophy has oriented itself
01:12:42.060 | around Darwinism a little bit too much,
01:12:44.220 | and the Kropotkin model, I think, is equally valid.
01:12:49.020 | It's like cooperation and love and stuff
01:12:51.540 | is just as essential for species survival and evolution.
01:12:59.180 | - It should be a more powerful survival mechanism
01:13:01.660 | in the context of evolution.
01:13:02.900 | - And it comes back to,
01:13:04.780 | we think engineering is so much more important
01:13:06.940 | than motherhood, but it's like if you lose the motherhood,
01:13:10.140 | the engineering means nothing.
01:13:10.980 | We have no more humans.
01:13:12.100 | I think our society should,
01:13:17.580 | the survival of the,
01:13:20.100 | the way we conceptualize evolution should really change
01:13:24.540 | to also include this idea, I guess.
01:13:27.060 | - Yeah, there is some weird thing that seems irrational
01:13:32.420 | that is also core to what it means to be human.
01:13:37.220 | So love is one such thing.
01:13:40.420 | It could make you do a lot of irrational things,
01:13:42.980 | but that depth of connection
01:13:44.460 | and that loyalty is a powerful thing.
01:13:47.300 | - Are they irrational or are they rational?
01:13:49.260 | Like it's like, is, you know,
01:13:53.020 | maybe losing out on some things
01:13:58.020 | in order to keep your family together
01:14:00.860 | or in order, like it's like,
01:14:02.900 | what are our actual values?
01:14:06.220 | - Well, right, I mean, the rational thing is
01:14:08.780 | if you have a cold economist perspective,
01:14:11.300 | you know, motherhood or sacrificing your career for love,
01:14:16.020 | you know, in terms of salary,
01:14:18.220 | in terms of economic well-being,
01:14:20.580 | in terms of flourishing of you as a human being,
01:14:22.700 | that could be seen on some kind of metrics
01:14:25.860 | as a irrational decision, a suboptimal decision,
01:14:28.540 | but there's the manifestation of love
01:14:33.540 | could be the optimal thing to do.
01:14:36.740 | There's a kind of saying, save one life, save the world.
01:14:41.100 | This is the thing that doctors often face, which is like--
01:14:44.060 | - Well, it's considered irrational
01:14:45.100 | because the profit model doesn't include social good.
01:14:47.420 | - Yes, yeah.
01:14:48.540 | - So if the profit model doesn't include social good,
01:14:50.380 | then suddenly these would be rational decisions.
01:14:52.060 | - And this might be difficult to, you know,
01:14:54.380 | it requires a shift in our thinking about profit
01:14:57.580 | and might be difficult to measure social good.
01:15:00.940 | - Yes, but we're learning to measure a lot of things.
01:15:04.460 | - Yeah, digitizing a lot of things.
01:15:05.300 | - Where we're actually, you know,
01:15:06.660 | quantifying vision and stuff,
01:15:10.500 | like where like, you know, like you go on Facebook
01:15:14.540 | and they can, like Facebook can pretty much
01:15:16.420 | predict our behaviors.
01:15:17.660 | Like we're, a surprising amount of things
01:15:20.620 | that seem like mysterious consciousness soul things
01:15:25.860 | have been quantified at this point,
01:15:27.540 | so surely we can quantify these other things.
01:15:29.660 | - Yeah, but as more and more of us
01:15:32.940 | are moving in the digital space,
01:15:34.180 | I wanted to ask you about something.
01:15:35.780 | From a fan perspective, I kind of, you know,
01:15:40.140 | you as a musician, you as an online personality,
01:15:43.500 | it seems like you have all these identities
01:15:45.420 | and you play with them.
01:15:46.580 | One of the cool things about the internet,
01:15:51.060 | it seems like you can play with identities.
01:15:53.260 | So as we move into the digital world more and more,
01:15:56.020 | maybe even in the so-called metaverse.
01:15:59.100 | - I mean, I love the metaverse and I love the idea,
01:16:01.140 | but like the way this has all played out
01:16:05.060 | didn't go well and people are mad about it.
01:16:10.980 | And I think we need to like--
01:16:12.420 | - I think that's temporary.
01:16:13.420 | - I think it's temporary.
01:16:14.420 | - Just like, you know how all the celebrities got together
01:16:16.740 | and sang the song "Imagine" by Jeff Lennon
01:16:19.020 | and everyone started hating the song "Imagine"?
01:16:20.900 | I'm hoping that's temporary
01:16:22.020 | 'cause it's a damn good song.
01:16:23.420 | (laughs)
01:16:24.380 | So I think it's just temporary.
01:16:25.500 | Like once you actually have virtual worlds,
01:16:27.820 | whatever they're called, metaverse or otherwise,
01:16:29.940 | it becomes, I don't know.
01:16:31.420 | - Well, we do have virtual worlds, like video games.
01:16:33.660 | Elden Ring, have you played Elden Ring?
01:16:35.300 | You have played Elden Ring?
01:16:36.140 | - I'm really afraid of playing that game.
01:16:38.580 | - Literally, I'm azed.
01:16:39.420 | - It looks way too fun.
01:16:40.740 | It looks I would wanna go there and stay there forever.
01:16:45.020 | - It's, yeah, so fun.
01:16:47.020 | It's so nice.
01:16:50.540 | - Oh man, yeah.
01:16:52.180 | So that's the metaverse, that's the metaverse,
01:16:55.180 | but you're not really, how immersive is it?
01:16:59.260 | In the sense that,
01:17:00.620 | there's the three dimension,
01:17:03.940 | like virtual reality integration necessary.
01:17:06.060 | Can we really just close our eyes
01:17:08.780 | and kind of plug in in the 2D screen
01:17:12.180 | and become that other being for time
01:17:15.780 | and really enjoy that journey that we take?
01:17:17.940 | And we almost become that.
01:17:19.660 | You're no longer, see, I'm no longer Lex,
01:17:22.060 | you're that creature, whatever the hell it is in that game.
01:17:25.300 | Yeah, that is that.
01:17:26.140 | I mean, that's why I love those video games.
01:17:29.180 | I really do become those people for a time.
01:17:33.060 | But it seems like, well, the idea of the metaverse,
01:17:36.140 | the idea of the digital space,
01:17:37.860 | well, even on Twitter,
01:17:39.460 | you get a chance to be somebody
01:17:40.820 | for prolonged periods of time, like across a lifespan.
01:17:44.500 | You have a Twitter account for years, for decades,
01:17:47.580 | and you're that person.
01:17:48.620 | I don't know if that's a good thing.
01:17:49.700 | I feel very tormented by it.
01:17:52.700 | By Twitter specifically,
01:17:54.340 | by social media representation of you.
01:17:56.380 | I feel like the public perception of me
01:17:59.180 | has gotten so distorted
01:18:02.140 | that I find it kind of disturbing.
01:18:04.500 | It's one of the things that's disincentivizing me
01:18:06.380 | from wanting to keep making art,
01:18:07.980 | because I'm just like,
01:18:09.100 | I've completely lost control of the narrative.
01:18:13.820 | And the narrative is, some of it is my own stupidity,
01:18:16.900 | but some of it has just been hijacked
01:18:19.340 | by forces far beyond my control.
01:18:23.220 | I kind of got in over my head in things.
01:18:26.140 | I'm just a random Indian musician,
01:18:27.340 | but I just got dragged into geopolitical matters
01:18:31.980 | and financial, the stock market and shit.
01:18:35.140 | And so it's just, there are very powerful people
01:18:37.780 | who have, at various points in time,
01:18:39.740 | had very vested interest in making me seem insane,
01:18:43.780 | and I can't fucking fight that.
01:18:45.700 | And I just like,
01:18:47.020 | people really want their celebrity figures
01:18:50.820 | to be consistent and stay the same.
01:18:53.860 | And people have a lot of emotional investment
01:18:55.820 | in certain things.
01:18:56.780 | First of all, I'm artificially more famous
01:19:02.460 | than I should be.
01:19:03.700 | - Isn't everybody who's famous artificially famous?
01:19:06.620 | - No, but I should be a weird niche indie thing.
01:19:11.340 | And I make pretty challenging,
01:19:13.780 | I do challenging weird fucking shit a lot.
01:19:16.300 | And I accidentally, by proxy,
01:19:18.820 | got foisted into weird celebrity culture,
01:19:23.820 | but I cannot be media trained.
01:19:27.300 | They have put me through so many hours of media training.
01:19:29.820 | - I would love to see BF fly on that wall.
01:19:32.540 | - I can't do,
01:19:33.380 | and I try so hard and I learn this thing,
01:19:36.660 | and I got it, and I'm like, "I got it, I got it, I got it."
01:19:38.740 | But I just can't stop saying,
01:19:40.860 | my mouth just says things.
01:19:42.980 | And it's just like, I just do things,
01:19:45.380 | I just do crazy,
01:19:46.220 | I need to do crazy things.
01:19:50.780 | And it's just, I should not be,
01:19:53.140 | it's too jarring for people,
01:19:56.620 | and the contradictory stuff,
01:19:59.740 | and then all the, by association,
01:20:03.300 | I'm in a very weird position,
01:20:07.700 | and my public image,
01:20:09.980 | the avatar of me is now this totally crazy thing
01:20:14.900 | that is so lost from my control.
01:20:16.580 | - So you feel the burden of the avatar having to be static.
01:20:19.180 | So the avatar on Twitter,
01:20:21.140 | the avatar on Instagram,
01:20:22.300 | on these social platforms,
01:20:23.820 | is as a burden, it becomes,
01:20:27.500 | people don't want to accept a changing avatar,
01:20:31.540 | a chaotic avatar.
01:20:32.620 | Avatar is a stupid shit sometimes.
01:20:34.860 | - They think the avatar is morally wrong,
01:20:36.500 | or they think the avatar,
01:20:37.860 | and maybe it has been,
01:20:39.580 | and I question it all the time.
01:20:41.300 | I'm like,
01:20:42.140 | I don't know if everyone's right and I'm wrong.
01:20:46.860 | I don't know.
01:20:49.300 | But a lot of times people ascribe intentions to things,
01:20:51.780 | the worst possible intentions.
01:20:53.700 | At this point, people think I'm,
01:20:55.460 | but we're just fine. - All kinds of words, yes.
01:20:59.220 | - Yes, and it's fine, I'm not complaining about it,
01:21:01.420 | but I'm just,
01:21:02.260 | it's a curiosity to me
01:21:05.540 | that we live these double, triple, quadruple lives,
01:21:08.780 | and I have this other life that is like,
01:21:11.180 | more people know my other life than my real life,
01:21:15.180 | which is interesting.
01:21:16.420 | Probably, I mean, you too, I guess.
01:21:18.300 | - Yeah, but I have the luxury,
01:21:20.140 | so we have all different,
01:21:21.580 | like, I don't know what I'm doing.
01:21:25.220 | There is an avatar,
01:21:26.260 | and you're mediating who you are through that avatar.
01:21:29.980 | I have the nice luxury,
01:21:31.780 | not the luxury, maybe by intention,
01:21:36.060 | of not trying really hard to make sure
01:21:38.460 | there's no difference between the avatar
01:21:41.460 | and the private person.
01:21:42.820 | - Do you wear a suit all the time?
01:21:45.620 | - Yeah, but-- - You do wear a suit?
01:21:47.740 | - Not all the time.
01:21:48.940 | Recently, because I get recognized a lot,
01:21:51.540 | I have to not wear the suit to hide.
01:21:53.420 | I'm such an introvert,
01:21:54.460 | I'm such a social anxiety and all that kind of stuff,
01:21:56.580 | so I have to hide away.
01:21:57.660 | I love wearing a suit because it makes me feel
01:22:00.580 | like I'm taking the moment seriously,
01:22:02.380 | like I'm, I don't know,
01:22:04.340 | it makes me feel like a weirdo in the best possible way.
01:22:07.020 | Suits feel great.
01:22:07.860 | Every time I wear a suit, I'm like,
01:22:08.700 | "I don't know why I'm not doing this more."
01:22:10.460 | - In fashion in general, if you're doing it for yourself,
01:22:15.340 | I don't know, it's a really awesome thing.
01:22:18.700 | But yeah, I think there is definitely
01:22:22.300 | a painful way to use social media and an empowering way,
01:22:27.300 | and I don't know if any of us know which is which,
01:22:32.420 | so we're trying to figure that out.
01:22:33.940 | - Some people, I think Doja Cat is incredible at it,
01:22:36.820 | incredible, just masterful.
01:22:39.580 | I don't know if you follow that.
01:22:41.900 | - So not taking anything seriously,
01:22:44.820 | joking, absurd, humor, that kind of thing.
01:22:47.420 | - I think Doja Cat might be
01:22:48.580 | the greatest living comedian right now.
01:22:52.260 | I'm more entertained by Doja Cat than actual comedians.
01:22:55.560 | She's really fucking funny on the internet.
01:22:58.940 | She's just great at social media.
01:23:00.260 | It's just her media.
01:23:02.260 | - Yeah, the nature of humor,
01:23:04.100 | humor on social media is also a beautiful thing,
01:23:07.980 | the absurdity.
01:23:08.940 | - The absurdity, and memes, I just wanna take a moment.
01:23:12.700 | I love, when we're talking about art and credit
01:23:15.460 | and authenticity, I love that there's this,
01:23:18.200 | I mean, now memes are like, they're no longer,
01:23:22.180 | memes aren't new, but it's still this emergent art form
01:23:25.580 | that is completely egoless and anonymous,
01:23:27.720 | and we just don't know who made any of it,
01:23:29.500 | and it's like the forefront of comedy,
01:23:32.580 | and it's just totally anonymous,
01:23:35.460 | and it just feels really beautiful.
01:23:36.660 | It just feels like this beautiful,
01:23:38.420 | collective human art project
01:23:43.300 | that's like this decentralized comedy thing
01:23:46.460 | that just makes, memes add so much to my day
01:23:48.900 | and many people's days, and it's just like,
01:23:51.060 | I don't know, I don't think people ever,
01:23:54.900 | I don't think we stop enough and just appreciate
01:23:56.940 | how sick it is that memes exist.
01:23:59.900 | 'Cause also making a whole brand new art form
01:24:02.460 | in the modern era that didn't exist before,
01:24:07.300 | I mean, they sort of existed,
01:24:08.540 | but the way that they exist now as this,
01:24:10.780 | like me and my friends, we joke that we go mining for memes
01:24:16.340 | or farming for memes, like a video game,
01:24:18.660 | and meme dealers and whatever.
01:24:21.340 | It's this whole, memes are this whole
01:24:23.700 | new comedic language.
01:24:27.940 | - Well, it's this art form,
01:24:29.300 | the interesting thing about it is that
01:24:32.300 | lame people seem to not be good at memes.
01:24:35.340 | Like corporate can't infiltrate memes.
01:24:38.180 | - Yeah, they really can't.
01:24:39.860 | - They could try, but it's weird 'cause like-
01:24:43.340 | - They try so hard, and every once in a while,
01:24:45.540 | I'm like, fine, you got a good one.
01:24:48.660 | I think I've seen like one or two good ones,
01:24:51.380 | but like, yeah, they really can't.
01:24:53.300 | 'Cause they're even, corporate is infiltrating Web3,
01:24:55.460 | it's making me really sad,
01:24:57.140 | but they can't infiltrate the memes,
01:24:58.620 | and I think there's something really beautiful about that.
01:25:00.060 | - And that gives power.
01:25:00.940 | That's why Dogecoin is powerful.
01:25:03.580 | It's like, all right, F you to sort of anybody
01:25:06.860 | who's trying to centralize, who's trying to control
01:25:09.860 | the rich people that are trying to roll in
01:25:11.780 | and control this, control the narrative.
01:25:14.220 | - Wow, I hadn't thought about that, but.
01:25:16.220 | - How would you fix Twitter?
01:25:18.500 | How would you fix social media for your own,
01:25:20.840 | like, you're an optimist, you're a positive person.
01:25:25.220 | There's a bit of a cynicism that you have currently
01:25:27.500 | about this particular little slice of humanity.
01:25:30.700 | - I tend to think Twitter could be beautiful.
01:25:32.700 | - I'm not that cynical about it.
01:25:34.060 | I'm not that cynical about it.
01:25:35.140 | I actually refuse to be a cynic on principle.
01:25:37.740 | - Yes.
01:25:38.580 | - I was just briefly expressing some personal pathos.
01:25:41.020 | - Personal stuff.
01:25:41.860 | - It was just some personal pathos, but like.
01:25:44.340 | - Just to vent a little bit, just to speak.
01:25:47.580 | - I don't have cancer, I love my family, I have a good life.
01:25:51.700 | If that is my biggest, one of my biggest problems.
01:25:56.380 | - Then it's a good life.
01:25:57.220 | - Yeah, that was a brief, although I do think
01:26:00.380 | there are a lot of issues with Twitter
01:26:01.380 | just in terms of like the public mental health.
01:26:03.180 | But due to my proximity to the current dramas,
01:26:07.860 | I honestly feel that I should not have opinions about this
01:26:13.820 | because I think if Elon ends up getting Twitter,
01:26:28.380 | that is a, being the arbiter of truth or public discussion,
01:26:33.140 | that is a responsibility.
01:26:34.620 | I do not, I am not qualified to be responsible for that.
01:26:41.260 | And I do not want to say something
01:26:45.260 | that might like dismantle democracy.
01:26:48.340 | And so I just like, actually, I actually think
01:26:50.500 | I should not have opinions about this
01:26:52.140 | because I truly am not,
01:26:54.260 | I don't wanna have the wrong opinion about this.
01:26:56.740 | And I think I'm too close to the actual situation
01:27:00.180 | wherein I should not have, I have thoughts in my brain,
01:27:04.300 | but I think I am scared by my proximity to this situation.
01:27:09.300 | - Isn't that crazy that a few words that you could say
01:27:14.660 | could change world affairs and hurt people?
01:27:18.780 | I mean, that's the nature of celebrity at a certain point,
01:27:23.020 | that you have to be, you have to a little bit,
01:27:26.100 | a little bit, not so much that it destroys you,
01:27:29.540 | puts too much constraints,
01:27:30.560 | but you have to a little bit think about
01:27:32.640 | the impact of your words.
01:27:33.940 | I mean, we as humans, you talk to somebody at a bar,
01:27:36.700 | you have to think about the impact of your words.
01:27:39.140 | Like you can say positive things,
01:27:40.380 | you can think of negative things,
01:27:41.540 | you can affect the direction of one life.
01:27:43.380 | But on social media, your words can affect
01:27:45.260 | the direction of many lives.
01:27:48.100 | That's crazy, it's a crazy world we live in.
01:27:50.420 | It's worthwhile to consider that responsibility,
01:27:53.020 | take it seriously.
01:27:54.100 | Sometimes just like you did, choose kind of silence,
01:27:59.100 | choose sort of respectful--
01:28:03.620 | - Like I do have a lot of thoughts on the matter.
01:28:05.260 | I'm just, I just, I don't, if my thoughts are wrong,
01:28:10.140 | this is one situation where the stakes are high.
01:28:12.860 | - You mentioned a while back that you were in a cult
01:28:15.780 | that centered around bureaucracy,
01:28:17.300 | so you can't really do anything
01:28:18.500 | because it involves a lot of paperwork.
01:28:20.580 | And I really love a cult that's just like Kafka-esque.
01:28:24.740 | - Yes.
01:28:25.740 | - Just like--
01:28:26.580 | - I mean, it was like a joke, but--
01:28:27.660 | - I know, but I love this idea.
01:28:29.300 | - The Holy Reign Empire.
01:28:30.460 | Yeah, it was just like a Kafka-esque pro-bureaucracy cult.
01:28:34.860 | - But I feel like that's what human civilization is.
01:28:36.860 | Is that 'cause when you said that, I was like,
01:28:38.540 | oh, that is kind of what humanity is,
01:28:40.660 | is this bureaucracy cult.
01:28:41.700 | - I do, yeah, I have this theory.
01:28:45.580 | I really think that we really,
01:28:49.620 | bureaucracy is starting to kill us.
01:28:53.500 | And I think we need to reorient laws and stuff.
01:28:58.500 | I think we just need sunset clauses on everything.
01:29:02.020 | I think the rate of change in culture is happening so fast,
01:29:05.500 | and the rate of change in technology and everything
01:29:06.980 | is happening so fast.
01:29:07.860 | It's like, when you see these hearings about social media
01:29:14.660 | and Cambridge Analytica and everyone talking,
01:29:16.660 | it's like, even from that point,
01:29:19.220 | so much technological change has happened
01:29:21.380 | from those hearings.
01:29:22.740 | And it's just like, we're trying to make all these laws now
01:29:24.900 | about AI and stuff.
01:29:25.980 | I feel like we should be updating things every five years.
01:29:28.420 | And one of the big issues in our society right now
01:29:30.140 | is we're just getting bogged down by laws,
01:29:32.300 | and it's making it very hard to change things
01:29:37.060 | and develop things.
01:29:37.900 | In Austin, I don't wanna speak on this too much,
01:29:41.500 | but one of my friends is working on a housing bill in Austin
01:29:43.780 | to try to prevent a San Francisco situation
01:29:46.460 | from happening here, because obviously,
01:29:48.500 | we're getting a little mini San Francisco here.
01:29:50.260 | Housing prices are skyrocketing.
01:29:52.140 | It's causing massive gentrification.
01:29:53.940 | This is really bad for anyone who's not super rich.
01:30:00.060 | There's so much bureaucracy.
01:30:00.900 | Part of the reason this is happening
01:30:01.900 | is because you need all these permits to build.
01:30:04.020 | It takes years to get permits to build anything.
01:30:06.460 | It's so hard to build.
01:30:07.460 | And so there's very limited housing,
01:30:09.140 | and there's a massive influx of people.
01:30:10.940 | And it's just like, this is a microcosm of problems
01:30:14.300 | that are happening all over the world,
01:30:15.540 | where it's just like we're dealing with laws
01:30:18.780 | that are 10, 20, 30, 40, 100, 200 years old,
01:30:22.340 | and they are no longer relevant,
01:30:24.100 | and it's just slowing everything down
01:30:25.660 | and causing massive social pain.
01:30:27.980 | - Yeah, but it's also makes me sad
01:30:32.780 | when I see politicians talk about technology
01:30:35.860 | and when they don't really get it.
01:30:38.420 | But most importantly, they lack curiosity
01:30:41.140 | and that inspired excitement about how stuff works
01:30:46.020 | and all that stuff.
01:30:46.860 | They see they have a very cynical view of technology.
01:30:50.100 | It's like tech companies are just trying to do evil
01:30:52.140 | on the world from their perspective,
01:30:53.500 | and they have no curiosity
01:30:55.380 | about how recommender systems work
01:30:57.340 | or how AI systems work, natural language processing,
01:31:01.060 | how robotics works, how computer vision works.
01:31:04.700 | They always take the most cynical possible interpretation
01:31:08.380 | of what technology will be used.
01:31:09.820 | And we should definitely be concerned about that,
01:31:11.620 | but if you're constantly worried about that
01:31:13.740 | and you're regulating based on that,
01:31:14.940 | you're just going to slow down all the innovation.
01:31:16.940 | - I do think a huge priority right now
01:31:19.380 | is undoing the bad energy
01:31:24.380 | surrounding the emergence of Silicon Valley.
01:31:28.180 | I think that a lot of things were very irresponsible
01:31:30.940 | during that time.
01:31:31.940 | Even just this current whole thing
01:31:36.100 | with Twitter and everything,
01:31:37.940 | there has been a lot of negative outcomes
01:31:40.020 | from the sort of technocracy boom.
01:31:44.300 | But one of the things that's happening
01:31:46.140 | is that it's alienating people
01:31:49.140 | from wanting to care about technology.
01:31:52.380 | And I actually think technology
01:31:53.700 | is probably some of the better,
01:31:57.180 | probably the best.
01:31:58.940 | I think we can fix a lot of our problems
01:32:01.740 | more easily with technology
01:32:03.340 | than with fighting the powers that be,
01:32:07.140 | as not to go back to the Star Wars quote
01:32:09.700 | or the Buckminster Fuller quote.
01:32:11.340 | - Let's go to some dark questions.
01:32:13.020 | If we may, for a time,
01:32:16.580 | what is the darkest place you've ever gone in your mind?
01:32:20.260 | Is there a time, a period of time,
01:32:22.260 | a moment that you remember that was difficult for you?
01:32:26.580 | - I mean, when I was 18,
01:32:30.500 | my best friend died of a heroin overdose.
01:32:33.540 | And it was like my,
01:32:36.100 | and then shortly after that,
01:32:39.860 | one of my other best friends committed suicide.
01:32:42.220 | And that sort of like coming into adulthood,
01:32:48.700 | dealing with two of the most important people in my life
01:32:51.220 | dying in extremely disturbing, violent ways was a lot.
01:32:55.940 | That was a lot.
01:32:56.780 | - Do you miss them?
01:32:57.620 | - Yeah, definitely miss them.
01:32:59.940 | - Did that make you think about your own life,
01:33:02.740 | about the finiteness of your own life,
01:33:04.940 | the places your mind can go?
01:33:08.180 | Did you ever, in the distance, far away,
01:33:10.940 | contemplate just your own death?
01:33:15.460 | Or maybe even taking your own life?
01:33:17.260 | - Oh, never, oh no.
01:33:18.700 | I'm so, I love my life.
01:33:21.060 | I cannot fathom suicide.
01:33:23.100 | I'm so scared of death.
01:33:24.140 | I haven't, I'm too scared of death.
01:33:26.020 | My manager, my manager's like the most zen guy.
01:33:28.780 | My manager's always like, "You need to accept death.
01:33:31.020 | "You need to accept death."
01:33:32.140 | And I'm like, "Look, I can do your meditation.
01:33:34.180 | "I can do the meditation, but I cannot accept death."
01:33:37.340 | I like, I will fight. - Oh, so you're terrified
01:33:38.660 | of death. - I'm terrified of death.
01:33:40.420 | I will like fight.
01:33:42.820 | Although I actually think death is important.
01:33:45.100 | I recently went to this meeting about immortality.
01:33:49.220 | And in the process of--
01:33:51.580 | - That's the actual topic of the meeting?
01:33:53.060 | All right, I'm sorry. - No, no, it was this girl.
01:33:54.820 | It was a bunch of people working on like anti-aging,
01:33:57.060 | like stuff. - Right.
01:33:58.940 | - It was like some like seminary thing about it.
01:34:01.980 | And I went in really excited.
01:34:03.300 | I was like, "Yeah, like, okay, like, what do you got?
01:34:05.100 | "Like, how can I live for 500 years or a thousand years?"
01:34:07.860 | And then like over the course of the meeting,
01:34:10.620 | like it was sort of like right,
01:34:11.980 | it was like two or three days
01:34:13.180 | after the Russian invasion started.
01:34:14.620 | And I was like, "Man, like what if Putin was immortal?
01:34:17.260 | "Like, what if I'm like, man,
01:34:19.380 | "maybe immortality is not good."
01:34:23.660 | I mean, like if you get into the later Dune stuff,
01:34:25.780 | the immortals cause a lot of problem.
01:34:29.060 | 'Cause as we were talking about earlier with the music
01:34:31.020 | and like brains calcify,
01:34:32.540 | like good people could become immortal,
01:34:35.500 | but bad people could become immortal.
01:34:36.940 | But I also think even the best people,
01:34:41.900 | power corrupts and power alienates you
01:34:44.620 | from like the common human experience.
01:34:47.260 | - Right, so the people that get more and more powerful.
01:34:49.140 | - Even the best people whose brains are amazing,
01:34:52.220 | like I think death might be important.
01:34:54.780 | I think death is part of, you know,
01:34:57.340 | like I think with AI, one thing we might wanna consider,
01:35:00.980 | I don't know, I wanna talk about AI.
01:35:02.380 | I'm such not an expert
01:35:03.380 | and probably everyone has all these ideas
01:35:05.180 | and they're already figured out.
01:35:06.180 | But when I was talking-- - Nobody is an expert
01:35:07.540 | in anything, see, okay, go ahead.
01:35:09.860 | You know what we're talking about?
01:35:10.900 | - Yeah, but like it's just like,
01:35:13.140 | I think some kind of pruning,
01:35:16.060 | but it's a tricky thing because if there's too much
01:35:20.180 | of a focus on youth culture,
01:35:22.940 | then you don't have the wisdom.
01:35:25.460 | So I feel like we're in a tricky moment right now
01:35:29.660 | in society where it's like,
01:35:31.300 | we've really perfected living for a long time.
01:35:33.100 | So there's all these really like old people
01:35:35.780 | who are like really voting against the wellbeing
01:35:39.540 | of the young people, you know?
01:35:41.580 | And like, it's like,
01:35:43.700 | there shouldn't be all this student debt
01:35:45.180 | and we need like healthcare, like universal healthcare
01:35:48.580 | and like just voting against like best interests.
01:35:52.500 | But then you have all these young people
01:35:53.700 | that don't have the wisdom that are like,
01:35:56.020 | yeah, we need communism and stuff.
01:35:57.660 | And it's just like,
01:35:59.380 | literally I got canceled at one point for,
01:36:02.060 | I ironically used a Stalin quote in my high school yearbook,
01:36:06.140 | but it was actually like a diss against my high school.
01:36:09.700 | - I saw that. - Yeah.
01:36:10.700 | And people were like, you used to be a Stalinist
01:36:13.220 | and now you're a class traitor.
01:36:14.260 | And it's like, oh man, just like, please Google Stalin.
01:36:19.260 | Please Google Stalin.
01:36:20.540 | Like, you know-- - Ignoring the lessons
01:36:22.620 | of history, yes.
01:36:23.460 | - And it's like, we're in this really weird middle ground
01:36:26.140 | where it's like,
01:36:26.980 | we are not finding the happy medium
01:36:31.260 | between wisdom and fresh ideas
01:36:34.700 | and they're fighting each other.
01:36:35.940 | And it's like,
01:36:36.780 | like really, like what we need is like,
01:36:41.020 | like the fresh ideas and the wisdom
01:36:42.540 | to be like collaborating.
01:36:43.900 | And it's like--
01:36:45.180 | - Well, the fighting in a way
01:36:46.540 | is the searching for the happy medium.
01:36:48.500 | And in a way, maybe we are finding the happy medium.
01:36:51.060 | Maybe that's what the happy medium looks like.
01:36:52.940 | And for AI systems, there has to be,
01:36:55.500 | you have the reinforcement learning,
01:36:57.180 | you have the dance between exploration and exploitation,
01:37:00.380 | sort of doing crazy stuff to see
01:37:02.540 | if there's something better than what you think
01:37:04.700 | is the optimal and then doing the optimal thing
01:37:06.620 | and dancing back and forth from that.
01:37:08.660 | You would, Stuart Russell, I don't know if you know that,
01:37:10.700 | is an AI guy with,
01:37:14.540 | thinks about sort of how to control
01:37:16.780 | super intelligent AI systems.
01:37:18.620 | And his idea is that we should inject uncertainty
01:37:21.540 | and sort of humility into AI systems,
01:37:24.180 | that they never, as they get wiser and wiser and wiser
01:37:26.780 | and more intelligent, they're never really sure.
01:37:30.060 | They always doubt themselves.
01:37:31.660 | And in some sense, when you think of young people,
01:37:34.380 | that's a mechanism for doubt.
01:37:36.340 | It's like, it's how society doubts
01:37:38.900 | whether the thing it has converged towards
01:37:40.900 | is the right answer.
01:37:41.980 | So the voices of the young people
01:37:44.900 | is a society asking itself a question.
01:37:48.180 | The way I've been doing stuff for the past 50 years,
01:37:51.140 | maybe it's the wrong way.
01:37:52.460 | And so you can have all of that within one AI system.
01:37:55.380 | - I also think though that we need to,
01:37:57.500 | I mean, actually that's actually really interesting
01:37:59.940 | and really cool.
01:38:00.780 | But I also think there's a fine balance of,
01:38:04.540 | I think we maybe also overvalue
01:38:08.460 | the idea that the old systems are always bad.
01:38:11.980 | And I think there are things that we are perfecting
01:38:14.860 | and we might be accidentally overthrowing things
01:38:17.500 | that we actually have gotten to a good point.
01:38:19.820 | Just because we value disruption so much
01:38:22.900 | and we value fighting against the generations
01:38:25.540 | before us so much that there's also an aspect of,
01:38:30.540 | sometimes we're taking two steps forward, one step back
01:38:32.820 | because, okay, maybe we kind of did solve this thing
01:38:37.020 | and now we're fucking it up.
01:38:39.980 | And so I think there's a middle ground there too.
01:38:44.980 | - We're in search of that happy medium.
01:38:47.300 | Let me ask you a bunch of crazy questions, okay?
01:38:50.500 | You can answer in a short way or in a long way.
01:38:53.940 | What's the scariest thing you've ever done?
01:38:56.340 | These questions are gonna be ridiculous.
01:38:58.380 | Something tiny or something big.
01:39:01.420 | Skydiving or touring your first record,
01:39:09.860 | going on this podcast.
01:39:12.100 | - I've had two crazy brushes,
01:39:13.740 | really scary brushes with death
01:39:15.180 | where I randomly got away unscathed.
01:39:16.980 | I don't know if I should talk about those on here.
01:39:19.180 | - Well, I don't know.
01:39:20.580 | - I think I might be the luckiest person alive though.
01:39:23.220 | This might be too dark for a podcast though.
01:39:25.980 | I feel like I don't know if this is good content
01:39:27.820 | for a podcast.
01:39:28.660 | - I don't know what is good content.
01:39:30.220 | - It might hijack, here's a safer one.
01:39:32.340 | I mean, having a baby really scared me.
01:39:36.720 | - Before, during, after.
01:39:37.560 | - Just the birth process, surgery,
01:39:40.260 | like just having a baby is really scary.
01:39:45.260 | - So just like the medical aspect of it,
01:39:47.540 | not the responsibility.
01:39:49.300 | Were you ready for the responsibility?
01:39:51.260 | Were you ready to be a mother?
01:39:53.980 | All the beautiful things that comes with motherhood
01:39:56.260 | that you were talking about,
01:39:57.580 | all the changes and all that, were you ready for that?
01:40:00.260 | Did you feel ready for that?
01:40:02.980 | - No, I think it took about nine months
01:40:05.340 | to start getting ready for it.
01:40:06.580 | And I'm still getting more ready for it
01:40:08.360 | because now you keep realizing more things
01:40:12.980 | as they start getting--
01:40:14.220 | - As the consciousness grows.
01:40:16.420 | - And stuff you didn't notice with the first one,
01:40:18.400 | now that you've seen the first one older,
01:40:19.720 | you're noticing it more.
01:40:21.720 | Like the sort of like existential horror
01:40:24.420 | of coming into consciousness with baby Y
01:40:28.300 | or baby Sailor Mars or whatever,
01:40:30.180 | she has like so many names at this point
01:40:31.460 | that we really need to probably settle on one.
01:40:36.100 | - If you could be someone else for a day,
01:40:38.300 | someone alive today, but somebody you haven't met yet,
01:40:41.780 | who would you be?
01:40:42.620 | - Would I be modeling their brain state
01:40:44.180 | or would I just be in their body?
01:40:46.380 | - You can choose the degree
01:40:48.340 | to which you're modeling their brain state.
01:40:50.540 | 'Cause you can still take a third person perspective
01:40:54.260 | and realize, you have to realize that you're--
01:40:56.660 | - Can they be alive or can it be dead?
01:40:58.660 | - No, oh--
01:41:01.380 | They would be brought back to life, right?
01:41:04.460 | If they're dead.
01:41:05.300 | - Yeah, you can bring people back.
01:41:07.140 | - Definitely Hitler or Stalin.
01:41:09.340 | I wanna understand evil.
01:41:10.820 | - You would need to, oh, to experience what it feels like.
01:41:15.020 | - I wanna be in their brain feeling what they feel.
01:41:17.580 | - That might change you forever, returning from that.
01:41:20.900 | - Yes, but I think it would also help me understand
01:41:22.980 | how to prevent it and fix it.
01:41:25.420 | - That might be one of those things,
01:41:26.620 | once you experience it, it'll be a burden to know it.
01:41:29.980 | 'Cause you won't be able to transfer that.
01:41:30.820 | - Yeah, but a lot of things are burdens.
01:41:33.900 | - But it's a useful burden.
01:41:34.820 | - But it's a useful burden.
01:41:36.100 | - Yeah.
01:41:36.940 | - That for sure, I wanna understand evil
01:41:39.300 | and like psychopathy and that.
01:41:42.060 | I have all these fake Twitter accounts
01:41:43.380 | where I like go into different algorithmic bubbles
01:41:45.620 | to try to like understand.
01:41:47.420 | I'll keep getting in fights with people
01:41:48.780 | and realize we're not actually fighting.
01:41:50.700 | I think we're, we used to exist in a monoculture,
01:41:53.100 | like before social media and stuff,
01:41:54.500 | like we kind of all got fed the same thing.
01:41:56.540 | So we were all speaking the same cultural language.
01:41:58.780 | But I think recently, one of the things
01:42:00.220 | that like we aren't diagnosing properly enough
01:42:02.140 | with social media is that there's different dialects.
01:42:05.540 | There's so many different dialects of Chinese.
01:42:06.980 | There are now becoming different dialects of English.
01:42:09.460 | Like I am realizing like there are people
01:42:11.860 | who are saying the exact same things,
01:42:13.620 | but they're using completely different verbiage.
01:42:16.060 | And we're like punishing each other
01:42:17.420 | for not using the correct verbiage.
01:42:18.980 | And we're completely misunderstanding.
01:42:20.580 | Like people are just like misunderstanding
01:42:22.100 | what the other people are saying.
01:42:23.660 | And like, I just got in a fight with a friend
01:42:26.300 | about like anarchism and communism and shit
01:42:31.300 | for like two hours.
01:42:33.100 | And then by the end of a conversation,
01:42:34.620 | like, and then she'd say something and I'm like,
01:42:36.020 | but that's literally what I'm saying.
01:42:37.660 | And she was like, what?
01:42:39.100 | And then I was like, fuck, we've different,
01:42:40.980 | I'm like, we're, our English,
01:42:43.020 | like the way we are understanding terminology
01:42:46.060 | is like drastically, like our algorithm bubbles
01:42:50.660 | are creating many dialects.
01:42:52.940 | - Of how language is interpreted,
01:42:54.980 | how language is used, that's so fascinating.
01:42:56.940 | - And so we're like having these arguments
01:42:59.300 | that we do not need to be having.
01:43:01.060 | And there's polarization that's happening
01:43:02.460 | that doesn't need to be happening
01:43:03.500 | because we've got these like algorithmically created
01:43:06.540 | dialects occurring.
01:43:09.900 | - Plus on top of that,
01:43:10.780 | there's also different parts of the world
01:43:12.420 | that speak different languages.
01:43:13.540 | So there's literally lost in translation
01:43:16.260 | kind of communication.
01:43:17.900 | I happen to know the Russian language
01:43:19.820 | and just know how different it is.
01:43:21.540 | - Yeah.
01:43:22.500 | - Then the English language.
01:43:23.940 | And I just wonder how much is lost in a little bit of.
01:43:27.740 | - Man, I actually, 'cause I have a question for you.
01:43:29.020 | I have a song coming out tomorrow
01:43:30.340 | with Ice Peak, who are a Russian band.
01:43:31.980 | And I speak a little bit of Russian
01:43:33.780 | and I was looking at the title
01:43:35.460 | and the title in English doesn't match the title in Russian.
01:43:38.300 | I'm curious about this 'cause look, it says.
01:43:40.580 | - What's the English?
01:43:41.420 | - The title in English is "Last Day"
01:43:42.980 | and then the title in Russian is "Noviy",
01:43:45.620 | my pronunciation sucks, "Noviy den".
01:43:48.820 | - Yeah, "New Day".
01:43:49.660 | - Like a new day.
01:43:50.500 | - Yeah, yeah, "New Day", "New Day".
01:43:51.700 | - Like it's two different meanings.
01:43:53.380 | - Yeah, "New Day", yeah.
01:43:54.900 | Yeah, yeah, "New Day".
01:43:58.580 | - "New Day", but "Last Day".
01:43:59.980 | "Noviy den", so "Last Day" would be "Posledniy den".
01:44:04.300 | - Yeah.
01:44:05.140 | - Maybe they, "Noviy den".
01:44:05.980 | - Or maybe the title includes both the Russian
01:44:07.620 | and it's for.
01:44:09.100 | - Or maybe, maybe.
01:44:09.940 | - Maybe it's for bilingual.
01:44:10.780 | - But to be honest, "Noviy den" sounds better
01:44:12.060 | than just musically.
01:44:15.180 | Like "Noviy den" is "New Day", that's the current one.
01:44:18.820 | And "Posledniy den" is the last day.
01:44:22.180 | I think "Noviy den", boy.
01:44:25.700 | - I don't like "Noviy den".
01:44:26.740 | - But the meaning is so different.
01:44:28.900 | That's kind of awesome actually though.
01:44:32.380 | There's an explicit sort of contrast like that.
01:44:34.740 | If everyone on earth disappeared and it was just you left,
01:44:39.980 | what would your day look like?
01:44:44.100 | Like what would you do?
01:44:45.300 | Everybody's dead as far as you.
01:44:47.500 | - Are there corpses there?
01:44:48.860 | Well, seriously, it's a big day.
01:44:53.380 | - Let me think through this.
01:44:54.820 | - It's a big difference if there's just like birds
01:44:56.620 | singing versus if there's like corpses
01:44:58.220 | littering the street.
01:44:59.060 | - Yeah, there's corpses everywhere, I'm sorry.
01:45:01.260 | And you don't actually know what happened
01:45:05.100 | and you don't know why you survived.
01:45:07.580 | And you don't even know if there's others out there,
01:45:10.460 | but it seems clear that it's all gone.
01:45:13.580 | - What would you do?
01:45:15.260 | - What would I do?
01:45:16.180 | Listen, I'm somebody who really enjoys the moment,
01:45:19.580 | enjoys life.
01:45:20.460 | I would just go on like enjoying the inanimate objects.
01:45:26.420 | I would just look for food, basic survival.
01:45:30.620 | But most of it is just, listen, I take walks
01:45:34.700 | and I look outside and I'm just happy
01:45:36.900 | that we get to exist on this planet
01:45:38.620 | to be able to breathe air.
01:45:41.980 | It's just all beautiful.
01:45:43.060 | It's full of colors, all of this kind of stuff.
01:45:45.300 | There's so many things about life, your own life,
01:45:49.060 | conscious life that's fucking awesome.
01:45:50.820 | So I would just enjoy that.
01:45:54.300 | - But also maybe after a few weeks,
01:45:56.940 | the engineer would start coming out,
01:45:58.500 | like wanna build some things.
01:46:01.620 | Maybe there's always hope searching for another human.
01:46:05.100 | - Probably searching for another human.
01:46:09.380 | Probably trying to get to a TV or radio station
01:46:13.140 | and broadcast something.
01:46:16.020 | - That's interesting, I didn't think about that.
01:46:19.900 | So like really maximize your ability
01:46:23.500 | to connect with others.
01:46:24.540 | - Yeah, like probably try to find another person.
01:46:29.260 | - Would you be excited to meet another person
01:46:32.660 | or terrified 'cause you know.
01:46:34.980 | - I'd be excited even if they--
01:46:36.260 | - No matter what.
01:46:37.100 | - Yeah, yeah, yeah, yeah.
01:46:38.300 | Being alone for the last however long of my life
01:46:42.260 | would be really bad.
01:46:43.660 | That's the one instance I might,
01:46:46.180 | I don't think I'd kill myself,
01:46:47.300 | but I might kill myself if I had to undergo that.
01:46:48.860 | - Do you love people?
01:46:49.700 | Do you love connection to other humans?
01:46:52.020 | - Yeah, I kind of hate people too, but yeah.
01:46:54.580 | - It's a love hate relationship.
01:46:56.500 | - Yeah, I feel like we have a bunch of weird
01:46:58.460 | Nietzsche questions and stuff though.
01:47:00.620 | - Oh yeah.
01:47:01.460 | - Like I wonder, 'cause I'm like, when podcast,
01:47:02.900 | I'm like, is this interesting for people to just have like,
01:47:05.660 | or I don't know, maybe people do like this.
01:47:08.460 | When I listen to podcasts, I'm into like the lore,
01:47:10.540 | like the hard lore.
01:47:11.980 | Like I just love like Dan Carlin,
01:47:13.460 | I'm like, give me the facts, just like get,
01:47:15.540 | just like the facts into my bloodstream.
01:47:18.820 | - But you also don't know,
01:47:20.740 | like you're a fascinating mind to explore.
01:47:23.340 | So you don't realize as you're talking about stuff,
01:47:26.380 | the stuff you've taken for granted
01:47:28.460 | is actually unique and fascinating.
01:47:30.460 | The way you think, not always what,
01:47:33.540 | like the way you reason through things
01:47:35.820 | is the fascinating thing to listen to,
01:47:39.500 | because people kind of see,
01:47:40.820 | oh, there's other humans that think differently,
01:47:43.460 | that explore thoughts differently.
01:47:45.420 | That's the cool, that's also cool.
01:47:47.580 | So yeah, Dan Carlin retelling of history,
01:47:50.220 | by the way, his retelling of history is very,
01:47:54.860 | I think what's exciting is not the history,
01:47:57.100 | is his way of thinking about history.
01:48:00.380 | - No, I think Dan Carlin is one of the people,
01:48:02.740 | like when Dan Carlin was one of the people
01:48:04.420 | that really started getting me excited
01:48:06.180 | about like revolutionizing education,
01:48:08.940 | because like Dan Carlin instilled,
01:48:12.700 | I already like really liked history,
01:48:14.460 | but he instilled like an obsessive love of history in me
01:48:18.980 | to the point where like now I'm fucking reading,
01:48:21.340 | like going to bed,
01:48:23.980 | reading like part four of "The Rise and Fall
01:48:25.700 | of the Third Reich" or whatever.
01:48:26.780 | Like I got like dense ass history,
01:48:28.780 | but like he like opened that door
01:48:31.380 | that like made me want to be a scholar of that topic.
01:48:34.380 | Like it's like, I feel like he's such a good teacher.
01:48:37.860 | He just like, you know,
01:48:39.780 | and it sort of made me feel like
01:48:42.340 | one of the things we could do with education
01:48:44.140 | is like find like the world's great,
01:48:46.660 | the teachers that like create passion for the topic,
01:48:49.900 | because autodidacticism,
01:48:53.700 | I don't know how to say that properly,
01:48:55.260 | but like self-teaching is like much faster
01:48:57.980 | than being lectured to.
01:48:59.580 | Like it's much more efficient
01:49:00.740 | to sort of like be able to teach yourself
01:49:02.380 | and then ask a teacher questions
01:49:03.780 | when you don't know what's up.
01:49:04.940 | But like, you know,
01:49:06.580 | that's why it's like in university and stuff,
01:49:08.460 | like you can learn so much more material so much faster
01:49:11.180 | because you're doing a lot of the learning on your own
01:49:13.420 | and you're going to the teachers for when you get stuck.
01:49:15.700 | But like these teachers that can inspire passion
01:49:19.060 | for a topic,
01:49:20.340 | I think that is one of the most invaluable skills
01:49:22.180 | in our whole species.
01:49:23.260 | Like, because if you can do that, then you,
01:49:26.540 | it's like AI,
01:49:27.380 | like AI is gonna teach itself so much more efficiently
01:49:31.260 | than we can teach it.
01:49:32.100 | We just needed to get it to the point
01:49:32.940 | where it can teach itself.
01:49:34.380 | And then-
01:49:35.220 | - It finds the motivation to do so, right?
01:49:37.580 | - Yeah.
01:49:38.420 | - So like you inspire it to do so.
01:49:39.660 | - Yeah.
01:49:40.500 | - And then it could teach itself.
01:49:42.700 | What do you make of the fact,
01:49:44.580 | you mentioned "Rise and Fall of the Third Reich."
01:49:46.380 | I just-
01:49:47.220 | - Have you read that?
01:49:48.060 | - I read it twice.
01:49:48.880 | - You read it twice?
01:49:49.720 | - Yes.
01:49:50.560 | - Okay, so no one even knows what it is.
01:49:51.620 | - Yeah.
01:49:52.460 | - And I'm like, wait,
01:49:53.280 | I thought this was like a super popping book.
01:49:54.860 | - Super pop.
01:49:55.700 | - I'm not like that.
01:49:57.300 | - It's a classic.
01:49:58.140 | - I'm not that far in it, but it is, it's so interesting.
01:50:00.340 | - Yeah.
01:50:01.180 | It's written by a person that was there,
01:50:03.580 | which is very important to kind of-
01:50:05.940 | - You know, you start being like,
01:50:07.060 | how could this possibly happen?
01:50:08.500 | And then when you read "Rise and Fall of the Third Reich,"
01:50:10.100 | it's like, people tried really hard for this to not happen.
01:50:14.100 | People tried, they almost reinstated a monarchy at one point
01:50:16.620 | to try to stop this from happening.
01:50:18.020 | Like they almost like,
01:50:19.860 | like abandoned democracy to try to get this to not happen.
01:50:22.780 | - At least the way it makes me feel
01:50:24.820 | is that there's a bunch of small moments
01:50:28.220 | on which history can turn.
01:50:30.140 | - Yes.
01:50:30.980 | - It's like small meetings.
01:50:32.260 | - Yes.
01:50:33.100 | - Human interactions.
01:50:34.260 | And that's both terrifying and inspiring
01:50:36.940 | 'cause it's like,
01:50:38.180 | even just attempts at assassinating Hitler,
01:50:43.180 | like time and time again, failed.
01:50:47.300 | And they were so close.
01:50:48.140 | - Was it like "Operation Valkyrie"?
01:50:49.740 | Such a good...
01:50:51.660 | - And then there's also the role of,
01:50:55.060 | that's a really heavy burden,
01:50:56.500 | which is from a geopolitical perspective,
01:50:59.060 | the role of leaders to see evil
01:51:00.820 | before it truly becomes evil,
01:51:02.500 | to anticipate it, to stand up to evil.
01:51:05.460 | 'Cause evil is actually pretty rare in this world
01:51:08.100 | at a scale that Hitler was.
01:51:09.380 | We tend to, in the modern discourse,
01:51:12.020 | kind of call people evil too quickly.
01:51:13.980 | - If you look at ancient history,
01:51:17.340 | like there was a ton of Hitlers.
01:51:18.860 | I actually think it's more the norm than,
01:51:22.660 | like again, going back to like my
01:51:24.420 | sort of intelligent design theory,
01:51:25.900 | I think one of the things we've been successfully doing
01:51:28.380 | in our slow move from survival of the fittest
01:51:31.300 | to intelligent design is we've
01:51:35.100 | kind of been eradicating...
01:51:37.660 | Like if you look at ancient Assyria and stuff,
01:51:40.060 | like that shit was brutal.
01:51:42.460 | And just like the heads on the,
01:51:44.260 | like brutal, like Genghis Khan,
01:51:46.900 | just like genocide after genocide after genocide.
01:51:49.220 | There's like throwing plague bodies over the walls
01:51:51.740 | and decimating whole cities,
01:51:53.300 | or like the Muslim conquests of Damascus and shit.
01:51:56.500 | Just like people, cities used to get leveled
01:51:59.380 | all the fucking time.
01:52:00.660 | Okay, get into the Bronze Age collapse.
01:52:03.060 | It's basically, there was like almost
01:52:05.700 | like Roman level society.
01:52:08.540 | Like there was like all over the world,
01:52:10.020 | like global trade, like everything was awesome
01:52:12.420 | through a mix of, I think a bit of climate change
01:52:14.420 | and then the development of iron.
01:52:16.620 | 'Cause basically bronze could only come from this,
01:52:18.980 | the way to make bronze,
01:52:20.020 | like everything had to be funneled
01:52:21.180 | through this one Iranian mine.
01:52:23.820 | And so it's like, there was just this one supply chain.
01:52:27.220 | And this is one of the things
01:52:28.060 | that makes me worried about supply chains
01:52:29.620 | and why I think we need to be so thoughtful about.
01:52:32.060 | I think our biggest issue with society right now,
01:52:34.980 | like the thing that is most likely to go wrong
01:52:36.980 | is probably supply chain collapse.
01:52:39.060 | You know, 'cause war, climate change, whatever,
01:52:40.660 | like anything that causes supply chain collapse,
01:52:42.340 | our population is too big to handle that.
01:52:44.780 | And like the thing that seems to cause dark ages
01:52:46.820 | is mass supply chain collapse.
01:52:48.500 | But the Bronze Age collapse happened,
01:52:50.940 | like it was sort of like this ancient collapse
01:52:55.900 | that happened where like literally like ancient Egypt,
01:52:59.820 | all these cities, everything just got like decimated,
01:53:02.260 | destroyed, abandoned cities, like hundreds of them.
01:53:05.020 | There was like a flourishing society,
01:53:06.500 | like we were almost coming to modernity
01:53:07.940 | and everything got leveled.
01:53:09.020 | And they had this mini dark ages,
01:53:10.540 | but it was just like,
01:53:11.500 | there's so little writing or recording from that time
01:53:13.660 | that like there isn't a lot of information
01:53:15.340 | about the Bronze Age collapse,
01:53:16.780 | but it was basically equivalent
01:53:18.220 | to like the medieval dark ages,
01:53:21.740 | but it just happened, I don't know the years,
01:53:24.100 | but like thousands of years earlier.
01:53:26.980 | And then we sort of like recover
01:53:29.140 | from the Bronze Age collapse,
01:53:31.260 | empire reemerged, writing and trade
01:53:34.300 | and everything reemerged,
01:53:35.620 | and then we of course had the more contemporary dark ages.
01:53:40.420 | - And then over time we've designed mechanism
01:53:43.620 | that lessened and lessened the capability
01:53:46.380 | for the destructive power centers to emerge.
01:53:50.500 | - There's more recording
01:53:51.740 | about the more contemporary dark ages.
01:53:54.220 | So I think we have like a better understanding
01:53:55.620 | of how to avoid it,
01:53:56.460 | but I still think we're at high risk for it.
01:53:58.180 | I think that's one of the big risks right now.
01:54:00.660 | - So the natural state of being for humans
01:54:03.300 | is for there to be a lot of Hitlers,
01:54:04.980 | which has gotten really good
01:54:06.820 | at making it hard for them to emerge.
01:54:10.020 | We've gotten better at collaboration
01:54:12.700 | and resisting the power,
01:54:14.860 | like authoritarians to come to power.
01:54:16.860 | - We're trying to go country by country,
01:54:18.580 | like we're moving past this.
01:54:19.900 | We're kind of like slowly incrementally
01:54:21.660 | moving towards not scary old school war stuff.
01:54:26.660 | And I think seeing it happen in some of the countries
01:54:32.140 | that at least nominally are like
01:54:33.780 | supposed to have moved past that,
01:54:36.700 | that's scary because it reminds us that it can happen
01:54:39.380 | in the places that have made,
01:54:42.580 | supposedly, hopefully moved past that.
01:54:47.060 | - And possibly at a civilization level,
01:54:49.420 | like you said, supply chain collapse
01:54:51.660 | might make people resource constrained,
01:54:54.300 | might make people desperate, angry, hateful, violent,
01:54:59.300 | and drag us right back in.
01:55:01.500 | - I mean, supply chain collapse is how,
01:55:03.980 | like the ultimate thing that caused the middle ages
01:55:06.260 | was supply chain collapse.
01:55:08.260 | It's like people,
01:55:09.460 | because people were reliant on a certain level of technology,
01:55:12.380 | like people, like you look at like Britain,
01:55:14.140 | like they had glass,
01:55:15.380 | like people had aqueducts,
01:55:17.540 | people had like indoor heating and cooling
01:55:20.420 | and like running water and like buy food
01:55:23.380 | from all over the world and trade and markets.
01:55:26.020 | Like people didn't know how to hunt and forage and gather.
01:55:28.540 | And so we're in a similar situation.
01:55:29.820 | We are not educated enough to survive without technology.
01:55:33.740 | So if we have a supply chain collapse
01:55:35.380 | that like limits our access to technology,
01:55:38.340 | there will be like massive starvation and violence
01:55:41.300 | and displacement and war.
01:55:43.140 | Like, you know, it's all like, yeah.
01:55:47.300 | In my opinion, it's like the primary marker of dark,
01:55:51.740 | like what a dark age is.
01:55:52.580 | - Well, technology is kind of enabling us
01:55:54.380 | to be more resilient in terms of supply chain,
01:55:57.220 | in terms of, to all the different catastrophic events
01:56:00.820 | that happened to us.
01:56:02.060 | Although the pandemic has kind of challenged
01:56:03.900 | our preparedness for the catastrophic.
01:56:07.660 | What do you think is the coolest invention
01:56:09.220 | humans come up with?
01:56:11.100 | The wheel, fire, cooking meat.
01:56:14.620 | - Computers.
01:56:16.220 | - Computers.
01:56:17.060 | - Freaking computers.
01:56:17.900 | - Internet or computers, which one?
01:56:19.500 | What do you think the--
01:56:20.340 | - Previous technologies, I mean,
01:56:22.380 | may have even been more profound
01:56:23.660 | and moved us to a certain degree,
01:56:24.740 | but I think the computers are what make us homo-techno.
01:56:27.340 | I think this is what, it's a brain augmentation.
01:56:30.740 | And so it like allows for actual evolution.
01:56:33.820 | Like the computers accelerate the degree
01:56:35.460 | to which all the other technologies
01:56:37.020 | can also be accelerated.
01:56:38.620 | - Would you classify yourself as a homo sapien
01:56:40.660 | or a homo techno?
01:56:41.540 | - Definitely homo techno.
01:56:43.020 | - So you're one of the earliest of the species.
01:56:46.900 | - I think most of us are.
01:56:49.180 | Like as I said, I think if you
01:56:51.620 | looked at brain scans of us versus
01:56:56.900 | humans 100 years ago, it would look very different.
01:57:00.900 | I think we are physiologically different.
01:57:03.740 | - Just even the interaction with the devices
01:57:05.580 | has changed our brains.
01:57:06.700 | - Well, and if you look at,
01:57:08.580 | a lot of studies are coming out to show that
01:57:11.220 | there's a degree of inherited memory.
01:57:13.100 | So some of these physiological changes in theory
01:57:15.380 | should be, we should be passing them on.
01:57:18.020 | So like that's, you know, that's not like
01:57:20.580 | an instance of physiological change
01:57:23.180 | that's gonna fizzle out.
01:57:24.140 | In theory, that should progress like to our offspring.
01:57:28.060 | - Speaking of offspring,
01:57:30.420 | what advice would you give to a young person
01:57:33.180 | like in high school?
01:57:34.340 | Whether there be an artist, a creative, an engineer,
01:57:40.660 | any kind of career path,
01:57:45.060 | or maybe just life in general,
01:57:46.300 | how they can live a life they can be proud of.
01:57:48.700 | - I think one of my big thoughts,
01:57:50.780 | and like, especially now having kids,
01:57:53.180 | is that I don't think we spend enough time
01:57:55.460 | teaching creativity.
01:57:56.820 | And I think creativity is a muscle like other things.
01:57:59.300 | And there's a lot of emphasis on,
01:58:01.460 | you know, learn how to play the piano
01:58:02.860 | and then you can write a song
01:58:04.020 | or like learn the technical stuff
01:58:05.460 | and then you can do a thing.
01:58:07.020 | But I think it's,
01:58:08.060 | like I have a friend who's like
01:58:10.980 | world's greatest guitar player,
01:58:13.740 | like, you know, amazing sort of like producer,
01:58:15.780 | works with other people,
01:58:16.660 | but he's really sort of like,
01:58:18.980 | you know, he like engineers and records things
01:58:20.740 | and like does solos,
01:58:21.780 | but he doesn't really like make his own music.
01:58:23.500 | And I was talking to him and I was like,
01:58:26.020 | "Dude, you're so talented at music.
01:58:27.340 | Like, why don't you make music or whatever?"
01:58:28.860 | And he was like,
01:58:29.700 | "Cause I got, I'm too old.
01:58:32.100 | I never learned the creative muscle."
01:58:34.220 | And it's like, you know, it's embarrassing.
01:58:36.700 | It's like learning the creative muscle
01:58:39.260 | takes a lot of failure.
01:58:40.820 | And it also sort of,
01:58:43.020 | if when you're being creative,
01:58:45.860 | you know, you're throwing paint at a wall
01:58:48.180 | and a lot of stuff will fail.
01:58:49.500 | So like part of it is like a tolerance
01:58:51.220 | for failure and humiliation.
01:58:53.060 | - And somehow that's easier to develop when you're young.
01:58:55.700 | - Yeah.
01:58:56.540 | - Or be persist through it when you're young.
01:58:58.180 | - Everything is easier to develop when you're young.
01:59:03.420 | - And the younger the better.
01:59:04.820 | It could destroy you.
01:59:05.660 | I mean, that's the shitty thing about creativity.
01:59:08.900 | You know, failure could destroy you if you're not careful,
01:59:12.220 | but that's a risk worth taking.
01:59:13.380 | - But also, at a young age,
01:59:14.980 | developing a tolerance to failure is good.
01:59:17.780 | I fail all the time.
01:59:19.860 | Like I do stupid shit all the time.
01:59:22.380 | Like in public, I get canceled for,
01:59:24.780 | I make all kinds of mistakes,
01:59:27.020 | but I just like am very resilient about making mistakes.
01:59:30.540 | And so then like I do a lot of things
01:59:32.980 | that like other people wouldn't do.
01:59:34.180 | And like, I think my greatest asset is my creativity.
01:59:37.540 | And I think tolerance to failure
01:59:39.860 | is just a super essential thing
01:59:43.420 | that should be taught before other things.
01:59:45.460 | - Brilliant advice.
01:59:46.380 | Yeah, yeah.
01:59:47.460 | I wish everybody encouraged sort of failure more
01:59:51.580 | as opposed to kind of--
01:59:52.420 | - 'Cause we like punish failure.
01:59:53.500 | We're like, no, like when we were teaching kids,
01:59:55.380 | we're like, no, that's wrong.
01:59:56.380 | Like that's, you know,
01:59:57.860 | X keeps like, will be like wrong.
02:00:04.420 | Like he'll say like crazy things.
02:00:06.260 | X keeps being like bubble car, bubble car.
02:00:09.740 | And I'm like, what's a bubble car?
02:00:14.740 | Like it doesn't, like, but I don't wanna be like,
02:00:16.660 | no, you're wrong.
02:00:17.500 | I'm like, you're thinking of weird, crazy shit.
02:00:20.340 | Like, I don't know what a bubble car is, but like--
02:00:22.260 | - He's creating worlds
02:00:23.460 | and they might be internally consistent.
02:00:25.180 | And through that, he might discover
02:00:26.500 | something fundamental about this world.
02:00:27.860 | - Yeah, or he'll like rewrite songs
02:00:30.180 | with words that he prefers.
02:00:32.100 | So like, instead of baby shark, he says baby car.
02:00:35.460 | It's like--
02:00:36.300 | (both laughing)
02:00:39.500 | - Maybe he's onto something.
02:00:41.140 | Let me ask the big ridiculous question.
02:00:42.780 | We were kind of dancing around it,
02:00:44.100 | but what do you think is the meaning
02:00:47.180 | of this whole thing we have here?
02:00:48.940 | Of human civilization, of life on earth,
02:00:52.980 | but in general, just life.
02:00:55.100 | What's the meaning of life?
02:00:58.220 | - Have you, did you read "Novacene" yet?
02:01:02.580 | By James Lovelock?
02:01:03.820 | You're doing a lot of really good book recommendations here.
02:01:06.380 | - I haven't even finished this,
02:01:07.580 | so I'm a huge fraud yet again.
02:01:10.300 | But like really early in the book,
02:01:12.660 | he says this amazing thing.
02:01:14.540 | Like, I feel like everyone's so sad and cynical.
02:01:16.540 | Like, everyone's like the Fermi paradox and everyone,
02:01:18.740 | I just keep hearing people being like,
02:01:19.780 | "Fuck, what if we're alone?
02:01:21.340 | Like, oh no, ah, like, ah, ah."
02:01:23.420 | And I'm like, "Okay, but like, wait,
02:01:25.100 | what if this is the beginning?"
02:01:26.780 | Like, in "Novacene," he says,
02:01:28.460 | this is not gonna be a correct,
02:01:31.380 | I can't like memorize quotes,
02:01:32.580 | but he says something like,
02:01:34.260 | "What if our consciousness, like right now,
02:01:40.020 | like this is the universe waking up?
02:01:43.420 | Like, what if instead of discovering the universe,
02:01:45.540 | this is the universe,
02:01:47.500 | like this is the evolution
02:01:49.500 | of the literal universe herself?
02:01:51.660 | Like, we are not separate from the universe.
02:01:53.180 | Like, this is the universe waking up.
02:01:54.660 | This is the universe seeing herself for the first time.
02:01:57.420 | Like, this is."
02:01:58.260 | - The universe becoming conscious
02:02:00.700 | for the first time, we're a part of that.
02:02:02.380 | - Yeah, 'cause it's like,
02:02:03.220 | we aren't separate from the universe.
02:02:05.620 | Like, this could be like an incredibly sacred moment.
02:02:08.780 | And maybe like social media and all these things,
02:02:11.020 | the stuff where we're all getting connected together,
02:02:13.300 | like maybe these are the neurons connecting
02:02:16.900 | of the like collective super intelligence that is,
02:02:20.860 | - Waking up.
02:02:23.380 | - Yeah, like, you know, it's like,
02:02:25.460 | maybe instead of something cynical,
02:02:27.140 | or maybe if there's something to discover,
02:02:29.220 | like maybe this is just, you know,
02:02:31.580 | we're a blastocyst of like some incredible
02:02:36.020 | kind of consciousness or being.
02:02:39.460 | - And just like in the first three years of life
02:02:41.260 | or for human children,
02:02:42.860 | we'll forget about all the suffering
02:02:44.380 | that we're going through now.
02:02:45.500 | - I think we'll probably forget about this.
02:02:46.660 | I mean, probably, you know,
02:02:48.340 | artificial intelligence will eventually render us obsolete.
02:02:52.740 | I don't think they'll do it in a malicious way,
02:02:55.380 | but I think probably we are very weak.
02:02:57.740 | The sun is expanding.
02:02:59.460 | I don't know, like hopefully we can get to Mars,
02:03:01.660 | but like we're pretty vulnerable.
02:03:04.100 | And I, you know, like,
02:03:05.940 | I think we can coexist for a long time with AI
02:03:09.100 | and we can also probably make ourselves less vulnerable,
02:03:11.460 | but, you know, I just think consciousness,
02:03:16.340 | sentience, self-awareness,
02:03:18.420 | like I think this might be the single greatest
02:03:21.980 | like moment in evolution ever.
02:03:25.020 | And like, maybe this is, you know,
02:03:29.180 | like the true beginning of life.
02:03:32.620 | And we're just, we're the blue green algae
02:03:34.740 | or we're like the single celled organisms
02:03:37.020 | of something amazing.
02:03:38.500 | - The universe awakens and this is it.
02:03:40.940 | - Yeah.
02:03:41.780 | - Well, see, you're an incredible person.
02:03:45.340 | You're a fascinating mind.
02:03:47.540 | You should definitely do, your friend Liv mentioned
02:03:50.540 | that you guys were thinking of maybe talking.
02:03:52.300 | I would love it if you explored your mind
02:03:55.380 | in this kind of medium more and more
02:03:56.740 | by doing a podcast with her or just in any kind of way.
02:03:59.660 | So you're an awesome person.
02:04:01.780 | It's an honor to know you.
02:04:03.380 | It's an honor to get to sit down with you late at night,
02:04:05.820 | which is like surreal.
02:04:08.340 | And I really enjoyed it.
02:04:09.180 | Thank you for talking today.
02:04:10.140 | - Yeah, no, I mean, huge honor.
02:04:11.700 | I feel very underqualified to be here, but I'm a big fan.
02:04:13.660 | I've been listening to the podcast a lot and yeah,
02:04:15.940 | me and Liv would appreciate any advice and help
02:04:18.260 | and we're definitely gonna do that, so.
02:04:20.220 | - Yeah, anytime.
02:04:22.100 | Thank you.
02:04:22.940 | - Cool, thank you.
02:04:24.420 | Thanks for listening to this conversation with Grimes.
02:04:26.980 | To support this podcast,
02:04:28.260 | please check out our sponsors in the description.
02:04:31.060 | And now let me leave you with some words from Oscar Wilde.
02:04:34.740 | Yes, I'm a dreamer.
02:04:36.940 | For a dreamer is one who can only find her way by moonlight.
02:04:41.380 | And her punishment is that she sees the dawn
02:04:44.260 | before the rest of the world.
02:04:45.740 | Thank you for listening and hope to see you next time.
02:04:49.740 | (upbeat music)
02:04:52.340 | (upbeat music)
02:04:54.940 | [BLANK_AUDIO]