back to index

Jaron Lanier: Virtual Reality, Social Media & the Future of Humans and AI | Lex Fridman Podcast #218


Chapters

0:0 Introduction
1:39 What is reality?
5:52 Turing machines
7:10 Simulating our universe
13:25 Video games and other immersive experiences
17:12 Death and consciousness
25:43 Designing human-centric AI
27:17 Empathy with robots
31:9 Social media incentives
43:29 Data dignity
51:1 Jack Dorsey and Twitter
62:46 Bitcoin and cryptocurrencies
67:26 Government overreach and freedom
77:41 GitHub and TikTok
79:51 The Autodidactic Universe
84:42 Humans and the mystery of music
90:53 Defining moments
101:39 Mortality
103:31 The meaning of life

Whisper Transcript | Transcript Only Page

00:00:00.000 | The following is a conversation with Jaron Lanier,
00:00:03.060 | a computer scientist, visual artist, philosopher,
00:00:06.200 | writer, futurist, musician,
00:00:08.060 | and the founder of the field of virtual reality.
00:00:11.660 | To support this podcast,
00:00:12.780 | please check out our sponsors in the description.
00:00:15.860 | As a side note, you may know that Jaron
00:00:18.060 | is a staunch critic of social media platforms.
00:00:20.700 | Him and I agree on many aspects of this,
00:00:23.780 | except perhaps I am more optimistic
00:00:26.340 | about it being possible to build better platforms.
00:00:29.700 | And better artificial intelligence systems
00:00:32.220 | that put long-term interests
00:00:33.900 | and happiness of human beings first.
00:00:36.660 | Let me also say a general comment about these conversations.
00:00:40.260 | I try to make sure I prepare well,
00:00:42.500 | remove my ego from the picture,
00:00:44.420 | and focus on making the other person shine
00:00:47.200 | as we try to explore the most beautiful
00:00:49.140 | and insightful ideas in their mind.
00:00:51.820 | This can be challenging when the ideas
00:00:54.020 | that are close to my heart are being criticized.
00:00:57.180 | In those cases, I do offer a little pushback,
00:00:59.940 | but respectfully, and then move on,
00:01:02.700 | trying to have the other person come out
00:01:04.300 | looking wiser in the exchange.
00:01:06.620 | I think there's no such thing as winning in conversations,
00:01:09.980 | nor in life.
00:01:11.580 | My goal is to learn and to have fun.
00:01:14.020 | I ask that you don't see my approach
00:01:15.840 | to these conversations as weakness.
00:01:17.980 | It is not.
00:01:19.220 | It is my attempt at showing respect
00:01:21.500 | and love for the other person.
00:01:24.120 | That said, I also often just do a bad job of talking,
00:01:28.520 | but you probably already knew that.
00:01:30.820 | So please give me a pass on that as well.
00:01:33.460 | This is the Lex Friedman Podcast,
00:01:35.540 | and here is my conversation with Jaron Lanier.
00:01:38.640 | You're considered the founding father of virtual reality.
00:01:43.160 | Do you think we will one day spend most
00:01:47.300 | or all of our lives in virtual reality worlds?
00:01:52.420 | - I have always found the very most valuable moment
00:01:56.380 | in virtual reality to be the moment
00:01:58.320 | when you take off the headset and your senses are refreshed
00:02:01.200 | and you perceive physicality afresh,
00:02:05.140 | as if you were a newborn baby,
00:02:07.000 | but with a little more experience.
00:02:08.940 | So you can really notice just how incredibly strange
00:02:13.720 | and delicate and peculiar and impossible the real world is.
00:02:20.580 | - So the magic is, and perhaps forever will be,
00:02:23.940 | in the physical world.
00:02:25.340 | - Well, that's my take on it.
00:02:26.900 | That's just me.
00:02:27.780 | I mean, I think I don't get to tell everybody else
00:02:31.140 | how to think or how to experience virtual reality,
00:02:33.740 | and at this point, there have been multiple generations
00:02:36.780 | of younger people who've come along and liberated me
00:02:41.160 | from having to worry about these things.
00:02:43.340 | But I should say also, even in what some,
00:02:47.540 | well, I called it mixed reality back in the day,
00:02:49.580 | and these days it's called augmented reality,
00:02:52.780 | but with something like a HoloLens,
00:02:54.620 | even then, like one of my favorite things
00:02:57.580 | is to augment a forest,
00:02:58.960 | not because I think the forest needs augmentation,
00:03:01.300 | but when you look at the augmentation next to a real tree,
00:03:05.100 | the real tree just pops out as being astounding.
00:03:08.260 | It's interactive, it's changing slightly all the time
00:03:13.180 | if you pay attention, and it's hard to pay attention
00:03:15.500 | to that, but when you compare it to virtual reality,
00:03:17.500 | all of a sudden you do.
00:03:19.420 | And even in practical applications,
00:03:21.980 | my favorite early application of virtual reality,
00:03:25.620 | which we prototyped going back to the '80s
00:03:28.140 | when I was working with Dr. Joe Rosen at Stanford Med
00:03:31.460 | near where we are now,
00:03:33.580 | we made the first surgical simulator,
00:03:35.740 | and to go from the fake anatomy of the simulation,
00:03:40.740 | which is incredibly valuable for many things,
00:03:43.380 | for designing procedures, for training,
00:03:45.020 | for all kinds of things, then to go to the real person,
00:03:47.660 | boy, it's really something.
00:03:50.340 | Surgeons really get woken up by that transition.
00:03:53.300 | It's very cool.
00:03:54.140 | So I think the transition is actually more valuable
00:03:56.180 | than the simulation.
00:03:57.480 | - That's fascinating.
00:03:59.380 | I never really thought about that.
00:04:01.660 | It's almost, it's like traveling elsewhere
00:04:05.620 | in the physical space can help you appreciate
00:04:08.220 | how much you value your home once you return.
00:04:11.900 | - Well, that's how I take it.
00:04:13.580 | I mean, once again,
00:04:15.220 | people have different attitudes towards it.
00:04:17.580 | All are welcome.
00:04:18.740 | - What do you think is the difference
00:04:20.380 | between the virtual world and the physical meat space world
00:04:23.780 | that you are still drawn, for you personally,
00:04:26.780 | still drawn to the physical world?
00:04:28.900 | Like there clearly then is a distinction.
00:04:31.620 | Is there some fundamental distinction
00:04:33.220 | or is it the peculiarities of the current set of technology?
00:04:37.540 | - In terms of the kind of virtual reality that we have now,
00:04:41.460 | it's made of software and software is terrible stuff.
00:04:45.980 | Software is always the slave of its own history,
00:04:50.340 | its own legacy.
00:04:52.020 | It's always infinitely arbitrarily messy and arbitrary.
00:04:57.020 | Working with it brings out a certain kind
00:05:00.220 | of nerdy personality in people, or at least in me,
00:05:03.460 | which I'm not that fond of.
00:05:05.900 | And there are all kinds of things
00:05:07.260 | about software I don't like.
00:05:09.100 | And so that's different from the physical world.
00:05:11.580 | It's not something we understand, as you just pointed out.
00:05:14.980 | On the other hand, I'm a little mystified
00:05:17.340 | when people ask me,
00:05:18.180 | "Well, do you think the universe is a computer?"
00:05:21.540 | And I have to say, "Well, I mean,
00:05:23.540 | "what on earth could you possibly mean
00:05:26.180 | "if you say it isn't a computer?"
00:05:27.980 | If it isn't a computer,
00:05:30.060 | it wouldn't follow principles consistently
00:05:33.940 | and it wouldn't be intelligible
00:05:35.660 | 'cause what else is a computer ultimately?
00:05:37.860 | And we have physics, we have technology,
00:05:41.220 | so we can do technology, so we can program it.
00:05:43.700 | So I mean, of course it's some kind of computer,
00:05:45.700 | but I think trying to understand it as a Turing machine
00:05:48.980 | is probably a foolish approach.
00:05:52.820 | - Right, that's the question,
00:05:54.460 | whether it performs, this computer we call the universe,
00:05:58.820 | performs the kind of computation that can be modeled
00:06:01.140 | as a universal Turing machine,
00:06:03.940 | or is it something much more fancy, so fancy, in fact,
00:06:08.500 | that it may be beyond our cognitive capabilities
00:06:10.980 | to understand?
00:06:12.660 | - Turing machines are kind of,
00:06:14.660 | I'd call them teases in a way,
00:06:18.660 | 'cause if you have an infinitely smart programmer
00:06:23.180 | with an infinite amount of time,
00:06:24.660 | an infinite amount of memory,
00:06:25.900 | and an infinite clock speed, then they're universal,
00:06:29.900 | but that cannot exist, so they're not universal in practice.
00:06:33.140 | And they actually are, in practice,
00:06:36.260 | a very particular sort of machine
00:06:38.260 | within the constraints,
00:06:40.660 | within the conservation principles of any reality
00:06:44.020 | that's worth being in, probably.
00:06:46.380 | And so, I think universality of a particular model
00:06:51.380 | is probably a deceptive way to think,
00:06:58.540 | even though at some sort of limit,
00:07:00.740 | of course, something like that's gotta be true
00:07:05.060 | at some sort of high enough limit,
00:07:07.380 | but it's just not accessible to us, so what's the point?
00:07:10.460 | - Well, to me, the question of whether we're living
00:07:12.940 | inside a computer or a simulation
00:07:15.460 | is interesting in the following way.
00:07:17.260 | There's a technical question is here.
00:07:20.900 | How difficult is it to build a machine,
00:07:25.820 | not that simulates the universe,
00:07:28.340 | but that makes it sufficiently realistic
00:07:31.460 | that we wouldn't know the difference,
00:07:33.420 | or better yet, sufficiently realistic
00:07:36.180 | that we would kinda know the difference,
00:07:37.820 | but we would prefer to stay in the virtual world anyway.
00:07:40.420 | I wanna give you a few different answers.
00:07:42.460 | I wanna give you the one that I think
00:07:43.980 | has the most practical importance to human beings right now,
00:07:47.460 | which is that there's a kind of an assertion
00:07:51.580 | sort of built into the way the question's usually asked
00:07:54.220 | that I think is false, which is a suggestion
00:07:57.300 | that people have a fixed level of ability
00:08:00.220 | to perceive reality in a given way.
00:08:03.100 | And actually, people are always learning, evolving,
00:08:08.140 | forming themselves.
00:08:09.140 | We're fluid, too.
00:08:10.380 | We're also programmable, self-programmable,
00:08:13.700 | changing, adapting.
00:08:15.300 | And so my favorite way to get at this
00:08:18.780 | is to talk about the history of other media.
00:08:20.980 | So for instance, there was a peer-reviewed paper
00:08:23.580 | that showed that an early wire recorder
00:08:26.420 | playing back an opera singer behind a curtain
00:08:28.620 | was indistinguishable from a real opera singer.
00:08:31.300 | And so now, of course, to us,
00:08:32.420 | it would not only be distinguishable,
00:08:34.180 | but it would be very blatant
00:08:35.500 | 'cause the recording would be horrible.
00:08:37.620 | But to the people at the time,
00:08:39.180 | without the experience of it, it seemed plausible.
00:08:43.760 | There was an early demonstration
00:08:46.300 | of extremely crude video teleconferencing
00:08:49.580 | between New York and DC in the '30s, I think so,
00:08:54.320 | that people viewed as being absolutely realistic
00:08:56.420 | and indistinguishable, which to us would be horrible.
00:08:59.820 | And there are many other examples.
00:09:00.940 | Another one, one of my favorite ones,
00:09:02.220 | is in the Civil War era,
00:09:04.160 | there were itinerant photographers
00:09:06.120 | who collected photographs of people
00:09:07.780 | who just looked kind of like a few archetypes.
00:09:10.660 | So you could buy a photo of somebody
00:09:12.460 | who looked kind of like your loved one
00:09:14.360 | (laughs)
00:09:15.500 | to remind you of that person,
00:09:17.060 | 'cause actually photographing them was inconceivable,
00:09:20.520 | and hiring a painter was too expensive,
00:09:22.380 | and you didn't have any way for the painter
00:09:23.900 | to represent them remotely anyway.
00:09:25.460 | How would they even know what they looked like?
00:09:27.660 | So these are all great examples
00:09:29.580 | of how in the early days of different media,
00:09:32.300 | we perceived the media as being really great,
00:09:34.300 | but then we evolved through the experience of the media.
00:09:37.780 | This gets back to what I was saying.
00:09:38.820 | Maybe the greatest gift of photography
00:09:40.780 | is that we can see the flaws in a photograph
00:09:42.660 | and appreciate reality more.
00:09:44.500 | Maybe the greatest gift of audio recording
00:09:46.720 | is that we can distinguish that opera singer now
00:09:49.940 | from that recording of the opera singer
00:09:51.980 | on the horrible wire recorder.
00:09:53.740 | So we shouldn't limit ourselves
00:09:57.460 | by some assumption of stasis that's incorrect.
00:10:01.260 | So that's the first thing, that's my first answer,
00:10:03.820 | which is I think the most important one.
00:10:05.260 | Now, of course, somebody might come back and say,
00:10:07.260 | oh, but technology can go so far,
00:10:09.340 | there must be some point at which it would surpass.
00:10:11.500 | That's a different question.
00:10:12.580 | I think that's also an interesting question,
00:10:14.700 | but I think the answer I just gave you
00:10:16.020 | is actually the more important answer
00:10:17.560 | to the more important question.
00:10:18.900 | - That's profound, yeah.
00:10:20.260 | But can you, the second question,
00:10:23.140 | which you're now making me realize is way different.
00:10:26.820 | Is it possible to create worlds
00:10:28.500 | in which people would want to stay instead of the real world?
00:10:32.900 | - Well.
00:10:33.860 | - Like en masse, like large numbers of people.
00:10:38.220 | - What I hope is, as I said before,
00:10:41.340 | I hope that the experience of virtual worlds
00:10:44.260 | helps people appreciate this physical world we have
00:10:49.260 | and feel tender towards it
00:10:51.700 | and keep it from getting too fucked up.
00:10:54.540 | That's my hope.
00:10:55.680 | - Do you see all technology in that way?
00:10:58.700 | So basically technology helps us appreciate
00:11:02.780 | the more sort of technology-free aspect of life.
00:11:06.260 | - Well, media technology.
00:11:10.340 | You can stretch that.
00:11:13.540 | Let me say, I could definitely play McLuhan
00:11:17.300 | and turn this into a general theory.
00:11:19.020 | It's totally doable.
00:11:20.020 | The program you just described is totally doable.
00:11:23.140 | In fact, I will psychically predict
00:11:25.060 | that if you did the research,
00:11:26.060 | you could find 20 PhD theses that do that already.
00:11:29.220 | I don't know, but they might exist.
00:11:31.280 | But I don't know how much value there is
00:11:34.860 | in pushing a particular idea that far.
00:11:38.740 | Claiming that reality isn't a computer in some sense
00:11:41.580 | seems incoherent to me 'cause we can program it.
00:11:44.860 | We have technology.
00:11:46.180 | It seems to obey physical laws.
00:11:48.780 | What more do you want from it to be a computer?
00:11:50.660 | I mean, it's a computer of some kind.
00:11:52.140 | We don't know exactly what kind.
00:11:53.460 | We might not know how to think about it.
00:11:54.800 | We're working on it.
00:11:55.820 | - Sorry to interrupt, but you're absolutely right.
00:11:59.260 | That's my fascination with the AI as well
00:12:01.860 | is it helps in the case of AI,
00:12:05.160 | I see as a set of techniques
00:12:07.220 | that help us understand ourselves, understand us humans.
00:12:10.100 | In the same way, virtual reality,
00:12:12.340 | and you're putting it brilliantly,
00:12:14.620 | it's a way to help us understand reality.
00:12:17.900 | - Sure.
00:12:19.780 | - Appreciate and open our eyes more richly to reality.
00:12:23.740 | - That's certainly how I see it.
00:12:26.100 | And I wish people who become incredibly fascinated,
00:12:29.900 | who go down the rabbit hole of the different fascinations
00:12:33.940 | with whether we're in a simulation or not,
00:12:35.900 | or there's a whole world of variations on that.
00:12:39.360 | I wish they'd step back and think about
00:12:41.940 | their own motivations and exactly what they mean.
00:12:44.980 | And I think the danger with these things is,
00:12:50.060 | so if you say, is the universe
00:12:54.380 | some kind of computer broadly,
00:12:56.380 | it has to be 'cause it's not coherent to say that it isn't.
00:12:59.860 | On the other hand, to say that that means
00:13:02.260 | you know anything about what kind of computer,
00:13:05.180 | that's something very different.
00:13:06.380 | And the same thing is true for the brain,
00:13:07.940 | the same thing is true for anything
00:13:10.500 | where you might use computational metaphors.
00:13:12.220 | We have to have a bit of modesty about where we stand.
00:13:15.020 | And the problem I have with these framings of computation
00:13:19.420 | as these ultimate cosmic questions
00:13:21.140 | is that it has a way of getting people
00:13:23.380 | to pretend they know more than they do.
00:13:25.380 | - Can you maybe, this is a therapy session,
00:13:28.260 | psychoanalyze me for a second.
00:13:30.420 | I really like the Elder Scrolls series,
00:13:32.300 | it's a role-playing game, Skyrim, for example.
00:13:36.860 | Why do I enjoy so deeply just walking around that world
00:13:41.860 | and then there's people you could talk to
00:13:45.180 | and you can just like, it's an escape,
00:13:48.100 | but my life is awesome, I'm truly happy,
00:13:51.340 | but I also am happy with the music that's playing
00:13:55.180 | and the mountains and carrying around a sword
00:13:59.220 | and just that, I don't know what that is.
00:14:02.420 | It's very pleasant though to go there
00:14:04.660 | and I miss it sometimes.
00:14:06.580 | - I think it's wonderful to love artistic creations,
00:14:11.580 | it's wonderful to love contact with other people,
00:14:16.020 | it's wonderful to love play and ongoing,
00:14:19.900 | evolving meaning and patterns with other people.
00:14:24.180 | I think it's a good thing.
00:14:26.940 | I'm not anti-tech and I'm certainly not anti-digital tech,
00:14:34.420 | I'm anti, as everybody knows by now,
00:14:37.260 | I think the manipulative economy of social media
00:14:41.220 | is making everybody nuts and all that,
00:14:42.420 | so I'm anti that stuff, but the core of it,
00:14:45.340 | of course I worked for many, many years
00:14:47.620 | on trying to make that stuff happen
00:14:49.180 | because I think it can be beautiful.
00:14:51.180 | Why not?
00:14:54.340 | And by the way, there's a thing about humans,
00:14:59.180 | which is we're problematic.
00:15:03.900 | Any kind of social interaction with other people
00:15:07.860 | is gonna have its problems.
00:15:10.140 | People are political and tricky,
00:15:13.980 | and I love classical music,
00:15:16.220 | but when you actually go to a classical music thing
00:15:18.540 | and it turns out, oh actually,
00:15:19.700 | this is like a backroom power deal kind of place
00:15:21.980 | and a big status ritual as well,
00:15:24.140 | and that's kind of not as fun,
00:15:26.900 | that's part of the package.
00:15:28.980 | And the thing is, it's always going to be,
00:15:30.660 | there's always gonna be a mix of things.
00:15:33.140 | I don't think the search for purity
00:15:38.780 | is gonna get you anywhere, so I'm not worried about that.
00:15:42.300 | I worry about the really bad cases
00:15:44.500 | where we're making ourselves crazy or cruel enough
00:15:48.220 | that we might not survive,
00:15:49.260 | and I think the social media criticism rises to that level,
00:15:53.460 | but I'm glad you enjoy it.
00:15:54.900 | I think it's great.
00:15:55.900 | - And I like that you basically say
00:15:59.060 | that every experience is both beauty and darkness,
00:16:02.180 | as in with classical music.
00:16:03.620 | I also play classical piano, so I appreciate it very much.
00:16:07.180 | But it's interesting.
00:16:09.020 | And even the darkness,
00:16:10.180 | in "A Man's Search for Meaning" with Viktor Frankl
00:16:12.780 | in the concentration camps,
00:16:15.780 | even there, there's opportunity to discover beauty.
00:16:19.220 | And so it's, that's the interesting thing about humans,
00:16:25.060 | is the capacity to discover beautiful
00:16:27.900 | in the darkest of moments,
00:16:29.060 | but there's always the dark parts too.
00:16:31.580 | - Well, I mean, it's,
00:16:34.220 | our situation is structurally difficult.
00:16:36.980 | We are--
00:16:37.820 | - Structurally difficult.
00:16:40.660 | - No, it is, it's true.
00:16:42.140 | We perceive socially, we depend on each other
00:16:44.780 | for our sense of place and perception of the world.
00:16:49.780 | I mean, we're dependent on each other,
00:16:52.340 | and yet there's also a degree in which we're inevitably,
00:16:56.580 | we inevitably let each other down.
00:16:59.980 | We are set up to be competitive as well as supportive.
00:17:05.140 | I mean, it's just,
00:17:06.540 | our fundamental situation is complicated and challenging,
00:17:10.620 | and I wouldn't have it any other way.
00:17:13.540 | - Okay, let's talk about one of the most challenging things.
00:17:17.020 | One of the things I unfortunately am very afraid of,
00:17:20.820 | being human, allegedly.
00:17:23.420 | You wrote an essay on death and consciousness,
00:17:26.300 | in which you write a note,
00:17:28.300 | "Certainly the fear of death
00:17:29.940 | "has been one of the greatest driving forces
00:17:31.940 | "in the history of thought
00:17:33.380 | "and in the formation of the character of civilization,
00:17:37.300 | "and yet it is under-acknowledged.
00:17:39.740 | "The great book on the subject,
00:17:41.300 | "The Denial of Death by Ernest Becker,
00:17:43.140 | "deserves a reconsideration."
00:17:45.460 | I'm Russian, so I have to ask you about this.
00:17:49.820 | What's the role of death in life?
00:17:51.740 | - See, you would have enjoyed coming to our house,
00:17:54.660 | 'cause my wife is Russian,
00:17:57.180 | and we also have, - Awesome.
00:17:58.660 | - We have a piano of such spectacular qualities,
00:18:01.420 | you wouldn't, you would have freaked out.
00:18:03.700 | But anyway, we'll let all that go.
00:18:07.260 | So the context in which, I remember that essay,
00:18:12.020 | sort of, this was from maybe the '90s or something,
00:18:15.060 | and I used to publish in a journal
00:18:18.940 | called the Journal of Consciousness Studies,
00:18:20.580 | 'cause I was interested in these endless debates
00:18:24.220 | about consciousness and science,
00:18:26.460 | which certainly continue today.
00:18:30.700 | And I was interested in how the future
00:18:36.980 | the fear of death and the denial of death
00:18:40.420 | played into different philosophical approaches
00:18:43.140 | to consciousness.
00:18:44.780 | Because I think on the one hand,
00:18:49.780 | the sort of sentimental school of dualism,
00:18:58.540 | meaning the feeling that there's something apart
00:19:00.780 | from the physical brain,
00:19:02.180 | some kind of soul or something else,
00:19:05.060 | is obviously motivated in a sense by a hope
00:19:08.020 | that whatever that is will survive death and continue,
00:19:11.620 | and that's a very core aspect
00:19:14.260 | of a lot of the world religions.
00:19:15.460 | Not all of them, not really, but most of them.
00:19:19.420 | The thing I noticed is that the opposite of those,
00:19:26.260 | which might be the sort of hardcore,
00:19:28.980 | no, the brain's a computer and that's it,
00:19:31.380 | in a sense, we're motivated in the same way
00:19:36.340 | with a remarkably similar chain of arguments,
00:19:40.740 | which is, no, the brain's a computer
00:19:43.740 | and I'm gonna figure it out in my lifetime
00:19:45.620 | and upload myself and I'll live forever. (laughs)
00:19:49.580 | - That's interesting.
00:19:50.580 | Yeah, that's like the implied thought, right?
00:19:53.580 | - Yeah, and so it's kind of this,
00:19:55.940 | in a funny way, it's the same thing.
00:19:58.580 | It's peculiar to notice that these people
00:20:03.580 | who would appear to be opposites in character
00:20:09.620 | and cultural references and in their ideas
00:20:14.380 | actually are remarkably similar.
00:20:16.660 | And to an incredible degree,
00:20:20.340 | this sort of hardcore computationalist idea
00:20:24.620 | about the brain has turned into medieval Christianity.
00:20:29.620 | There are the people who are afraid
00:20:31.460 | that if you have the wrong thought,
00:20:32.540 | you'll piss off the super AIs of the future
00:20:34.740 | who will come back and zap you and all that stuff.
00:20:38.460 | It's really turned into medieval Christianity
00:20:41.740 | all over again.
00:20:43.140 | - This is so the Ernest Becker's idea
00:20:44.860 | that the fear of death is the warm of the core,
00:20:49.620 | which is like, that's the core motivator
00:20:53.700 | of everything we see humans have created.
00:20:56.860 | The question is if that fear of mortality
00:20:59.180 | is somehow core, is like a prerequisite.
00:21:02.200 | - So you just moved across this vast cultural chasm
00:21:08.340 | that separates me from most of my colleagues in a way,
00:21:13.220 | and I can't answer what you just said on the level
00:21:15.500 | without this huge deconstruction.
00:21:17.740 | - Yes. - Should I do it?
00:21:18.940 | - Yes, what's the chasm?
00:21:20.260 | - Okay.
00:21:21.340 | - Let us travel across this vast chasm.
00:21:23.140 | - Okay, I don't believe in AI.
00:21:25.020 | I don't think there's any AI.
00:21:26.220 | There's just algorithms, we make them, we control them.
00:21:28.420 | Now, they're tools, they're not creatures.
00:21:30.740 | Now, this is something that rubs a lot of people
00:21:33.300 | the wrong way, and don't I know it.
00:21:36.140 | When I was young, my main mentor was Marvin Minsky,
00:21:39.460 | who's the principal author of the computer
00:21:43.380 | as creature rhetoric that we still use.
00:21:47.020 | He was the first person to have the idea at all,
00:21:48.820 | but he certainly populated the AI culture
00:21:52.940 | with most of its tropes, I would say,
00:21:55.380 | 'cause a lot of the stuff people will say,
00:21:56.900 | "Oh, did you hear this new idea about AI?"
00:21:58.620 | And I'm like, "Yeah, I heard it in 1978, sure.
00:22:00.580 | "Yeah, I remember that."
00:22:01.940 | So Marvin was really the person,
00:22:03.660 | and Marvin and I used to argue all the time about this stuff
00:22:08.540 | 'cause I always rejected it, and of all of his,
00:22:16.060 | I wasn't formally his student,
00:22:17.780 | but I worked for him as a researcher,
00:22:19.780 | but of all of his students and student-like people
00:22:23.660 | of his young adoptees, I think I was the one
00:22:27.540 | who argued with him about this stuff in particular,
00:22:30.100 | and he loved it.
00:22:31.220 | - Yeah, I would've loved to hear that conversation.
00:22:33.180 | - It was fun.
00:22:34.140 | - Did you ever converge to a place?
00:22:36.700 | - Oh, no, no.
00:22:37.540 | So the very last time I saw him, he was quite frail,
00:22:40.260 | and I was in Boston, and I was going to the old house
00:22:45.180 | in Brookline, his amazing house,
00:22:47.380 | and one of our mutual friends said,
00:22:49.020 | "Hey, listen, Marvin's so frail.
00:22:51.940 | "Don't do the argument with him.
00:22:53.980 | "Don't argue about AI."
00:22:55.580 | - Yeah.
00:22:56.420 | - And so I said, "But Marvin loves that."
00:22:58.860 | And so I showed up, and he was frail,
00:23:01.540 | and he looked up, and he said, "Are you ready to argue?"
00:23:04.460 | (all laughing)
00:23:07.060 | - He's such an amazing person for that.
00:23:10.260 | - So it's hard to summarize this
00:23:13.900 | 'cause it's decades of stuff.
00:23:16.220 | The first thing to say is that nobody can claim
00:23:19.700 | absolute knowledge about whether somebody
00:23:23.140 | or something else is conscious or not.
00:23:25.820 | This is all a matter of faith, and in fact,
00:23:28.680 | I think the whole idea of faith needs to be updated,
00:23:32.900 | so it's not about God,
00:23:34.060 | but it's just about stuff in the universe.
00:23:36.180 | We have faith in each other being conscious,
00:23:39.380 | and then I used to frame this as a thing
00:23:42.540 | called the circle of empathy in my old papers,
00:23:45.340 | and then it turned into a thing
00:23:47.980 | for the animal rights movement, too.
00:23:49.140 | I noticed Peter Singer using it.
00:23:50.460 | I don't know if it was coincident,
00:23:52.500 | but anyway, there's this idea
00:23:54.500 | that you draw a circle around yourself,
00:23:56.140 | and the stuff inside is more like you,
00:23:58.260 | might be conscious, might be deserving of your empathy,
00:24:00.700 | of your consideration,
00:24:02.180 | and the stuff outside the circle isn't.
00:24:04.300 | And outside the circle might be a rock,
00:24:08.500 | or (laughs) I don't know.
00:24:12.120 | - And that circle is fundamentally based on faith.
00:24:15.480 | - Well, no, no. - Your faith
00:24:16.320 | in what is and what isn't.
00:24:18.000 | - The thing about the circle is it can't be pure faith.
00:24:21.440 | It's also a pragmatic decision,
00:24:23.880 | and this is where things get complicated.
00:24:26.040 | If you try to make it too big,
00:24:27.920 | you suffer from incompetence.
00:24:29.920 | If you say, "I don't wanna kill a bacteria.
00:24:33.280 | "I will not brush my teeth."
00:24:34.600 | I don't know, what do you do?
00:24:36.000 | There's a competence question
00:24:39.160 | where you do have to draw the line.
00:24:41.040 | People who make it too small become cruel.
00:24:44.440 | People are so clannish and political
00:24:46.440 | and so worried about themselves ending up
00:24:48.640 | on the bottom of society that they are always ready
00:24:52.400 | to gang up on some designated group,
00:24:54.240 | and so there's always these people
00:24:55.360 | who are being tried,
00:24:56.280 | we're always trying to shove somebody out of the circle.
00:24:58.840 | And so-- - So aren't you shoving AI
00:25:00.720 | outside the circle?
00:25:01.560 | - Well, give me a second. - All right.
00:25:02.760 | - So there's a pragmatic consideration here.
00:25:05.840 | And so, and the biggest questions are probably fetuses
00:25:10.600 | and animals lately, but AI is getting there.
00:25:13.400 | Now, with AI, I think,
00:25:16.920 | and I've had this discussion so many times,
00:25:21.560 | people say, "But aren't you afraid if you exclude AI,
00:25:23.680 | "you'd be cruel to some consciousness?"
00:25:26.320 | And then I would say, "Well, if you include AI,
00:25:29.440 | "you exclude yourself from being able
00:25:33.280 | "to be a good engineer or designer,
00:25:35.800 | "and so you're facing incompetence immediately."
00:25:38.760 | So I really think we need to subordinate algorithms
00:25:41.440 | and be much more skeptical of them.
00:25:43.600 | - Your intuition, you speak about this brilliantly
00:25:45.920 | with social media, how things can go wrong.
00:25:49.000 | Isn't it possible to design systems
00:25:54.000 | that show compassion, not to manipulate you,
00:25:57.760 | but give you control and make your life better
00:26:01.240 | if you so choose to?
00:26:02.760 | Grow together with systems,
00:26:04.040 | and the way we grow with dogs and cats,
00:26:06.360 | with pets, with significant others,
00:26:08.520 | in that way, they grow to become better people.
00:26:11.520 | I don't understand why that's fundamentally not possible.
00:26:14.440 | You're saying oftentimes you get into trouble
00:26:18.080 | by thinking you know what's good for people.
00:26:20.200 | - Well, look, there's this question
00:26:22.960 | of what frame we're speaking in.
00:26:25.600 | Do you know who Alan Watts was?
00:26:27.680 | So Alan Watts once said, "Morality is like gravity,
00:26:32.040 | "that in some absolute cosmic sense,
00:26:34.400 | "there can't be morality because at some point
00:26:36.360 | "it all becomes relative and who are we anyway?
00:26:39.240 | "Morality is relative to us tiny creatures."
00:26:42.120 | But here on Earth, we're with each other,
00:26:45.560 | this is our frame, and morality is a very real thing.
00:26:47.920 | Same thing with gravity.
00:26:48.760 | At some point, you get into interstellar space
00:26:52.120 | and you might not feel much of it,
00:26:54.000 | but here we are on Earth.
00:26:55.200 | And I think in the same sense,
00:26:57.000 | I think this identification with a frame
00:27:02.200 | that's quite remote cannot be separated
00:27:06.600 | from a feeling of wanting to feel sort of separate from
00:27:10.360 | and superior to other people or something like that.
00:27:12.920 | There's an impulse behind it that I really have to reject.
00:27:16.120 | And we're just not competent yet
00:27:18.440 | to talk about these kinds of absolutes.
00:27:21.000 | - Okay, so I agree with you that a lot of technologies
00:27:24.400 | sort of lack this basic respect,
00:27:27.000 | understanding, and love for humanity.
00:27:29.200 | There's a separation there.
00:27:30.680 | The thing I'd like to push back against,
00:27:32.440 | it's not that you disagree,
00:27:33.640 | but I believe you can create technologies
00:27:36.240 | and you can create a new kind of technologist, engineer,
00:27:41.160 | that does build systems that respect humanity,
00:27:44.560 | not just respect, but admire humanity,
00:27:46.920 | that have empathy for common humans, have compassion.
00:27:50.600 | - So, I mean, no, no, no.
00:27:52.600 | I think, yeah, I mean, I think musical instruments
00:27:57.320 | are a great example of that.
00:27:58.840 | Musical instruments are technologies
00:28:00.280 | that help people connect in fantastic ways.
00:28:02.280 | And that's a great example.
00:28:05.760 | My invention or design during the pandemic period
00:28:11.320 | was this thing called Together Mode,
00:28:12.520 | where people see themselves seated
00:28:14.120 | sort of in a classroom or a theater instead of in squares.
00:28:19.120 | And it allows them to semi-consciously perform to each other
00:28:25.320 | as if they have proper eye contact,
00:28:29.480 | as if they're paying attention to each other non-verbally.
00:28:31.640 | And weirdly, that turns out to work.
00:28:34.040 | And so it promotes empathy so far as I can tell.
00:28:37.000 | I hope it is of some use to somebody.
00:28:39.360 | The AI idea isn't really new.
00:28:43.120 | I would say it was born with Adam Smith's "Invisible Hand,"
00:28:47.160 | with this idea that we build this algorithmic thing
00:28:49.720 | and it gets a bit beyond us,
00:28:51.880 | and then we think it must be smarter than us.
00:28:53.840 | And the thing about the "Invisible Hand"
00:28:55.680 | is absolutely everybody has some line they draw
00:28:59.040 | where they say, "No, no, no,
00:29:00.000 | "we're gonna take control of this thing."
00:29:01.680 | They might have different lines,
00:29:03.200 | they might care about different things,
00:29:04.280 | but everybody ultimately became a Keynesian
00:29:06.880 | 'cause it just didn't work.
00:29:07.960 | It really wasn't that smart.
00:29:09.240 | It was sometimes smart and sometimes it failed.
00:29:11.600 | And so if you really, people who really, really, really
00:29:16.600 | wanna believe in the "Invisible Hand" is infinitely smart,
00:29:20.920 | screw up their economies terribly.
00:29:22.760 | You have to recognize the economy as a subservient tool.
00:29:27.440 | Everybody does when it's to their advantage.
00:29:29.920 | They might not when it's not to their advantage.
00:29:31.760 | That's kind of an interesting game that happens.
00:29:34.280 | But the thing is, it's just like that with our algorithms.
00:29:37.080 | Like you can have a sort of a Chicago economic philosophy
00:29:42.080 | about your computers.
00:29:45.880 | Say, "No, no, no, my thing's come alive.
00:29:47.240 | "It's smarter than anything."
00:29:49.120 | - I think that there is a deep loneliness within all of us.
00:29:52.880 | This is what we seek.
00:29:54.000 | We seek love from each other.
00:29:56.360 | I think AI can help us connect deeper.
00:29:59.640 | Like this is what you criticize social media for.
00:30:02.360 | I think there's much better ways of doing social media
00:30:04.920 | that doesn't lead to manipulation,
00:30:06.640 | but instead leads to deeper connection between humans,
00:30:09.600 | leads to you becoming a better human being.
00:30:12.000 | And what that requires is some agency on the part of AI
00:30:15.200 | to be almost like a therapist, I mean, a companion.
00:30:18.680 | It's not telling you what's right.
00:30:22.080 | It's not guiding you as if it's an all-knowing thing.
00:30:25.400 | It's just another companion that you can leave at any time.
00:30:28.840 | You have complete transparency and control over.
00:30:31.960 | There's a lot of mechanisms that you can have
00:30:34.320 | that are counter to how current social media operates
00:30:38.920 | that I think is subservient to humans,
00:30:41.560 | or no, deeply respects human beings,
00:30:46.120 | and is empathetic to their experience
00:30:47.800 | and all those kinds of things.
00:30:48.920 | I think it's possible to create AI systems like that.
00:30:51.680 | And I think they need, I mean, that's a technical discussion
00:30:54.680 | of whether they need to have something
00:30:56.680 | that looks like AI versus algorithms,
00:31:03.680 | something that has an identity,
00:31:05.680 | something that has a personality,
00:31:07.960 | all those kinds of things.
00:31:09.160 | AI systems, and you've spoken extensively
00:31:11.560 | how AI systems manipulate you within social networks.
00:31:16.560 | And that's the biggest problem.
00:31:19.680 | It isn't necessarily that there's advertisements,
00:31:24.680 | that social networks present you with advertisements
00:31:29.440 | that then get you to buy stuff.
00:31:31.240 | That's not the biggest problem.
00:31:32.400 | The biggest problem is they then manipulate you.
00:31:35.320 | They alter your human nature to get you to buy stuff,
00:31:41.360 | or to get you to do whatever the advertiser wants.
00:31:46.520 | Maybe you can correct me.
00:31:47.560 | - Yeah, I don't see it quite that way,
00:31:49.920 | but we can work with that as an approximation.
00:31:52.160 | - Sure, so my--
00:31:53.240 | - I think the actual thing is even sort of more ridiculous
00:31:55.960 | and stupider than that, but that's okay.
00:31:58.240 | - So my question is, let's not use the word AI,
00:32:02.520 | but how do we fix it?
00:32:05.400 | - Oh, fixing social media.
00:32:06.840 | That diverts us into this whole other field,
00:32:10.600 | in my view, which is economics,
00:32:12.640 | which I always thought was really boring,
00:32:14.280 | but we have no choice but to turn into economists
00:32:16.480 | who wanna fix this problem,
00:32:17.960 | 'cause it's all about incentives.
00:32:19.880 | But I've been around this thing since it started,
00:32:24.280 | and I've been in the meetings
00:32:27.240 | where the social media companies sell themselves
00:32:31.600 | to the people who put the most money into them,
00:32:33.880 | which are usually the big advertising holding companies
00:32:36.280 | and whatnot, and there's this idea
00:32:39.120 | that I think is kind of a fiction,
00:32:41.400 | and maybe it's even been recognized as that by everybody,
00:32:45.120 | that the algorithm will get really good
00:32:48.360 | at getting people to buy something,
00:32:49.760 | 'cause I think people have looked at their returns
00:32:52.120 | and looked at what happens,
00:32:53.200 | and everybody recognizes it's not exactly right.
00:32:56.320 | It's more like a cognitive access blackmail payment
00:33:01.320 | at this point.
00:33:03.560 | Like, just to be connected, you're paying the money.
00:33:06.040 | It's not so much that the persuasion algorithms.
00:33:08.640 | So Stanford renamed its program,
00:33:10.440 | but it used to be called Engage Persuade.
00:33:12.280 | The Engage part works.
00:33:13.560 | The Persuade part is iffy,
00:33:15.840 | but the thing is that once people are engaged,
00:33:18.920 | in order for you to exist as a business,
00:33:20.720 | in order for you to be known at all,
00:33:21.960 | you have to put money into the--
00:33:23.160 | - Oh, that's dark.
00:33:24.480 | - Oh, no, that's not--
00:33:25.320 | - It doesn't work, but they have to--
00:33:27.080 | - But they're still, it's a giant cognitive access
00:33:30.680 | blackmail scheme at this point.
00:33:32.560 | So, because the science behind the Persuade part,
00:33:35.280 | it's not entirely a failure,
00:33:39.640 | but it's not what,
00:33:42.800 | we play make believe that it works more than it does.
00:33:47.000 | The damage doesn't come, honestly, as I've said in my books,
00:33:51.320 | I'm not anti-advertising.
00:33:53.440 | I actually think advertising can be demeaning and annoying
00:33:58.280 | and banal and ridiculous
00:34:01.400 | and take up a lot of our time with stupid stuff.
00:34:04.320 | Like, there's a lot of ways to criticize advertising
00:34:07.040 | that's accurate, and it can also lie
00:34:10.360 | and all kinds of things.
00:34:11.520 | However, if I look at the biggest picture,
00:34:14.000 | I think advertising, at least as it was understood
00:34:17.160 | before social media, helped bring people into modernity
00:34:20.360 | in a way that overall actually did benefit people overall.
00:34:24.640 | And you might say, am I contradicting myself
00:34:27.600 | because I was saying you shouldn't manipulate people?
00:34:29.160 | Yeah, I am, probably here.
00:34:30.520 | I mean, I'm not pretending to have this perfect airtight
00:34:33.720 | worldview without some contradictions.
00:34:35.520 | I think there's a bit of a contradiction there, so.
00:34:37.960 | - Well, looking at the long arc of history,
00:34:39.400 | I think advertising has, in some parts, benefited society
00:34:43.920 | because it funded some efforts that perhaps benefited society.
00:34:46.720 | - Yeah, I mean, I think there's a thing where,
00:34:51.120 | sometimes I think it's actually been of some use.
00:34:53.960 | Now, where the damage comes is a different thing, though.
00:34:58.960 | Social media, algorithms on social media
00:35:03.400 | have to work on feedback loops
00:35:04.840 | where they present you with stimulus,
00:35:06.880 | they have to see if you respond to the stimulus.
00:35:09.080 | Now, the problem is that the measurement mechanism
00:35:12.560 | for telling if you respond in the engagement feedback loop
00:35:16.520 | is very, very crude.
00:35:17.680 | It's things like whether you click more
00:35:19.600 | or occasionally if you're staring at the screen more,
00:35:21.680 | if there's a forward-facing camera that's activated,
00:35:23.960 | but typically there isn't.
00:35:25.600 | So you have this incredibly crude
00:35:27.080 | back channel of information.
00:35:29.000 | And so it's crude enough that it only catches
00:35:32.680 | sort of the more dramatic responses from you,
00:35:35.800 | and those are the fight or flight responses.
00:35:37.760 | Those are the things where you get scared or pissed off
00:35:40.200 | or aggressive or horny.
00:35:43.480 | These are these ancient,
00:35:44.920 | the sort of what are sometimes called
00:35:46.200 | the lizard brain circuits or whatever,
00:35:48.080 | these fast response, old, old, old evolutionary business
00:35:54.160 | circuits that we have that are helpful in survival
00:35:58.320 | once in a while, but are not us at our best.
00:36:00.600 | They're not who we wanna be,
00:36:01.680 | they're not how we relate to each other.
00:36:03.520 | They're this old business.
00:36:05.120 | So then just when you're engaged using those
00:36:07.560 | intrinsically, totally aside from whatever the topic is,
00:36:11.080 | you start to get incrementally just a little bit
00:36:13.960 | more paranoid, xenophobic, aggressive,
00:36:16.920 | you get a little stupid and you become a jerk.
00:36:21.000 | And it happens slowly.
00:36:22.440 | It's not like everybody's instantly transformed,
00:36:26.120 | but it does kind of happen progressively
00:36:28.040 | where people who get hooked kind of get drawn
00:36:30.320 | more and more into this pattern of being at their worst.
00:36:33.680 | - Would you say that people are able to,
00:36:35.800 | when they get hooked in this way,
00:36:37.520 | look back at themselves from 30 days ago
00:36:40.360 | and say, "I am less happy with who I am now,"
00:36:45.160 | or, "I'm not happy with who I am now
00:36:47.120 | versus who I was 30 days ago."
00:36:48.800 | Are they able to self-reflect when you take yourself
00:36:51.440 | outside of the lizard brain?
00:36:52.640 | - Sometimes.
00:36:54.160 | I wrote a book about people, suggesting people
00:36:56.720 | take a break from their social media to see what happens
00:36:58.920 | and maybe even, actually the title of the book
00:37:01.800 | was just "Arguments to Delete Your Account."
00:37:04.200 | - Yeah, 10 arguments. - 10 arguments.
00:37:06.480 | Although I always said, "I don't know that you should.
00:37:08.480 | "I can give you the arguments, it's up to you."
00:37:10.320 | I'm always very clear about that.
00:37:11.800 | But I get, I don't have a social media account, obviously,
00:37:15.600 | and it's not that easy for people to reach me.
00:37:18.920 | They have to search out an old-fashioned email address
00:37:21.320 | on a super crappy, antiquated website.
00:37:23.960 | It's actually a bit, I don't make it easy.
00:37:26.160 | And even with that, I get this huge flood of mail
00:37:28.720 | from people who say, "Oh, I quit my social media.
00:37:30.560 | "I'm doing so much better.
00:37:31.400 | "I can't believe how bad it was."
00:37:33.280 | But the thing is, what's for me a huge flood of mail
00:37:36.080 | would be an imperceptible trickle
00:37:37.720 | from the perspective of Facebook, right?
00:37:39.960 | And so I think it's rare for somebody to look at themselves
00:37:44.560 | and say, "Oh boy, I sure screwed myself over."
00:37:46.560 | (laughs)
00:37:47.400 | - Well, I-- - It's a really hard thing
00:37:48.720 | to ask of somebody.
00:37:49.600 | None of us find that easy, right?
00:37:51.280 | - Well, the reason I-- - It's just hard.
00:37:52.600 | - The reason I ask this is,
00:37:54.320 | is it possible to design social media systems
00:37:58.160 | that optimize for some longer-term metrics
00:38:01.200 | of you being happy with yourself?
00:38:04.560 | - Well, see-- - Personal growth.
00:38:05.400 | - I don't think you should try to engineer
00:38:07.160 | personal growth or happiness.
00:38:08.360 | I think what you should do is design a system
00:38:10.760 | that's just respectful of the people
00:38:12.680 | and subordinates itself to the people
00:38:14.720 | and doesn't have perverse incentives.
00:38:16.800 | And then at least there's a chance
00:38:18.200 | of something decent happening.
00:38:19.040 | - But you have to recommend stuff, right?
00:38:22.080 | So you're saying, like, be respectful.
00:38:24.440 | What does that actually mean engineering-wise?
00:38:26.240 | - People, yeah, curation.
00:38:27.720 | People have to be responsible.
00:38:30.280 | Algorithms shouldn't be recommending.
00:38:31.680 | Algorithms don't understand enough to recommend.
00:38:33.480 | Algorithms are crap in this era.
00:38:35.320 | I mean, I'm sorry, they are.
00:38:37.040 | And I'm not saying this as somebody
00:38:38.400 | as a critic from the outside.
00:38:39.400 | I'm in the middle of it.
00:38:40.240 | I know what they can do.
00:38:41.120 | I know the math.
00:38:41.960 | I know what the corpora are.
00:38:43.440 | I know the best ones.
00:38:46.960 | Our office is funding GPT-3 and all these things
00:38:49.840 | that are at the edge of what's possible.
00:38:53.480 | And they do not have yet.
00:38:57.440 | I mean, it still is statistical emergent pseudo-semantics.
00:39:02.120 | It doesn't actually have deep representation
00:39:04.160 | emerging of anything.
00:39:05.120 | It's just not, like, I mean,
00:39:06.560 | I'm speaking the truth here and you know it.
00:39:08.600 | - Well, let me push back on this.
00:39:10.800 | There's several truths here.
00:39:13.080 | So you're speaking to the way
00:39:15.040 | certain companies operate currently.
00:39:17.040 | I don't think it's outside the realm
00:39:18.880 | of what's technically feasible to do.
00:39:21.760 | There's just not incentive.
00:39:22.760 | Like, companies are not, why fix this thing?
00:39:26.120 | I am aware that, for example,
00:39:29.080 | the YouTube search and discovery
00:39:31.120 | has been very helpful to me.
00:39:33.480 | And there's a huge number of,
00:39:35.920 | there's so many videos that it's nice
00:39:38.440 | to have a little bit of help.
00:39:39.760 | - Have you done-- - But I'm still in control.
00:39:41.480 | - Let me ask you something.
00:39:42.320 | Have you done the experiment of letting YouTube
00:39:45.120 | recommend videos to you,
00:39:46.480 | either starting from a absolutely anonymous random place
00:39:50.440 | where it doesn't know who you are,
00:39:51.520 | or from knowing who you or somebody else is,
00:39:53.520 | and then going 15 or 20 hops?
00:39:55.360 | Have you ever done that and just let it go,
00:39:57.560 | top video recommend and then just go 20 hops?
00:40:00.200 | - No, I've not.
00:40:01.040 | - I've done that many times now.
00:40:03.160 | I have, because of how large YouTube is
00:40:06.000 | and how widely it's used,
00:40:07.600 | it's very hard to get to enough scale
00:40:10.760 | to get a statistically solid result on this.
00:40:14.360 | I've done it with high school kids,
00:40:15.880 | with dozens of kids doing it at a time.
00:40:18.400 | Every time I've done an experiment,
00:40:20.240 | the majority of times, after about 17 or 18 hops,
00:40:23.520 | you end up in really weird, paranoid, bizarre territory.
00:40:27.280 | Because ultimately, that is the stuff
00:40:29.560 | the algorithm rewards the most,
00:40:30.960 | because of the feedback crudeness.
00:40:32.440 | I was just talking about.
00:40:34.040 | So, I'm not saying that the video
00:40:36.840 | never recommends something cool.
00:40:38.360 | I'm saying that its fundamental core
00:40:40.480 | is one that promotes a paranoid style,
00:40:43.720 | that promotes increasing irritability,
00:40:46.360 | that promotes xenophobia, promotes fear, anger,
00:40:50.120 | promotes selfishness, promotes separation between people.
00:40:54.320 | And I would, the thing is,
00:40:55.720 | it's very hard to do this work solidly.
00:40:58.200 | Many have repeated this experiment,
00:40:59.960 | and yet, it still is kind of anecdotal.
00:41:02.200 | I'd like to do a large citizen science thing sometime
00:41:05.560 | and do it, but then I think the problem with that
00:41:07.320 | is YouTube would detect it and then change it.
00:41:09.480 | - Well, yes, I definitely,
00:41:10.640 | I love that kind of stuff in Twitter.
00:41:12.120 | So, Jack Dorsey has spoken about
00:41:14.160 | doing healthy conversations on Twitter,
00:41:16.800 | or optimizing for healthy conversations.
00:41:19.120 | What that requires within Twitter
00:41:20.680 | are most likely citizen experiments
00:41:24.040 | of what does healthy conversations actually look like,
00:41:27.040 | and how do you incentivize those healthy conversations?
00:41:30.040 | You're describing what often happens
00:41:33.040 | and what is currently happening.
00:41:34.720 | What I'd like to argue is it's possible
00:41:36.760 | to strive for healthy conversations,
00:41:39.760 | not in a dogmatic way of saying,
00:41:42.760 | I know what healthy conversations are,
00:41:44.680 | and I will tell you.
00:41:45.520 | - I think one way to do this
00:41:46.920 | is to try to look around at social,
00:41:49.880 | maybe not things that are officially social media,
00:41:51.880 | but things where people are together online
00:41:54.040 | and see which ones have more healthy conversations,
00:41:56.800 | even if it's hard to be completely objective
00:42:00.600 | in that measurement, you can kind of,
00:42:02.160 | at least crudely disagree.
00:42:03.000 | - You could do subjective annotation,
00:42:05.880 | like have a large crowd-sourced annotation.
00:42:07.760 | - Yeah, one that I've been really interested in is GitHub,
00:42:10.840 | 'cause it could change, I'm not saying it'll always be,
00:42:16.080 | but for the most part,
00:42:17.920 | GitHub has had a relatively quite low poison quotient,
00:42:22.280 | and I think there's a few things about GitHub
00:42:25.360 | that are interesting.
00:42:26.640 | One thing about it is that people have a stake in it.
00:42:29.560 | It's not just empty status games.
00:42:31.880 | There's actual code, or there's actual stuff being done,
00:42:35.160 | and I think as soon as you have
00:42:36.520 | a real-world stake in something,
00:42:38.280 | you have a motivation to not screw up that thing,
00:42:42.680 | and I think that that's often missing,
00:42:45.520 | that there's no incentive for the person
00:42:48.240 | to really preserve something
00:42:49.640 | if they get a little bit of attention
00:42:51.560 | from dumping on somebody's TikTok or something.
00:42:56.000 | They don't pay any price for it,
00:42:56.960 | but you have to kind of get decent with people
00:43:00.760 | when you have a shared stake, a little secret,
00:43:03.160 | so GitHub does a bit of that.
00:43:05.160 | - GitHub is wonderful, yes,
00:43:08.640 | but I'm tempted to play the journey back at you,
00:43:13.320 | which is that GitHub is currently amazing,
00:43:16.440 | but the thing is, if you have a stake,
00:43:18.400 | then if it's a social media platform,
00:43:20.480 | they can use the fact that you have a stake
00:43:22.520 | to manipulate you because you wanna preserve the stake.
00:43:25.280 | - Right, well, this is why,
00:43:27.400 | this gets us into the economics,
00:43:29.240 | so there's this thing called data dignity
00:43:30.880 | that I've been studying for a long time.
00:43:33.000 | I wrote a book about an earlier version of it
00:43:34.920 | called "Who Owns the Future?"
00:43:37.000 | And the basic idea of it is that,
00:43:39.960 | once again, this is a 30-year conversation.
00:43:43.440 | - It's a fascinating topic.
00:43:44.280 | - Let me do the fastest version of this I can do.
00:43:46.800 | The fastest way I know how to do this
00:43:48.800 | is to compare two futures, all right?
00:43:51.960 | So future one is then the normative one,
00:43:55.600 | the one we're building right now,
00:43:56.880 | and future two is gonna be data dignity, okay?
00:44:00.240 | And I'm gonna use a particular population.
00:44:03.120 | I live on the Hill in Berkeley,
00:44:05.320 | and one of the features about the Hill
00:44:06.960 | is that as the climate changes, we might burn down
00:44:09.360 | and all lose our houses or die or something.
00:44:11.480 | Like, it's dangerous, you know, and it didn't used to be.
00:44:14.240 | And so who keeps us alive?
00:44:17.000 | Well, the city does.
00:44:18.360 | The city does some things.
00:44:19.520 | The electric company, kind of, sort of,
00:44:21.440 | maybe, hopefully, better.
00:44:23.440 | Individual people who own property
00:44:26.000 | take care of their property.
00:44:26.960 | That's all nice, but there's this other middle layer,
00:44:29.280 | which is fascinating to me,
00:44:30.960 | which is that the groundskeepers
00:44:33.560 | who work up and down that Hill,
00:44:35.320 | many of whom are not legally here,
00:44:38.680 | many of whom don't speak English,
00:44:40.520 | cooperate with each other to make sure trees don't touch
00:44:44.320 | to transfer fire easily from lot to lot.
00:44:46.640 | They have this whole little web that's keeping us safe.
00:44:49.160 | I didn't know about this at first.
00:44:50.480 | I just started talking to them
00:44:52.480 | 'cause they were out there during the pandemic.
00:44:54.320 | And so I'd try to just see who are these people?
00:44:56.800 | Who are these people who are keeping us alive?
00:44:59.320 | Now, I wanna talk about the two different faiths
00:45:01.480 | for those people under Future One and Future Two.
00:45:04.880 | Future One, some weird, like, kindergarten paint job van
00:45:09.880 | with all these, like, cameras and weird things,
00:45:11.960 | drives up, observes what the gardeners
00:45:13.760 | and groundskeepers are doing.
00:45:15.600 | A few years later, some amazing robots
00:45:18.200 | that can shimmy up trees and all this
00:45:20.040 | show up, all those people are out of work,
00:45:21.600 | and there are these robots doing the thing.
00:45:23.160 | And the robots are good, and they can scale to more land.
00:45:26.400 | And they're actually good,
00:45:28.480 | but then there are all these people out of work.
00:45:29.840 | And these people have lost dignity.
00:45:31.360 | They don't know what they're gonna do.
00:45:32.960 | And then some people say,
00:45:34.280 | "Well, they go on basic income, whatever.
00:45:35.760 | They become wards of the state."
00:45:39.040 | My problem with that solution is every time in history
00:45:42.620 | that you've had some centralized thing
00:45:44.360 | that's doling out the benefits,
00:45:45.640 | that thing gets seized by people
00:45:47.280 | because it's too centralized and it gets seized.
00:45:49.560 | That's happened to every communist experiment I can find.
00:45:52.360 | So I think that turns into a poor future
00:45:56.040 | that will be unstable.
00:45:57.720 | I don't think people will feel good in it.
00:45:59.240 | I think it'll be a political disaster
00:46:01.460 | with a sequence of people seizing
00:46:03.080 | this central source of the basic income.
00:46:06.800 | And you'll say, "Oh, no, an algorithm can do it."
00:46:08.400 | Then people will seize the algorithm.
00:46:09.720 | They'll seize control.
00:46:11.240 | - Unless the algorithm is decentralized
00:46:13.560 | and it's impossible to seize the control.
00:46:15.600 | - Yeah, but-- - Very difficult.
00:46:18.160 | - 60-something people own a quarter of all the Bitcoin.
00:46:22.440 | Like the things that we think are decentralized
00:46:24.160 | are not decentralized.
00:46:25.960 | So let's go to future two.
00:46:27.840 | Future two, the gardeners see that van with all the cameras
00:46:32.480 | and the kindergarten paint job,
00:46:33.680 | and they say, the groundskeepers,
00:46:35.440 | and they say, "Hey, the robots are coming.
00:46:37.680 | We're gonna form a data union."
00:46:39.000 | And amazingly, California has a little baby data union law
00:46:43.360 | emerging in the books.
00:46:44.200 | - Interesting, that's interesting.
00:46:45.680 | - And so they'll, and what they say,
00:46:48.520 | and they say, "We're gonna form a data union,
00:46:52.600 | and we're gonna, not only are we gonna sell our data
00:46:55.880 | to this place, but we're gonna make it better
00:46:57.280 | than it would have been if they were just grabbing it
00:46:58.840 | without our cooperation.
00:47:00.160 | And we're gonna improve it.
00:47:01.840 | We're gonna make the robots more effective.
00:47:03.400 | We're gonna make them better,
00:47:04.240 | and we're gonna be proud of it.
00:47:05.360 | We're gonna become a new class of experts
00:47:08.480 | that are respected."
00:47:09.920 | And then here's the interesting,
00:47:11.760 | there's two things that are different about that world
00:47:14.560 | from future one.
00:47:15.680 | One thing, of course, the people have more pride.
00:47:17.680 | They have more sense of ownership, of agency,
00:47:22.280 | but what the robots do changes.
00:47:27.280 | Instead of just like this functional,
00:47:30.040 | like, we'll figure out how to keep the neighborhood
00:47:31.640 | from burning down, you have this whole creative community
00:47:35.400 | that wasn't there before thinking,
00:47:36.560 | well, how can we make these robots better
00:47:38.080 | so we can keep on earning money?
00:47:39.760 | There'll be waves of creative groundskeeping
00:47:44.360 | with spiral pumpkin patches and waves of cultural things.
00:47:48.080 | There'll be new ideas like,
00:47:49.560 | wow, I wonder if we could do something
00:47:51.680 | about climate change mitigation with how we do this.
00:47:54.560 | What about fresh water?
00:47:56.520 | Can we make the food healthier?
00:47:59.280 | What about, all of a sudden,
00:48:00.560 | there'll be this whole creative community on the case.
00:48:03.400 | And isn't it nicer to have a high-tech future
00:48:06.240 | with more creative classes
00:48:07.680 | than one with more dependent classes?
00:48:09.320 | Isn't that a better future?
00:48:10.600 | But, but, but, but, but,
00:48:12.600 | future one and future two have the same robots
00:48:16.520 | and the same algorithms.
00:48:17.680 | There's no technological difference.
00:48:19.480 | There's only a human difference.
00:48:21.720 | And that's second future two, that's state of dignity.
00:48:24.440 | - The economy that you're, I mean,
00:48:27.360 | the game theory here is on the humans,
00:48:29.320 | and then the technology is just the tools
00:48:31.840 | that enable both possibilities.
00:48:32.680 | - Yeah, you know, I mean, I think you can believe in AI
00:48:36.480 | and be in future two.
00:48:37.680 | I just think it's a little harder.
00:48:39.480 | You have to do more contortions.
00:48:42.600 | It's possible.
00:48:43.440 | - So in the case of social media,
00:48:46.080 | what does data dignity look like?
00:48:49.200 | Is it people getting paid for their data?
00:48:51.480 | - Yeah, I think what should happen is in the future,
00:48:55.360 | there should be massive data unions
00:48:58.320 | for people putting content into the system.
00:49:04.000 | And those data unions should smooth out the results
00:49:06.440 | a little bit so it's not winner-take-all.
00:49:08.640 | But at the same time, and people have to pay for it too.
00:49:11.680 | They have to pay for Facebook the way they pay for Netflix
00:49:14.920 | with an allowance for the poor.
00:49:17.440 | There has to be a way out too.
00:49:20.320 | But the thing is, people do pay for Netflix.
00:49:22.240 | It's a going concern.
00:49:23.480 | People pay for Xbox and PlayStation.
00:49:26.280 | Like people, there's enough people to pay
00:49:28.200 | for stuff they want.
00:49:29.040 | This could happen too.
00:49:29.880 | It's just that this precedent started
00:49:31.360 | that moved it in the wrong direction.
00:49:33.120 | And then what has to happen,
00:49:35.080 | the economy's a measuring device.
00:49:38.040 | So if it's an honest measuring device,
00:49:40.920 | the outcomes for people form a normal distribution,
00:49:44.320 | a bell curve.
00:49:45.440 | And then so there should be a few people
00:49:47.000 | who do really well, a lot of people who do okay.
00:49:49.400 | And then we should have an expanding economy
00:49:51.480 | reflecting more and more creativity and expertise
00:49:54.680 | flowing through the network.
00:49:56.360 | And that expanding economy moves the result
00:49:58.680 | just a bit forward.
00:49:59.520 | So more people are getting money out of it
00:50:01.720 | than are putting money into it.
00:50:02.920 | So it gradually expands the economy and lifts all boats.
00:50:05.600 | And the society has to support the lower wing
00:50:09.400 | of the bell curve too, but not universal basic income.
00:50:12.080 | It has to be for the, you know,
00:50:14.280 | 'cause if it's an honest economy,
00:50:16.720 | there will be that lower wing
00:50:19.200 | and we have to support those people.
00:50:20.800 | There has to be a safety net.
00:50:22.840 | But see what I believe, I'm not gonna talk about AI,
00:50:26.040 | but I will say that I think there'll be more
00:50:29.120 | and more algorithms that are useful.
00:50:31.160 | And so I don't think everybody's gonna be supplying data
00:50:34.800 | to groundskeeping robots,
00:50:36.160 | nor do I think everybody's gonna make their living
00:50:38.040 | with TikTok videos.
00:50:38.880 | I think in both cases, there'll be a rather small contingent
00:50:42.840 | that do well enough at either of those things.
00:50:45.280 | But I think there might be many, many, many,
00:50:48.080 | many of those niches that start to evolve
00:50:50.000 | as there are more and more algorithms,
00:50:51.160 | more and more robots.
00:50:52.200 | And it's that large number
00:50:54.680 | that will create the economic potential
00:50:56.680 | for a very large part of society
00:50:58.640 | to become members of new creative classes.
00:51:01.640 | - Do you think it's possible to create a social network
00:51:06.320 | that competes with Twitter and Facebook
00:51:07.960 | that's large and centralized in this way?
00:51:10.080 | Not centralized, sort of large, large.
00:51:12.200 | - How do we get, all right,
00:51:13.120 | so I gotta tell you how to get from what I'm talking,
00:51:16.600 | how to get from where we are to anything kind of in the zone
00:51:19.440 | of what I'm talking about is challenging.
00:51:22.400 | I know some of the people who run,
00:51:25.960 | like I know Jack Dorsey,
00:51:27.720 | I view Jack as somebody who's actually,
00:51:32.440 | I think he's really striving and searching
00:51:37.000 | and trying to find a way to make it better,
00:51:39.160 | but it's kind of like,
00:51:42.400 | it's very hard to do it while in flight.
00:51:44.240 | And he's under enormous business pressure too.
00:51:46.520 | - So Jack Dorsey to me is a fascinating study
00:51:49.640 | because I think his mind is in a lot of good places.
00:51:52.680 | He's a good human being,
00:51:54.560 | but there's a big Titanic ship
00:51:56.480 | that's already moving in one direction.
00:51:57.960 | It's hard to know what to do with it.
00:51:59.200 | - I think that's the story of Twitter.
00:52:00.920 | I think that's the story of Twitter.
00:52:02.720 | One of the things that I observed
00:52:04.240 | is that if you just wanna look at the human side,
00:52:06.560 | meaning like how are people being changed?
00:52:08.760 | How do they feel?
00:52:09.600 | What does the culture like?
00:52:11.480 | Almost all of the social media platforms that get big
00:52:15.920 | have an initial sort of honeymoon period
00:52:18.040 | where they're actually kind of sweet and cute.
00:52:20.320 | Like if you look at the early years of Twitter,
00:52:22.320 | it was really sweet and cute,
00:52:23.760 | but also look at Snap, TikTok.
00:52:27.480 | And then what happens is as they scale
00:52:30.400 | and the algorithms become more influential
00:52:32.720 | instead of just the early people,
00:52:34.080 | when it gets big enough that it's the algorithm running it,
00:52:36.920 | then you start to see the rise of the paranoid style
00:52:39.600 | and then they start to get dark.
00:52:40.760 | And we've seen that shift in TikTok rather recently.
00:52:43.800 | - But I feel like that scaling reveals the flaws
00:52:48.720 | within the incentives.
00:52:50.160 | - I feel like I'm torturing you.
00:52:52.880 | I'm sorry.
00:52:53.720 | - No, it's not torturing.
00:52:54.560 | No, because I have hope for the world with humans
00:52:59.560 | and I have hope for a lot of things that humans create,
00:53:02.800 | including technology.
00:53:04.320 | And I just, I feel it is possible
00:53:06.800 | to create social media platforms
00:53:08.960 | that incentivize different things than the current.
00:53:13.360 | I think the current incentivization
00:53:15.760 | is around like the dumbest possible thing
00:53:18.040 | that was invented like 20 years ago, however long.
00:53:21.720 | And it just works and so nobody's changing it.
00:53:24.080 | I just think that there could be a lot of innovation
00:53:26.600 | for more, see, you kind of push back this idea
00:53:29.480 | that we can't know what long-term growth or happiness is.
00:53:33.640 | If you give control to people
00:53:35.600 | to define what their long-term happiness and goals are,
00:53:39.360 | then that optimization can happen
00:53:42.440 | for each of those individual people.
00:53:44.240 | - Well, I mean, imagine a future
00:53:50.240 | where probably a lot of people would love
00:53:55.240 | to make their living doing TikTok dance videos,
00:53:59.920 | but people recognize generally
00:54:02.080 | that's kind of hard to get into.
00:54:03.880 | Nonetheless, dance crews have an experience
00:54:07.600 | that's very similar to programmers
00:54:09.640 | working together on GitHub.
00:54:10.920 | So the future is like a cross between TikTok and GitHub
00:54:14.120 | and they get together and they have their,
00:54:17.440 | they have rights.
00:54:18.280 | They're negotiating, they're negotiating for returns.
00:54:21.360 | They join different artist societies
00:54:23.680 | in order to soften the blow of the randomness
00:54:26.880 | of who gets the network effect benefit
00:54:29.200 | 'cause nobody can know that.
00:54:31.040 | And I think an individual person
00:54:35.520 | might join a thousand different data unions
00:54:37.800 | in the course of their lives, or maybe even 10,000.
00:54:40.240 | I don't know, but the point is
00:54:41.560 | that we'll have like these very hedged,
00:54:44.200 | distributed portfolios of different data unions
00:54:46.760 | we're part of.
00:54:47.880 | And some of them might just trickle in a little money
00:54:50.160 | for nonsense stuff where we're contributing
00:54:52.880 | to health studies or something.
00:54:54.960 | But I think people will find their way.
00:54:57.000 | They'll find their way to the right GitHub-like community
00:55:00.360 | in which they find their value
00:55:02.680 | in the context of supplying inputs and data
00:55:06.360 | and taste and correctives and all of this
00:55:09.880 | into the algorithms and the robots of the future.
00:55:14.760 | - And that is a way to resist
00:55:17.040 | the lizard brain-based funding mechanisms.
00:55:22.040 | - It's an alternate economic system
00:55:25.280 | that rewards productivity, creativity,
00:55:28.800 | value as perceived by others.
00:55:30.400 | It's a genuine market.
00:55:31.480 | It's not doled out from a center.
00:55:33.080 | There's not some communist person deciding who's valuable.
00:55:36.320 | It's actual market.
00:55:37.480 | And the money is made by supporting that
00:55:43.320 | instead of just grabbing people's attention
00:55:46.400 | in the cheapest possible way,
00:55:47.640 | which is definitely how you get the lizard brain.
00:55:49.840 | - Yeah.
00:55:50.840 | Okay, so we're finally at the agreement.
00:55:53.160 | (laughing)
00:55:55.720 | But I just think that...
00:55:57.760 | (laughing)
00:55:59.280 | So yeah, I'll tell you how I think to fix social media.
00:56:03.080 | There's a few things.
00:56:05.480 | So one, I think people should have complete control
00:56:08.000 | over their data and transparency of what that data is
00:56:11.720 | and how it's being used if they do hand over the control.
00:56:14.760 | Another thing, they should be able to delete,
00:56:16.520 | walk away with their data at any moment, easy,
00:56:19.720 | like with a single click of a button, maybe two buttons.
00:56:22.160 | I don't know.
00:56:23.000 | Just easily walk away with their data.
00:56:25.200 | The other is control of the algorithm,
00:56:28.120 | individualized control of the algorithm for them.
00:56:31.240 | So each one has their own algorithm.
00:56:33.480 | Each person has their own algorithm.
00:56:34.880 | They get to be the decider of what they see in this world.
00:56:39.120 | And to me, that's, I guess, fundamentally decentralized
00:56:43.640 | in terms of the key decisions being made.
00:56:46.120 | But if that's made transparent,
00:56:47.480 | I feel like people will choose that system
00:56:50.120 | over Twitter of today, over Facebook of today,
00:56:53.560 | when they have the ability to walk away,
00:56:55.280 | to control their data,
00:56:56.720 | and to control the kinds of thing they see.
00:56:59.040 | Now, let's walk away from the term AI.
00:57:01.640 | You're right.
00:57:03.000 | In this case, you have full control of the algorithms
00:57:07.080 | that help you if you want to use their help.
00:57:10.400 | But you can also say F you to those algorithms
00:57:12.880 | and just consume the raw, beautiful waterfall
00:57:17.000 | of the internet.
00:57:19.440 | I think that, to me, that's not only fixes social media,
00:57:23.520 | but I think it would make a lot more money.
00:57:25.400 | So I would like to challenge the idea.
00:57:27.000 | I know you're not presenting that,
00:57:28.360 | but that the only way to make a ton of money
00:57:31.840 | is to operate like Facebook is.
00:57:33.840 | I think you can make more money by giving people control.
00:57:37.680 | - Yeah, I mean, I certainly believe that.
00:57:39.880 | We're definitely in the territory
00:57:41.600 | of wholehearted agreement here.
00:57:45.240 | I do want to caution against one thing,
00:57:48.600 | which is making a future that benefits programmers
00:57:52.560 | versus people, like this idea
00:57:53.760 | that people are in control of their data.
00:57:55.280 | So years ago, I co-founded an advisory board for the EU
00:57:59.800 | with a guy named Giovanni Buttarelli who passed away.
00:58:02.120 | It's one of the reasons I wanted to mention it.
00:58:03.480 | A remarkable guy who'd been,
00:58:06.120 | he was originally a prosecutor
00:58:07.720 | who was throwing mafioso in jail in Sicily.
00:58:12.040 | So he was like this intense guy who was like,
00:58:15.000 | I've dealt with death threats,
00:58:17.160 | Mark Zuckerberg doesn't scare me, whatever.
00:58:19.000 | So we worked on this path of saying,
00:58:22.200 | let's make it all about transparency and consent.
00:58:24.240 | And it was one of the theaters that led to this huge
00:58:26.960 | data privacy and protection framework in Europe
00:58:32.520 | called the GDPR.
00:58:34.000 | And so therefore, we've been able to have empirical feedback
00:58:38.280 | on how that goes.
00:58:39.120 | And the problem is that most people actually get stymied
00:58:44.000 | by the complexity of that kind of management.
00:58:46.880 | They have trouble, and reasonably so.
00:58:49.720 | I don't, I'm like a techie.
00:58:51.520 | I can go in and I can figure out what's going on.
00:58:54.480 | But most people really do.
00:58:56.800 | And so there's a problem that it differentially benefits
00:59:02.800 | those who kind of have a technical mindset
00:59:05.520 | and can go in and sort of have a feeling
00:59:07.240 | for how this stuff works.
00:59:09.000 | I kind of still want to come back to incentives.
00:59:11.520 | And so if the incentive for whoever's,
00:59:15.080 | if the commercial incentive is to help the creative people
00:59:17.600 | of the future make more money,
00:59:18.640 | 'cause you get a cut of it,
00:59:20.440 | that's how you grow an economy.
00:59:22.280 | - Not the programmers.
00:59:24.080 | - Well, some of them will be programmers.
00:59:25.640 | It's not anti-programmer.
00:59:26.720 | I'm just saying that it's not only programmers.
00:59:30.840 | - So, yeah, you have to make sure the incentives are right.
00:59:35.640 | I mean, I like control is an interface problem
00:59:40.520 | to where you have to create something
00:59:41.760 | that's compelling to everybody,
00:59:45.120 | to the creatives, to the public.
00:59:48.120 | I mean, there's, I don't know, Creative Commons,
00:59:52.040 | like the licensing.
00:59:53.400 | There's a bunch of legal speak,
00:59:57.320 | just in general, the whole legal profession.
01:00:00.360 | It's nice when it can be simplified
01:00:01.880 | in the way that you can truly, simply understand.
01:00:04.000 | Everybody can simply understand the basics.
01:00:07.880 | In the same way, it should be very simple to understand
01:00:11.600 | how the data is being used
01:00:14.920 | and what data is being used for people.
01:00:17.600 | But then you're arguing that in order for that to happen,
01:00:20.600 | you have to have the incentives aligned.
01:00:22.480 | - I mean, a lot of the reason that money works
01:00:26.640 | is actually information hiding and information loss.
01:00:30.280 | Like, one of the things about money
01:00:32.200 | is a particular dollar you get
01:00:34.400 | might have passed through your enemy's hands
01:00:36.480 | and you don't know it.
01:00:37.800 | But also, I mean, this is what Adam Smith,
01:00:40.400 | if you wanna give the most charitable interpretation possible
01:00:43.520 | to the invisible hand, is what he was saying,
01:00:46.000 | is that there's this whole complicated thing,
01:00:48.600 | and not only do you not need to know about it,
01:00:50.640 | the truth is you'd never be able to follow it if you tried.
01:00:52.880 | And it's like, let the economic incentives
01:00:55.880 | solve for this whole thing.
01:00:57.720 | And that, in a sense, every transaction's like a neuron
01:01:01.840 | in a neural net.
01:01:02.680 | If he'd had that metaphor, he would have used it.
01:01:05.680 | And let the whole thing settle to a solution
01:01:08.080 | and don't worry about it.
01:01:09.760 | I think this idea of having incentives
01:01:13.680 | that reduce complexity for people can be made to work.
01:01:17.360 | And that's an example of an algorithm
01:01:19.240 | that could be manipulative or not,
01:01:20.600 | going back to your question before about,
01:01:22.000 | can you do it in a way that's not manipulative?
01:01:24.480 | And I would say a GitHub-like, if you just have this vision,
01:01:29.280 | GitHub plus TikTok combined, is it possible?
01:01:33.320 | I think it is.
01:01:34.400 | I really think it is. - I'm not gonna be able
01:01:35.640 | to unsee that idea of creatives on TikTok
01:01:39.920 | collaborating in the same way
01:01:41.040 | that people on GitHub collaborate.
01:01:42.760 | - Why not? - I like that kind of version.
01:01:44.520 | - Why not? - I like it, I love it.
01:01:46.400 | - I just, like, right now when people use,
01:01:48.880 | by the way, father of teenage daughter, so.
01:01:51.720 | - It's all about TikTok, right?
01:01:53.560 | - So, you know, when people use TikTok,
01:01:55.600 | there's a lot of, it's kind of funny,
01:01:59.120 | I was gonna say cattiness,
01:02:00.160 | but I was just using the cat as this exemplar
01:02:03.000 | of what we're talking about, so I don't know.
01:02:04.640 | I contradict myself, but anyway,
01:02:06.280 | there's all this cattiness where people are like,
01:02:07.840 | eww, this person's, and I just,
01:02:11.000 | what about people getting together
01:02:13.640 | and kind of saying, okay, we're gonna work on this move,
01:02:16.560 | we're gonna get a better, can we get a better musician?
01:02:18.760 | And they do that, but that's the part
01:02:22.040 | that's kind of off the books right now, you know?
01:02:25.240 | That should be, like, right there,
01:02:26.320 | that should be the center, that's where the,
01:02:28.120 | that's the really best part.
01:02:29.480 | - Well, that's where the invention of Git, period,
01:02:32.040 | the versioning is brilliant, and so some of the things
01:02:35.600 | you're talking about, technology, algorithms,
01:02:38.440 | tools can empower, and that's the thing,
01:02:41.800 | for humans to connect, to collaborate, and so on.
01:02:45.080 | Can we upset more people a little bit?
01:02:48.040 | You already-- - Maybe, we'd have to try.
01:02:50.800 | - No, no, can we, can I ask you to elaborate,
01:02:53.920 | 'cause my intuition was that you would be a supporter
01:02:57.000 | of something like cryptocurrency and Bitcoin,
01:02:59.360 | because it is fundamentally emphasizes decentralization.
01:03:03.160 | What do you, so can you elaborate on what--
01:03:05.760 | - Yeah, okay, look-- - Your thoughts on Bitcoin.
01:03:08.160 | - It's kind of funny, I wrote,
01:03:12.960 | I've been advocating some kind of digital currency
01:03:16.600 | for a long time, and when the,
01:03:19.560 | when Bitcoin came out, and the original paper on blockchain,
01:03:25.760 | my heart kind of sank, because I thought,
01:03:29.440 | oh my God, we're applying all of this fancy thought,
01:03:32.800 | and all these very careful distributed security measures
01:03:36.400 | to recreate the gold standard?
01:03:38.600 | Like, it's just so retro, it's so dysfunctional,
01:03:42.080 | it's so useless from an economic point of view,
01:03:44.120 | so it's always, and then the other thing
01:03:46.440 | is using computational inefficiency at a boundless scale
01:03:50.240 | as your form of security is a crime against the atmosphere,
01:03:54.000 | obviously, a lot of people know that now,
01:03:55.520 | but we knew that at the start.
01:03:57.720 | Like, the thing is, when the first paper came out,
01:03:59.800 | I remember a lot of people saying, oh my God,
01:04:01.520 | this thing scales, it's a carbon disaster, you know?
01:04:04.560 | And I just like, I'm just mystified,
01:04:09.320 | but that's a different question than when you asked,
01:04:11.480 | can you have a cryptographic currency,
01:04:15.200 | or at least some kind of digital currency
01:04:17.400 | that's of a benefit, and absolutely, like I'm,
01:04:20.360 | and there are people who are trying
01:04:21.520 | to be thoughtful about this, you should,
01:04:23.320 | if you haven't, you should interview
01:04:24.640 | Vitalik Buterin sometime.
01:04:25.960 | - Yeah, I've interviewed him twice.
01:04:27.640 | - Okay, so like, there are people in the community
01:04:30.080 | who are trying to be thoughtful,
01:04:31.040 | and trying to figure out how to do this better.
01:04:33.000 | - It has nice properties, though, right?
01:04:34.400 | So one of the nice properties is that,
01:04:36.160 | like, government centralized, it's hard to control.
01:04:39.320 | And then the other one, to fix some of the issues
01:04:41.400 | that you're referring to, I'm sort of playing
01:04:42.960 | devil's advocate here, is, you know,
01:04:44.520 | there's lightning network, there's ideas,
01:04:46.480 | how you build stuff on top of Bitcoin,
01:04:50.000 | similar with gold, that allow you to have
01:04:52.240 | this kind of vibrant economy that operates,
01:04:55.240 | not on the blockchain, but outside the blockchain,
01:04:57.440 | and uses Bitcoin for like,
01:05:01.440 | checking the security of those transactions.
01:05:03.360 | - So Bitcoin's not new, it's been around for a while.
01:05:05.920 | I've been watching it closely, I've not seen one example
01:05:10.480 | of it creating economic growth.
01:05:13.000 | There was this obsession with the idea
01:05:14.440 | that government was the problem.
01:05:16.200 | That idea that government's the problem,
01:05:18.520 | let's say government earned that wrath honestly.
01:05:22.880 | Because if you look at some of the things
01:05:25.160 | that governments have done in recent decades,
01:05:27.200 | it's not a pretty story.
01:05:29.000 | Like, after a very small number of people
01:05:33.440 | in the US government decided to bomb and landmine
01:05:37.480 | Southeast Asia, it's hard to come back and say,
01:05:40.440 | oh, government's this great thing.
01:05:41.840 | But then the problem is that this resistance to government
01:05:46.840 | is basically resistance to politics.
01:05:51.040 | It's a way of saying, if I can get rich,
01:05:53.280 | nobody should bother me.
01:05:54.320 | It's a way of not having obligations to others.
01:05:56.960 | And that ultimately is a very suspect motivation.
01:06:00.760 | - But does that mean that the impulse,
01:06:04.240 | that the government should not overreach its power is flawed?
01:06:09.240 | - Well, I mean, what I wanna ask you to do
01:06:12.160 | is to replace the word government with politics.
01:06:15.560 | Like, our politics is people having to deal with each other.
01:06:19.840 | My theory about freedom is that the only authentic
01:06:23.560 | form of freedom is perpetual annoyance.
01:06:26.560 | All right, so annoyance means you're actually dealing
01:06:30.200 | with people 'cause people are annoying.
01:06:31.880 | Perpetual means that that annoyance is survivable,
01:06:34.600 | so it doesn't destroy us all.
01:06:36.240 | So if you have perpetual annoyance, then you have freedom.
01:06:38.680 | - And that's politics.
01:06:39.840 | - That's politics.
01:06:40.680 | If you don't have perpetual annoyance,
01:06:42.920 | something's gone very wrong, and you've suppressed
01:06:45.200 | those people, and it's only temporary,
01:06:46.480 | it's gonna come back and be horrible.
01:06:48.440 | You should seek perpetual annoyance.
01:06:51.080 | I'll invite you to a Berkeley City Council meeting
01:06:52.920 | so you can know what that feels like.
01:06:54.080 | - What perpetual annoyance feels like.
01:06:55.640 | (laughing)
01:06:57.520 | - But anyway, so freedom is being,
01:06:59.720 | the test of freedom is that you're annoyed
01:07:01.520 | by other people.
01:07:02.360 | If you're not, you're not free.
01:07:03.560 | If you're not, you're trapped in some temporary illusion
01:07:06.160 | that's gonna fall apart.
01:07:07.840 | Now, this quest to avoid government is really a quest
01:07:11.360 | to avoid that political feeling, but you have to have it.
01:07:14.200 | You have to deal with it.
01:07:15.560 | And it sucks, but that's the human situation.
01:07:19.360 | That's the human condition.
01:07:20.680 | And this idea that we're gonna have this abstract thing
01:07:22.920 | that protects us from having to deal with each other
01:07:25.280 | is always an illusion.
01:07:26.760 | - The idea, and I apologize, I overstretched the use
01:07:30.800 | of the word government.
01:07:32.320 | The idea is there should be some punishment from the people
01:07:37.320 | when a bureaucracy, when a set of people
01:07:41.120 | or a particular leader, like in an authoritarian regime,
01:07:44.580 | which more than half the world currently lives under,
01:07:47.280 | if they become, they stop representing the people,
01:07:53.800 | it stops being like a Berkeley meeting
01:07:56.680 | and starts being more like a dictatorial kind of situation.
01:08:01.680 | So the point is it's nice to give people,
01:08:06.080 | the populace in a decentralized way,
01:08:08.960 | power to resist that kind of,
01:08:13.960 | that government becoming over authoritarian.
01:08:15.520 | - Yeah, but people, see this idea that the problem
01:08:18.360 | is always the government being powerful is false.
01:08:21.560 | The problem can also be criminal gangs.
01:08:23.600 | The problem can also be weird cults.
01:08:25.560 | The problem can be abusive, abusive clergy.
01:08:30.440 | The problem can be infrastructure that fails.
01:08:35.200 | The problem can be poisoned water.
01:08:37.620 | The problem can be failed electric grids.
01:08:39.960 | The problem can be a crappy education system
01:08:44.960 | that makes the whole society less and less
01:08:49.000 | able to create value.
01:08:51.400 | There are all these other problems that are different
01:08:53.480 | from an overbearing government.
01:08:54.660 | Like you have to keep some sense of perspective
01:08:57.000 | and not be obsessed with only one kind of problem
01:08:59.320 | because then the others will pop up.
01:09:01.240 | - But empirically speaking,
01:09:02.500 | some problems are bigger than others.
01:09:05.320 | So like some groups of people, like governments or gangs
01:09:10.200 | or companies lead to problems more than others.
01:09:12.680 | - Are you a US citizen?
01:09:13.600 | - Yes.
01:09:14.440 | - Has the government ever really been a problem for you?
01:09:16.560 | - Well, okay.
01:09:17.400 | So first of all, I grew up in the Soviet Union
01:09:19.120 | and actually--
01:09:20.640 | - Yeah, my wife did too.
01:09:22.320 | - So I have seen, you know.
01:09:25.540 | - Sure.
01:09:26.920 | - And has the government bothered me?
01:09:28.900 | I would say that that's a really complicated question,
01:09:32.820 | especially because the United States is such,
01:09:35.480 | it's a special place like a lot of other countries.
01:09:39.620 | - My wife's family were refuseniks.
01:09:41.820 | And so we have like a very,
01:09:43.100 | and her dad was sent to the Gulag.
01:09:46.220 | For what it's worth, on my father's side,
01:09:49.260 | all but a few were killed by pogrom
01:09:51.480 | in a post-Soviet pogrom in Ukraine.
01:09:56.480 | So I--
01:09:58.000 | - I would say, 'cause you did a little trick of,
01:10:00.540 | eloquent trick of language that you switched
01:10:03.720 | to the United States to talk about government.
01:10:06.260 | So I believe, unlike my friend, Michael Malice,
01:10:10.600 | who's an anarchist,
01:10:12.080 | I believe government can do a lot of good in the world.
01:10:15.800 | That is exactly what you're saying,
01:10:17.160 | which is it's politics.
01:10:19.720 | The thing that Bitcoin folks and cryptocurrency folks argue
01:10:22.960 | is that one of the big ways
01:10:25.080 | that government can control the populace
01:10:26.840 | is a centralized bank, like control the money.
01:10:30.120 | That was the case in the Soviet Union too.
01:10:32.360 | There's, you know, inflation can really make
01:10:36.120 | poor people suffer.
01:10:38.640 | And so what they argue is this is one way to go around
01:10:43.740 | that power that government has
01:10:46.340 | of controlling the monetary system.
01:10:48.580 | So that's a way to resist.
01:10:50.220 | That's not actually saying government bad.
01:10:53.460 | That's saying some of the ways
01:10:55.700 | that central banks get into trouble
01:10:59.800 | can be resisted through centralized--
01:11:01.340 | - So let me ask you, on balance today,
01:11:04.140 | in the real world, in terms of actual facts,
01:11:07.760 | do you think cryptocurrencies are doing more
01:11:10.200 | to prop up corrupt, murderous, horrible regimes
01:11:14.020 | or to resist those regimes?
01:11:15.820 | Where do you think the balance is right now?
01:11:17.620 | - I know exactly, having talked to a lot
01:11:20.580 | of cryptocurrency folks, what they would tell me, right?
01:11:23.340 | It's hard, it's, I don't, no, no.
01:11:27.660 | - I'm asking it as a real question.
01:11:29.420 | There's no way to know the answer perfectly.
01:11:30.860 | - There's no way to know the answer perfectly.
01:11:32.740 | - However, I gotta say, if you look at people
01:11:36.500 | who've been able to decode blockchains,
01:11:39.820 | and they do leak a lot of data,
01:11:41.180 | they're not as secure as is widely thought,
01:11:43.660 | there are a lot of unknown Bitcoin whales
01:11:47.380 | from pretty early, and they're huge.
01:11:49.840 | And if you ask, who are these people?
01:11:53.860 | There's evidence that a lot of them are quite,
01:11:57.720 | not the people you'd wanna support, let's say.
01:12:00.340 | And I just don't, like, I think empirically,
01:12:03.900 | this idea that there's some intrinsic way
01:12:07.360 | that bad governments will be disempowered,
01:12:12.360 | and people will be able to resist them more
01:12:16.160 | than new villains, or even villainous governments
01:12:18.960 | will be empowered, there's no basis for that assertion.
01:12:21.920 | It just is kind of circumstantial.
01:12:24.600 | And I think in general, Bitcoin ownership is one thing,
01:12:29.600 | but Bitcoin transactions have tended
01:12:33.600 | to support criminality more than productivity.
01:12:37.440 | - Of course, they would argue that was the story
01:12:39.680 | of its early days, that now more and more Bitcoin
01:12:43.120 | is being used for legitimate transactions.
01:12:46.240 | - I didn't say for legitimate transactions,
01:12:49.320 | I said for economic growth, for creativity.
01:12:52.320 | I think what's happening is people are using it
01:12:57.080 | a little bit for buying, I don't know,
01:12:59.280 | maybe somebody's companies make it available
01:13:02.760 | for this and that, they buy a Tesla with it or something.
01:13:05.600 | Investing in a startup, hard, it might have happened
01:13:11.040 | a little bit, but it's not an engine of productivity,
01:13:13.960 | creativity, and economic growth,
01:13:16.280 | whereas old-fashioned currency still is.
01:13:18.600 | And anyway, look, I think something,
01:13:23.600 | I'm pro the idea of digital currencies.
01:13:28.200 | I am anti the idea of economics wiping out politics
01:13:33.200 | as a result.
01:13:37.760 | I think they have to exist in some balance
01:13:40.040 | to avoid the worst dysfunctions of each.
01:13:42.480 | - In some ways, there's parallels to our discussion
01:13:44.800 | of algorithms and cryptocurrency is you're pro the idea,
01:13:49.800 | but it can be used to manipulate,
01:13:54.440 | you can be used poorly by aforementioned humans.
01:13:59.360 | - Well, I think that you can make better designs
01:14:02.240 | and worse designs.
01:14:03.520 | And I think, and you know, the thing about cryptocurrency
01:14:06.320 | that's so interesting is how many of us are responsible
01:14:11.320 | for the poor designs because we're all so hooked
01:14:15.140 | on that Horatio Alger story, on like,
01:14:17.280 | I'm gonna be the one who gets the viral benefit.
01:14:20.040 | You know, way back when all this stuff was starting,
01:14:22.880 | I remember it would have been in the 80s,
01:14:24.840 | somebody had the idea of using viral as a metaphor
01:14:27.840 | for network effect.
01:14:29.680 | And the whole point was to talk about
01:14:32.300 | how bad network effect was,
01:14:33.680 | that it always created distortions that ruined
01:14:36.800 | the usefulness of economic incentives,
01:14:39.240 | that created dangerous distortions.
01:14:42.520 | Like, but then somehow, even after the pandemic,
01:14:45.640 | we think of viral as this good thing
01:14:47.160 | 'cause we imagine ourselves as the virus, right?
01:14:49.380 | We wanna be on the beneficiary side of it.
01:14:52.180 | But of course, you're not likely to be.
01:14:54.560 | - There is a sense because money is involved,
01:14:57.040 | people are not reasoning clearly always
01:15:01.560 | because they want to be part of that first viral wave
01:15:06.560 | that makes them rich.
01:15:07.640 | And that blinds people from their basic morality.
01:15:11.380 | - I had an interesting conversation.
01:15:13.440 | I don't, I sort of feel like I should respect
01:15:15.640 | some people's privacy,
01:15:16.480 | but some of the initial people who started Bitcoin,
01:15:20.920 | I remember having an argument about like,
01:15:23.520 | it's intrinsically a Ponzi scheme.
01:15:26.580 | Like, the early people have more than the later people.
01:15:29.600 | And the further down the chain you get,
01:15:31.820 | the more you're subject to gambling-like dynamics
01:15:34.920 | where it's more and more random
01:15:36.160 | and more and more subject to weird network effects
01:15:37.920 | and whatnot, unless you're a very small player, perhaps,
01:15:41.400 | and you're just buying something.
01:15:43.080 | But even then you'll be subject to fluctuations
01:15:45.280 | 'cause the whole thing is just kind of,
01:15:47.200 | like as it fluctuates,
01:15:49.080 | it's going to wave around the little people more.
01:15:51.800 | And I remember the conversation turned to gambling
01:15:55.360 | because gambling is a pretty large economic sector.
01:15:58.200 | And it's always struck me as being non-productive.
01:16:01.560 | Like somebody goes to Las Vegas, they lose money.
01:16:03.800 | And so one argument is, well, they got entertainment.
01:16:07.040 | They paid for entertainment as they lost money.
01:16:09.060 | So that's fine.
01:16:10.400 | And Las Vegas does up the losing of money
01:16:13.520 | in an entertaining way, so why not?
01:16:14.880 | It's like going to a show.
01:16:16.120 | So that's one argument.
01:16:17.640 | The argument that was made to me was different from that.
01:16:19.840 | It's that, no, what they're doing
01:16:21.360 | is they're getting a chance to experience hope.
01:16:23.880 | And a lot of people don't get that chance.
01:16:25.520 | And so that's really worth it, even if they're going to lose.
01:16:27.600 | They have that moment of hope
01:16:29.080 | and they need to be able to experience that.
01:16:31.240 | And it was a very interesting argument.
01:16:33.960 | - That's so heartbreaking 'cause I've seen it.
01:16:38.160 | But I've seen that, I have that a little bit of a sense.
01:16:41.760 | I've talked to some young people
01:16:43.220 | who invest in cryptocurrency.
01:16:46.000 | And what I see is this hope.
01:16:48.480 | This is the first thing that gave them hope.
01:16:50.400 | And that's so heartbreaking to me
01:16:52.000 | that you've gotten hope from, so much is invested.
01:16:56.840 | It's like hope from somehow becoming rich
01:17:00.040 | as opposed to something, to me, I apologize,
01:17:02.360 | but money is in the long-term
01:17:04.960 | not going to be a source of that deep meaning.
01:17:07.900 | It's good to have enough money,
01:17:09.800 | but it should not be the source of hope.
01:17:12.120 | And it's heartbreaking to me
01:17:13.160 | how many people it's a source of hope.
01:17:16.160 | - Yeah, you've just described the psychology of virality
01:17:21.160 | or the psychology of trying to base a civilization
01:17:25.720 | on semi-random occurrences of network effect peaks.
01:17:29.640 | And it doesn't really work.
01:17:32.240 | I mean, I think we need to get away from that.
01:17:34.160 | We need to soften those peaks
01:17:36.080 | and except Microsoft, which deserves every penny,
01:17:40.400 | but in every other case.
01:17:41.920 | - Well, you mentioned GitHub.
01:17:43.880 | I think what Microsoft did with GitHub was brilliant.
01:17:46.120 | I was very happy.
01:17:47.760 | Okay, if I can give a, not a critical,
01:17:51.160 | but on Microsoft, 'cause they recently purchased Bethesda.
01:17:56.160 | So Elder Scrolls is in their hands.
01:17:59.920 | I'm watching you, Microsoft,
01:18:01.360 | do not screw up my favorite game.
01:18:03.880 | - Yeah, well, look, I'm not speaking for Microsoft.
01:18:07.040 | I have an explicit arrangement with them
01:18:09.120 | where I don't speak for them, obviously.
01:18:11.320 | Like that should be very clear.
01:18:12.240 | I do not speak for them.
01:18:13.800 | I am not saying, I like them.
01:18:17.440 | I think such is amazing.
01:18:19.400 | The term data dignity was coined by Satya.
01:18:23.720 | Like, so, you know, we have, it's kind of extraordinary,
01:18:27.200 | but you know, Microsoft's this giant thing.
01:18:29.440 | It's gonna screw up this or that.
01:18:30.680 | You know, it's not, I don't know.
01:18:33.520 | It's kind of interesting.
01:18:35.040 | I've had a few occasions in my life
01:18:36.800 | to see how things work from the inside of some big thing.
01:18:39.920 | And you know, it's always just people kind of,
01:18:42.640 | it's, I don't know.
01:18:44.720 | There's always like coordination problems.
01:18:48.240 | And there's always.
01:18:49.760 | - Human problems.
01:18:50.720 | - Oh God.
01:18:51.560 | - And there's some good people, there's some bad people.
01:18:52.760 | It's always.
01:18:53.600 | - I hope Microsoft doesn't screw up your game.
01:18:56.320 | - And I hope they bring Clippy back.
01:18:57.960 | You should never kill Clippy.
01:18:59.520 | Bring Clippy back.
01:19:00.360 | - Oh, Clippy.
01:19:01.360 | But Clippy promotes the myth of AI.
01:19:04.120 | - Well, that's why, this is why I think you're wrong.
01:19:06.400 | - How about if we, all right.
01:19:08.040 | Could we bring back Bob instead of Clippy?
01:19:10.280 | - Which one was Bob?
01:19:11.240 | - Oh, Bob was another thing.
01:19:13.200 | Bob was this other screen character
01:19:15.280 | who was supposed to be the voice of AI.
01:19:16.920 | Cortana, Cortana, would Cortana do it for you?
01:19:19.400 | - Cortana is too corporate.
01:19:21.040 | I like it, it's fine.
01:19:24.920 | - There's a woman in Seattle who's like the model
01:19:27.040 | for Cortana, did Cortana's voice.
01:19:28.640 | - The voice?
01:19:29.480 | - There was like.
01:19:30.320 | - No, the voice is great.
01:19:31.200 | - We had her as a, she used to walk around
01:19:34.720 | if you were wearing hollow lines for a bit.
01:19:36.320 | I don't think that's happening anymore.
01:19:38.200 | I think, I don't think you should turn a software
01:19:40.360 | into a creature.
01:19:41.200 | - Well, you and I.
01:19:42.040 | - Get a cat, just get a cat.
01:19:43.520 | - You and I, you and I.
01:19:44.520 | Well, get a dog, get a dog.
01:19:46.640 | - Or a dog, yeah.
01:19:48.000 | - Yeah, you're.
01:19:48.840 | - Or a hedgehog.
01:19:49.960 | - A hedgehog.
01:19:50.800 | - Yeah.
01:19:51.920 | - You co-authored a paper, you mentioned Lee Smolin,
01:19:55.640 | titled "The Autodidactic Universe,"
01:20:00.200 | which describes our universe as one
01:20:01.960 | that learns its own physical laws.
01:20:05.320 | That's a trippy and beautiful and powerful idea.
01:20:08.320 | What are, what would you say are the key ideas
01:20:10.920 | in this paper?
01:20:11.760 | - Ah, okay.
01:20:12.880 | Well, I should say that paper reflected work
01:20:16.640 | from last year and the project,
01:20:18.680 | the program has moved quite a lot.
01:20:20.520 | So it's a little, there's a lot of stuff
01:20:22.320 | that's not published that I'm quite excited about.
01:20:24.160 | So I have to kind of keep my frame
01:20:26.640 | in that last year's thing.
01:20:29.120 | So I have to try to be a little careful about that.
01:20:33.680 | We can think about it in a few different ways.
01:20:35.960 | The core of the paper, the technical core of it
01:20:40.480 | is a triple correspondence.
01:20:42.640 | One part of it was already established,
01:20:46.920 | and then another part is in the process.
01:20:49.560 | The part that was established was, of course,
01:20:52.880 | understanding different theories of physics
01:20:55.240 | as matrix models.
01:20:57.000 | The part that was fresher is understanding those
01:21:01.560 | as machine learning systems,
01:21:03.400 | so that we could move fluidly
01:21:04.760 | between these different ways of describing systems.
01:21:07.320 | And the reason to wanna do that
01:21:10.200 | is to just have more tools and more options,
01:21:12.520 | because, well, theoretical physics is really hard,
01:21:17.480 | and a lot of programs have kind of run into a state
01:21:22.480 | where they feel a little stalled, I guess.
01:21:25.640 | I wanna be delicate about this,
01:21:26.720 | 'cause I'm not a physicist.
01:21:27.680 | I'm the computer scientist collaborating.
01:21:29.520 | So I don't mean to diss anybody's--
01:21:32.200 | - So this is almost like gives a framework
01:21:34.520 | for generating new ideas in physics.
01:21:36.680 | - As we start to publish more about where it's gone,
01:21:39.960 | I think you'll start to see there's tools
01:21:43.440 | and ways of thinking about theories
01:21:46.360 | that I think open up some new paths
01:21:50.480 | that will be of interest.
01:21:52.280 | There's the technical core of it,
01:21:54.920 | which is this idea of a correspondence
01:21:57.280 | to give you more facility.
01:21:58.640 | But then there's also the storytelling part of it.
01:22:01.280 | And this is something,
01:22:02.320 | Lee loves stories and I do.
01:22:06.480 | And the idea here is that a typical way
01:22:11.480 | of thinking about physics
01:22:15.080 | is that there's some kind of starting condition,
01:22:18.200 | and then there's some principle
01:22:19.400 | by which the starting condition evolves.
01:22:22.440 | And the question is like, why the starting condition?
01:22:26.240 | Like how, oh, the starting condition has to get kind of,
01:22:31.240 | it has to be fine-tuned and all these things about it
01:22:33.360 | have to be kind of perfect.
01:22:35.760 | And so we were thinking, well, look,
01:22:37.160 | what if we could push the storytelling
01:22:40.720 | about where the universe comes from much further back
01:22:42.960 | by starting with really simple things that evolve,
01:22:46.080 | and then through that evolution,
01:22:47.240 | explain how things got to be how they are
01:22:48.840 | through very simple principles, right?
01:22:51.200 | And so we've been exploring a variety of ways
01:22:55.080 | to push the start of the storytelling
01:22:57.640 | further and further back, which,
01:23:00.400 | and it's an interesting,
01:23:02.360 | it's really kind of interesting,
01:23:03.600 | 'cause like for all of his,
01:23:06.920 | Lee is sometimes considered to be,
01:23:09.560 | to have a radical quality in the physics world,
01:23:13.760 | but he still is like, no, this is gonna be like
01:23:18.200 | the kind of time we're talking about
01:23:19.800 | in which evolution happens is the same time we're in now.
01:23:22.320 | And we're talking about something that starts
01:23:24.920 | and continues, and I'm like,
01:23:26.040 | well, what if there's some other kind of time
01:23:28.320 | that's time-like, and it sounds like metaphysics,
01:23:31.000 | but there's an ambiguity, you know,
01:23:33.200 | like it has to start from something,
01:23:36.120 | and it's kind of an interesting,
01:23:37.920 | so there's this,
01:23:38.760 | a lot of the math can be thought of either way,
01:23:42.400 | which is kind of interesting.
01:23:44.000 | - So push it so far back that basically all the things
01:23:46.520 | we take for granted in physics start becoming emergent.
01:23:49.600 | - I really wanna emphasize, this is all super baby steps.
01:23:53.400 | I don't wanna over-claim.
01:23:54.440 | It's like, I think a lot of the things we're doing,
01:23:57.400 | we're approaching some old problems
01:23:59.040 | in a pretty fresh way, informed.
01:24:02.240 | There's been a zillion papers about how you can think
01:24:04.600 | of the universe as a big neural net,
01:24:06.160 | or how you can think of different ideas in physics
01:24:09.080 | as being quite similar to, or even equivalent to,
01:24:12.280 | some of the ideas in machine learning.
01:24:14.160 | And that actually works out crazy well.
01:24:18.680 | Like, I mean, that is actually kind of eerie
01:24:21.000 | when you look at it.
01:24:21.840 | Like, there's probably two or three dozen papers
01:24:25.880 | that have this quality, and some of them are just crazy good.
01:24:28.520 | And it's very interesting.
01:24:30.600 | What we're trying to do is take those kinds of observations
01:24:34.080 | and turn them into an actionable framework
01:24:35.840 | where you can then start to do things
01:24:38.760 | with landscapes of theories that you couldn't do before,
01:24:40.520 | and that sort of thing.
01:24:42.440 | - So in that context, or maybe beyond,
01:24:46.120 | how do you explain us humans?
01:24:47.920 | How unlikely are we, this intelligent civilization?
01:24:50.960 | Or is there a lot of others,
01:24:53.200 | or are we alone in this universe?
01:24:55.040 | - Yeah.
01:24:58.360 | - You seem to appreciate humans very much.
01:25:01.960 | - I've grown fond of us.
01:25:04.720 | (both laughing)
01:25:06.280 | - We're okay.
01:25:07.120 | We have our nice qualities.
01:25:10.840 | - I like that.
01:25:14.560 | I mean, we're kind of weird.
01:25:16.240 | We sprout this hair on our heads,
01:25:17.800 | and then we're, I don't know,
01:25:18.640 | we're sort of weird animals.
01:25:20.280 | - That's a feature, not a bug, I think, the weirdness.
01:25:23.840 | - I hope so.
01:25:24.960 | I hope so.
01:25:25.880 | I think if I'm just gonna answer you in terms of truth,
01:25:35.880 | the first thing I'd say is we're not
01:25:38.640 | in a privileged enough position,
01:25:40.800 | at least as yet, to really know much about who we are,
01:25:45.640 | how we are, what we're really like
01:25:48.360 | in the context of something larger,
01:25:50.200 | what that context is, all that stuff.
01:25:52.640 | We might learn more in the future.
01:25:54.000 | Our descendants might learn more,
01:25:55.200 | but we don't really know very much,
01:25:57.480 | which you can either view as frustrating
01:25:59.440 | or charming like that first year of TikTok or something.
01:26:02.440 | - All roads lead back to TikTok, I like it.
01:26:06.640 | - Well, lately.
01:26:07.480 | There's another level at which I can think about it
01:26:12.120 | where I sometimes think that if you're a person
01:26:17.120 | that if you are just quiet and you do something
01:26:22.120 | that gets you in touch with the way reality happens,
01:26:25.840 | and for me, it's playing music,
01:26:28.200 | sometimes it seems like you can feel a bit
01:26:31.520 | of how the universe is,
01:26:32.960 | and it feels like there's a lot more going on in it,
01:26:36.160 | and there is a lot more life and a lot more stuff happening
01:26:39.640 | and a lot more stuff flowing through.
01:26:41.480 | I'm not speaking as a scientist now.
01:26:43.000 | This is kind of a more my artist side talking,
01:26:46.240 | and it's, I feel like I'm suddenly
01:26:49.480 | in multiple personalities with you, but.
01:26:51.440 | - Well, Kerouac, Jack Kerouac said
01:26:54.240 | that music is the only truth.
01:26:56.600 | What do you, it sounds like you might be at least in part.
01:27:01.560 | - There's a passage in Kerouac's book, "Dr. Sax,"
01:27:05.640 | where somebody tries to just explain the whole situation
01:27:08.040 | with reality and people in like a paragraph,
01:27:10.160 | and I couldn't reproduce it for you here,
01:27:12.000 | but it's like, yeah, like there are these bulbous things
01:27:15.040 | that walk around and they make these sounds.
01:27:16.520 | You can sort of understand them, but only kind of,
01:27:18.360 | and then there's like this, and it's just like this amazing,
01:27:20.560 | like just really quick, like if some spirit being
01:27:24.600 | or something was gonna show up in our reality
01:27:26.440 | and hadn't knew nothing about it,
01:27:27.520 | it's like a little basic intro of like,
01:27:29.520 | okay, here's what's going on here.
01:27:31.280 | An incredible passage.
01:27:32.640 | - Yeah. - Yeah.
01:27:33.880 | - It's like a one or two sentence summary
01:27:36.560 | in "Hitchhiker's Guide to the Galaxy," right,
01:27:38.800 | of what this-- - Mostly harmless.
01:27:41.320 | - Mostly harmless.
01:27:43.000 | - Yeah. - Do you think there's truth
01:27:44.040 | to that, that music somehow connects
01:27:46.280 | to something that words cannot?
01:27:48.960 | - Yeah, music is something that just towers above me.
01:27:52.720 | I don't feel like I have an overview of it.
01:27:57.720 | It's just the reverse.
01:27:58.800 | I don't fully understand it,
01:28:00.720 | because on one level, it's simple.
01:28:02.240 | Like you can say, oh, it's a thing people evolved
01:28:06.080 | to coordinate our brains on a pattern level
01:28:10.120 | or something like that.
01:28:11.920 | There's all these things you can say about music,
01:28:13.840 | which are, you know, some of that's probably true.
01:28:16.920 | It's also, there's kind of like this,
01:28:21.920 | this is the mystery of meaning.
01:28:26.160 | Like there's a way that just,
01:28:29.360 | instead of just being pure abstraction,
01:28:31.320 | music can have like this kind of substantiality to it
01:28:35.320 | that is philosophically impossible.
01:28:38.080 | I don't know what to do with it.
01:28:41.200 | - Yeah, the amount of understanding
01:28:43.360 | I feel I have when I hear the right song
01:28:46.840 | at the right time is not comparable
01:28:49.520 | to anything I can read on Wikipedia.
01:28:52.560 | Anything I can understand, read through in language.
01:28:57.280 | The music does connect us to something.
01:28:59.720 | - There's this thing there.
01:29:00.800 | Yeah, there's some kind of a thing in it.
01:29:04.960 | And I've never ever, I've read across a lot of explanations
01:29:09.840 | from all kinds of interesting people
01:29:12.240 | like that it's some kind of a flow language
01:29:16.480 | between people or between people and how they perceive
01:29:19.000 | and that kind of thing.
01:29:20.960 | And that sort of explanation is fine,
01:29:24.040 | but it's not quite it either.
01:29:26.560 | - Yeah, there's something about music
01:29:29.560 | that makes me believe that panpsychism
01:29:32.320 | could possibly be true,
01:29:34.120 | which is that everything in the universe is conscious.
01:29:36.840 | It makes me think,
01:29:39.600 | makes me be humble in how much or how little I understand
01:29:44.600 | about the functions of our universe,
01:29:48.320 | that everything might be conscious.
01:29:50.560 | - Most people interested in theoretical physics
01:29:54.240 | eventually land in panpsychism,
01:29:56.520 | but I'm not one of them.
01:30:00.200 | I still think there's this pragmatic imperative
01:30:05.200 | to treat people as special.
01:30:09.160 | So I will proudly be a dualist.
01:30:11.760 | - Without people and cats, people and cats.
01:30:14.920 | - Yeah, I'm not quite sure where to draw the line
01:30:19.280 | or why the lines there or anything like that,
01:30:21.320 | but I don't think I should be required
01:30:22.840 | to all the same questions are equally mysterious
01:30:25.240 | for no line.
01:30:26.080 | So I don't feel disadvantaged by that.
01:30:28.560 | So I shall remain a dualist.
01:30:30.480 | But if you listen to anyone trying to explain
01:30:35.480 | where consciousness is in a dualistic sense,
01:30:38.600 | either believing in souls or some special thing
01:30:41.160 | in the brain or something,
01:30:42.360 | you pretty much say, screw this, I'm gonna be a panpsychist.
01:30:45.400 | (laughing)
01:30:47.640 | - Fair enough, well put.
01:30:53.400 | - Is there moments in your life that happen
01:30:55.880 | that were defining in the way that you hope others,
01:30:59.880 | your daughters might--
01:31:00.720 | - Well, listen, I gotta say,
01:31:02.560 | the moments that defined me were not the good ones.
01:31:06.280 | The moments that defined me were often horrible.
01:31:09.560 | I've had successes, but if you ask what defined me,
01:31:17.400 | my mother's death,
01:31:19.360 | being under the World Trade Center and the attack,
01:31:25.740 | the things that have had an effect on me
01:31:30.760 | were the most were sort of real world terrible things,
01:31:35.240 | which I don't wish on young people at all.
01:31:37.580 | And this is the thing that's hard about giving advice
01:31:42.000 | to young people that they have to learn their own lessons
01:31:47.000 | and lessons don't come easily.
01:31:52.940 | And a world which avoids hard lessons
01:31:56.720 | will be a stupid world.
01:31:58.320 | And I don't know what to do with it.
01:32:00.080 | That's a little bundle of truth
01:32:03.000 | that has a bit of a fatalistic quality to it,
01:32:05.120 | but I don't, this is like what I was saying
01:32:07.880 | that freedom equals eternal annoyance.
01:32:10.000 | There's a degree to which honest advice
01:32:16.660 | is not that pleasant to give.
01:32:20.400 | And I don't want young people
01:32:23.080 | to have to know about everything.
01:32:25.680 | I think--
01:32:26.520 | - You don't wanna wish hardship on them.
01:32:27.960 | - Yeah, I think they deserve to have
01:32:30.560 | a little grace period of naivety that's pleasant.
01:32:34.720 | I mean, I do, if it's possible, if it's,
01:32:38.280 | these things are, this is tricky stuff.
01:32:42.560 | I mean, if you,
01:32:45.480 | okay, so let me try a little bit on this advice thing.
01:32:51.020 | I think one thing,
01:32:52.460 | and any serious broad advice will have been given
01:32:56.400 | a thousand times before for a thousand years.
01:32:58.320 | So this, I'm not gonna,
01:33:00.360 | I'm not going to claim originality,
01:33:03.000 | but I think trying to find a way
01:33:06.440 | to really pay attention
01:33:10.600 | to what you're feeling fundamentally,
01:33:13.200 | what your sense of the world is,
01:33:14.740 | what your intuition is,
01:33:15.920 | if you feel like an intuitive person,
01:33:17.800 | what your,
01:33:18.640 | like to try to escape the constant sway
01:33:26.720 | of social perception or manipulation,
01:33:29.280 | whatever you wish, not to escape it entirely,
01:33:31.420 | that would be horrible,
01:33:32.260 | but to find cover from it once in a while,
01:33:37.260 | to find a sense of being anchored in that,
01:33:41.120 | to believe in experience as a real thing.
01:33:44.120 | Believing in experience as a real thing is very dualistic.
01:33:47.200 | That goes with my philosophy of dualism.
01:33:50.800 | I believe there's something magical,
01:33:52.680 | and instead of squirting the magic dust on the programs,
01:33:56.000 | I think experience is something real
01:33:58.280 | and something apart and something mystical.
01:34:00.360 | - Your own personal experience that you just have,
01:34:04.740 | and then you're saying,
01:34:06.140 | silence the rest of the world enough to hear that,
01:34:08.260 | like whatever that magic dust is in that experience.
01:34:11.340 | - Find what is there.
01:34:13.540 | And I think that's one thing.
01:34:18.100 | Another thing is to recognize
01:34:21.180 | that kindness requires genius,
01:34:24.900 | that it's actually really hard,
01:34:27.180 | that facile kindness is not kindness,
01:34:30.020 | and that it'll take you a while to have the skills,
01:34:33.500 | to have kind impulses,
01:34:34.740 | to want to be kind, you can have right away.
01:34:37.540 | To be effectively kind is hard.
01:34:40.100 | - To be effectively kind, yeah.
01:34:41.860 | - It takes skill, it takes hard lessons.
01:34:46.260 | You'll never be perfect at it.
01:34:52.780 | To the degree you get anywhere with it,
01:34:55.580 | it's the most rewarding thing ever.
01:34:58.600 | - Yeah.
01:34:59.440 | Let's see, what else would I say?
01:35:02.940 | I would say when you're young,
01:35:07.880 | you can be very overwhelmed
01:35:10.320 | by social and interpersonal emotions.
01:35:16.800 | You'll have broken hearts and jealousies.
01:35:20.720 | You'll feel socially down the ladder
01:35:24.200 | instead of up the ladder.
01:35:26.000 | It feels horrible when that happens.
01:35:28.000 | All of these things.
01:35:29.480 | And you have to remember
01:35:31.400 | what a fragile crest all that stuff is.
01:35:35.600 | And it's hard, 'cause right when it's happening,
01:35:37.660 | it's just so intense.
01:35:39.220 | And if I was actually giving this advice to my daughter,
01:35:48.200 | she'd already be out of the room.
01:35:50.120 | (laughing)
01:35:51.640 | So this is for some hypothetical teenager
01:35:55.800 | that doesn't really exist,
01:35:56.640 | that really wants to sit and listen to my voice.
01:35:59.240 | - Or for your daughter 10 years from now.
01:36:01.880 | - Maybe.
01:36:03.360 | - Can I ask you a difficult question?
01:36:06.640 | - Yeah, sure.
01:36:07.480 | - You talked about losing your mom.
01:36:10.760 | - Yeah.
01:36:11.960 | - Do you miss her?
01:36:13.880 | - Yeah, I mean, I still connect to her through music.
01:36:17.680 | She was a young prodigy piano player in Vienna.
01:36:24.880 | And she survived the concentration camp
01:36:27.960 | and then died in a car accident here in the US.
01:36:32.960 | - What music makes you think of her?
01:36:35.640 | Is there a song that connects you?
01:36:38.200 | - Well, she was in Vienna,
01:36:40.680 | so she had the whole Viennese music thing going,
01:36:45.680 | which is this incredible school
01:36:51.440 | of absolute skill and romance bundled together
01:36:56.280 | and wonderful on the piano, especially.
01:36:58.920 | I learned to play some of the Beethoven sonatas for her
01:37:01.920 | and I played them in this exaggerated drippy way,
01:37:04.680 | I remember when I was a kid.
01:37:06.960 | - Exaggerated meaning too full of emotion?
01:37:09.520 | - Yeah, like just like.
01:37:11.560 | - Isn't that the only way to play Beethoven?
01:37:13.240 | I mean, I didn't know there's any other way.
01:37:15.000 | - That's a reasonable question.
01:37:16.320 | I mean, the fashion these days is to be slightly Apollonian
01:37:20.040 | even with Beethoven, but one imagines
01:37:22.800 | that actual Beethoven playing might've been different.
01:37:26.000 | I don't know.
01:37:26.840 | I've gotten to play a few instruments he played
01:37:30.920 | and tried to see if I could feel anything
01:37:32.400 | about how it might've been for him.
01:37:33.640 | I don't know really.
01:37:35.080 | - I was always against the clinical precision
01:37:37.600 | of classical music.
01:37:39.400 | I thought a great piano player should be like in pain,
01:37:47.480 | like emotionally, like truly feel the music
01:37:52.480 | and make it messy sort of.
01:37:57.400 | - Sure.
01:37:58.360 | - Maybe play classical music the way, I don't know,
01:38:00.920 | blues pianist plays blues.
01:38:03.560 | - It seems like they actually got happier
01:38:07.960 | and I'm not sure if Beethoven got happier.
01:38:10.080 | I think it's a different kind of concept
01:38:13.760 | of the place of music.
01:38:17.080 | I think the blues, the whole African-American tradition
01:38:21.680 | was initially surviving awful, awful circumstances.
01:38:25.560 | So you could say, there was some of that
01:38:27.360 | in the concentration camps and all that too.
01:38:29.560 | And it's not that Beethoven's circumstances were brilliant,
01:38:34.600 | but he kind of also, I don't know, this is hard.
01:38:39.600 | Like, I mean, it would seem to be his misery
01:38:42.520 | was somewhat self-imposed maybe through, I don't know.
01:38:45.760 | It's kind of interesting.
01:38:47.240 | I've known some people who loathed Beethoven,
01:38:49.200 | like the late composer Pauline Oliveros,
01:38:52.840 | wonderful modernist composer.
01:38:54.040 | I played in her band for a while and she was like,
01:38:57.080 | "Oh, Beethoven, that's the worst music ever.
01:38:59.080 | "It's like all ego.
01:39:00.400 | "It completely, it turns emotion into your enemy.
01:39:05.400 | "And it's ultimately all about your own self-importance,
01:39:11.920 | "which has to be at the expense of others.
01:39:14.080 | "What else could it be?"
01:39:16.200 | And blah, blah, blah.
01:39:17.520 | So she had, I shouldn't say,
01:39:19.040 | I don't mean to be dismissive,
01:39:20.000 | but I'm just saying like her position on Beethoven
01:39:22.720 | was very negative and very unimpressed,
01:39:25.680 | which is really interesting for me.
01:39:26.880 | - The man or the music?
01:39:28.640 | - I think, I don't know.
01:39:30.200 | I mean, she's not here to speak for herself.
01:39:31.680 | So it's a little hard for me to answer that question.
01:39:34.760 | But it was interesting 'cause I'd always thought
01:39:36.000 | of Beethoven as like, "Whoa, this is like Beethoven
01:39:38.160 | "is like really the dude."
01:39:40.840 | And she's like, "Eh."
01:39:43.320 | Beethoven, Schmadovan, it's like not really happening.
01:39:45.760 | - Yeah, still, even though it's cliche,
01:39:47.440 | I like playing personally just for myself,
01:39:49.720 | Moonlight Sonata.
01:39:50.600 | I mean, I just...
01:39:51.880 | - Moonlight's amazing.
01:39:54.400 | You're talking about comparing the blues
01:40:00.880 | and that sensibility from Europe
01:40:02.840 | is so different in so many ways.
01:40:05.520 | One of the musicians I play with is John Batiste,
01:40:08.000 | who has the band on Colbert Show.
01:40:09.760 | And he'll sit there playing jazz
01:40:12.320 | and suddenly go into Moonlight.
01:40:13.440 | He loves Moonlight.
01:40:14.320 | And what's kind of interesting is
01:40:16.920 | he's found a way to do Beethoven.
01:40:21.520 | And by the way, he can really do Beethoven.
01:40:23.440 | Like he went through Juilliard.
01:40:25.920 | And one time he was at my house.
01:40:28.080 | He's like, "Hey, do you have the book of Beethoven's sonatas?"
01:40:30.280 | I say, "Yeah, I wanna find one I haven't played."
01:40:31.720 | And he sight read through the whole damn thing perfectly.
01:40:34.080 | And I'm like, "Oh God, I just need to get out of here.
01:40:36.520 | "I can't even deal with this."
01:40:38.080 | But anyway.
01:40:41.400 | But anyway, the thing is he has this way
01:40:44.840 | of with the same persona and the same philosophy
01:40:46.920 | moving from the blues into Beethoven.
01:40:49.520 | That's really, really fascinating to me.
01:40:51.640 | It's like, I don't wanna say he plays it
01:40:55.560 | as if it were jazz, but he kind of does.
01:40:58.360 | It's kind of really, and he talks,
01:41:00.560 | while he was sight reading,
01:41:01.400 | he talks like Beethoven's talking to him.
01:41:03.320 | Like he's like, "Oh yeah, here, he's doing this."
01:41:05.440 | I can't do John, but you know.
01:41:07.080 | It's like, it's really interesting.
01:41:09.080 | Like it's very different.
01:41:10.080 | For me, I was introduced to Beethoven
01:41:12.560 | as like almost like this godlike figure,
01:41:14.720 | and I presume Pauline was too,
01:41:16.720 | that was really kind of oppressed and hard to deal with.
01:41:18.680 | And for him, it's just like-
01:41:20.200 | - The conversation he's having.
01:41:21.400 | - He's playing James P. Johnson or something.
01:41:23.720 | It's like another musician who did something
01:41:25.280 | and they're talking.
01:41:26.120 | And it's very cool to be around.
01:41:27.960 | It's very kind of freeing to see someone
01:41:31.720 | have that relationship.
01:41:34.640 | - I would love to hear him play Beethoven.
01:41:36.160 | That sounds amazing.
01:41:37.800 | - He's great.
01:41:39.640 | - We talked about Ernest Becker
01:41:41.320 | and how much value he puts on our mortality
01:41:46.680 | and our denial of our mortality.
01:41:49.520 | Do you think about your mortality?
01:41:51.920 | Do you think about your own death?
01:41:53.600 | - You know, what's funny is I used to not be able to,
01:41:56.240 | but as you get older, you just know people who die
01:41:58.480 | and there's all these things
01:41:59.320 | and it just becomes familiar and more ordinary,
01:42:04.000 | which is what it is.
01:42:06.720 | - But are you afraid?
01:42:09.600 | - Sure.
01:42:10.600 | Although less so.
01:42:11.840 | And it's not like I didn't have some kind of insight
01:42:18.400 | or revelation to become less afraid.
01:42:20.240 | I think I just, like I say, it's kind of familiarity.
01:42:25.240 | It's just knowing people who've died.
01:42:27.480 | And I really believe in the future.
01:42:32.080 | I have this optimism that people
01:42:35.280 | or this whole thing of life on earth,
01:42:37.280 | this whole thing we're part of,
01:42:38.200 | I don't know where to draw that circle,
01:42:40.040 | but this thing is going somewhere
01:42:44.120 | and has some kind of value.
01:42:48.240 | And you can't both believe in the future
01:42:51.440 | and wanna live forever.
01:42:52.480 | You have to make room for it.
01:42:53.680 | You know, like you have to,
01:42:55.360 | that optimism has to also come with its own like humility.
01:42:58.680 | You have to make yourself small to believe in the future.
01:43:02.040 | And so it actually in a funny way comforts me.
01:43:07.120 | - Wow, that's powerful.
01:43:08.600 | And optimism requires you to kind of step down after time.
01:43:15.600 | - Yeah, I mean, that said, life seems kind of short,
01:43:19.760 | but you know, whatever.
01:43:20.920 | - Do you think there's-
01:43:22.720 | - I've tried to find, I can't find the complaint department.
01:43:24.760 | You know, I really want to, I want to bring this up,
01:43:27.080 | but the customer service number never answers
01:43:29.320 | and like the email bounces.
01:43:30.440 | - One way.
01:43:31.280 | - So yeah.
01:43:32.120 | - Do you think there's meaning to it, to life?
01:43:35.240 | - Ah, well, see, meaning's a funny word.
01:43:38.320 | Like we say all these things as if we know what they mean,
01:43:40.480 | but meaning, we don't know what we mean when we say meaning.
01:43:43.200 | Like we obviously do not.
01:43:44.360 | And it's a funny little mystical thing.
01:43:48.520 | I think it ultimately connects to that sense of experience
01:43:51.280 | that dualists tend to believe in.
01:43:54.720 | - I guess there are why, like if you look up to the stars
01:43:59.040 | and you experience that awe inspiring, like joy,
01:44:02.920 | whatever, when you look up to the stars,
01:44:06.320 | I don't know why for me, that's kind of makes me feel
01:44:10.080 | joyful, maybe a little bit melancholy,
01:44:12.040 | just some weird soup of feelings.
01:44:14.760 | And ultimately the question is like,
01:44:16.680 | why are we here in this vast universe?
01:44:19.760 | That question, why?
01:44:23.600 | Have you been able in some way,
01:44:28.400 | maybe through music, answer it for yourself?
01:44:31.400 | (silence)
01:44:33.560 | - My impulse is to feel like it's not quite
01:44:41.840 | the right question to ask,
01:44:43.200 | but I feel like going down that path
01:44:46.360 | is just too tedious for the moment
01:44:48.200 | and I don't want to do it, but. (laughs)
01:44:51.600 | - The wrong question.
01:44:54.120 | - Well, just because, I don't know what meaning is.
01:44:58.000 | And I think, I do know that sense of awe.
01:45:01.720 | I grew up in Southern New Mexico and the stars were so vivid.
01:45:06.040 | I've had some weird misfortunes,
01:45:12.640 | but I've had some weird luck also.
01:45:15.960 | One of our near neighbors was the head of optics research
01:45:20.440 | at White Sands and when he was young,
01:45:22.000 | he discovered Pluto, his name was Clyde Tombaugh.
01:45:25.160 | And he taught me how to make telescopes,
01:45:27.680 | grinding mirrors and stuff.
01:45:28.960 | And my dad had also made telescopes when he was a kid,
01:45:31.560 | but Clyde had backyard telescopes
01:45:35.040 | that would put to shame a lot of, (laughs)
01:45:37.320 | I mean, he really, he did his telescopes, you know?
01:45:39.960 | And so I remember he'd let me go and play with them
01:45:43.040 | and just looking at a globular cluster
01:45:46.240 | and you're seeing the actual photons
01:45:47.760 | and with a good telescope, it's really like this object.
01:45:50.120 | Like you can really tell this isn't coming through
01:45:53.240 | some intervening information structure.
01:45:55.360 | This is like the actual photons
01:45:56.960 | and it's really a three-dimensional object.
01:45:59.680 | And you have even a feeling for the vastness of it.
01:46:02.640 | And it's, I don't know, so I definitely,
01:46:07.640 | I was very, very fortunate to have a connection
01:46:11.680 | to the sky that way when I was a kid.
01:46:15.440 | - To have had that experience,
01:46:17.360 | again, the emphasis on experience.
01:46:22.760 | It's kind of funny, I feel like sometimes,
01:46:26.320 | I've taken, when she was younger,
01:46:28.680 | I took my daughter and her friends to a telescope.
01:46:31.960 | There are a few around here that our kids can go and use
01:46:34.680 | and they would look at Jupiter's moons or something.
01:46:37.320 | I think Galilean moons.
01:46:39.120 | And I don't know if they quite had that
01:46:42.240 | 'cause it's been just too normalized.
01:46:47.120 | And I think maybe when I was growing up,
01:46:50.720 | screens weren't that common yet.
01:46:52.200 | And maybe it's too confusable with the screen.
01:46:55.160 | I don't know.
01:46:56.600 | - Somebody brought up in conversation to me,
01:47:00.560 | I said somewhere, I don't remember who,
01:47:02.160 | but they kind of posited this idea
01:47:04.520 | that if humans, early humans weren't able to see the stars,
01:47:08.680 | like if Earth's atmosphere was such that it was cloudy,
01:47:12.080 | that we would not develop human civilization.
01:47:14.920 | There's something about being able to look up
01:47:17.520 | and see a vast universe is like,
01:47:20.600 | that's fundamental to the development of human civilization.
01:47:23.840 | I thought that was a curious kind of thought.
01:47:26.720 | - That reminds me of that old Isaac Asimov story
01:47:30.280 | where there's this planet where they finally get to see
01:47:33.520 | what's in the sky once in a while.
01:47:35.040 | And it turns out they're in the middle of a globular cluster
01:47:36.960 | and there are all these stars.
01:47:38.520 | I forget what happens exactly.
01:47:39.760 | God, that's from when I was the same age as a kid.
01:47:42.040 | I don't really remember.
01:47:43.240 | But yeah, I don't know.
01:47:47.280 | It might be right.
01:47:48.120 | I'm just thinking of all the civilizations
01:47:50.080 | that grew up under clouds.
01:47:51.480 | I mean, like the Vikings needed a special
01:47:56.480 | diffracting piece of mica to navigate
01:47:58.800 | 'cause they could never see the sun.
01:48:00.120 | They had this thing called a sunstone
01:48:01.280 | that they found from this one cave.
01:48:02.960 | Do you know about that?
01:48:04.040 | So they were in this,
01:48:05.160 | they were trying to navigate boats in the North Atlantic
01:48:10.680 | without being able to see the sun 'cause it was cloudy.
01:48:12.600 | And so they used a chunk of mica to diffract it
01:48:19.960 | in order to be able to align where the sun really was
01:48:22.520 | 'cause they couldn't tell by eye and navigate.
01:48:24.680 | So I'm just saying there are a lot of civilizations
01:48:26.960 | that are pretty impressive
01:48:27.800 | that had to deal with a lot of clouds.
01:48:30.000 | The Amazonians invented our agriculture
01:48:34.360 | and they were probably under clouds a lot.
01:48:36.120 | I don't know.
01:48:37.440 | - To me personally, the question of the meaning of life
01:48:41.360 | becomes most vibrant, most apparent
01:48:46.120 | when you look up at the stars
01:48:47.840 | because it makes me feel very small.
01:48:50.960 | - We are small.
01:48:53.400 | - But then you ask, it still feels that we're special.
01:48:59.560 | And then the natural question is like,
01:49:01.640 | well, if we are as special as I think we are,
01:49:05.280 | why the heck are we here in this vast universe?
01:49:08.840 | That ultimately is the question of the meaning of life.
01:49:14.160 | - I mean, look, there's a confusion sometimes
01:49:18.600 | in trying to set up a question or a thought experiment
01:49:23.600 | or something that's defined in terms of a context
01:49:29.920 | to explain something where there is no larger context.
01:49:32.720 | And that's a category error.
01:49:34.200 | If we wanna do it in physics or in computer science,
01:49:40.360 | it's hard to talk about the universe as a Turing machine
01:49:44.000 | because a Turing machine has an external clock
01:49:46.520 | and an observer and input and output.
01:49:48.920 | There's a larger context implied
01:49:50.520 | in order for it to be defined at all.
01:49:52.840 | And so if you're talking about the universe,
01:49:54.160 | you can't talk about it coherently as a Turing machine.
01:49:57.400 | Quantum mechanics is like that.
01:49:58.560 | Quantum mechanics has an external clock
01:50:00.400 | and has some kind of external context
01:50:02.800 | depending on your interpretation
01:50:04.600 | that's either the observer or whatever.
01:50:08.400 | And they're similar that way.
01:50:11.440 | So maybe Turing machines and quantum mechanics
01:50:14.560 | can be better friends or something
01:50:17.200 | because they have a similar setup.
01:50:18.560 | But the thing is, if you have something that's defined
01:50:21.520 | in terms of an outer context,
01:50:23.720 | you can't talk about ultimates with it
01:50:26.200 | because obviously it's not suited for that.
01:50:29.280 | So there's some ideas that are their own context.
01:50:32.200 | General relativity is its own context.
01:50:34.520 | It's different.
01:50:35.360 | That's why it's hard to unify.
01:50:36.720 | And I think the same thing is true
01:50:40.200 | when we talk about these types of questions.
01:50:43.680 | Like meaning is in a context
01:50:48.680 | and to talk about ultimate meaning is therefore a category.
01:50:52.720 | Or it's not a resolvable way of thinking.
01:50:57.720 | It might be a way of thinking that is experientially
01:51:06.920 | or aesthetically valuable because it is awesome
01:51:11.920 | in the sense of awe-inspiring.
01:51:14.880 | But to try to treat it analytically is not sensible.
01:51:20.440 | - Maybe that's what music and poetry are for.
01:51:22.480 | - Yeah, maybe.
01:51:23.320 | I think so.
01:51:24.160 | I think music actually does escape any particular context.
01:51:27.280 | That's how it feels to me, but I'm not sure about that.
01:51:29.200 | That's once again, crazy artist talking, not scientist.
01:51:33.480 | - Well, you do both masterfully, Jaron.
01:51:37.880 | And like I said, I'm a big fan of everything you've done,
01:51:40.040 | of you as a human being.
01:51:42.080 | I appreciate the fun argument we had today
01:51:46.420 | that will, I'm sure, continue for 30 years
01:51:48.720 | as it did with Mark Rominski.
01:51:50.680 | Honestly, I deeply appreciate
01:51:53.760 | that you spent your really valuable time with me today.
01:51:55.720 | It was a really great conversation.
01:51:56.920 | Thank you so much.
01:51:58.560 | Thanks for listening to this conversation
01:52:00.160 | with Jaron Lanier.
01:52:01.820 | To support this podcast, please check out our sponsors
01:52:04.560 | in the description.
01:52:06.160 | And now, let me leave you with some words
01:52:08.400 | from Jaron Lanier himself.
01:52:10.000 | A real friendship ought to introduce each person
01:52:13.960 | to unexpected weirdness in the other.
01:52:16.720 | Thank you for listening, and hope to see you next time.
01:52:20.540 | (upbeat music)
01:52:23.120 | (upbeat music)
01:52:25.700 | [BLANK_AUDIO]