back to index

We exist inside the story that the brain tells itself (Joscha Bach) | AI Podcast Clips


Whisper Transcript | Transcript Only Page

00:00:00.000 | What is dualism, what is idealism, what is materialism,
00:00:05.000 | what is functionalism, and what connects with you most?
00:00:08.040 | In terms of, 'cause you just mentioned,
00:00:09.280 | there's a reality we don't have access to.
00:00:11.360 | Okay, what does that even mean?
00:00:13.640 | And why don't we get access to it?
00:00:16.520 | Aren't we part of that reality?
00:00:17.720 | Why can't we access it?
00:00:19.680 | - So the particular trajectory that mostly exists
00:00:22.160 | in the West is the result of our indoctrination
00:00:25.640 | by a cult for 2,000 years.
00:00:27.480 | - A cult, which one?
00:00:28.320 | - The Catholic class mostly.
00:00:30.480 | And for better or worse, it has created or defined
00:00:34.360 | many of the modes of interaction that we have
00:00:36.240 | that has created this society.
00:00:38.200 | But it has also, in some sense, scarred our rationality.
00:00:42.880 | And the intuition that exists, if you would translate
00:00:47.680 | the mythology of the Catholic church into the modern world
00:00:51.160 | is that the world in which you and me interact
00:00:53.900 | is something like a multiplayer role-playing adventure.
00:00:57.680 | And the money and the objects that we have in this world,
00:01:00.360 | this is all not real.
00:01:01.400 | Or as Eastern philosophers would say, it's maya.
00:01:05.400 | It's just stuff that appears to be meaningful
00:01:08.800 | and this embedding in this meaning,
00:01:11.080 | if you believe in it, is samsara.
00:01:13.320 | It's basically the identification with the needs
00:01:16.360 | of the mundane, secular, everyday existence.
00:01:19.300 | And the Catholics also introduced the notion
00:01:22.920 | of higher meaning, the sacred.
00:01:25.000 | And this existed before, but eventually
00:01:27.520 | the natural shape of God is the platonic form
00:01:30.440 | of the civilization that you're a part of.
00:01:32.120 | It's basically the superorganism that is formed
00:01:33.920 | by the individuals as an intentional agent.
00:01:36.980 | And basically, the Catholics used
00:01:39.720 | a relatively crude mythology to implement software
00:01:43.000 | on the minds of people and get the software synchronized
00:01:45.480 | to make them walk in lockstep.
00:01:46.920 | - To get the software synchronized.
00:01:47.760 | - To basically get this God online
00:01:50.520 | and to make it efficient and effective.
00:01:53.480 | And I think God technically is just a self
00:01:56.480 | that spans multiple brains as opposed to your and myself,
00:01:59.280 | which mostly exists just on one brain.
00:02:01.280 | So in some sense, you can construct a self functionally
00:02:04.880 | as a function that is implemented by brains
00:02:07.320 | that exists across brains.
00:02:09.520 | And this is a God with a small g.
00:02:11.320 | - That's one of the, if you look,
00:02:13.200 | Yuval Harari kind of talking about,
00:02:15.280 | this is one of the nice features of our brains,
00:02:18.480 | it seems to, that we can all download
00:02:20.600 | the same piece of software, like God in this case,
00:02:22.760 | and kind of share it.
00:02:24.280 | - Yeah, so basically you give everybody a spec
00:02:26.320 | and the mathematical constraints
00:02:28.600 | that are intrinsic to information processing
00:02:32.400 | make sure that given the same spec,
00:02:34.360 | you come up with a compatible structure.
00:02:36.520 | - Okay, so that's, there's this space of ideas
00:02:38.960 | that we all share and we think that's kind of the mind.
00:02:42.080 | But that's separate from, the idea is,
00:02:46.400 | from Christianity, from religion,
00:02:49.080 | is that there's a separate thing between the mind--
00:02:51.520 | - There is a real world.
00:02:52.360 | And this real world is the world in which God exists.
00:02:55.920 | God is the coder of the multiplayer adventure,
00:02:58.240 | so to speak, and we are all players in this game.
00:03:02.160 | And-- - And that's dualism.
00:03:03.600 | You would say-- - Yes.
00:03:04.600 | But the dualist aspect is because the mental realm
00:03:08.240 | exists in a different implementation
00:03:10.040 | than the physical realm.
00:03:11.640 | And the mental realm is real.
00:03:13.520 | And a lot of people have this intuition
00:03:15.600 | that there is this real room
00:03:16.800 | in which you and me talk and speak right now.
00:03:19.440 | Then comes a layer of physics and abstract rules and so on.
00:03:23.800 | And then comes another real room
00:03:25.160 | where our souls are and our true form
00:03:28.000 | isn't a thing that gives us phenomenal experience.
00:03:30.000 | And this, of course, is a very confused notion
00:03:32.800 | that you would get.
00:03:34.080 | And it's basically, it's the result of connecting
00:03:37.120 | materialism and idealism in the wrong way.
00:03:41.120 | - So, okay, I apologize, but I think it's really helpful
00:03:44.080 | if we just try to define, try to define terms.
00:03:47.720 | Like, what is dualism, what is idealism,
00:03:49.440 | what is materialism for people that don't know?
00:03:51.320 | - So the idea of dualism in our cultural tradition
00:03:54.360 | is that there are two substances,
00:03:55.880 | a mental substance and a physical substance.
00:03:58.760 | And they interact by different rules.
00:04:01.200 | And the physical world is basically causally closed
00:04:04.320 | and is built on a low-level causal structure.
00:04:07.800 | So there's basically a bottom level
00:04:09.600 | that is causally closed that's entirely mechanical.
00:04:12.480 | And mechanical in the widest sense, so it's computational.
00:04:15.360 | There's basically a physical world
00:04:16.800 | in which information flows around.
00:04:18.640 | And physics describes the laws
00:04:20.280 | of how information flows around in this world.
00:04:22.880 | - Would you compare it to like a computer
00:04:24.760 | where you have hardware and software?
00:04:26.760 | - The computer is a generalization
00:04:28.240 | of information flowing around.
00:04:29.920 | Basically, what you will discover
00:04:32.160 | that there is a universal principle,
00:04:34.040 | you can define this universal machine
00:04:36.600 | that is able to perform all the computations.
00:04:39.320 | So all these machines have the same power.
00:04:41.520 | This means that you can always define
00:04:43.400 | a translation between them,
00:04:44.720 | as long as they have unlimited memory,
00:04:46.600 | to be able to perform each other's computations.
00:04:50.640 | - So would you then say that materialism,
00:04:53.120 | this whole world is just the hardware,
00:04:55.120 | and idealism is this whole world is just the software?
00:04:58.480 | - Not quite.
00:04:59.320 | I think that most idealists
00:05:00.400 | don't have a notion of software yet.
00:05:02.320 | Because software also comes down to information processing.
00:05:05.320 | What you notice is the only thing
00:05:08.120 | that is real to you and me is this experiential world
00:05:10.800 | in which things matter, in which things have taste,
00:05:13.000 | in which things have color, phenomenal content, and so on.
00:05:16.120 | - Oh, there you are bringing up consciousness, okay.
00:05:18.360 | - And this is distinct from the physical world.
00:05:20.520 | In which things have values only in an abstract sense.
00:05:24.880 | And you only look at cold patterns moving around.
00:05:29.240 | So how does anything feel like something?
00:05:31.840 | And this connection between the two things
00:05:33.680 | is very puzzling to a lot of people,
00:05:35.480 | and of course, to many philosophers.
00:05:36.800 | So idealism starts out with the notion
00:05:38.720 | that mind is primary,
00:05:39.680 | materialism, things that matter, is primary.
00:05:42.520 | And so for the idealist,
00:05:45.120 | the material patterns that we see playing out
00:05:48.320 | are part of the dream that the mind is dreaming.
00:05:50.920 | And we exist in a mind
00:05:53.440 | on a higher plane of existence, if you want.
00:05:55.720 | And for the materialist,
00:05:58.600 | there is only this material thing,
00:06:00.400 | and that generates some models,
00:06:02.360 | and we are the result of these models.
00:06:05.520 | And in some sense,
00:06:06.400 | I don't think that we should understand,
00:06:08.560 | if you understand it properly,
00:06:09.840 | materialism and idealism is a dichotomy,
00:06:13.120 | but as two different aspects of the same thing.
00:06:16.080 | So the weird thing is we don't exist in the physical world.
00:06:18.480 | We do exist inside of a story that the brain tells itself.
00:06:21.640 | - Okay, let me,
00:06:26.160 | let my information processing take that in.
00:06:31.160 | We don't exist in the physical world,
00:06:32.840 | we exist in the narrative.
00:06:34.400 | - Basically, a brain cannot feel anything.
00:06:36.560 | A neuron cannot feel anything.
00:06:37.880 | They're physical things.
00:06:38.720 | Physical systems are unable to experience anything.
00:06:41.440 | But it would be very useful for the brain
00:06:43.360 | or for the organism to know what it would be like
00:06:45.660 | to be a person and to feel something.
00:06:47.840 | So the brain creates a simulacrum of such a person
00:06:51.240 | that it uses to model the interactions of the person.
00:06:53.520 | It's the best model of what that brain,
00:06:55.960 | this organism, thinks it is
00:06:57.440 | in relationship to its environment.
00:06:59.160 | So it creates that model.
00:07:00.280 | It's a story, a multimedia novel
00:07:01.820 | that the brain is continuously writing and updating.
00:07:04.160 | - But you also kind of said that,
00:07:06.320 | you said that we kind of exist in that story.
00:07:09.160 | - In that story, yes.
00:07:10.000 | - In that story.
00:07:11.500 | What is real in any of this?
00:07:15.660 | So like, there's a, again, these terms are,
00:07:20.340 | you kind of said there's a quantum graph.
00:07:22.500 | I mean, what is this whole thing running on then?
00:07:25.420 | Is the story, and is it completely, fundamentally impossible
00:07:29.980 | to get access to it?
00:07:31.260 | Because isn't the story supposed to,
00:07:34.140 | isn't the brain in something,
00:07:37.780 | in existing in some kind of context?
00:07:40.900 | - So what we can identify as computer scientists,
00:07:43.780 | we can engineer systems and test our theories this way
00:07:47.580 | that may have the necessary insufficient properties
00:07:51.220 | to produce the phenomena that we are observing,
00:07:53.620 | which is there is a self in a virtual world
00:07:56.540 | that is generated in somebody's neocortex
00:07:58.740 | that is contained in the skull of this primate here.
00:08:02.700 | And when I point at this,
00:08:03.860 | this indexicality is of course wrong.
00:08:06.300 | But I do create something that is likely
00:08:08.660 | to give rise to patterns on your retina
00:08:11.860 | that allow you to interpret what I'm saying, right?
00:08:14.380 | But we both know that the world that you and me are seeing
00:08:17.140 | is not the real physical world.
00:08:19.260 | What we are seeing is a virtual reality generated
00:08:21.760 | in your brain to explain the patterns on your retina.
00:08:24.300 | - How close is it to the real world?
00:08:25.900 | That's kind of the question.
00:08:27.800 | Is it, when you have people like Donald Hoffman,
00:08:32.700 | let's say that you're really far away,
00:08:35.220 | the thing we're seeing, you and I now,
00:08:37.480 | that interface we have is very far away from anything.
00:08:40.820 | Like we don't even have anything close
00:08:42.940 | like to the sense of what the real world is.
00:08:44.860 | Or is it a very surface piece of architecture?
00:08:48.340 | - Imagine you look at the Mandelbrot fractal, right?
00:08:50.780 | This famous thing that Bernard Mandelbrot discovered.
00:08:54.220 | If you see an overall shape in there, right?
00:08:57.580 | But you know, if you truly understand it,
00:08:59.340 | you know it's two lines of code.
00:09:01.340 | It's basically a series that is being tested
00:09:05.580 | for complex numbers in the complex number plane
00:09:08.480 | for every point.
00:09:09.320 | And for those where the series is diverging,
00:09:12.560 | you paint this black.
00:09:15.400 | And where it's converging, you don't.
00:09:17.840 | And you get the intermediate colors
00:09:21.520 | by checking how far it diverges.
00:09:25.240 | - Yes.
00:09:26.140 | - This gives you this shape of this fractal.
00:09:28.480 | But imagine you live inside of this fractal
00:09:30.280 | and you don't have access to where you are in the fractal.
00:09:33.220 | Or you have not discovered the generator function even.
00:09:36.600 | Right, so what you see is,
00:09:37.840 | all I can see right now is a spiral.
00:09:39.840 | And this spiral moves a little bit to the right.
00:09:41.620 | Is this an accurate model of reality?
00:09:43.280 | Yes, it is, right?
00:09:44.280 | It is an adequate description.
00:09:46.720 | You know that there is actually no spiral
00:09:49.120 | in the Mandelbrot fractal.
00:09:49.960 | It only appears like this to an observer
00:09:52.560 | that is interpreting things as a two-dimensional space
00:09:55.400 | and then defines certain irregularities in there
00:09:57.960 | at a certain scale that it currently observes.
00:10:00.120 | Because if you zoom in, the spiral might disappear
00:10:02.200 | and turn out to be something different
00:10:03.540 | at a different resolution, right?
00:10:04.900 | - Yes.
00:10:05.740 | - So at this level, you have the spiral.
00:10:06.900 | And then you discover the spiral moves to the right
00:10:08.820 | and at some point it disappears.
00:10:10.260 | So you have a singularity.
00:10:11.580 | At this point, your model is no longer valid.
00:10:13.900 | You cannot predict what happens beyond the singularity.
00:10:16.700 | But you can observe again and you will see
00:10:18.700 | it hit another spiral and at this point it disappeared.
00:10:21.460 | So we now have a second-order law.
00:10:23.500 | And if you make 30 layers of these laws,
00:10:25.660 | then you have a description of the world
00:10:27.460 | that is similar to the one that we come up with
00:10:29.420 | when we describe the reality around us.
00:10:31.100 | It's reasonably predictive.
00:10:32.640 | It does not cut to the core of it.
00:10:34.720 | It does not explain how it's being generated,
00:10:36.680 | how it actually works.
00:10:38.000 | But it's relatively good to explain the universe
00:10:40.720 | that we are entangled with.
00:10:41.560 | - But you don't think the tools of computer science
00:10:43.360 | or the tools of physics could step outside,
00:10:47.080 | see the whole drawing, and get at the basic mechanism
00:10:50.060 | of how the pattern, the spirals, is generated?
00:10:53.520 | - Imagine you would find yourself
00:10:55.200 | embedded into a motherboard fracture
00:10:56.560 | and you try to figure out what works
00:10:57.880 | and you somehow have a Turing machine
00:11:00.200 | with enough memory to think.
00:11:02.300 | And as a result, you come to this idea,
00:11:05.620 | it must be some kind of automaton.
00:11:07.540 | And maybe you just enumerate all the possible automata
00:11:09.900 | until you get to the one that produces your reality.
00:11:12.660 | So you can identify necessary and sufficient condition.
00:11:15.620 | For instance, we discover that mathematics itself
00:11:18.020 | is the domain of all languages.
00:11:20.340 | And then we see that most of the domains of mathematics
00:11:22.980 | that we have discovered are, in some sense,
00:11:25.420 | describing the same fractals.
00:11:26.700 | This is what category theory is obsessed about,
00:11:28.860 | that you can map these different domains to each other.
00:11:31.280 | So there are not that many fractals.
00:11:33.140 | And some of these have interesting structure
00:11:35.560 | and symmetry breaks.
00:11:37.000 | And so you can discover what region of this global fractal
00:11:41.840 | you might be embedded in from first principles.
00:11:44.400 | But the only way you can get there is from first principles.
00:11:46.720 | So basically, your understanding of the universe
00:11:49.200 | has to start with automata and then number theory
00:11:51.480 | and then spaces and so on.
00:11:53.240 | - Yeah, I think, like Stephen Wolfram still dreams
00:11:56.540 | that he'll be able to arrive at the fundamental rules
00:11:59.660 | of the cellular automata or the generalization
00:12:02.120 | of which is behind our universe.
00:12:04.260 | You've said on this topic,
00:12:08.340 | you said in a recent conversation that, quote,
00:12:11.620 | "Some people think that a simulation can't be conscious
00:12:14.820 | "and only a physical system can,
00:12:16.860 | "but they got it completely backward.
00:12:18.300 | "A physical system cannot be conscious.
00:12:21.080 | "Only a simulation can be conscious.
00:12:23.080 | "Consciousness is a simulated property
00:12:25.040 | "that simulates itself."
00:12:26.380 | Just like you said, the mind is kind of,
00:12:29.900 | we'll call it story, narrative.
00:12:32.580 | There's a simulation.
00:12:33.740 | So our mind is essentially a simulation?
00:12:36.260 | - Usually, I try to use the terminology
00:12:39.580 | so that the mind is basically the principles
00:12:41.920 | that produce the simulation.
00:12:43.100 | It's the software that is implemented by your brain.
00:12:46.100 | And the mind is creating both the universe that we are in
00:12:49.820 | and the self, the idea of a person
00:12:52.680 | that is on the other side of attention
00:12:54.420 | and is embedded in this world.
00:12:56.420 | - Why is that important, that idea of a self?
00:12:59.420 | Why is that an important feature in the simulation?
00:13:02.660 | - It's basically a result of the purpose
00:13:05.520 | that the mind has.
00:13:06.740 | It's a tool for modeling, right?
00:13:08.260 | We are not actually monkeys.
00:13:09.340 | We are side effects of the regulation needs of monkeys.
00:13:13.000 | And what the monkey has to regulate
00:13:15.820 | is the relationship of an organism to an outside world
00:13:20.100 | that is in large part also consisting of other organisms.
00:13:24.340 | And as a result, it basically has regulation targets
00:13:27.340 | that it tries to get to.
00:13:28.740 | These regulation targets start with priors.
00:13:30.700 | They're basically like unconditional reflexes
00:13:32.980 | that we are more or less born with.
00:13:34.460 | And then we can reverse engineer them
00:13:36.220 | to make them more consistent.
00:13:37.860 | And then we get more detailed models
00:13:39.260 | about how the world works and how to interact with it.
00:13:42.140 | And so these priors that you commit to
00:13:44.380 | are largely target values
00:13:46.360 | that our needs should approach, set points.
00:13:49.180 | And this deviation to the set point
00:13:50.780 | creates some urge, some tension.
00:13:53.580 | And we find ourselves living inside of feedback loops,
00:13:56.540 | right?
00:13:57.380 | Consciousness emerges over dimensions
00:13:58.820 | of disagreements with the universe.
00:14:00.880 | Things that you care,
00:14:02.660 | things are not the way they should be,
00:14:04.420 | but you need to regulate.
00:14:05.660 | And so in some sense, the sense itself
00:14:07.840 | is the result of all the identifications
00:14:09.740 | that you're having.
00:14:10.580 | An identification is a regulation target
00:14:12.780 | that you're committing to.
00:14:13.900 | It's a dimension that you care about,
00:14:15.700 | that you think is important.
00:14:17.300 | And this is also what locks you in.
00:14:18.660 | If you let go of these commitments,
00:14:21.460 | of these identifications, you get free.
00:14:24.140 | There's nothing that you have to do anymore.
00:14:26.500 | And if you let go of all of them,
00:14:27.740 | you're completely free and you can enter Nirvana
00:14:29.660 | because you're done.
00:14:30.660 | - And actually, this is a good time to pause and say
00:14:34.180 | thank you to a friend of mine, Gustav Sordestrom,
00:14:37.820 | who introduced me to your work.
00:14:39.260 | I want to give him a shout out.
00:14:40.780 | He's a brilliant guy.
00:14:42.540 | And I think the AI community is actually quite amazing.
00:14:45.540 | And Gustav is a good representative of that.
00:14:47.500 | You are as well.
00:14:48.340 | So I'm glad, first of all,
00:14:50.020 | I'm glad the internet exists and YouTube exists
00:14:51.780 | where I can watch your talks
00:14:54.500 | and then get to your book and study your writing
00:14:57.500 | and think about, you know, that's amazing.
00:14:59.940 | Okay, but you've kind of described
00:15:03.100 | in sort of this emergent phenomenon of consciousness
00:15:05.660 | from the simulation.
00:15:06.800 | So what about the hard problem of consciousness?
00:15:10.420 | Can you just linger on it?
00:15:12.780 | Why does it still feel?
00:15:18.060 | Like, I understand you're kind of,
00:15:19.860 | the self is an important part of the simulation,
00:15:22.340 | but why does the simulation feel like something?
00:15:26.300 | - So if you look at a book by, say, George R.R. Martin
00:15:30.500 | where the characters have plausible psychology
00:15:32.980 | and they stand on a hill
00:15:34.540 | because they want to conquer the city below the hill
00:15:36.460 | and they're done in it
00:15:37.300 | and they look at the color of the sky
00:15:38.660 | and they are apprehensive and feel empowered
00:15:41.900 | and all these things.
00:15:42.740 | Why do they have these emotions?
00:15:43.660 | It's because it's written into the story, right?
00:15:45.940 | And it's written into the story
00:15:46.860 | because it's an adequate model of the person
00:15:48.900 | that predicts what they're going to do next.
00:15:51.660 | And the same thing is true for us.
00:15:53.740 | So it's basically a story that our brain is writing.
00:15:56.060 | It's not written in words.
00:15:57.220 | It's written in perceptual content,
00:16:00.300 | basically multimedia content.
00:16:02.180 | And it's a model of what the person would feel
00:16:05.060 | if it existed.
00:16:06.900 | So it's a virtual person.
00:16:08.860 | And you and me happen to be this virtual person.
00:16:11.100 | So this virtual person gets access to the language center
00:16:14.340 | and talks about the sky being blue.
00:16:16.740 | And this is us.
00:16:18.140 | - But hold on a second.
00:16:19.100 | Do I exist in your simulation?
00:16:22.020 | - You do exist in an almost similar way as me.
00:16:25.740 | So there are internal states that are less accessible
00:16:30.740 | for me that you have and so on.
00:16:34.580 | And my model might not be completely adequate.
00:16:37.060 | There are also things that I might perceive about you
00:16:38.820 | that you don't perceive.
00:16:40.280 | But in some sense, both you and me are some puppets,
00:16:43.140 | two puppets that enact this play in my mind.
00:16:46.540 | And I identify with one of them
00:16:48.460 | because I can control one of the puppet directly.
00:16:51.260 | And with the other one, I can create things in between.
00:16:54.980 | So for instance, we can go in an interaction
00:16:56.900 | that even leads to a coupling to a feedback loop.
00:16:59.420 | So we can sync things together in a certain way
00:17:02.100 | or feel things together.
00:17:03.460 | But this coupling is itself not a physical phenomenon.
00:17:06.260 | It's entirely a software phenomenon.
00:17:08.100 | It's the result of two different implementations
00:17:10.180 | interacting with each other.
00:17:11.300 | - So that's interesting.
00:17:12.260 | So are you suggesting,
00:17:15.220 | like the way you think about it,
00:17:16.860 | is the entirety of existence, the simulation,
00:17:20.000 | and we're kind of each mind is a little sub-simulation
00:17:24.060 | that like, why don't you,
00:17:27.900 | why doesn't your mind have access
00:17:30.820 | to my mind's full state?
00:17:34.940 | Like--
00:17:36.060 | - For the same reason that my mind
00:17:37.540 | doesn't have access to its own full state.
00:17:40.100 | - So what, I mean--
00:17:43.020 | - There is no trick involved.
00:17:44.540 | So basically when I know something about myself,
00:17:46.700 | it's because I made a model.
00:17:48.260 | So one part of your brain is tasked with modeling
00:17:50.340 | what other parts of your brain are doing.
00:17:52.420 | - Yes, but there seems to be an incredible consistency
00:17:55.900 | about this world in the physical sense,
00:17:58.460 | that there's repeatable experiments and so on.
00:18:00.900 | How does that fit into our silly
00:18:03.980 | descendant of apes simulation of the world?
00:18:06.780 | So why is it so repeatable?
00:18:07.820 | Why is everything so repeatable?
00:18:09.460 | And not everything.
00:18:10.760 | There's a lot of fundamental physics experiments
00:18:13.540 | that are repeatable for a long time,
00:18:16.780 | all over the place, and so on.
00:18:19.340 | The laws of physics, how does that fit in?
00:18:21.380 | - It seems that the parts of the world
00:18:23.220 | that are not deterministic are not long lived.
00:18:26.660 | So if you build a system, any kind of automaton,
00:18:30.900 | so if you build simulations of something,
00:18:33.380 | you'll notice that the phenomena that endure
00:18:36.720 | are those that give rise to stable dynamics.
00:18:39.880 | So basically if you see anything that is complex
00:18:41.980 | in the world, it's the result of usually of some control,
00:18:44.900 | of some feedback that keeps it stable
00:18:46.780 | around certain attractors.
00:18:48.260 | And the things that are not stable,
00:18:49.740 | that don't give rise to certain harmonic patterns
00:18:52.080 | and so on, they tend to get weeded out over time.
00:18:55.300 | So if we are in a region of the universe
00:18:58.740 | that sustains complexity, which is required
00:19:01.280 | to implement minds like ours,
00:19:04.100 | this is going to be a region of the universe
00:19:05.940 | that is very tightly controlled and controllable.
00:19:08.800 | So it's going to have lots of interesting symmetries
00:19:11.580 | and also symmetry breaks that allow
00:19:13.980 | to the creation of structure.
00:19:15.740 | - But they exist where?
00:19:18.300 | So there's such an interesting idea
00:19:20.220 | that our mind is simulation that's constructing
00:19:22.000 | the narrative.
00:19:23.300 | My question is, just to try to understand
00:19:27.700 | how that fits with the entirety of the universe.
00:19:31.620 | You're saying that there's a region of this universe
00:19:34.260 | that allows enough complexity to create creatures like us,
00:19:36.980 | but what's the connection between the brain,
00:19:41.480 | the mind, and the broader universe?
00:19:44.300 | Which comes first?
00:19:45.220 | Which is more fundamental?
00:19:46.700 | Is the mind the starting point, the universe is emergent?
00:19:50.300 | Is the universe the starting point, the minds are emergent?
00:19:53.980 | - I think quite clearly the latter.
00:19:55.940 | That's at least a much easier explanation
00:19:57.940 | because it allows us to make causal models.
00:20:00.220 | And I don't see any way to construct an inverse causality.
00:20:03.780 | - So what happens when you die to your mind simulation?
00:20:06.780 | - My implementation ceases.
00:20:09.540 | So basically the thing that implements myself
00:20:12.320 | will no longer be present.
00:20:13.880 | Which means if I'm not implemented
00:20:15.320 | on the minds of other people,
00:20:16.420 | the thing that I identify with.
00:20:18.120 | The weird thing is I don't actually have an identity
00:20:22.620 | beyond the identity that I construct.
00:20:24.740 | If I was the Dalai Lama,
00:20:26.680 | he identifies as a form of government.
00:20:29.360 | So basically the Dalai Lama gets reborn,
00:20:31.560 | not because he's confused,
00:20:33.440 | but because he is not identifying as a human being.
00:20:38.020 | He runs on a human being.
00:20:39.280 | He's basically a governmental software
00:20:41.760 | that is instantiated in every new generation and you.
00:20:44.840 | So his advisors will pick someone
00:20:46.540 | who does this in the next generation.
00:20:48.340 | So if you identify with this,
00:20:50.320 | you are no longer a human and you don't die in the sense,
00:20:53.640 | what dies is only the body of the human that you run on.
00:20:57.200 | To kill the Dalai Lama, you would have to kill his tradition.
00:21:00.280 | And if we look at ourselves,
00:21:02.400 | we realize that we are to a small part like this,
00:21:04.760 | most of us.
00:21:05.600 | So for instance, if you have children,
00:21:06.520 | you realize something lives on in them.
00:21:09.400 | Or if you spark an idea in the world, something lives on.
00:21:12.040 | Or if you identify with the society around you,
00:21:14.860 | because you are a part that.
00:21:16.440 | You're not just this human being.
00:21:17.960 | - Yeah, so in a sense, you are kind of like a Dalai Lama
00:21:21.240 | in the sense that you, Josh Abach,
00:21:24.160 | is just a collection of ideas.
00:21:25.760 | So you have this operating system
00:21:28.200 | on which a bunch of ideas live and interact.
00:21:30.560 | And then once you die,
00:21:31.560 | they kind of, some of them jump off the ship.
00:21:36.080 | - Put it the other way, identity is a software state.
00:21:38.520 | It's a construction.
00:21:39.720 | It's not physically real.
00:21:41.000 | Identity is not a physical concept.
00:21:44.340 | It's basically a representation of different objects
00:21:46.720 | on the same world line.
00:21:48.480 | - But identity lives and dies.
00:21:52.840 | Are you attached?
00:21:53.680 | What's the fundamental thing?
00:21:56.120 | Is it the ideas that come together to form identity?
00:21:59.880 | Or is each individual identity
00:22:01.280 | actually a fundamental thing?
00:22:02.760 | - It's a representation that you can get agency over
00:22:05.120 | if you care.
00:22:05.960 | Basically, you can choose what you identify with
00:22:08.360 | if you want to.
00:22:09.520 | - No, but it just seems, if the mind is not real,
00:22:14.520 | that the birth and death is not a crucial part of it.
00:22:21.120 | Well, maybe I'm silly.
00:22:26.400 | Maybe I'm attached to this whole biological organism.
00:22:30.720 | But it seems that the physical,
00:22:33.320 | being a physical object in this world
00:22:36.280 | is an important aspect of birth and death.
00:22:40.240 | Like it feels like it has to be physical to die.
00:22:42.880 | It feels like simulations don't have to die.
00:22:46.400 | - The physics that we experience is not the real physics.
00:22:48.960 | There is no color and sound in the real world.
00:22:51.600 | Color and sound are types of representations
00:22:54.000 | that you get if you want to model reality
00:22:56.720 | with oscillators, right?
00:22:57.680 | So colors and sound in some sense have octaves.
00:23:00.480 | - Yes.
00:23:01.320 | - And it's because they are represented
00:23:02.240 | probably with oscillators, right?
00:23:03.680 | So that's why colors form a circle of use.
00:23:07.000 | And colors have harmonics, sounds have harmonics
00:23:09.520 | as a result of synchronizing oscillators in the brain, right?
00:23:13.280 | So the world that we subjectively interact with
00:23:15.840 | is fundamentally the result
00:23:18.120 | of the representation mechanisms in our brain.
00:23:20.560 | They are mathematically, to some degree, universal.
00:23:22.600 | There are certain regularities that you can discover
00:23:25.000 | in the patterns and not others.
00:23:26.680 | But the patterns that we get, this is not the real world.
00:23:29.320 | The world that we interact with
00:23:30.480 | is always made of too many parts to count, right?
00:23:33.040 | So when you look at this table and so on,
00:23:35.720 | it's consisting of so many molecules and atoms
00:23:39.000 | that you cannot count them.
00:23:39.920 | So you only look at the aggregate dynamics,
00:23:42.160 | at limit dynamics.
00:23:43.720 | If you had almost infinitely many particles,
00:23:47.560 | what would be the dynamics of the table?
00:23:49.200 | And this is roughly what you get.
00:23:50.400 | So geometry that we are interacting with
00:23:52.800 | is the result of discovering those operators
00:23:54.920 | that work in the limit,
00:23:56.760 | that you get by building an infinite series that converges.
00:24:00.120 | For those parts where it converges, it's geometry.
00:24:02.640 | For those parts where it doesn't converge, it's chaos.
00:24:05.280 | - Right, and then, so all of that is filtered
00:24:07.320 | through the consciousness that's emergent in our narrative.
00:24:12.320 | So the consciousness gives it color,
00:24:14.760 | gives it feeling, gives it flavor.
00:24:16.960 | - So I think the feeling, flavor, and so on,
00:24:20.480 | is given by the relationship that a feature has
00:24:22.640 | to all the other features.
00:24:24.240 | It's basically a giant relational graph
00:24:26.280 | that is our subjective universe.
00:24:28.460 | The color is given by those aspects of the representation,
00:24:31.840 | or this experiential color where you care about,
00:24:35.080 | where you have identifications,
00:24:36.680 | where something means something,
00:24:38.040 | where you are the inside of a feedback loop,
00:24:39.760 | and the dimensions of caring are basically dimensions
00:24:43.160 | of this motivational system that we emerge over.
00:24:45.800 | (silence)
00:24:47.960 | (silence)
00:24:50.120 | (silence)
00:24:52.280 | (silence)
00:24:54.440 | (silence)
00:24:56.600 | (silence)
00:24:58.760 | (silence)
00:25:00.920 | [BLANK_AUDIO]