back to index

Happiness is a cookie that your brain bakes for itself (Joscha Bach) | AI Podcast Clips


Chapters

0:0 Happiness is a cookie
1:46 The cell is a molecular machine
2:40 The cell as a von Neumann probe
3:13 The cellular automata
5:19 God
7:26 Creation

Whisper Transcript | Transcript Only Page

00:00:00.000 | - So do you think suffering is fundamental
00:00:04.760 | to happiness along these lines?
00:00:07.480 | - Suffering is the result of caring
00:00:09.120 | about things that you cannot change.
00:00:11.140 | And if you are able to change what you care about
00:00:13.800 | to those things that you can change, you will not suffer.
00:00:16.040 | - But would you then be able to experience happiness?
00:00:20.160 | - Yes, but happiness itself is not important.
00:00:23.280 | Happiness is like a cookie.
00:00:25.040 | When you are a child, you think cookies are very important
00:00:27.200 | and you want to have all the cookies in the world
00:00:28.760 | and you look forward to being an adult
00:00:30.360 | because then you have as many cookies as you want, right?
00:00:32.800 | - Yes.
00:00:33.640 | - But as an adult, you realize a cookie is a tool.
00:00:35.860 | It's a tool to make you eat vegetables.
00:00:38.120 | And once you eat your vegetables anyway,
00:00:39.600 | you stop eating cookies for the most part
00:00:41.360 | because otherwise you will get diabetes
00:00:43.120 | and will not be around for your kids.
00:00:44.560 | - Yes, but then the cookie, the scarcity of a cookie,
00:00:48.080 | if scarcity is enforced, nevertheless,
00:00:50.400 | so like the pleasure comes from the scarcity.
00:00:52.560 | - Yes, but the happiness is a cookie
00:00:54.840 | that your brain bakes for itself.
00:00:56.800 | It's not made by the environment.
00:00:58.440 | The environment cannot make you happy.
00:00:59.880 | It's your appraisal of the environment
00:01:01.560 | that makes you happy.
00:01:03.040 | And if you can change the appraisal of the environment,
00:01:05.080 | which you can learn to,
00:01:06.000 | then you can create arbitrary states of happiness.
00:01:08.360 | And some meditators fall into this trap.
00:01:10.160 | So they discover the womb, this basement womb in their brain
00:01:13.200 | where the cookies are made,
00:01:14.520 | and they indulge in stuff themselves.
00:01:16.280 | And after a few months, it gets really old
00:01:18.280 | and the big crisis of meaning comes
00:01:20.400 | because they thought before that their unhappiness
00:01:23.000 | was the result of not being happy enough.
00:01:25.360 | So they fixed this, right?
00:01:26.520 | They can release the neurotransmitters at will
00:01:28.520 | if they train.
00:01:29.720 | And then the crisis of meaning pops up a deeper layer.
00:01:34.440 | And the question is, why do I live?
00:01:35.680 | How can I make a sustainable civilization
00:01:37.600 | that is meaningful to me?
00:01:38.920 | How can I insert myself into this?
00:01:40.640 | And this was the problem that you couldn't solve
00:01:42.080 | in the first place.
00:01:43.040 | - But at the end of all this,
00:01:47.520 | let me then ask that same question.
00:01:49.080 | What is the answer to that?
00:01:51.800 | What could the possible answer be of the meaning of life?
00:01:55.240 | What could an answer be?
00:01:57.080 | What is it to you?
00:01:58.720 | - I think that if you look at the meaning of life,
00:02:01.120 | you look at what the cell is.
00:02:03.120 | Life is the cell, right?
00:02:04.960 | - The original cell.
00:02:05.800 | - Yes, or this principle, the cell.
00:02:07.880 | It's this self-organizing thing
00:02:10.160 | that can participate in evolution.
00:02:12.200 | In order to make it work, it's a molecular machine.
00:02:14.600 | It needs a self-replicator,
00:02:15.880 | a entropy extractor, and a Turing machine.
00:02:18.380 | If any of these parts is missing,
00:02:19.680 | you don't have a cell and it is not living, right?
00:02:22.040 | And life is basically the emergent complexity
00:02:24.240 | over that principle.
00:02:25.560 | Once you have this intelligent super molecule, the cell,
00:02:29.360 | there is very little that you cannot make it do.
00:02:31.040 | It's probably the optimal computronium,
00:02:33.440 | especially in terms of resilience.
00:02:35.440 | It's very hard to sterilize the planet
00:02:37.240 | once it's infected with life.
00:02:38.680 | - So it's active function of these three components
00:02:43.680 | of the super cell,
00:02:45.080 | the cell is present in the cell,
00:02:47.480 | it's present in us, and it's just--
00:02:49.640 | - We are just an expression of the cell.
00:02:51.240 | It's a certain layer of complexity
00:02:52.840 | in the organization of cells.
00:02:54.840 | So in a way, it's tempting to think of the cell
00:02:57.600 | as a von Neumann probe.
00:02:59.120 | If you want to build intelligence on other planets,
00:03:02.120 | the best way to do this is to infect them with cells.
00:03:05.320 | And wait for long enough, and there's a reasonable chance
00:03:07.800 | the stuff is going to evolve
00:03:09.080 | into an information processing principle
00:03:11.200 | that is general enough to become sentient.
00:03:13.400 | - Well, that idea is very akin to sort of the same dream
00:03:17.820 | and beautiful ideas that are expressed
00:03:19.280 | in cellular automata in their most simple mathematical form.
00:03:22.320 | If you just inject the system with some basic mechanisms
00:03:26.840 | of replication and so on, basic rules,
00:03:29.000 | amazing things would emerge.
00:03:30.880 | - And the cell is able to do something
00:03:32.520 | that James Chardy calls existential design.
00:03:35.760 | He points out that in technical design,
00:03:37.920 | we go from the outside in.
00:03:39.120 | We work in a highly controlled environment
00:03:41.400 | in which everything is deterministic,
00:03:42.800 | like our computers, our labs, or our engineering workshops.
00:03:46.640 | And then we use this determinism
00:03:48.280 | to implement a particular kind of function
00:03:50.640 | that we dream up and that seamlessly interfaces
00:03:53.400 | with all the other deterministic functions
00:03:55.400 | that we already have in our world.
00:03:57.000 | So it's basically from the outside in.
00:03:59.720 | And biological systems designed from the inside out,
00:04:02.600 | as seed, will become a seedling
00:04:05.160 | by taking some of the relatively unorganized matter
00:04:08.240 | around it and turn it into its own structure,
00:04:11.760 | and thereby subdue the environment.
00:04:13.520 | And cells can cooperate if they can rely on other cells
00:04:16.480 | having a similar organization that is already compatible.
00:04:19.400 | But unless that's there, the cell needs to divide,
00:04:23.400 | to create that structure by itself.
00:04:25.200 | So it's a self-organizing principle
00:04:27.240 | that works on a somewhat chaotic environment.
00:04:29.880 | And the purpose of life, in a sense,
00:04:31.720 | is to produce complexity.
00:04:34.680 | And the complexity allows you
00:04:36.040 | to harvest neg-entropy gradients
00:04:37.640 | that you couldn't harvest without the complexity.
00:04:40.080 | And in this sense, intelligence and life
00:04:42.120 | are very strongly connected,
00:04:43.500 | because the purpose of intelligence
00:04:45.320 | is to allow control under conditions of complexity.
00:04:48.320 | So basically, you shift the boundary
00:04:50.160 | between the ordered systems into the realm of chaos.
00:04:54.640 | You build bridge heads into chaos with complexity.
00:04:58.060 | And this is what we are doing.
00:04:59.620 | This is not necessarily a deeper meaning.
00:05:01.440 | I think the meaning that we have priors for,
00:05:03.400 | that we are evolved for,
00:05:04.640 | outside of the priors, there is no meaning.
00:05:06.240 | Meaning only exists if a mind projects it, right?
00:05:08.600 | - Yeah, the narrative.
00:05:09.440 | - That is probably civilization.
00:05:11.280 | I think that what feels most meaningful to me
00:05:14.160 | is to try to build and maintain a sustainable civilization.
00:05:18.660 | - And taking a slight step outside of that,
00:05:21.180 | we talked about a man with a beard and God.
00:05:25.460 | But something, some mechanism,
00:05:30.460 | perhaps must have planted the seed,
00:05:33.860 | the initial seed of the cell.
00:05:35.340 | Do you think there is a God?
00:05:38.780 | What is a God?
00:05:40.500 | And what would that look like?
00:05:41.940 | - So if there was no spontaneous abiogenesis,
00:05:45.060 | in the sense that the first cell
00:05:46.780 | formed by some happy random accidents
00:05:50.540 | where the molecules just happen to be
00:05:51.980 | in the right constellation to each other.
00:05:53.420 | - But there could also be the mechanism
00:05:55.940 | that allows for the random.
00:05:57.660 | I mean, there's like turtles all the way down.
00:06:00.260 | There seems to be,
00:06:01.140 | there has to be a head turtle at the bottom.
00:06:03.580 | - Let's consider something really wild.
00:06:05.260 | Imagine, is it possible that a gas giant
00:06:08.460 | could become intelligent?
00:06:10.260 | What would that involve?
00:06:11.340 | So imagine that you have vortices
00:06:13.380 | that spontaneously emerge on the gas giants,
00:06:15.700 | like big storm systems that endure for thousands of years.
00:06:19.380 | And some of these storm systems
00:06:20.980 | produce electromagnetic fields
00:06:22.220 | because some of the clouds are ferromagnetic or something.
00:06:25.020 | And as a result,
00:06:25.900 | they can change how certain clouds react
00:06:28.260 | rather than other clouds
00:06:29.420 | and thereby produce some self-stabilizing patterns
00:06:32.380 | that eventually to regulation feedback loops,
00:06:34.600 | nested feedback loops and control.
00:06:37.040 | So imagine you have such a thing
00:06:38.740 | that basically has emergent self-sustaining,
00:06:41.020 | self-organizing complexity.
00:06:42.340 | And at some point this breaks up
00:06:43.580 | and realizes, I'm basically LEM Solaris.
00:06:46.020 | I am a thinking planet,
00:06:47.760 | but I will not replicate
00:06:48.900 | because I cannot recreate the conditions
00:06:50.780 | of my own existence somewhere else.
00:06:53.100 | I'm just basically an intelligence
00:06:55.100 | that has spontaneously formed because it could.
00:06:58.220 | And now it builds a von Neumann probe.
00:07:00.740 | And the best von Neumann probe for such a thing
00:07:02.500 | might be the cell.
00:07:03.660 | So maybe it, because it's very, very clever
00:07:05.660 | and very enduring, creates cells and sends them out.
00:07:08.540 | And one of them has infected our planet.
00:07:10.740 | And I'm not suggesting that this is the case,
00:07:12.380 | but it would be compatible with the Prince Bermion
00:07:14.820 | hypothesis and with my intuition
00:07:17.120 | that albiogenesis is very unlikely.
00:07:19.400 | It's possible, but you probably need
00:07:21.700 | to roll the cosmic dice very often,
00:07:23.580 | maybe more often than there are planetary surfaces.
00:07:25.740 | I don't know.
00:07:26.580 | - So God is just a large enough,
00:07:31.020 | a system that's large enough that allows randomness.
00:07:35.340 | - No, I don't think that God
00:07:36.260 | has anything to do with creation.
00:07:37.820 | I think it's a mistranslation of the Talmud
00:07:40.300 | into the Catholic mythology.
00:07:43.060 | I think that Genesis is actually
00:07:44.340 | the childhood memories of a God.
00:07:46.340 | So the, when--
00:07:47.860 | - Sorry, that Genesis is the--
00:07:49.620 | - The childhood memories of a God.
00:07:50.940 | It's basically a mind that is remembering
00:07:54.180 | how it came into being.
00:07:55.820 | And we typically interpret Genesis
00:07:58.780 | as the creation of a physical universe
00:08:00.420 | by a supernatural being.
00:08:02.260 | And I think when you'll read it,
00:08:05.800 | there is light and darkness that is being created.
00:08:09.420 | And then you discover sky and ground, create them.
00:08:13.180 | You construct the plants and the animals,
00:08:16.540 | and you give everything their names and so on.
00:08:18.820 | That's basically cognitive development.
00:08:20.180 | It's a sequence of steps that every mind has to go through
00:08:24.740 | when it makes sense of the world.
00:08:25.980 | And when you have children,
00:08:26.800 | you can see how initially they distinguish
00:08:29.100 | light and darkness.
00:08:30.420 | And then they make out directions in it,
00:08:32.340 | and they discover sky and ground,
00:08:33.720 | and they discover the plants and the animals,
00:08:35.580 | and they give everything their name.
00:08:36.580 | And it's a creative process that happens in every mind,
00:08:39.580 | because it's not given, right?
00:08:40.880 | Your mind has to invent these structures
00:08:43.060 | to make sense of the patterns on your retina.
00:08:45.820 | Also, if there was some big nerd who set up a server
00:08:48.660 | and runs this world on it,
00:08:50.580 | this would not create a special relationship
00:08:52.820 | between us and the nerd.
00:08:53.940 | This nerd would not have the magical power
00:08:56.060 | to give meaning to our existence, right?
00:08:58.260 | So this equation of a creator God
00:09:01.300 | with the God of meaning is a slate of hand.
00:09:04.420 | You shouldn't do it.
00:09:05.860 | The other one that is done in Catholicism
00:09:07.940 | is the equation of the first mover,
00:09:10.100 | the prime mover of Aristotle,
00:09:12.220 | which is basically the automaton that runs the universe.
00:09:15.100 | Aristotle says, "If things are moving,
00:09:17.660 | "and things seem to be moving here,
00:09:19.060 | "something must move them," right?
00:09:20.860 | If something moves them,
00:09:22.120 | something must move the thing that is moving it.
00:09:24.060 | So there must be a prime mover.
00:09:26.140 | This idea to say that this prime mover
00:09:28.140 | is a supernatural being is complete nonsense, right?
00:09:31.220 | It's an automaton in the simplest case.
00:09:34.480 | So we have to explain the enormity
00:09:36.100 | that this automaton exists at all.
00:09:38.260 | But again, we don't have any possibility
00:09:41.220 | to infer anything about its properties
00:09:43.420 | except that it's able to produce change in information, right?
00:09:47.900 | So there needs to be some kind of computational principle.
00:09:50.500 | This is all there is.
00:09:51.660 | But to say this automaton is identical, again,
00:09:54.220 | with the creator of the first cause
00:09:55.860 | or with the thing that gives meaning to our life
00:09:57.700 | is confusion.
00:10:00.020 | No, I think that what we perceive
00:10:01.660 | is the higher being that we are part of.
00:10:05.100 | And the higher being that we are part of
00:10:06.800 | is the civilization.
00:10:08.380 | It's the thing in which we have a similar relationship
00:10:10.680 | as the cell has to our body.
00:10:12.520 | And we have this prior
00:10:14.820 | because we have evolved to organize in these structures.
00:10:17.620 | So basically, the Christian god in its natural form,
00:10:21.820 | without the mythology, if you undress it,
00:10:24.100 | is basically the platonic form of a civilization.
00:10:26.540 | - Is the ideal, is the--
00:10:30.540 | - Yes, it's this ideal that you try to approximate
00:10:32.660 | when you interact with others,
00:10:34.300 | not based on your incentives,
00:10:35.620 | but on what you think is right.
00:10:37.180 | - Wow, we covered a lot of ground.
00:10:41.740 | And we're left with one of my favorite lines,
00:10:44.380 | and there's many, which is,
00:10:45.960 | "Happiness is a cookie that the brain bakes itself."
00:10:50.960 | It's been a huge honor and a pleasure to talk to you.
00:10:56.260 | I'm sure our paths will cross many times again.
00:10:59.760 | - Yasha, thank you so much for talking today.
00:11:02.020 | Really appreciate it. - Thank you, Lex.
00:11:02.860 | It was so much fun.
00:11:04.660 | I enjoyed it.
00:11:05.780 | - Awesome.
00:11:06.620 | (upbeat music)
00:11:09.200 | (upbeat music)
00:11:11.780 | (upbeat music)
00:11:14.360 | (upbeat music)
00:11:16.940 | (upbeat music)
00:11:19.520 | (upbeat music)
00:11:22.100 | [BLANK_AUDIO]