back to index

Moore's Law is Not Dead (Jim Keller) | AI Podcast Clips


Chapters

0:0
0:20 What Is Moore's Law
1:25 Broader Definition of Moore's Law
14:52 Building Blocks of Mathematics
19:40 Nanowires

Whisper Transcript | Transcript Only Page

00:00:00.000 | For over 50 years now, Moore's Law has served for me and millions of others as an inspiring
00:00:08.760 | beacon of what kind of amazing future brilliant engineers can build.
00:00:14.120 | I'm just making your kids laugh all of today.
00:00:16.600 | That's great.
00:00:18.420 | So first, in your eyes, what is Moore's Law, if you could define for people who don't know?
00:00:25.960 | Well the simple statement was, from Gordon Moore, was double the number of transistors
00:00:30.680 | every two years.
00:00:32.920 | Something like that.
00:00:34.200 | And then my operational model is, we increase the performance of computers by 2x every two
00:00:41.880 | or three years.
00:00:43.400 | And it's wiggled around substantially over time.
00:00:46.340 | And also, in how we deliver performance has changed.
00:00:50.320 | Right.
00:00:51.320 | But the foundational idea was 2x the transistors every two years.
00:00:57.680 | The current cadence is something like, they call it a shrink factor, like .6 every two
00:01:04.200 | years, which is not .5.
00:01:06.720 | But that's referring strictly, again, to the original definition of just--
00:01:09.840 | Yeah, of transistor count.
00:01:11.320 | A shrink factor, just getting them smaller and smaller and smaller.
00:01:13.920 | Well, it's for a constant chip area.
00:01:16.540 | If you make the transistors smaller by .6, then you get one over .6 more transistors.
00:01:21.980 | So can you linger on it a little longer?
00:01:24.700 | What's a broader-- what do you think should be the broader definition of Moore's Law?
00:01:28.340 | When you mentioned how you think of performance, just broadly, what's a good way to think about
00:01:35.420 | Moore's Law?
00:01:36.420 | Well, first of all, I've been aware of Moore's Law for 30 years.
00:01:42.940 | In which sense?
00:01:43.940 | Well, I've been designing computers for 40.
00:01:47.780 | You're just watching it before your eyes, kind of thing.
00:01:50.620 | Well, and somewhere where I became aware of it, I was also informed that Moore's Law was
00:01:54.580 | going to die in 10 to 15 years.
00:01:56.500 | And I thought that was true at first.
00:01:58.820 | But then after 10 years, it was going to die in 10 to 15 years.
00:02:02.380 | And then at one point, it was going to die in five years.
00:02:04.620 | And then it went back up to 10 years.
00:02:06.180 | And at some point, I decided not to worry about that particular prognostication for
00:02:11.540 | the rest of my life, which is fun.
00:02:14.500 | And then I joined Intel, and everybody said Moore's Law is dead.
00:02:17.660 | And I thought, that's sad, because it's the Moore's Law company.
00:02:20.620 | And it's not dead.
00:02:21.780 | And it's always been going to die.
00:02:24.260 | And humans like these apocryphal kind of statements, like, we'll run out of food, or we'll run
00:02:30.340 | out of air, or run out of room, or run out of something.
00:02:34.100 | Right.
00:02:35.100 | But it's still incredible that it's lived for as long as it has.
00:02:39.540 | And yes, there's many people who believe now that Moore's Law is dead.
00:02:44.860 | You know, they can join the last 50 years of people who had the same idea.
00:02:48.540 | Yeah, there's a long tradition.
00:02:50.340 | But why do you think, if you can try to understand it, why do you think it's not dead currently?
00:02:58.100 | Well, first, let's just think, people think Moore's Law is one thing.
00:03:02.300 | Transistors get smaller.
00:03:04.020 | But actually, under the sheet, there's literally thousands of innovations.
00:03:07.360 | And almost all those innovations have their own diminishing return curves.
00:03:12.180 | So if you graph it, it looks like a cascade of diminishing return curves.
00:03:15.500 | I don't know what to call that.
00:03:17.560 | But the result is an exponential curve.
00:03:20.420 | Well, at least it has been.
00:03:24.180 | And we keep inventing new things.
00:03:25.660 | So if you're an expert in one of the things on a diminishing return curve, right, and
00:03:30.860 | you can see its plateau, you will probably tell people, well, this is done.
00:03:36.540 | Meanwhile, some other pile of people are doing something different.
00:03:41.220 | So that's just normal.
00:03:43.100 | So then there's the observation of how small could a switching device be?
00:03:48.900 | So a modern transistor is something like a thousand by a thousand by a thousand atoms,
00:03:53.500 | right?
00:03:54.860 | And you get quantum effects down around two to ten atoms.
00:03:59.500 | So you can imagine a transistor as small as 10 by 10 by 10.
00:04:03.020 | So that's a million times smaller.
00:04:06.940 | And then the quantum computational people are working away at how to use quantum effects.
00:04:12.300 | So a thousand by a thousand by a thousand atoms.
00:04:18.580 | That's a really clean way of putting it.
00:04:21.500 | Well, a fan, like a modern transistor, if you look at the fan, it's like 120 atoms wide.
00:04:26.860 | But we can make that thinner.
00:04:28.180 | And then there's a gate wrapped around it.
00:04:30.460 | And then there's spacing.
00:04:31.460 | There's a whole bunch of geometry.
00:04:33.740 | And a competent transistor designer could count both atoms in every single direction.
00:04:42.820 | Like there's techniques now to already put down atoms in a single atomic layer.
00:04:47.900 | And you can place atoms if you want to.
00:04:50.720 | It's just from a manufacturing process, placing an atom takes 10 minutes.
00:04:56.140 | And you need to put 10 to the 23rd atoms together to make a computer.
00:05:01.340 | It would take a long time.
00:05:03.660 | So the methods are both shrinking things and then coming up with effective ways to control
00:05:10.940 | what's happening.
00:05:13.260 | Manufacture stably and cheaply.
00:05:15.380 | Yeah.
00:05:16.460 | So the innovation stack's pretty broad.
00:05:19.380 | There's equipment, there's optics, there's chemistry, there's physics, there's material
00:05:22.940 | science, there's metallurgy.
00:05:25.940 | There's lots of ideas about when you put different materials together, how do they interact?
00:05:29.300 | Are they stable?
00:05:30.300 | Are they stable over temperature?
00:05:33.740 | Like are they repeatable?
00:05:36.300 | There's literally thousands of technologies involved.
00:05:39.820 | - But just for the shrinking, you don't think we're quite yet close to the fundamental limits
00:05:44.380 | of physics?
00:05:45.380 | - I did a talk on Moore's Law and I asked for a roadmap to a path of 100.
00:05:49.700 | And after two weeks, they said we only got to 50.
00:05:52.780 | - 100 what, sorry?
00:05:54.260 | - 100x shrink.
00:05:55.260 | - About 100x shrink.
00:05:56.260 | We only got to 50.
00:05:57.260 | - The 50 and I said, "Why don't you give it another two weeks?"
00:06:02.540 | Well here's the thing about Moore's Law.
00:06:04.460 | So I believe that the next 10 or 20 years of shrinking is gonna happen.
00:06:11.180 | Now as a computer designer, you have two stances.
00:06:15.780 | You think it's going to shrink, in which case you're designing and thinking about architecture
00:06:20.860 | in a way that you'll use more transistors.
00:06:23.820 | Or conversely, not be swamped by the complexity of all the transistors you get.
00:06:30.940 | You have to have a strategy.
00:06:34.620 | - You're open to the possibility and waiting for the possibility of a whole new army of
00:06:38.920 | transistors ready to work.
00:06:40.660 | - I'm expecting more transistors every two or three years by a number large enough that
00:06:47.700 | how you think about design, how you think about architecture has to change.
00:06:52.060 | Like imagine you build buildings out of bricks and every year the bricks are half the size
00:06:59.300 | or every two years.
00:07:00.700 | Well if you kept building bricks the same way, so many bricks per person per day, the
00:07:06.220 | amount of time to build a building would go up exponentially.
00:07:09.660 | But if you said, "I know that's coming, so now I'm gonna design equipment that moves
00:07:16.140 | bricks faster, uses them better," because maybe you're getting something out of the
00:07:19.420 | smaller bricks, more strength, thinner walls, less material efficiency out of that.
00:07:25.180 | So once you have a roadmap with what's gonna happen, transistors, we're gonna get more
00:07:30.140 | of them, then you design all this collateral around it to take advantage of it and also
00:07:35.300 | to cope with it.
00:07:37.460 | That's the thing people don't understand.
00:07:38.940 | If I didn't believe in Moore's law and then Moore's law transistors showed up, my design
00:07:43.940 | teams were all drowned.
00:07:46.540 | - So what's the hardest part of this influx of new transistors?
00:07:52.060 | I mean, even if you just look historically throughout your career, what's the thing,
00:07:59.060 | what fundamentally changes when you add more transistors in the task of designing an architecture?
00:08:06.140 | - There's two constants, right?
00:08:07.380 | One is people don't get smarter.
00:08:10.860 | - By the way, there's some science showing that we do get smarter because of nutrition,
00:08:14.740 | whatever.
00:08:15.740 | - Yeah.
00:08:16.740 | - Sorry to bring that up.
00:08:17.740 | - The Flynn effect.
00:08:18.740 | - Yes.
00:08:19.740 | - Yeah, I'm familiar with it.
00:08:20.740 | Nobody understands it.
00:08:21.740 | Nobody knows if it's still going on.
00:08:22.740 | - Or whether it's real or not.
00:08:23.740 | - Yeah, I sort of--
00:08:24.740 | - Anyway, but not exponentially.
00:08:25.740 | - I would believe for the most part people aren't getting much smarter.
00:08:30.300 | - The evidence doesn't support it.
00:08:31.740 | That's right.
00:08:32.740 | - And then teams can't grow that much.
00:08:35.700 | So human beings, we're really good in teams of 10, up to teams of 100, they can know each
00:08:42.460 | other.
00:08:43.460 | Beyond that, you have to have organizational boundaries.
00:08:45.620 | So you're kind of, you have, those are pretty hard constraints, right?
00:08:49.420 | So then you have to divide and conquer.
00:08:51.260 | Like as the designs get bigger, you have to divide it into pieces.
00:08:55.020 | You know, the power of abstraction layers is really high.
00:08:57.980 | We used to build computers out of transistors.
00:09:00.900 | Now we have a team that turns transistors into logic cells and another team that turns
00:09:04.340 | them into functional units and another one that turns them into computers, right?
00:09:07.900 | So we have abstraction layers in there.
00:09:11.100 | And you have to think about when do you shift gears on that?
00:09:16.100 | We also use faster computers to build faster computers.
00:09:19.060 | So some algorithms run twice as fast on new computers, but a lot of algorithms are N squared.
00:09:25.220 | So a computer with twice as many transistors in it might take four times as long to run.
00:09:31.300 | So you have to refactor the software.
00:09:33.540 | Like simply using faster computers to build bigger computers doesn't work.
00:09:38.980 | So you have to think about all these things.
00:09:41.020 | So in terms of computing performance and the exciting possibility that more powerful computers
00:09:45.300 | bring is shrinking the thing we've just been talking about.
00:09:51.420 | One of the, for you, one of the biggest exciting possibilities of advancement in performance,
00:09:56.260 | or is there other directions that you're interested in?
00:09:59.900 | Like in the direction of sort of enforcing given parallelism or like doing massive parallelism
00:10:06.780 | in terms of many, many CPUs, stacking CPUs on top of each other, that kind of parallelism
00:10:14.420 | or any kind of parallelism?
00:10:15.420 | Well, think about it in a different way.
00:10:17.020 | So old computers, slow computers, you said A equal B plus C times D. Pretty simple, right?
00:10:25.420 | And then we made faster computers with vector units and you can do proper equations and
00:10:31.380 | matrices, right?
00:10:33.340 | And then modern like AI computations or like convolutional neural networks where you convolve
00:10:38.980 | one large dataset against another.
00:10:41.980 | And so there's sort of this hierarchy of mathematics, from simple equation to linear equations,
00:10:48.820 | to matrix equations, to deeper kind of computation.
00:10:53.620 | And the datasets are getting so big that people are thinking of data as a topology problem.
00:10:59.540 | Data is organized in some immense shape.
00:11:02.860 | And then the computation, which sort of wants to be get data from immense shape and do some
00:11:08.060 | computation on it.
00:11:10.140 | So what computers have allowed people to do is have algorithms go much, much further.
00:11:17.320 | So that paper you referenced, the Sutton paper, they talked about, like when AI started, it
00:11:24.020 | was apply rule sets to something.
00:11:26.740 | That's a very simple computational situation.
00:11:30.660 | And then when they did first chess thing, they solved deep searches.
00:11:34.740 | So have a huge database of moves and results, deep search, but it's still just a search.
00:11:42.980 | Now we take large numbers of images and we use it to train these weight sets that we
00:11:49.340 | convolve across to completely different kind of phenomena.
00:11:53.720 | We call that AI.
00:11:54.720 | And now they're doing the next generation.
00:11:57.260 | And if you look at it, they're going up this mathematical graph, right?
00:12:02.420 | And then computations, both computation and datasets support going up that graph.
00:12:08.300 | - Yeah, the kind of computation that might, I mean, I would argue that all of it is still
00:12:12.460 | a search, right?
00:12:14.900 | Just like you said, a topology problem of datasets, you're searching the datasets for
00:12:20.420 | valuable data.
00:12:21.940 | And also the actual optimization of neural networks is a kind of search for the--
00:12:27.180 | - I don't know, if you had looked at the inner layers of finding a cat, it's not a search.
00:12:34.300 | It's a set of endless projections.
00:12:35.860 | So projection, here's a shadow of this phone, right?
00:12:40.500 | And then you can have a shadow of that on something, a shadow on that of something.
00:12:44.100 | If you look in the layers, you'll see this layer actually describes pointy ears and round
00:12:48.820 | eyedness and fuzziness, but the computation to tease out the attributes is not search.
00:12:57.940 | - Right, I mean--
00:12:58.940 | - Like the inference part might be search, but the training is not search.
00:13:02.740 | And then in deep networks, they look at layers and they don't even know it's represented.
00:13:09.260 | And yet if you take the layers out, it doesn't work.
00:13:11.380 | - Okay, so--
00:13:12.380 | - So I don't think it's search.
00:13:13.380 | - All right, well--
00:13:14.380 | - But you have to talk to a mathematician about what that actually is.
00:13:17.140 | - Well, we could disagree, but it's just semantics, I think.
00:13:22.140 | It's not, but it's certainly not--
00:13:23.860 | - I would say it's absolutely not semantics, but--
00:13:26.740 | - Okay, all right, well, if you wanna go there.
00:13:31.860 | So optimization to me is search, and we're trying to optimize the ability of a neural
00:13:38.060 | network to detect cat ears.
00:13:40.980 | And the difference between chess and the space, the incredibly multi-dimensional, 100,000
00:13:50.620 | dimensional space that neural networks are trying to optimize over is nothing like the
00:13:55.060 | chess board database.
00:13:57.020 | So it's a totally different kind of thing.
00:13:59.260 | Okay, in that sense, you can say--
00:14:00.980 | - Yeah, yeah.
00:14:01.980 | - It loses the meaning.
00:14:02.980 | - I can see how you might say.
00:14:06.020 | The funny thing is, it's the difference between given search space and found search space.
00:14:11.180 | - Right, exactly.
00:14:12.180 | - Yeah, maybe that's a different way to describe it.
00:14:13.180 | - That's a beautiful way to put it.
00:14:14.180 | - Okay.
00:14:15.180 | - But you're saying, what's your sense in terms of the basic mathematical operations
00:14:19.500 | and the architectures, hardware that enables those operations?
00:14:24.700 | Do you see the CPUs of today still being a really core part of executing those mathematical
00:14:31.460 | operations?
00:14:32.460 | - Yes.
00:14:33.460 | Well, the operations continue to be add, subtract, load, store, compare, and branch.
00:14:39.540 | It's remarkable.
00:14:40.960 | So it's interesting that the building blocks of computers are transistors, and under that,
00:14:46.820 | atoms.
00:14:47.820 | So you got atoms, transistors, logic gates, computers, functional units of computers.
00:14:53.180 | The building blocks of mathematics at some level are things like adds and subtracts and
00:14:58.420 | multiplies, but the space mathematics can describe is, I think, essentially infinite.
00:15:06.180 | But the computers that run the algorithms are still doing the same things.
00:15:11.140 | Now, a given algorithm might say, "I need sparse data," or, "I need 32-bit data," or,
00:15:16.980 | "I need a convolution operation that naturally takes 8-bit data, multiplies it, and sums
00:15:24.780 | it up a certain way."
00:15:26.500 | So the data types in TensorFlow imply an optimization set, but when you go right down and look at
00:15:34.380 | the computers, it's and and or gates doing adds and multiplies.
00:15:39.180 | That hasn't changed much.
00:15:40.580 | Now, the quantum researchers think they're going to change that radically, and then there's
00:15:45.260 | people who think about analog computing, because you look in the brain and it seems to be more
00:15:48.660 | analog-ish, that maybe there's a way to do that more efficiently.
00:15:54.060 | We have a million X on computation, and I don't know the relationship between computational,
00:16:04.140 | let's say, intensity and ability to hit mathematical abstractions.
00:16:09.900 | I don't know any ways to describe that, but just like you saw in AI, you went from rule
00:16:16.100 | sets to simple search to complex search to, say, found search.
00:16:21.660 | Those are orders of magnitude more computation to do.
00:16:26.380 | And as we get the next two orders of magnitude, like a friend, Roger Godori, said, "Every
00:16:31.580 | order of magnitude changes the computation."
00:16:35.540 | - Fundamentally changes what the computation is doing.
00:16:38.140 | - Yeah.
00:16:39.140 | Oh, you know the expression, "The difference in quantity is the difference in kind."
00:16:44.300 | You know, the difference between ant and anthill, right?
00:16:47.860 | Or neuron and brain.
00:16:50.780 | There's this indefinable place where the quantity changed the quality, right?
00:16:57.300 | And we've seen that happen in mathematics multiple times, and my guess is it's gonna
00:17:02.380 | keep happening.
00:17:03.380 | - So, in your sense, is it, yeah, if you focus head down and shrinking the transistor...
00:17:08.820 | - Well, it's not just head down.
00:17:10.540 | We're aware of the software stacks that are running the computational loads, and we're
00:17:15.980 | kind of pondering what do you do with a petabyte of memory that wants to be accessed in a sparse
00:17:20.740 | way and have the kind of calculations AI programmers want.
00:17:27.540 | So there's a dialogue interaction, but when you go in the computer chip, you find adders
00:17:34.300 | and subtractors and multipliers.
00:17:37.980 | - So if you zoom out then with, as you mentioned, Rich Sutton, the idea that most of the development
00:17:44.020 | in the last many decades in AI research came from just leveraging computation and just
00:17:50.060 | simple algorithms waiting for the computation to improve.
00:17:54.740 | - Well, software guys have a thing that they call the problem of early optimization.
00:18:01.900 | So you write a big software stack, and if you start optimizing the first thing you write,
00:18:07.180 | the odds of that being the performance limiter is low.
00:18:09.740 | But when you get the whole thing working, can you make it 2x faster by optimizing the
00:18:13.820 | right things?
00:18:14.820 | Sure.
00:18:15.820 | While you're optimizing that, could you have written a new software stack, which would
00:18:19.300 | have been a better choice?
00:18:20.300 | Maybe.
00:18:21.300 | Now you have creative tension.
00:18:25.060 | - But the whole time as you're doing the writing, that's the software we're talking about.
00:18:29.700 | The hardware underneath gets faster and faster.
00:18:31.500 | - Well, this goes back to the Moore's Law.
00:18:32.860 | If Moore's Law is going to continue, then your AI research should expect that to show
00:18:41.020 | And then you make a slightly different set of choices.
00:18:42.780 | Then we've hit the wall.
00:18:44.780 | Nothing's going to happen.
00:18:46.180 | And from here, it's just us rewriting algorithms.
00:18:50.020 | That seems like a failed strategy for the last 30 years of Moore's Law's death.
00:18:54.920 | - So can you just linger on it?
00:18:57.900 | I think you've answered it, but I'll just ask the same dumb question over and over.
00:19:01.780 | So why do you think Moore's Law is not going to die?
00:19:07.580 | Which is the most promising, exciting possibility of why it won't die in the next five, 10 years?
00:19:12.780 | So is it the continued shrinking of the transistor, or is it another S-curve that steps in and
00:19:18.820 | it totally sort of-
00:19:20.180 | - Well, shrinking the transistor is literally thousands of innovations.
00:19:24.540 | - Right.
00:19:25.540 | So there's stacks of S-curves in there.
00:19:28.140 | There's a whole bunch of S-curves just kind of running their course and being reinvented
00:19:33.380 | and new things.
00:19:36.500 | The semiconductor fabricators and technologists have all announced what's called nanowires.
00:19:42.220 | So they took a fin which had a gate around it and turned that into little wires so you
00:19:47.620 | have better control of that and they're smaller.
00:19:50.220 | And then from there, there are some obvious steps about how to shrink that.
00:19:54.460 | The metallurgy around wire stacks and stuff has very obvious abilities to shrink.
00:20:02.220 | And there's a whole combination of things there to do.
00:20:05.820 | - Your sense is that we're going to get a lot if this innovation from just that shrinking.
00:20:10.340 | - Yeah, like a factor of a hundred, it's a lot.
00:20:13.900 | - Yeah, I would say that's incredible.
00:20:17.060 | And it's totally unknown.
00:20:18.060 | - It's only 10 or 15 years.
00:20:19.940 | - Now you're smart and you might know, but to me it's totally unpredictable of what that
00:20:23.460 | hundred X would bring in terms of the nature of the computation that people would be.
00:20:29.060 | - Yeah, you're familiar with Bell's law.
00:20:32.100 | So for a long time it was mainframes, minis, workstation, PC, mobile.
00:20:37.420 | Moore's law drove faster, smaller computers.
00:20:41.060 | And then when we were thinking about Moore's law, Rajagirdari said every 10 X generates
00:20:47.020 | a new computation.
00:20:48.100 | So scalar, vector, matrix, topological computation.
00:20:56.060 | And if you go look at the industry trends, there was mainframes and then mini computers
00:21:00.460 | and then PCs.
00:21:02.220 | And then the internet took off and then we got mobile devices and now we're building
00:21:06.380 | 5G wireless with one millisecond latency.
00:21:09.940 | And people are starting to think about the smart world where everything knows you, recognizes
00:21:15.380 | you, like the transformations are going to be like unpredictable.
00:21:22.280 | - How does it make you feel that you're one of the key architects of this kind of future?
00:21:29.940 | So we're not talking about the architects of the high level, people who build the Angry
00:21:35.900 | Bird apps and Snapchat.
00:21:37.620 | - Angry Bird apps, who knows?
00:21:39.900 | Maybe that's the whole point of the universe.
00:21:41.620 | - I'm going to take a stand at that and the attention distracting nature of mobile phones.
00:21:47.540 | I'll take a stand.
00:21:48.660 | But anyway, in terms of--
00:21:49.860 | - I don't think that matters much.
00:21:53.180 | - The side effects of smartphones or the attention distraction, which part?
00:21:58.380 | - Well, who knows where this is all leading?
00:22:00.980 | It's changing so fast.
00:22:01.980 | - Wait, so back to--
00:22:02.980 | - My parents used to yell at my sisters for hiding in the closet with a wired phone with
00:22:06.220 | a dial on it.
00:22:08.100 | Stop talking to your friends all day.
00:22:09.620 | - Right.
00:22:10.620 | - And now my wife yells at my kids for talking to their friends all day on text.
00:22:14.620 | It looks the same to me.
00:22:16.620 | - It's always, it echoes the same thing.
00:22:18.500 | But you are one of the key people architecting the hardware of this future.
00:22:23.740 | How does that make you feel?
00:22:24.900 | Do you feel responsible?
00:22:28.460 | Do you feel excited?
00:22:30.900 | - So we're in a social context, so there's billions of people on this planet.
00:22:35.740 | There are literally millions of people working on technology.
00:22:39.380 | I feel lucky to be doing what I do and getting paid for it, and there's an interest in it.
00:22:47.660 | But there's so many things going on in parallel.
00:22:50.980 | Like the actions are so unpredictable.
00:22:53.120 | If I wasn't here, somebody else would do it.
00:22:56.500 | The vectors of all these different things are happening all the time.
00:23:01.460 | There's a, I'm sure some philosopher or meta philosophers wondering about how we transform
00:23:07.900 | our world.
00:23:11.060 | - So you can't deny the fact that these tools are changing our world.
00:23:18.860 | - That's right.
00:23:19.860 | - So do you think it's changing for the better?
00:23:24.300 | - I read this thing recently, it said the two disciplines with the highest GRE scores
00:23:30.060 | in college are physics and philosophy.
00:23:34.540 | And they're both sort of trying to answer the question, why is there anything?
00:23:38.940 | And the philosophers are on the theological side, and the physicists are obviously on
00:23:45.180 | the material side.
00:23:47.460 | And there's 100 billion galaxies with 100 billion stars.
00:23:51.700 | It seems, well, repetitive at best.
00:23:55.820 | So there's on our way to 10 billion people.
00:24:01.500 | It's hard to say what it's all for, if that's what you're asking.
00:24:04.300 | - Yeah, I guess I am.
00:24:05.300 | - Things do tend to significantly increases in complexity.
00:24:11.300 | And I'm curious about how computation, like our world, our physical world inherently generates
00:24:20.220 | mathematics.
00:24:21.220 | It's kind of obvious, right?
00:24:22.220 | So we have XYZ coordinates, you take a sphere, you make it bigger, you get a surface that
00:24:26.140 | falls, grows by R squared.
00:24:28.900 | Like it generally generates mathematics and the mathematicians and the physicists have
00:24:33.540 | been having a lot of fun talking to each other for years.
00:24:36.100 | And computation has been, let's say, relatively pedestrian.
00:24:40.980 | Like computation in terms of mathematics has been doing binary algebra, while those guys
00:24:47.320 | have been gallivanting through the other realms of possibility, right?
00:24:52.900 | Now recently, the computation lets you do mathematical computations that are sophisticated
00:25:00.460 | enough that nobody understands how the answers came out, right?
00:25:04.980 | - Machine learning.
00:25:05.980 | - Machine learning.
00:25:06.980 | It used to be, you get data set, you guess at a function, the function is considered
00:25:12.660 | physics if it's predictive of new functions, new data sets.
00:25:18.180 | Modern, you can take a large data set with no intuition about what it is and use machine
00:25:25.340 | learning to find a pattern that has no function, right?
00:25:29.340 | And it can arrive at results that I don't know if they're completely mathematically
00:25:33.940 | describable.
00:25:34.940 | So computation has kind of done something interesting compared to A equal B plus C.
00:25:40.980 | - Thank you.
00:25:41.980 | - Thank you.
00:25:42.980 | - Thank you.
00:25:42.980 | - Thank you.
00:25:43.980 | - Thank you.
00:25:43.980 | - Thank you.
00:25:48.980 | - Thank you.
00:25:53.980 | - Thank you.
00:25:58.980 | [ Silence ]