back to index

fast.ai APL study session 7


Chapters

0:0
5:25 Euler's Formula
11:24 Calculus
15:15 Custom Operator
45:34 Adding Matrix to the Search

Whisper Transcript | Transcript Only Page

00:00:00.000 | I don't know.
00:00:27.000 | Hello, everybody.
00:00:28.000 | Hello, everybody.
00:00:29.000 | Hello.
00:00:38.000 | Good morning, Jeremy.
00:00:40.000 | Hey guys.
00:00:44.000 | Doing good. I got a
00:00:48.000 | APL thing just pushed a couple of minutes ago, so let me see.
00:00:57.000 | Notepok1 of the Fast AI Numerical Linear Algebra course, but APL-ified.
00:01:05.000 | Oh, it's not on the forum yet. Where do I find it?
00:01:08.000 | I just posted in the chat here. I literally pushed it like two minutes ago, so I haven't even had a chance to post it on the forum.
00:01:16.000 | I just got it done.
00:01:18.000 | But I will make a forum post after the call.
00:01:21.000 | Okay.
00:01:33.000 | So this is about the numerical, well, computational linear algebra course, which we did, oh my God, five years ago.
00:01:46.000 | Well, three days short.
00:01:49.000 | I think five, oh my gosh, that's amazing.
00:01:55.000 | I started going through it when you mentioned it just the other day, and it seems like a great course. I don't know why I didn't do it sooner.
00:02:02.000 | Oh, thanks.
00:02:04.000 | Fantastic.
00:02:06.000 | We like to think so.
00:02:09.000 | I mean, it's not as like,
00:02:13.000 | obviously, immediately applicable kind of a thing, but you know, it's interesting.
00:02:18.000 | So, okay, so you've, so you've taken the first notebook from it.
00:02:31.000 | here we go.
00:02:38.000 | APL, look at that.
00:02:41.000 | Get those two, two ones and then I can't cut the answer using our matrix multiplication.
00:03:00.000 | I see this is like a little just empty thing to fill in the space, I guess.
00:03:05.000 | Yeah, I'm not really sure how to make tables look good in APL.
00:03:10.000 | I came up with stuff together but
00:03:20.000 | one more section.
00:03:23.000 | A couple more sections here.
00:03:26.000 | I got to learn a bunch of new glyphs doing this.
00:03:31.000 | Cool.
00:03:35.000 | Nice using the power operator I'm glad we did that.
00:03:40.000 | Yeah.
00:03:53.000 | And I can values, my goodness.
00:03:57.000 | I don't know how to calculate in a smart way but I got them calculated, and hopefully I learned a smarter way later in the course. All right. Great.
00:04:08.000 | That's the last of it. This is my favorite bloom filter check.
00:04:15.000 | It's also the only one I know.
00:04:18.000 | Nice job, Isaac. That's cool.
00:04:22.000 | Thank you.
00:04:24.000 | Not going to get any easier.
00:04:28.000 | Well, I mean, in theory, this should be exactly what I feel is good at right. Yes, absolutely.
00:04:36.000 | I mean, the bits we have to start like opening JPEGs and stuff might get complicated, but.
00:04:43.000 | Oh yeah, I spent some time trying to figure out how to open images.
00:04:51.000 | I don't have a super easy way of doing it as far as I could tell so I'll have to take another stab at that later.
00:05:03.000 | And then Molly's posted something.
00:05:08.000 | You can talk Molly you don't have to put stuff in the chat. We like hearing from people.
00:05:13.000 | Oh no, the conversation was already going so I was waiting for it to be over.
00:05:21.000 | Yeah, so a few videos ago, you orders formula was talked about just a bit, and it was an entity.
00:05:33.000 | Yeah, the Khan Academy one shows like the power series of the power. Sorry, my color. I forget.
00:05:44.000 | Series is that like a tailor.
00:05:47.000 | Oh yeah, I guess it must be like a stainless yeah like a tailor series. Um, it shows that first sign cosine and E, and then inserts.
00:05:57.000 | One, the exponential. Cool. That one inserts I into it and shows how it's a combination of the site.
00:06:07.000 | The sign and cosine with an eye in it.
00:06:11.000 | Okay.
00:06:12.000 | Yeah.
00:06:13.000 | Thanks.
00:06:14.000 | And then you can split it and then it shows how you can split them up into the one for sign and cosine and how it ends up looking like.
00:06:25.000 | Yeah, you're, you're, you're formula. Excellent.
00:06:29.000 | Thank you.
00:06:37.000 | Anything else that's come up.
00:06:41.000 | I was just gonna say, Jeremy I listened to some more of the array broadcast content.
00:06:49.000 | I listened to yours too, which was great for the other.
00:06:55.000 | What shocked me was that how deeply embedded APL is in Wall Street.
00:07:01.000 | Yes, that's amazing. I didn't realize that that was such a long legacy there with trading.
00:07:07.000 | Yes. Yeah, I mean, mainly nowadays it's K and KTB, which I think most of the big sense hedge fund trading folks use.
00:07:26.000 | But that itself came from a plus, which Arthur Whitney built at Morgan Stanley, and I discovered the other day that it actually exists as an open source thing nowadays.
00:07:42.000 | Not exactly.
00:07:45.000 | Yeah, historical interest.
00:07:49.000 | I remember who was talking about.
00:07:54.000 | One of the reasons that the APL community is a bit cloistered as you said last time, is that a lot of them didn't open source their plantations, it was really built around proprietary applications that limited the exposure, which is really interesting compared to something like Python.
00:08:15.000 | Yeah, exactly. And it's also interesting how like proprietary trading shops like, you know, secrecy is so important to them, but also like they don't care about following cultural kind of trends and so they do tend to like pick things that are good regardless of whether they're popular.
00:08:39.000 | So, like Jane Street for example uses OCaml and Morgan Stanley had a plus and lots of them use APL.
00:08:48.000 | Yeah, it is interesting. I know a lot of them have been moving towards using more Python in recent years though.
00:09:00.000 | And I think partly that might be because Python is much better for working with accelerators.
00:09:10.000 | This one with Aaron was like one of my favorite episodes if you haven't seen it.
00:09:16.000 | And this one with Prok was interesting it doesn't actually talk about APL that much, but just like you're doing this so that you get us to talk.
00:09:31.000 | Whatever works.
00:09:35.000 | Great.
00:09:39.000 | Oh, so that means that whole discussion about
00:09:49.000 | what's in the chat. Yes.
00:09:52.000 | Isaac's thing you didn't actually say it.
00:09:56.000 | Oh, what did I stop sharing the screen I don't even remember pressing the stop sharing screen button.
00:10:01.000 | Okay, fine.
00:10:04.000 | Yeah okay so this is K.
00:10:09.000 | And this is a plus.
00:10:12.000 | And this is this.
00:10:14.000 | Yeah, public.
00:10:16.000 | The available implementation.
00:10:20.000 | And, yeah, this is the one with Aaron, who built a GPU compiler in APL.
00:10:30.000 | Yeah, this one with Brooke I thought was cool.
00:10:34.000 | Which other ones are good. There's also one
00:10:39.000 | with Eric Iverson which is good.
00:10:42.000 | Yeah, this one is good.
00:10:46.000 | It's a kind of a weird podcast because the first few episodes, kind of like, assume you don't know anything about array programming. So it's like why would you be listening to an array programming podcast if you didn't know anything about it for programming.
00:10:59.000 | Yeah, anyway, I felt glad about the point where they talked to Eric and talked about tacit programming was first data getting interesting.
00:11:12.000 | I think, yeah, I didn't know that until you mentioned you're going on. Thanks for the tones. All right.
00:11:18.000 | No worries.
00:11:23.000 | Okay, should we do some calculus then.
00:11:28.000 | Okay, going to some calculus.
00:11:41.000 | Yeah, so this is where we got to yesterday right we were doing.
00:11:47.000 | Rise over run.
00:11:51.000 | The slope.
00:11:55.000 | So this is a numerical approximation of a derivative.
00:12:01.000 | And it's an approximation because like the smaller you get this the closer you get to the slope at this exact point, but it's never, you know, quite short enough to be perfect.
00:12:16.000 | So yeah, I thought it'd be nice if we could create something that would calculate this for any function, which we can do.
00:12:30.000 | And the way you do it is by creating a custom operator.
00:12:35.000 | So we could create something called gradient.
00:12:51.000 | And we could kind of copy and paste all this.
00:12:56.000 | And let's say we put x on the left so that'll be our alpha.
00:13:05.000 | And our difference will be on the right. So that'll be our Omega.
00:13:12.000 | So that's going to be a gradient of a particular function. Right. So the gradient of f at
00:13:27.000 | three, or whatever that x, which is three, let's write it at three, with a difference of 0.01.
00:13:39.000 | Oh, I've got to run everything.
00:13:49.000 | Didn't know I restarted my notebook.
00:13:52.000 | I'm surprised how long it takes to run, actually.
00:14:01.000 | Okay, it's happening.
00:14:07.000 | There we go.
00:14:10.000 | Okay.
00:14:13.000 | So that number is the same as that number.
00:14:17.000 | So the thing is, we want to replace f with.
00:14:23.000 | Oh, and let's simplify this. We don't need these parentheses, right, because plus happens first.
00:14:32.000 | There we go.
00:14:34.000 | So in order to pass in a function, we can turn this into an operator.
00:14:41.000 | So if you look at the help for an operator,
00:14:51.000 | like star diuresis,
00:14:58.000 | you've got up to five things around it. You've got the two arguments of the function it creates,
00:15:07.000 | and one or two arguments of the functions that you pass to the operator.
00:15:13.000 | So there's five things. So if you want to create a custom operator, this thing's going to be omega.
00:15:19.000 | This thing's going to be alpha. And then there's two more things. This thing is going to be called omega omega.
00:15:25.000 | And this thing is going to be called alpha alpha. So if we replace f with double alpha,
00:15:36.000 | we've now created an operator.
00:15:39.000 | And so that means we now have to tell it what function to take the derivative of.
00:15:52.000 | Oh, omega omega. So I'm going to put it on the right.
00:16:01.000 | OK, what did I do wrong?
00:16:06.000 | I'm not aware of needing to put it in parentheses, but what did I do wrong?
00:16:19.000 | Why you still keep the f?
00:16:23.000 | Why is there an f here? Because I've got to say, what am I taking the gradient of? So I'm taking the gradient of this function.
00:16:31.000 | So that's the thing that omega is going to be replaced with.
00:16:36.000 | So this is kind of where I find the quad operator really nice, because right in the function, you can add your print statements to,
00:16:44.000 | you know, so like take that first, that alpha that it's pointing at and assign that to the quad operator in the function.
00:16:53.000 | And then it'll print out where it's there, hopefully. OK, which is that should run before most everything else.
00:17:01.000 | OK, so it should print that before it fails.
00:17:05.000 | That didn't work. All right, let's try making something simpler.
00:17:18.000 | We're going to create an operator which just calls the function.
00:17:27.000 | OK, so there's the world's dumbest operator. So we should be able to go G of plus, which would be plus, apply that to two.
00:17:47.000 | OK. G plus.
00:18:08.000 | All right, so that did not work how I expected. Does it need to be alpha alpha the other way around?
00:18:15.000 | So normally you do like plus slash. So it goes on the right.
00:18:22.000 | I don't think this is right. OK.
00:18:38.000 | So maybe if we search for this.
00:18:52.000 | I haven't normally found this search very useful, but let's give it a try.
00:18:59.000 | Dops. Yes.
00:19:10.000 | OK, so it does expect to have just alpha alpha if it's magnetic.
00:19:18.000 | So that means.
00:19:21.000 | Oh, it goes, I think, the opposite way around the way I expected. All right. So let's change this to alpha alpha.
00:19:29.000 | And that would mean I think it's plus G.
00:19:33.000 | Ah, OK. So I guess that makes sense. Other operators work that way, like plus slash.
00:19:42.000 | Oh, of course they do. No. So I thought about me. You're an idiot. All right. Yes.
00:19:53.000 | Yeah, somehow I had it backwards in my head. OK. All that. Fine.
00:20:02.000 | By the way, Isaac, for your flashcards, it occurred to me that a lot of these things don't really make sense as flashcards.
00:20:10.000 | And for those like it occurred to me that something that might be useful is if you.
00:20:24.000 | Added tags to the ones that you want to be exported as cards, then you could go through in your script and just add cards for those that have like a card tag on it or something like that.
00:20:36.000 | That would be a way to avoid having lots of crap you don't need.
00:20:42.000 | Yeah, I quickly learned the shortcut to suspend the card, but that would probably be a better way to do it to not have it generated in the first place.
00:20:51.000 | Yeah.
00:20:58.000 | Great.
00:21:02.000 | Yes, that's a bit. I don't know. It's a bit weird in some ways, but I guess it kind of makes sense. This is how you create an operator. So this is a monadic operator because it only has alpha alpha.
00:21:11.000 | It doesn't have omega omega. And it's a monadic operator that creates a dyadic function because it's got an alpha and omega.
00:21:20.000 | And so I don't think we actually need the parentheses anymore.
00:21:26.000 | Yeah, we don't because operators bind more tightly. So it's as if this is parenthesized. Does that make sense?
00:21:35.000 | So a monadic operator takes stuff from the left. If you give it an alpha alpha, it would take stuff on the left. I mean, I assume we could go omega omega.
00:21:44.000 | Although, as Isaac said, that's not quite what you would expect given how other ones work. Let's see if it works.
00:21:53.000 | No, you can't. OK, so yeah, it goes on the left if you say alpha alpha. And if it's on the left and the right, then you would do you would do both.
00:22:06.000 | OK, so that's our custom.
00:22:11.000 | Derivative.
00:22:14.000 | And that's a numeric approximation of a derivative to be more precise.
00:22:34.000 | All right, so.
00:22:43.000 | OK, we've got a whole list of operators here.
00:22:51.000 | Wait, so left arrow is considered an operator.
00:22:57.000 | Has anybody figured out what the curly brackets means yet, by the way?
00:23:04.000 | I haven't.
00:23:10.000 | I'll tell you an operator I'd quite like to do is this one, tilde diuresis.
00:23:24.000 | Think they can save a parenthesis in the one we just did. Correct.
00:23:30.000 | Tilde diuresis. OK, which is a monadic operator. So it's going to take one function on its left and it produces a dyadic function.
00:23:47.000 | Hence, there's the one function on its left and that results in the dyadic function.
00:23:53.000 | It's got a bit of a strange name commute, but all it does is it takes X and Y and it returns a function that actually calls YFX rather than XFY.
00:24:11.000 | So if we do. OK, what's the letter for that?
00:24:29.000 | Yeah. Shift T. And that's called tilde diuresis.
00:24:47.000 | Monadic, shift T, dyadic, shift T. Oh, there is no monadic. OK.
00:24:59.000 | So then there's commute.
00:25:12.000 | And. You could say, yeah, three minus two is that so that would be putting X on the left and Y on the right.
00:25:29.000 | So it's three minus two. But if we do it the other way around, two minus three, we could also write like this, two minus, sorry, three minus.
00:25:44.000 | What was the letter again? T. Yeah. And then commute means switch the order of the arguments.
00:25:53.000 | Does that make sense?
00:25:56.000 | Does that afflict them around? Oh, just one moment. I don't know what it wants to be.
00:29:18.000 | Sorry, we had a missing computer problem.
00:29:24.000 | So Marty found a link for brackets.
00:29:32.000 | Curly braces. Great. Let's take a look.
00:29:48.000 | I thought they might be optional arguments, but it didn't make sense for results. So it can indicate shy results. And did you find out what a shy result is?
00:29:59.000 | I've heard that word before.
00:30:04.000 | APL shy result.
00:30:26.000 | Ah, OK. By default, functions print the result unless they're shy.
00:30:37.000 | There you go.
00:30:40.000 | OK, so that's an optional argument.
00:30:45.000 | And that's a shy result.
00:30:50.000 | Great. How do you define a shy result in a function?
00:30:54.000 | No idea. OK.
00:31:00.000 | So this is a dyadic tilde diuresis. And so we can now redefine gradient.
00:31:10.000 | Like so.
00:31:16.000 | So because the right hand side is handled first, we can now say and I find it's really helpful to like find a way to say this, which is I would say Omega.
00:31:30.000 | Divided into the right hand side.
00:31:37.000 | So I wouldn't say divide. Commute, I would say divided into like normally there's some way you can like express the idea of these things being backwards in a reasonable math or English expression.
00:31:55.000 | So that does make it a bit more clean, which is nice.
00:32:05.000 | And then there was another version of compute of tilde diuresis, which is constant.
00:32:15.000 | And so constant just always returns its argument.
00:32:20.000 | So we could create a function called zero.
00:32:29.000 | And so this.
00:32:37.000 | This is a function.
00:32:40.000 | And so we can apply it to anything we like.
00:32:43.000 | And I believe we can even do it diatically.
00:32:48.000 | So that's just a function that returns zero.
00:32:54.000 | And that's it for tilde diuresis.
00:32:59.000 | This form I see a lot.
00:33:02.000 | People use it very frequently in APL.
00:33:18.000 | Tell me when are they using it. They use it for exactly this kind of purpose. APL is height parentheses.
00:33:28.000 | And they height unnecessary symbols,
00:33:35.000 | which I kind of get like it's.
00:33:38.000 | Certainly by having less stuff to read, it's I find it easier to read.
00:33:53.000 | You know, the other one, I think we might want to do.
00:34:04.000 | Each which one of these is each does anybody remembers this one.
00:34:09.000 | Yeah, this one.
00:34:12.000 | Okay, so this is just diuresis.
00:34:24.000 | And here it is.
00:34:28.000 | This one.
00:34:48.000 | It's a monadic operator. Oh, this word here means can be either monadic or dyadic.
00:34:55.000 | It's not ambivalent as in I don't care, but it's ambivalent as in either valence.
00:35:18.000 | Valence is handedness.
00:35:29.000 | Yes, this is.
00:35:34.000 | Okay.
00:35:37.000 | This is a list of.
00:35:49.000 | Oops.
00:35:53.000 | Okay, this is a list. This is an array of arrays.
00:35:57.000 | So it's an array with two elements.
00:36:01.000 | And if we try to do plus slash of that.
00:36:07.000 | It's going to get upset because it's trying to do.
00:36:12.000 | It's trying to insert plus between its arguments, which would be the same as typing one, two, three, four, plus five, six, seven.
00:36:26.000 | The each operator takes the previous function, which in this case is itself being defined with an operator.
00:36:36.000 | And it means some and it applies it over each of its arguments.
00:36:41.000 | So plus slash each means apply plus slash to this and then apply plus slash to this.
00:36:54.000 | Thus giving us the results 10 and 18. Does that make sense?
00:37:02.000 | And I think that might work for like.
00:37:14.000 | We're going to get an array matrix, which is a two rows by three column array.
00:37:27.000 | Iota six rows got to go between them.
00:37:42.000 | Cool. Thank you.
00:37:52.000 | Okay.
00:37:57.000 | So if I tried to do.
00:38:02.000 | Two, three plus Matt.
00:38:07.000 | Something like that can work in NumPy. It would broadcast the.
00:38:15.000 | Maybe like this or broadcast this over each row.
00:38:20.000 | But it doesn't add it. Also, I think can work in J, but I don't think by default it works in APL.
00:38:27.000 | But I think if we say that it applies to each element of Matt.
00:38:36.000 | Or each column.
00:38:42.000 | Something like that.
00:38:56.000 | I think the problem is that it's going through each of two.
00:39:00.000 | Three, four separately.
00:39:10.000 | What does this look like plus slash.
00:39:18.000 | Okay, so it doesn't actually work that way correctly on a matrix.
00:39:26.000 | On a matrix.
00:39:35.000 | I think this is.
00:39:37.000 | This might be related to when we were looking at the iota before we were searching for using it to find values in the matrix it was, you wouldn't find individual values you'd find all rows at a time.
00:39:53.000 | Yeah, I think the issue is it's not going over cells of an array that going over items.
00:40:02.000 | So I'm guessing if we did like this.
00:40:27.000 | I'm going to do like.
00:40:42.000 | Oh, not that be.
00:40:46.000 | Okay, yeah, so that's going to go over each of these it's going to go over it's going to go to plus this and then three plus this.
00:40:56.000 | And I assume there must be some way to make that apply over a rank two array, but I don't know what it is.
00:41:12.000 | So I guess, anybody fingers that out let us know otherwise I guess we'll probably come to it at some point.
00:41:22.000 | I put the syntax for defining a shy function in the chat.
00:41:27.000 | Okay.
00:41:28.000 | And one of the structure flow but shy, this is not shy, let me copy them.
00:41:37.000 | I don't really have much of a flow I gotta be honest.
00:41:41.000 | Okay.
00:41:43.000 | Seems like you're on a roll to me.
00:41:49.000 | Copy this.
00:41:53.000 | That's more like it.
00:41:55.000 | Okay.
00:41:57.000 | So, I'm going to get shy, is it.
00:42:04.000 | It's not shy. So what do you actually add to the function that makes it not print out.
00:42:12.000 | Cool.
00:42:13.000 | Thanks.
00:42:16.000 | Yeah.
00:42:20.000 | Not sure when I would want things not to print out but.
00:42:33.000 | Okay, so none of their examples are using matrices.
00:42:37.000 | The only other places can be helpful to look at is to look at the APL wiki.
00:42:49.000 | It's only defined in nested APLs. I think that means things we can have an array in an array.
00:43:18.000 | Okay, I don't know what I knew that means there if we can search, trying to search the appeal card, be nice that you can search for symbols.
00:43:33.000 | So I assume there's going to be some magic incantation that basically turns a matrix into an array of arrays of rows, and that you would do it, do it that way, I assume.
00:43:48.000 | Okay.
00:43:49.000 | So you can search for a, an APL symbol on a PL cart will give you everything that's in most of them are to me but I mean that's not a bad idea to learn how to use this thing because when I was on the podcast they seem to think it was thing worth learning about.
00:44:11.000 | Okay, here it is. Each.
00:44:15.000 | How do I see so typing comma each ensures all events vectors join items.
00:44:24.000 | I see these are like idioms I guess.
00:44:30.000 | Yeah, so, like one that I found I was working on this computational algebra that you can type in like calculate the determinant, and they've got a big long thing for that.
00:44:43.000 | And so some of them.
00:44:49.000 | I think for most of them, the ones at the top seem to be more simpler than the ones down below. Okay, are sorted.
00:44:58.000 | And actually this this table's lives in a text file in a GitHub repository. Here's Conway's game of life.
00:45:10.000 | That's great.
00:45:19.000 | I'm gonna be really happy when I can read all this. Yeah, this is intense.
00:45:27.000 | That's cool. I already say one thing that mentions and matrix.
00:45:34.000 | Can you try adding matrix to the search, I can do that works too.
00:45:43.000 | Pepto diagonal matrices.
00:45:52.000 | I believe there are, he does have additional tags and stuff that's not shown here to help with the searching.
00:46:04.000 | I don't know how good they are.
00:46:08.000 | I'm going on matrix of shape matrix. So if I copy this and saying at the top that what each thing is, M is a matrix and capital N is a numeric array.
00:46:25.000 | I was just saying it's a numeric array which is a matrix I guess.
00:46:29.000 | So I guess that means in theory, we could type match here.
00:46:38.000 | We can. Okay.
00:47:07.000 | This is.
00:47:15.000 | Okay, this is the H, which is flipped.
00:47:25.000 | Okay, I'm not going to try to do this just now.
00:47:29.000 | All right.
00:47:37.000 | I see slash bar and slope bar a lot. I didn't know this is called slope.
00:47:43.000 | I always call it backslash. And I have no idea what they are.
00:47:49.000 | So maybe we should learn.
00:47:58.000 | It says an operator. It's a monadic operator.
00:48:06.000 | And we type it.
00:48:11.000 | Oh, the slash. Makes sense.
00:48:18.000 | Slash.
00:48:25.000 | And this is called.
00:48:29.000 | Replicate first. No, it's called reduce first.
00:48:36.000 | Oh, my daughter wants me again. Sorry.
00:49:05.000 | All right.
00:49:07.000 | So if you do the sum plus slash, and a matrix versus this one, one will some column wise and one will some row wise.
00:49:19.000 | Okay.
00:49:21.000 | Cool.
00:49:33.000 | And we had a matrix ready to go then.
00:49:51.000 | Okay. And is that literally all it does.
00:49:58.000 | This is okay. So in J.
00:50:03.000 | J has a rank operator, which is actually the double quote sign, where basically you can just always say what access you want to use so that would be reduced over, over the zero axis, and that would be reduced over the one axis.
00:50:21.000 | But I'm not sure if you can do anything quite the same.
00:50:32.000 | There is a thing called rank.
00:50:41.000 | Maybe we should see how this is different.
00:50:45.000 | What's this one here,
00:50:51.000 | which is called.
00:50:53.000 | I assume.
00:51:03.000 | Shift J.
00:51:08.000 | Classic edition.
00:51:14.000 | I have the same like
00:51:19.000 | usual information.
00:51:25.000 | Anyway, it's called rank.
00:51:37.000 | I already forgot what letter I said.
00:51:46.000 | I guess this is called dot diuresis. I didn't see a thing for it.
00:51:53.000 | On attic.
00:51:56.000 | Rank.
00:52:01.000 | And if I say.
00:52:13.000 | Do it over this access.
00:52:19.000 | Well, that sure didn't work.
00:52:30.000 | that might be this.
00:52:34.000 | Hey, look, it is the same.
00:52:40.000 | wait, no, that's the same as some.
00:52:42.000 | And what if we put a zero.
00:52:51.000 | Okay, I don't know what I'm doing.
00:52:55.000 | Let's come back to that one and make sure we okay so it sounds like slash bar there's not much to learn, which is it's just the same as slash.
00:53:05.000 | But it does it over a different axis.
00:53:10.000 | I assume is going to be the same for backslash bar.
00:53:26.000 | Except they didn't call it backslash they called it slope.
00:53:35.000 | And that's probably going to be back tick backslash I'm guessing.
00:53:42.000 | Nope, it's not right. Okay, so we can, we can specify the axis of plus slash. Yep.
00:53:52.000 | By adding like bracket one, right after the operator.
00:53:58.000 | I believe, and then bracket.
00:54:03.000 | No square brackets, immediately after the operator, so just like that.
00:54:11.000 | Okay, so does that apply for like the cease.
00:54:15.000 | Does that apply for like everything or is that just this particular.
00:54:21.000 | I feel like I've seen that mentioned in the docs.
00:54:24.000 | Yeah, it's called a function access.
00:54:27.000 | Like operators like to you can you put it on all functions or.
00:54:32.000 | I think it's all functions.
00:54:35.000 | Okay, so why on earth do we need slash bar access access.
00:54:49.000 | This is what I'm looking at here.
00:55:01.000 | This must be a monadic primitive mixed function taken from these or function. Okay, I can't apply to many things. It's just slash or slope for these.
00:55:20.000 | There's a bunch of
00:55:27.000 | And then you've also got access with diet a copper and
00:55:37.000 | it must be a dyadic primitive scalar function.
00:55:41.000 | Oh, well, that's good. So any dyadic scalar function primitives go.
00:55:47.000 | Or mixed function which I see means one with an operator where they've used one of these. Okay, so actually does sound like you can do a lot of things with it.
00:55:55.000 | That's good. Yeah, there's a, there's a wiki I put on the chat about the function access specifically and kind of covers both.
00:56:03.000 | I think it's kind of a combination of these two. Okay, cool. I see a few other people have missed them all chat by the way, is there anything we wanted to talk about or ask about anybody wants to speak up.
00:56:18.000 | Function access.
00:56:24.000 | Mine was was not specifically related to the content we were discussing so didn't want to bring it up. Oh, please do.
00:56:32.000 | We're so like not at all focused Molly, as you'll see. This is just like, yeah, hanging out chatting about whatever seems interesting.
00:56:42.000 | This formula is definitely interesting. Oh yeah so I came first I first came across dealers formula in a paper that I was reading on positional embeddings.
00:56:55.000 | So, I had no idea what it was. I actually use this to figure out the fundamentals I was missing.
00:57:03.000 | Yeah, it was rotational positional embeddings what it was called. Okay.
00:57:08.000 | So, for those of you who are interested in transformer models. There's like literally no sense of like the order of things in a vector. And so it's like literally impossible for it to learn anything that requires order and language is ordered.
00:57:25.000 | So we use these things called positional embeddings.
00:57:30.000 | This rotary position.
00:57:32.000 | It's been a while.
00:57:36.000 | Yes, that's it. And then there's a proof. Yeah, this was the actual proof I was going through, trying to figure out what they were talking about.
00:57:45.000 | All right.
00:57:48.000 | Yeah.
00:57:49.000 | And then I was missing a lot of the fundamentals so I just go down to where they're doing the actual proof.
00:57:56.000 | And then you can just click on the whatever you do not understand, like the proof for what you don't understand. So if you don't understand complex cosine, like how they're able to get the complex cosine function like infinite you don't find yourself clicking.
00:58:14.000 | Come back to where you started.
00:58:18.000 | Somewhere that can happen.
00:58:22.000 | But yeah.
00:58:24.000 | And then I just start looking up the various things they're talking about here until I understand it. So, okay, that's cool.
00:58:34.000 | It's just one way I was able to understand papers, like the math behind papers and stuff.
00:58:40.000 | Thanks.
00:58:42.000 | That's great.
00:58:45.000 | Okay, we might stop now and
00:58:52.000 | probably talk about access next time.
00:59:08.000 | Access.
00:59:20.000 | Something's going to be there, put a number.
00:59:27.000 | We can do that next time.
00:59:32.000 | That's going to be super handy.
00:59:34.000 | All right.
00:59:35.000 | Thanks everybody.
00:59:38.000 | See you next time.
00:59:42.000 | [BLANK_AUDIO]