I want to introduce you to this course I've been working on. I've just released it and I wanted to give a lot of you guys who subscribed and follow me on Medium or Twitter, I wanted to give you guys a chance to get this course for free. So for the next three days it is completely free, you just use this code.
But I just want to talk very quickly about what it actually covers. Now obviously you can see from the title it's NLP and it's with Transformers and Python. Now if we scroll down a little bit we come to this course overview video and I'll just quickly go through this because it's quite long and I don't want to take too much of your time and we cover a lot of things.
So first thing is NLP and Transformers where I give a quick summary of NLP in general, the history of NLP leading up to Transformers. Then we move into a bit of pre-processing for NLP. Now this is just your basic stuff, I think the most relevant one here for us in Transformers is Unicode normalization and tokenization special tokens.
Then I move through a few lectures on attention, how attention works and describing the logic behind it. I always see this as like the hello world of NLP which is sentiment analysis. I think it's a great introduction and we introduce Transformers in this section here. And it's worth pointing out as well that I use a lot of different frameworks throughout this course.
So Flare is the very first one, we also use Hockey Face Transformers, that's obviously the primary one that we'll be using throughout the course, TensorFlow, PyTorch, NLTK, Spacey and many others as well. So there's a lot in there, of course using a lot of BERT. So there's two projects in the course as well, the first of those is sentiment analysis, the second one is question answering.
Both of them I think are great because they take you all the way through from the very start of your project, so getting data, all the way through to actually building your model and applying it to your data. So moving on to named entity recognition, question answering, how we measure the performance of our models which is of course very important, a full question answering stack using another library called Haystack which I think this is one of the coolest things in the course in my opinion and in NLP in general, this sort of stuff is incredibly cool.
Then like I said, there's that second project, the Q&A project. Before we move on to similarity, now similarity is super important in NLP and I think probably one of the most promising areas in the future for further research and just impact that it could have on industry. I think this is really a super cool place to be.
Then finally we move on to fine tuning. So that's the course in a nutshell, all together there's 11 hours of content so it's I think comparatively long when you look at other NLP courses, so we see this 11, 10, 10, 3 and 6 and as far as I'm aware it's the first course that focuses on Transformers, on Udemy.
So if you're into NLP, obviously Transformers are really the models that you want to be using, check out the course in the next few days, it's completely free using this code. So thank you for watching and I hope you enjoy the course.