back to index

Lesson 1 Overview


Chapters

0:0 Introduction
1:23 Teaching
2:17 Paul Lockhart
4:40 The New Electricity
9:0 Intuition
22:58 Suggestions

Whisper Transcript | Transcript Only Page

00:00:00.000 | Hi, and welcome to lesson one of this deep learning MOOC. Thanks for joining us. I'm Jeremy Howard and I'm Rachel Thomas
00:00:08.000 | We're the people who put this together. You'll be seeing my face in front of the camera most of the time
00:00:14.640 | You'll be hearing Rachel's voice however, and I'll be asking questions that were coming through a slack channel asked by students in the in-person version of this course
00:00:23.040 | So we thought before we started we should tell you a little bit about what to expect and maybe you get to know us a little bit
00:00:29.760 | My background is really in
00:00:31.760 | Coding and data I spent 10 years of management consulting and 10 years running startups throughout that time
00:00:40.320 | I've been using data and machine learning to try to solve problems
00:00:43.660 | My background is a lot more academic and theoretical
00:00:47.240 | I have a PhD in math and then I worked as a quant later as a software engineer and data scientist at Uber
00:00:54.960 | One of the most fun and exciting parts of my life was when I spent some time really competing heavily in Kaggle competitions
00:01:01.920 | I was really pleased to win some of those competitions and get to the top of the leaderboard
00:01:06.520 | And I'm hoping to show you guys during this course some of the techniques that I used to do that
00:01:11.720 | I think the techniques that allow you to win Kaggle competitions are the same as the techniques that allow you to great
00:01:17.180 | Great results on your own models in solving your own problems
00:01:23.680 | we also both love teaching and so I taught calculus one and two when I was in graduate school and
00:01:30.160 | Then I later left my job as a software engineer to teach full-stack software development women at Hackbright Academy for a year and a half
00:01:37.200 | I think that's really cool Rachel was a quant and then she works as both a data scientist and a full-stack engineer at Uber
00:01:43.960 | But she realized that one of the highest leverage things you can do is to teach and it's great fun, too
00:01:50.740 | I feel the same way even when I was running startups. I was creating
00:01:55.280 | Course content online for example on the left here is a angular JS tutorial that I originally created for my colleagues at Kaggle
00:02:04.400 | But I recorded it and put it online and it's had over 200,000 views
00:02:08.300 | Makes me feel really good to know that people are learning
00:02:11.980 | From some of the things that I found really helpful myself
00:02:19.080 | So this is a quote from Paul Lockhart who was a
00:02:22.180 | He was actually working as a primary school math teacher
00:02:26.000 | Got his PhD in math at Columbia and became a math professor at Brown and then left to go back to teaching primary school
00:02:32.100 | And he's written a wonderful essay called a mathematician's lament on everything. That's horrible about how mathematics is taught in the United States
00:02:39.300 | Yeah, I think that that essay has been really influential to both Rachel and I
00:02:44.520 | Although Rachel stuck with her math education for decades longer than I did
00:02:49.040 | We both definitely felt like modern mathematics education is not done. Well
00:02:54.280 | Paul Lockhart uses a wonderful analogy about imagine if with music
00:02:59.560 | We didn't allow children to sing or play instruments until they had spent
00:03:04.880 | Years and even decades studying set theory and music notation and could transcribe scales and only then once they were in their 20s
00:03:13.480 | We would let them create music
00:03:15.480 | He says that's exactly what we're doing with mathematics, but that we should let people kind of
00:03:19.880 | Play and create and build patterns with it and something very similar happens with deep learning and how it's taught
00:03:26.480 | In fact, one of my heroes is a guy called David Perkins who at Harvard has created some
00:03:33.540 | Really interesting research about effective educational techniques and he has a very similar analogy to Paul Lockhart
00:03:40.440 | But he talks about baseball imagine if the way you learn baseball was that you never saw a game of baseball
00:03:47.360 | But instead you learned about how to stitch a baseball and you like the physics of a parabola
00:03:51.920 | And you learn every aspect of baseball and then after 20 years of study could be considered good enough to go and actually watch
00:03:58.640 | your first game
00:04:00.040 | We tend to think that this is rather the way that most mathematics perhaps particularly including deep learning is really taught
00:04:06.600 | so we
00:04:10.000 | Decided when we set up our research lab fast AI that the first thing we would do would be to try and
00:04:15.000 | Fill this need and particularly we decided to focus on deep learning because we both think that deep learning is
00:04:21.920 | The most exciting technology that we have ever seen
00:04:26.160 | We think it's going to be more transformative than even than the internet and so the more people who can participate the better
00:04:32.880 | Andrew Ng has called it the new electricity but kind of to say that it's going to have the impact on society that electricity has
00:04:41.880 | Some other kinds of problems we've seen with technical teaching for these and I just want to say we're introducing this to tell you
00:04:48.200 | That this course is taught in a very different style
00:04:50.160 | And so we want to kind of set your expectations ahead of time and motivate why it's so different how we teach it here
00:04:55.760 | And one is that a lot of them existing deep learning materials are very math centric and even as a mathematician and someone who loves math
00:05:04.360 | I found them to be pretty unhelpful for actually building and creating practical applications
00:05:11.040 | In fact every time I see somebody ask on a forum or on Hacker News or whatever
00:05:15.580 | What do I need in order to get into deep learning a whole bunch of people reply by saying well first?
00:05:21.280 | You need five years of real analysis and vector analysis
00:05:25.000 | And then you need to study probability and statistics and blah blah blah blah blah and it really comes across to me as
00:05:31.800 | Something which is all about being exclusive rather than inclusive. So that's why we have this little
00:05:37.880 | Thing making your own at some call again is kind of our
00:05:42.760 | slogan we're all about not being exclusive but about making things as simple as possible, but never about
00:05:52.640 | Dumbing it down, right?
00:05:55.280 | another way that
00:05:57.760 | Kind of technical education fails is what David Perkins and the Harvard professor Jeremy mentioned a moment ago
00:06:04.440 | Calls elementitis and that's that often math does this so much
00:06:09.880 | It teaches kind of each separate element and it's only at the end when you've learned all the elements
00:06:15.600 | Needed that you can put them together and see the whole thing and that's kind of what was going on in that baseball analogy
00:06:20.840 | And that happens in a lot of deep learning
00:06:22.440 | It's like, you know, we need to teach you probability theory and we need to teach you information theory and only way later on
00:06:28.200 | Are we going to let you put it together?
00:06:29.440 | You can think of it as being depth first rather than breadth first if you like
00:06:33.560 | So the traditional depth first approach means that you as a student have to trust that at some point all these things are going
00:06:39.760 | To come together and turn into something that's genuinely useful
00:06:43.480 | I think with this breadth first approach you still have to trust but it's different kind of trust
00:06:49.000 | Which is that it's okay that when we first show you an end-to-end process that you don't deeply understand every part
00:06:56.720 | But that you are able to actually do useful things from the very first lesson and that as the lessons go along
00:07:03.980 | You're going to get more and more in-depth understanding of each piece and two ways that the elementitis or the depth first
00:07:10.880 | Approach fails our one is motivation
00:07:13.760 | A lot of students kind of give up because they don't have the motivation of seeing how are these going to fit together?
00:07:18.040 | And then secondly, it's harder to get that like you don't have the context when you're learning all these discrete elements and you
00:07:24.360 | Can't learn how they're going to fit into the process until later
00:07:27.800 | Right and in fact this goes together with the idea of using a code centric
00:07:32.940 | Approach and sort of a math centric approach with a code centric approach and looking at the whole game
00:07:38.080 | That is an end-to-end machine learning process from the very start. It means that you can do experiments
00:07:43.680 | You can actually run experiments and see what goes in and out of each part of the system and build up that intuition
00:07:49.600 | And if this whole game analogy intrigues you David Perkins has a book called making learning hole where he goes into a lot more detail
00:07:57.040 | About it. Love that book
00:07:59.040 | So then not not only are we going to be showing you end-to-end processes from this very first lesson
00:08:04.960 | But these processes are going to not going to just end up with good enough results nearly all of the deep learning
00:08:11.680 | educational materials I've seen so far get you to a point where
00:08:15.560 | You can kind of get an okay-ish result now
00:08:19.060 | The whole point of deep learning is that you can get state-of-the-art results and so in the very first piece of code
00:08:24.680 | We're going to run we're going to run a piece of code which gives you a state-of-the-art result
00:08:28.560 | We know something as a state-of-the-art result if it is better than other approaches that people have tried
00:08:35.600 | The best way to know that is to try things on a Kaggle competition
00:08:39.920 | Having been the president and chief scientist of Kaggle
00:08:42.320 | I saw again and again that every Kaggle competition beat all previous academic state state-of-the-art results
00:08:48.300 | So very often in this course
00:08:50.800 | We're actually going to use Kaggle benchmarks and see if we can beat them because we know if we can then that's truly a world
00:08:59.240 | So this is from actually a very good book
00:09:04.960 | It's the Ian Goodfellow Yoshua Bengio deep learning book
00:09:08.200 | But it's a very good math book which teaches you the math of deep learning and so in this book when they say
00:09:15.200 | Here is how we gain some intuition in how to back propagation through time works. This is how they develop intuition
00:09:22.380 | Rachel is a math PhD. Did you find this helped your intuitions?
00:09:26.240 | We'll have a very different approach to intuition
00:09:29.960 | So this is a good book if you're interested in math and theorems in this course
00:09:33.720 | We're really going to be focused on code
00:09:35.720 | In fact, this is what Rachel and I put together when we were trying to explain back prop and specifically
00:09:42.920 | Stochastic gradient descent and the use of back prop there was we created a spreadsheet
00:09:47.320 | And we found each time that we taught our students in the in-person course through a spreadsheet
00:09:53.840 | They could see every single piece of what was going on every single intermediate result
00:09:58.760 | And it was very easy for them to experiment with and so one of the unusual things we do is that you'll see that
00:10:04.760 | nearly every major
00:10:07.360 | Idea is presented at some point using a spreadsheet. We present it in many different ways, but spreadsheets
00:10:13.400 | Diagrams and code are three of the key ways that we present these ideas
00:10:18.080 | I believe this is the first deep learning course in the world to implement
00:10:22.440 | convolutional neural network in an Excel spreadsheet and also as you see from this page not just stochastic gradient descent
00:10:28.640 | But at a grad or a mess prop Adam and even Eve which just came out a few weeks ago
00:10:34.240 | or modern examples of
00:10:36.760 | Accelerated SGD approaches
00:10:40.520 | So I think everything you really need to know about the course comes in this very first piece of code that you see
00:10:48.560 | And this very first piece of code that you see you can see that there's a number of things going on
00:10:53.240 | The first is that this piece of code shows not just how to complete a project
00:10:59.720 | But how to get a state-of-the-art result on a project this particular piece of code gives you 97%
00:11:06.520 | accuracy in determining cats versus dogs
00:11:09.340 | As recently as about five years ago the state-of-the-art for this particular
00:11:14.880 | Problem was about 80% accuracy
00:11:18.520 | Um, it's also an example of showing why working with code is so interesting
00:11:25.180 | rather than showing math
00:11:28.000 | What we're showing here is some working code and I'll give an example of what that means you can do
00:11:33.800 | So the code environment that we're working in is something called Jupiter notebook
00:11:38.140 | And you'll be using this in every single lesson throughout the course and in Jupiter notebook as you can see we provide you with
00:11:47.600 | Pros and information about what's going on and we draw pictures and at any point in time
00:11:53.280 | you can take a look at one of these results and you can
00:11:57.580 | You can take a look at one of these results and you can look to see what's going on behind the scenes
00:12:05.360 | So for example in this case
00:12:06.760 | We're running something called VGG dot predict and we're getting back some probabilities and you might wonder well
00:12:13.100 | What's VGG dot predict actually doing so at any time you can take anything and put two question marks on the front and
00:12:19.480 | Run that piece of code and it will actually show you
00:12:23.280 | The full documentation and source code of what you just ran now in this case
00:12:28.880 | It's actually running a function that we wrote for you
00:12:31.920 | One of the other different things about this course is that we're not just showing you how existing libraries work
00:12:38.600 | But every time we found that using somebody else's library takes more than four or five lines of code
00:12:44.660 | We would make sure we found a way to do it easier
00:12:47.860 | So generally speaking we show you how to do things in one line of code and then you can look behind the scenes and see
00:12:54.040 | What the lines of code are actually doing?
00:12:56.040 | so for example in this case the predict method is running some other predict method called model dot predict and
00:13:04.280 | So then what I always encourage people to do is to do some experiments. So what does model dot predict actually do?
00:13:11.620 | one thing that you can do in Jupyter notebook at any time is to press shift tab a couple of times and
00:13:18.640 | When you press shift tab the first time it pops up is tells you what?
00:13:23.400 | Parameters you need to pass this method and it also tells you what the method actually does
00:13:32.080 | If you press it three times it then gives you additional information about what each of those arguments are and what they're expecting and what it
00:13:39.320 | returns
00:13:40.240 | So it's really nice that using this method you can find out exactly what's going on behind the scenes and do some experiments
00:13:47.840 | And so then for example, you could find out. Okay. Well, what is the shape? What is the size and shape of the array?
00:13:54.040 | That this thing returns
00:13:56.040 | What are the first four elements of the classes that are in this?
00:14:01.120 | object and so forth and this is really the way to
00:14:05.520 | Use this style of teaching effectively
00:14:09.400 | It's to have the code in front of you all of the time and in every line look and see what's being passed in
00:14:15.660 | What's coming out? What else could we do with that and then even look at the documentation?
00:14:22.000 | So VGG dot model is apparently a care us dot model dot sequential. So if we were to just copy that into
00:14:30.580 | Google
00:14:32.580 | Then we can click on the first item and find out exactly what's going on what is being used here
00:14:41.640 | What are the other methods that this could take and then we can try calling some of these other methods and see what kind of results
00:14:47.420 | We get so really what we're trying to do is in the two hours of each lesson
00:14:52.480 | We're trying to give you enough information to get you started with your own experiments
00:14:57.780 | We're not trying to teach you everything and we're certainly not assuming that the lesson can stand alone
00:15:03.760 | And we'll talk more about this in a moment
00:15:05.900 | But the videos are just a small part of the course but the IPY on notebooks and the code are a huge resource
00:15:11.140 | And we'll talk about some of the other resources that we have available for you
00:15:14.140 | But the important thing to realize with these
00:15:16.740 | Six lines of code is that you can run this for anything not just for dogs versus cats
00:15:21.880 | But these first six lines of code you learn
00:15:23.860 | We're actually as it says here work for any image recognition task with any number of categories
00:15:28.460 | So if you can get this far in today's lesson
00:15:31.420 | Then you've learned to do one of the most important types of computer vision
00:15:34.940 | Which is image classification or any number of categories for any type of images?
00:15:39.660 | As Rachel said we've actually run this course already
00:15:45.540 | Specifically what you're going to be seeing are the recorded lessons from an in-person course
00:15:50.900 | And we thought it'd be helpful for you to see what some of our students said about that in-person course because it might
00:15:56.740 | Help you to be a more effective learner
00:15:59.300 | And I do want to say I'm again because this course is taught in such a different way that
00:16:05.980 | It takes some faith kind of that this new technique is worth trying and kind of sticking with
00:16:12.220 | but you can see that
00:16:15.060 | almost all the students said that this was that the homework assignments were very helpful or
00:16:20.780 | Extremely helpful in understanding the material
00:16:24.420 | And the class resources which includes the wiki the scripts that we give you our forums our slack channel
00:16:31.460 | We're very helpful or extremely helpful and we want to mention and we wanted to mention that because Rachel and I are both being
00:16:38.140 | kind of Coursera addicts in the past and Udacity addicts and
00:16:42.180 | Generally speaking we all often watch a video at one and a half speed or two speed and just zip through them
00:16:47.920 | This is not designed to be possible to do that this way
00:16:52.240 | This is designed that you need to use the homework assignments and the class resources
00:16:57.220 | So as you can see from the people who have already been through this class
00:16:59.900 | They're actually finding that these are really important parts of the overall course
00:17:03.700 | As each video is giving you you're kind of seeing an end-to-end process of solving a real problem with deep learning
00:17:10.860 | And that means that there's not though a separate video on the kind of this is everything you need to know about
00:17:15.180 | AWS in your environment and this is everything you need to know about this piece of code
00:17:20.800 | But rather you're kind of seeing the end-to-end process, but you'll see it again and again throughout the lessons
00:17:26.620 | Now it's okay if you're coming into this course with either a very large amount or a very small amount of data science background
00:17:34.300 | Everybody in the in-person course simply had to have had at least a year of coding experience
00:17:40.500 | even with that very wide variety and background
00:17:43.220 | Nearly everybody said they found the pacing about right for them
00:17:47.900 | And the reason for that I think is that we really give people the ability to pick up as much as little as they want
00:17:54.140 | Through the forums if you want to dig very very deep into advanced topics you can or if absolutely everything is new to you
00:18:03.040 | Then that's fine, too
00:18:04.820 | There'll be more than enough to do just to get through the basic parts of the assignments and of course on the forums
00:18:11.060 | We'd be very happy to help you with all of your questions there
00:18:13.960 | And if you are more advanced we really appreciate your help in adding new material to the wiki
00:18:19.020 | Answering others questions on the forums people started their own threads on the forums around kind of outside related topics that they were interested in
00:18:26.880 | There are a lot of different ways to be involved
00:18:29.780 | So here's a couple of quotes we got from people after they completed the in-person course
00:18:34.700 | And this is one that we heard again and again so for example
00:18:38.500 | this person says I personally fell into the habit of watching the lectures too much and
00:18:43.540 | Googling definitions and concepts and so forth too much without running the code at first
00:18:49.180 | I thought that I should read the code quickly and then spend time researching the theory behind it
00:18:53.900 | In retrospect I should have spent the majority of my time on the actual code in the notebooks instead in terms of running it
00:19:01.020 | And seeing what goes into it and seeing what comes out of it and Rachel
00:19:04.900 | I know you've seen similar things in your past teaching experience
00:19:07.760 | I've seen this in teaching full stack software development and test students
00:19:11.940 | And I also know that I've been guilty a bit myself sometimes
00:19:15.140 | And that was that students would sometimes kind of rather than start their project
00:19:19.980 | They would keep doing more and more research reading more and more tutorials and feeling like there's more and more they need to learn before
00:19:25.220 | They can start coding and two problems with that are one and I mean you want to have some background before you begin
00:19:31.900 | But there's a point where you just need to start coding
00:19:33.900 | Because you can't know exactly what you're going to need until you start
00:19:37.300 | Coding and building and seeing what errors you get and what things you don't know how to do
00:19:41.740 | And then secondly the test of whether you understand something is whether you can build with it and so kind of reading tutorials
00:19:49.780 | It's very possible to think oh, I understand all this, but it's not till you're writing code yourself
00:19:55.020 | I'm kind of seeing what your what your error rates are and what what's working and what's not that you know whether or not you truly
00:20:00.940 | Understand something yeah, so when I saw students at the study sessions during the week at USF
00:20:06.860 | I would keep telling them the same thing again
00:20:09.100 | And again just don't stop and wait till you feel ready to code start coding now
00:20:13.300 | And it's through that coding experience that you're actually going to figure out what you don't know and what you do know and you'll be
00:20:19.460 | able to develop the intuition by running lots of experiments
00:20:23.500 | This is another interesting quote
00:20:25.500 | from somebody talking about
00:20:27.980 | This learning style he said it's been very interesting learning from somebody who is an entrepreneur
00:20:33.060 | That'd be me a very known nonsense approach to getting things done very hands-on very smart and driven
00:20:39.900 | Your usual career and structure is quite the opposite
00:20:42.900 | So it's been refreshing and even somewhat shocking this is possibly understating things a bit it can be in fact
00:20:49.860 | We heard from quite a few people at the start. It was somewhat shocking to find so many things
00:20:55.140 | taught so quickly and it kind of
00:20:59.500 | Can seem like such a high level, but of course by the end of the seven weeks and assuming that each time you're putting 10
00:21:06.740 | Hours into those weeks you've actually got
00:21:09.700 | Many many full end-to-end processes under your belt so by the end of it you actually are going to develop a very deep and complete understanding
00:21:19.500 | Yeah, I know after the first lesson
00:21:21.060 | I heard a number of students kind of say things like oh
00:21:23.420 | I I didn't really get the details from that lesson and you know
00:21:26.860 | I feel like I need to spend all this time understand studying the details
00:21:29.700 | And we hadn't taught the details in the first lesson
00:21:32.380 | And the idea is that kind of we went more and more in depth each time
00:21:36.540 | You're seeing this end-to-end process and then kind of as time goes on digging into it more
00:21:41.740 | But even after the first lesson you can apply it you can actually create
00:21:46.500 | World-class image recognition models, and so you can go back to your organization and start trying things
00:21:52.180 | This is something else we encourage people to do try things with your own data and your own problems from the very first lesson
00:21:58.340 | So it's been a interesting experience in every way even the way we built this course was unusual
00:22:05.440 | For example, I actually wrote most of the material while traveling from the northern tip to the southern tip of Japan. I
00:22:14.700 | coded and wrote in every possible place you can imagine
00:22:19.260 | And this was a really an experiment for me because I studied human learning theory a lot
00:22:25.180 | And I know that in theory human creativity is meant to be better when you have a wider variety of contexts
00:22:32.180 | Interestingly, I actually found I was more productive in that month than I feel I ever have been before and you'll see actually in the
00:22:40.380 | material that you learn we show a lot of
00:22:44.780 | new techniques or different techniques or different ways of thinking about things and I think this kind of
00:22:49.860 | Different way of building the course perhaps what was really helpful and coming up with this kind of more creative approach
00:22:56.140 | So not everybody in the course
00:23:01.700 | In the in the in person course were able to put in the at least eight hours a week, but the vast majority were
00:23:10.060 | and those who didn't
00:23:14.260 | Still completed the course
00:23:15.900 | They just found they didn't necessarily pick everything up the way that they hoped they would
00:23:20.540 | But of course the nice thing is you can always come back to it later
00:23:23.580 | So our suggestion would be now that it's a MOOC now that you don't have to do it every single week
00:23:30.060 | Ideally you will put in the 10 hours a week. Did you want to talk about those 10 hours a little bit Rachel?
00:23:35.860 | Yeah, we wanted to give you kind of some suggestions on how how to use that time
00:23:40.660 | So the videos are between two and two and a half hours long
00:23:43.980 | and so
00:23:47.100 | with those videos
00:23:49.260 | You may find it helpful as you review them to use these notes
00:23:53.420 | Yeah, so this is coming from our wiki with you.fast.ai. There's a page for each lesson that has
00:23:59.900 | notes
00:24:01.700 | Kind of about the lesson. It also has links to other other resources
00:24:06.540 | That may be useful to you
00:24:09.340 | These notes are pretty complete. They're not designed to be read entirely independently from the video lesson
00:24:15.380 | But they are something which you can
00:24:17.620 | Read on your way to work. Maybe when you don't have
00:24:20.660 | It's not convenient to actually watch the lesson
00:24:24.060 | Sorry, I was gonna say we're expecting that you'll watch the lessons more than once, you know
00:24:29.740 | So the first time through you're kind of watching to get maybe a lot of the high-level ideas
00:24:33.980 | Then you'll probably want to read the wiki try out the notebooks and then go back and watch the lesson again
00:24:40.600 | Kind of maybe to get more detail. Yeah, I don't think any of our students in the in-person course just watch the lessons once
00:24:46.940 | They saw them live of course
00:24:48.660 | But then we also they also had the recording from the next day and I think everybody has spoken to watch them at least twice
00:24:55.020 | And then of course the other thing you've got is the notebooks the notebooks as you see have quite a lot of pros in them
00:25:01.620 | as well
00:25:03.100 | They've got quite a lot of additional detail that we don't necessarily get into in the video lesson
00:25:08.840 | But most importantly as we described they give you an environment in which you can experiment
00:25:13.780 | In fact, not only do we suggest that you experiment we have a very specific suggestion about how to use these notebooks
00:25:22.080 | Yeah, so we recommend that you
00:25:24.740 | Read through the notebook and then and this is after you've watched the video
00:25:28.540 | At least once and everything makes sense put it aside and try creating a new a new notebook where you go through that process yourself
00:25:36.340 | And so this is from scratch, right? This is like creating your own notebook to test that you can actually build it yourself
00:25:43.460 | We do not want you to just hit shift enter shift enter and run through the existing notebook
00:25:47.940 | Because again the test of whether whether you know something is can you can you build and code with it yourself?
00:25:54.020 | So if you get stuck you can always then go back and refer to the class notebook and then rather than copying and pasting it
00:26:00.300 | Make sure you do understand what it's saying
00:26:02.900 | Maybe look up some documentation about that concept and then put that notebook aside and see if you can now do it yourself
00:26:09.540 | So in a sense you're plagiarizing a lot from the notebook, but you're plagiarizing in a good way
00:26:15.180 | You know you're plagiarizing not by copying and pasting but by plagiarizing the concepts and making sure that you can recreate them yourself
00:26:23.440 | And then if you have a have questions
00:26:26.000 | Please ask them the forums are the first place you should go to and first search to see if someone's already asked your question
00:26:32.800 | As we said earlier, there's a separate thread for each lesson that are already
00:26:36.760 | Have tons of helpful questions and answers from the students that took our in-person course
00:26:41.960 | In fact, there's a great quote which we talk about in one of the lessons from the head of Google Brain
00:26:46.600 | Who says that their rule at Google Brain is that if you have a problem you first of all try to fix it yourself for half
00:26:53.360 | an hour
00:26:54.760 | And if half a half an hour you can't fix it yourself you then have to ask somebody so that ensures that you
00:27:01.680 | Always give it a go yourself and hopefully learn from the experience
00:27:06.520 | But you never waste too much time on something which somebody else can help you with. Yes, it's great advice
00:27:13.880 | So as Rachel said the forums are a really helpful resource and when you go to the forums
00:27:18.540 | You'll find that there's a lot of existing discussions
00:27:21.440 | There's a separate discussion for every lesson for example and each and each of those discussions
00:27:26.720 | You'll see that there's a summary of the existing discussion at the start
00:27:29.880 | So you may find that what you need is already in the question and answers there
00:27:35.080 | If it's not of course
00:27:36.680 | Feel free to add your question and you're generally found it's responded to within a small number of hours
00:27:42.840 | Maybe by Rachel or I or maybe by one of the other students
00:27:46.160 | The other thing you may find helpful is that each lesson has a timeline on the wiki and those hyperlinks are actually hyperlinks
00:27:54.840 | Directly to the part of the video which discusses that topic
00:27:57.960 | So if you're trying to remember how momentum works
00:28:00.960 | You can just click on that link and you'll jump straight to me telling you about momentum
00:28:05.240 | As Rachel said there's also a number of resources available to help you
00:28:12.360 | So this is taken from the front page of our wiki
00:28:15.660 | There's a whole section of tools with links where you can learn about learn more about each of the pieces that we use in the development
00:28:21.820 | Environment and so our goal here is not to be a single source of truth
00:28:25.800 | If somebody else has already done a great job of teaching one of these tools
00:28:29.040 | We'll leave it to them
00:28:30.680 | So we don't attempt to give you a great bash reference or a great umpire reference because people have already done that
00:28:37.880 | So if you want to learn more about one of these things jump onto the wiki click through here
00:28:42.800 | And you'll find some curated resources that we think are really helpful
00:28:46.220 | So this lesson
00:28:49.840 | You're going to cover a lot of stuff
00:28:52.240 | But these are the four things to keep in mind
00:28:54.440 | By the end of the lesson you want to make sure that you can create an AWS instance
00:28:58.800 | That you can connect to it with SSH
00:29:01.320 | It can run a Jupyter notebook in it and you can run those that state-of-the-art custom model code that we showed you earlier
00:29:08.480 | Those first three things you're going to be doing every single project in every single lesson
00:29:14.040 | So you're going to want to be really comfortable at doing that and for those of you who don't I haven't done that before
00:29:19.040 | It might take you a little while to get the hang of it and maybe a few
00:29:22.520 | unsuccessful attempts first
00:29:25.440 | So this first lessons unusual and that it's a lot more about
00:29:29.320 | Kind of getting your development environment set up and not as much about deep learning
00:29:33.560 | So indeed if you've got a background in Python and AWS and Linux
00:29:39.240 | You may find this lesson on the easy side in which case you can dip through it pretty fast
00:29:43.400 | If you don't have a background in these tools today's class may seem really overwhelming
00:29:48.140 | And we don't don't want you to be discouraged by that because this is very different from the future lessons
00:29:53.980 | But it's necessary to get your environment set up so that you can be coding throughout the course
00:29:58.120 | Yeah, I mean the folks who didn't have that background in the in-person course once they actually got through this and
00:30:04.080 | Often it was a lot of work and it was pretty tough
00:30:06.720 | But at the end they finally got the point and they could say, okay
00:30:09.600 | I've set up a GPU instance in the cloud
00:30:11.760 | I've set up my development environment and I have trained from scratch a model that can recognize dogs from cats
00:30:18.120 | And it was very very exciting. So if this is hard work for you
00:30:22.240 | Just know that when you get through the other end of it, it's going to be really exciting
00:30:27.120 | So I asked that everyone who's trying to decide if this course is for them
00:30:30.480 | Try at least the first two lessons since the first lesson is so much about setup
00:30:36.040 | Yeah, these lessons become
00:30:38.640 | Obviously we build more and more on the techniques we've learned and so we're going to be using this infrastructure in every lesson
00:30:45.080 | By the time we get to lesson seven, we're going to be looking at some pretty sophisticated and custom neural network architectures
00:30:52.560 | We're going to cover every different type of SGD optimization
00:30:57.040 | We're going to be covering convolutional neural networks and recurrent neural networks. So there's going to be a lot of exciting stuff
00:31:02.360 | and yeah, we really look forward to seeing you on the forums and
00:31:06.680 | Good luck with learning about deep learning