back to indexMost Research in Deep Learning is a Total Waste of Time - Jeremy Howard | AI Podcast Clips
00:00:02.580 |
- So much of fast AI students and researchers 00:00:10.960 |
and the things you teach are pragmatically minded, 00:00:21.080 |
So from your experience, what's the difference 00:00:23.440 |
between theory and practice of deep learning? 00:00:28.960 |
- Well, most of the research in the deep learning world 00:00:51.480 |
So that means that they all need to work on the same thing. 00:00:54.280 |
And so it really, and the thing they work on, 00:00:58.320 |
there's nothing to encourage them to work on things 00:01:14.640 |
Whereas the things that really make a difference, 00:01:18.080 |
like if we can do better at transfer learning, 00:01:25.080 |
can do world-class work with less resources and less data. 00:01:37.160 |
how do we get more out of the human beings in the loop? 00:01:46.480 |
because it's just not a trendy thing right now. 00:01:49.080 |
- You know what, somebody started to interrupt. 00:01:52.320 |
He was saying that nobody is publishing on active learning, 00:02:02.080 |
they're going to innovate on active learning. 00:02:04.920 |
- Yeah, everybody kind of reinvents active learning 00:02:09.040 |
because they start labeling things and they think, 00:02:11.640 |
gosh, this is taking a long time and it's very expensive. 00:02:22.160 |
Maybe I'll just start labeling those two classes 00:02:39.400 |
just has no reason to care about practical results. 00:02:43.560 |
like I've only really ever written one paper. 00:02:45.240 |
I hate writing papers and I didn't even write it. 00:02:48.040 |
It was my colleague, Sebastian Ruder, who actually wrote it. 00:02:53.320 |
but it was basically introducing transfer learning, 00:02:55.840 |
successful transfer learning to NLP for the first time. 00:03:10.560 |
and I thought I only want to teach people practical stuff. 00:03:12.720 |
And I think the only practical stuff is transfer learning. 00:03:15.760 |
And I couldn't find any examples of transfer learning in NLP. 00:03:19.760 |
And I was shocked to find that as soon as I did it, 00:03:22.520 |
which the basic prototype took a couple of days, 00:03:31.920 |
And I just thought, well, this is ridiculous. 00:03:39.000 |
and he kindly offered to write it up, the results. 00:03:46.560 |
which is the top computational linguistics conference. 00:03:50.760 |
So like people do actually care once you do it, 00:03:54.080 |
but I guess it's difficult for maybe like junior researchers 00:03:58.000 |
or like, I don't care whether I get citations 00:04:02.960 |
There's nothing in my life that makes that important, 00:04:09.200 |
I guess they have to pick the kind of safe option, 00:04:14.800 |
which is like, yeah, make a slight improvement 00:04:17.520 |
on something that everybody's already working on.