What's Next for Fastai/ with Jeremy Howards
There's a very interesting and informative recent interview with fastai's Jeremy Howard by Sam Charrington of TWIML AI. I first heard it on the TWIML AI Podcast, which you can access here.
It's also available in a slightly longer format as a video if you want to see everyone's smiling faces in the zoom interview.
Jeremy gets into several really interesting topics, which we're going to briefly talk about below. We'll expand out to provide a little more info for some of thee topics. But you should really watch the keynote, or listen to the podcast to hear the whole thing.
The interview starts off with some info on Jeremy's background, and then dives into a conversation focused on the fastai v2 release, and it's associated book "Deep Learning for Coders with fastai and PyTorch: AI Applications Without a PhD'.
The Future of fastai is then discussed. As we alluded to in a recent HTC Seminar Series post that was an interview with Chris Lattner, Chris leaving Google has put somewhat of a dent into Jeremy's current thinking on Swift as a possible alternative for a future version of fastai. He also talks about the Julia language a little bit. And some recent static designs trends in the Python language community he's not sure are a good thing.
One really interesting future research direction at fastai discussed in the podcast that is already happening is self-supervised learning. Self-supervised learning has been looked into quite a bit in the NLP world. Many of the most exciting recent directions in NPL have been driven by this approach. It is a hot current research topic in computer vision and imaging.
For more information on this topic:
Here's a quick overview of the general concept of Self-Supervised Learning.
Here's a post on 'Self-supervised Learning and Computer Vision' by Jeremy. As you would expect, the clearest explanation of the 3 links i'm providing here for more info.
Here's a medium post on 'Self-Supervised Representation Learning' in image space.
A recent paper on 'Self-supervised Domain Adaption for Computer Vision Tasks'.
Nbdev is a library that allows you to develop a python library in Jupyter Notebooks. This lets you put all of your code, tests, and documentation in one place. So you can think of it as a modern implementation of a lot of Donald Knuth's hopes for literate programming.
Nbdev provides a lot of useful features, including automatically generating docs from Jupyter notebooks, and a robust two-way synch between notebooks and source code. This allows you to use an IDE if you wish for code editing and navigation. It also provides a way to easily host your documentation for free using GitHub pages.
You can learn all about nbdev here.
Jeremy never ceases to amaze me. We're all anxiously awaiting the upcoming fastai v2 Part 2 course series of lectures to come into existence at some point in the hopefully not too far away future.
The HTC Education Series: 'Getting Started with Deep Learning' course is based on the fastai v2 course. We include additional material in the HTC course to try and supplement what is presented in the fastai course.
If you have been taking the course, then after reading Jeremy's post above on 'Self-supervised Learning and Computer Vision' , you will realize you already know a lot more about this subject then you might have expected.