### HTC Education Series: Getting Started with Deep Learning - Lesson 6

This weeks lesson dives back into the nitty gritty details of working with deep learning systems using the fastai api. We finish up Chapter 5 of the course book, look more deeply at Softmax, the specifics of how transfer learning works, and fun things to do with adaptive learning rates to improve system performance.

Then we look at multi-label classification problems (in contrast to the binary classification problems we have examined so far. We show how to work with fastai's DataLoader and DataSet classes in the datablock api to use the datablock api with multi-label classification data. We talk about modifying loss functions to work with multi-label data.

We then move on to take a look at deep learning based collaborative filtering applications *(like how one might implement a Netflix recommendation system)*. Where the deep learning system learns latent factors in the data. The specific example discussed is recommendation systems, but the underlying principals are much more general, and can be applied to many different potential application scenarios.

You can also watch this lecture at the fastai course site here. The advantage of this is that you can look at course notes, a Questionnaire, a transcript of the lecture is available to train your next generative RNN system, etc.

__What is covered in this lecture__

Choosing a correct learning rate is important (want to train as fast as possible without introducing error by overtraining)

*(practical)*, was only invented in 2015 (fastai first api to include it)

*(trains just the randomly added new weights in the new output layer)*

Half Precision floating point calculations

*(normal float 32 bit)*

*- uses less GPU memory*

*- runs faster*

*- can actually work better (stochastic variations introduced by rounding errors)*

Multi-label Classification - image can have more than one label associated with it

Pandas (Python library to deal with standard data formats)

Sigmoid - nonlinear function to map a number to be between -1 and 1

__Additional HTC Course Material__

1: Last week Xander got us pumped up about an exciting new neural net architecture called a GAN (Generative Adversarial Network). Specifically he detailed how the StyleGAN works.

*GANs are a hot research topic and 3 new ones have probably been invented in the time it took me to type this sentence*.

*like the GAN architecture can be used in this way).*This alternative class of neural networks is called a Variational AutoEncoder (VAE). It is a neural net architecture that learns to compress data without supervision.

__Observations__

1: Classification vs Regression labeling. The

*'Regression'*terminology i think is confusing for beginners.

*Historically i think it helped prevent people from seeing important potential applications of neural nets. I certainly saw that to be the case in the 90s (people thought classification was all you could do with these systems). I think it's still confusing as a term today, like in the fastai head tracking example discussed in this lecture. Don't call it a regression net, just call it a net that outputs a 2 floating point numbers (2 numbers that correspond to a 2D position of a point in an image).*

__Don't forget to read the course book__

Finished chapter 5.

__Need to review something from the previous lessons in the course.__

No problem.

You can access Lesson 1 here.

You can access Lesson 2 here.

You can access Lesson 3 here.

You can access Lesson 4 here.

You can access Lesson 5 here.

You can move on to the next Lesson 7 in the course (when it posts on 11/09/20).

## Comments

## Post a Comment