HTC Seminar #28 - Energy Based Models

 This week's HTC Seminar is a presentation by Yann YeCun at ICLR 2020 on energy based models titled 'The Future is Self-Supervised'.


Here's an ICLR link to the slides and talk.  Which they seem to make very hard to view on youtube directly.


I was first exposed to Yann's 'generative model theory of everything' in the NYU Deep Learning 2020 course lectures.  We'll be presenting some of those lectures in our Generative Model Deep Dive post series.  And it's definitely a mind expanding moment when what he is saying really starts to sink in.

With that in mind, let's check out the reactions from the gang at Machine Learning Street Talk, as they react to and analyze Yann's ICLR presentation.


Observations

1:  The self-supervised learning revolution continues.  And here is one of the big movers of that viewpoint explaining why we should care about it.  If that's not already obvious from all of the various papers presented on it over the last year.

2:  Analyzing GAN architectures from this energy-based model viewpoint is fascinating.  And might lead to alternate architecture implementations to achieve the same end goal.

3:  Check out these additional resources.

"A Tutorial on Energy-Based Learning' is here.

"Learning Concepts with Energy Functions" is here.

Confused about above. Here's Yannic to explain the paper.



Comments

Popular posts from this blog

CycleGAN: a GAN architecture for learning unpaired image to image transformations

Pix2Pix: a GAN architecture for image to image transformation

Smart Fabrics