Liquid Neural Networks

 Today's presentation is a talk given by Ramin Hasani of MIT at the MIT Center for Brains Minds and Machines on 10/8/21.  It discusses a new biologically inspired continuous time neural network architecture.

Abstract: In this talk, we will discuss the nuts and bolts of the novel continuous-time neural network models: Liquid Time-Constant (LTC) Networks. Instead of declaring a learning system's dynamics by implicit nonlinearities, LTCs construct networks of linear first-order dynamical systems modulated via nonlinear interlinked gates. LTCs represent dynamical systems with varying (i.e., liquid) time-constants, with outputs being computed by numerical differential equation solvers. These neural networks exhibit stable and bounded behavior, yield superior expressivity within the family of neural ordinary differential equations, and give rise to improved performance on time-series prediction tasks compared to advance recurrent network models.


  1. I generally check this kind of article and I found your article which is related to my interest.sequence to sequence learning with neural networks Genuinely it is good and instructive information. Thankful to you for sharing an article like this.


Post a Comment

Popular posts from this blog

Simulating the Universe with Machine Learning

CycleGAN: a GAN architecture for learning unpaired image to image transformations

Pix2Pix: a GAN architecture for image to image transformation