Intelligence and Generalization
So this video is an exposition' in the beginning, followed by a conversation with Francois Chollet about what deep learning does or does not do.
I don't get the 'controversy'. Yes, neural nets do interpolation. We've known that since the late 80's (at least). They do nonlinear function approximation. They can theoretically interpolate any nonlinear function.
The reason why this is interesting is because real world information lives on a low-dimensional manifold the neural net can learn to model.
Thinking of neural nets as 'running a program' is just a way to mislead yourself. MLST makes that mistake over and over again in the discussions on their videos.
When analyzing an interpolative mapping system, what you care about is the manifold surface it is interpolating, not the mechanism used to build the interpolator.
Onwards.
Observations:
1: What exactly is generalization that is not interpolative? Wouldn't it not be generalization?
Comments
Post a Comment