Lecture

Relative Stability Towards Diffeomorphisms in Deep Nets

Description

This lecture explores the concept of relative stability towards diffeomorphisms in deep neural networks, proposing that image classification is achievable due to classes being invariant to smooth deformations. By studying typical diffeomorphisms of controlled norm, the lecture delves into how stability to smooth input transformations relative to generic transformations impacts performance. Various architectures, from simple CNNs to state-of-the-art methods, are analyzed, showing that low relative sensitivity to diffeomorphisms is crucial for good performance. The correlation between relative stability and performance is highlighted, indicating that parameters like depth or skip connections are less significant. The findings suggest that relative stability to diffeomorphisms is a better predictor of performance than stability to noise alone.

About this result
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.