Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
This lecture explores the concept of relative stability towards diffeomorphisms in deep neural networks, proposing that image classification is achievable due to classes being invariant to smooth deformations. By studying typical diffeomorphisms of controlled norm, the lecture delves into how stability to smooth input transformations relative to generic transformations impacts performance. Various architectures, from simple CNNs to state-of-the-art methods, are analyzed, showing that low relative sensitivity to diffeomorphisms is crucial for good performance. The correlation between relative stability and performance is highlighted, indicating that parameters like depth or skip connections are less significant. The findings suggest that relative stability to diffeomorphisms is a better predictor of performance than stability to noise alone.