Lecture

Relative Stability Towards Diffeomorphisms in Deep Nets

Description

This lecture explores the concept of relative stability towards diffeomorphisms in deep neural networks, proposing that image classification is achievable due to classes being invariant to smooth deformations. By studying typical diffeomorphisms of controlled norm, the lecture delves into how stability to smooth input transformations relative to generic transformations impacts performance. Various architectures, from simple CNNs to state-of-the-art methods, are analyzed, showing that low relative sensitivity to diffeomorphisms is crucial for good performance. The correlation between relative stability and performance is highlighted, indicating that parameters like depth or skip connections are less significant. The findings suggest that relative stability to diffeomorphisms is a better predictor of performance than stability to noise alone.

About this result
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.

Graph Chatbot

Chat with Graph Search

Ask any question about EPFL courses, lectures, exercises, research, news, etc. or try the example questions below.

DISCLAIMER: The Graph Chatbot is not programmed to provide explicit or categorical answers to your questions. Rather, it transforms your questions into API requests that are distributed across the various IT services officially administered by EPFL. Its purpose is solely to collect and recommend relevant references to content that you can explore to help you answer your questions.