Lecture

Deep Learning: Dimensionality and Data Representation

Description

This lecture explores the landscape, performance, and curse of dimensionality in deep learning, focusing on classifying data in large dimensions, the benefits of learning data representation, and the stability towards smooth deformations. It delves into the mechanisms behind deep nets' invariance towards deformations, the geometry of loss landscapes, and the phase diagram for deep learning. Additionally, it discusses the 'jamming' transition in deep learning, two limiting algorithms based on the number of parameters, and the neural tangent kernel in modern architectures.

About this result
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.