Ask any question about EPFL courses, lectures, exercises, research, news, etc. or try the example questions below.
DISCLAIMER: The Graph Chatbot is not programmed to provide explicit or categorical answers to your questions. Rather, it transforms your questions into API requests that are distributed across the various IT services officially administered by EPFL. Its purpose is solely to collect and recommend relevant references to content that you can explore to help you answer your questions.
Deep learning algorithms are responsible for a technological revolution in a variety oftasks including image recognition or Go playing. Yet, why they work is not understood.Ultimately, they manage to classify data lying in high dimension – a feat generical ...
We present a general theory of Group equivariant Convolutional Neural Networks (G-CNNs) on homogeneous spaces such as Euclidean space and the sphere. Feature maps in these networks represent fields on a homogeneous base space, and layers are equivariant ma ...
Curie's principle states that "when effects show certain asymmetry, this asymmetry must be found in the causes that gave rise to them." We demonstrate that symmetry equivariant neural networks uphold Curie's principle and can be used to articulate many sym ...
A long-standing goal of science is to accurately simulate large molecular systems using quantum mechanics. The poor scaling of current quantum chemistry algorithms on classical computers, however, imposes an effective limit of about a few dozen atoms on tr ...
Neural networks (NNs) have been very successful in a variety of tasks ranging from machine translation to image classification. Despite their success, the reasons for their performance are still not well-understood. This thesis explores two main themes: lo ...
EPFL2021
, , , ,
We analyze numerically the training dynamics of deep neural networks (DNN) by using methods developed in statistical physics of glassy systems. The two main issues we address are (1) the complexity of the loss landscape and of the dynamics within it, and ( ...
IOP PUBLISHING LTD2019
, , , , , ,
Supervised deep learning involves the training of neural networks with a large number N of parameters. For large enough N, in the so-called over-parametrized regime, one can essentially fit the training data points. Sparsitybased arguments would suggest th ...
IOP PUBLISHING LTD2020
, , , ,
Deep learning has been immensely successful at a variety of tasks, ranging from classification to artificial intelligence. Learning corresponds to fitting training data, which is implemented by descending a very high-dimensional loss function. Understandin ...
Two distinct limits for deep learning have been derived as the network width h -> infinity, depending on how the weights of the last layer scale with h. In the neural tangent Kernel (NTK) limit, the dynamics becomes linear in the weights and is described b ...
Understanding why deep nets can classify data in large dimensions remains a challenge. It has been proposed that they do so by becoming stable to diffeomorphisms, yet existing empirical measurements support that it is often not the case. We revisit this qu ...