Person

Clément Hongler

2006: B.Sc. Math, EPFL2008: M.Sc. Math, EPFL2010: Ph.D. Math, Université de Genève2010-2014: Ritt Assistant Professor, Columbia University2014-2018 Tenure-Track Assistant Professor, EPFL2019-present: Associate Professor, EPFL

About this result
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.
Courses taught by this person (3)
MATH-200: Analysis III - complex analysis and vector fields
Apprendre les bases de l'analyse vectorielle et de l'analyse complexe.
MATH-642: Artificial Life
We will give an overview of the field of Artificial Life (Alife). We study questions such as emergence of complexity, self-reproduction, evolution, both through concrete models and through mathematica
MATH-434: Lattice models
Lattice models consist of (typically random) objects living on a periodic graph. We will study some models that are mathematically interesting and representative of physical phenomena seen in the real
Related publications (19)

Please note that this is not a complete list of this person’s publications. It includes only semantically relevant works. For a full list, please refer to Infoscience.

Conformal Field Theory at the Lattice Level: Discrete Complex Analysis and Virasoro Structure

Clément Hongler

Critical statistical mechanics and Conformal Field Theory (CFT) are conjecturally connected since the seminal work of Beliavin et al. (Nucl Phys B 241(2):333-380, 1984). Both exhibit exactly solvable structures in two dimensions. A long-standing question ( ...
SPRINGER2022

Geometry of the Loss Landscape in Overparameterized Neural Networks: Symmetries and Invariances

Wulfram Gerstner, Clément Hongler, Johanni Michael Brea, Francesco Spadaro, Berfin Simsek, Arthur Jacot

We study how permutation symmetries in overparameterized multi-layer neural networks generate `symmetry-induced' critical points. Assuming a network with LL layers of minimal widths r1,,rL1r_1^*, \ldots, r_{L-1}^* reaches a zero-loss minimum at $ r_1^*! \c ...
2021

Neural Tangent Kernel: Convergence and Generalization in Neural Networks (Invited Paper)

Clément Hongler, Franck Raymond Gabriel, Arthur Jacot

The Neural Tangent Kernel is a new way to understand the gradient descent in deep neural networks, connecting them with kernel methods. In this talk, I'll introduce this formalism and give a number of results on the Neural Tangent Kernel and explain how th ...
ASSOC COMPUTING MACHINERY2021
Show more

Graph Chatbot

Chat with Graph Search

Ask any question about EPFL courses, lectures, exercises, research, news, etc. or try the example questions below.

DISCLAIMER: The Graph Chatbot is not programmed to provide explicit or categorical answers to your questions. Rather, it transforms your questions into API requests that are distributed across the various IT services officially administered by EPFL. Its purpose is solely to collect and recommend relevant references to content that you can explore to help you answer your questions.