**Are you an EPFL student looking for a semester project?**

Work with us on data science and visualisation projects, and deploy your project as an app on top of GraphSearch.

Publication# Geometry of the Loss Landscape in Overparameterized Neural Networks: Symmetries and Invariances

Johanni Michael Brea, Wulfram Gerstner, Clément Hongler, Berfin Simsek, Francesco Spadaro

2021

Conference paper

2021

Conference paper

Abstract

We study how permutation symmetries in overparameterized multi-layer neural networks generate `symmetry-induced' critical points. Assuming a network with $L$ layers of minimal widths $r_1^*, \ldots, r_{L-1}^*$ reaches a zero-loss minimum at $r_1^*! \cdots r_{L-1}^*!$ isolated points that are permutations of one another, we show that adding one extra neuron to each layer is sufficient to connect all these previously discrete minima into a single manifold. For a two-layer overparameterized network of width $r^*+ h =: m$ we explicitly describe the manifold of global minima: it consists of $T(r^*, m)$ affine subspaces of dimension at least $h$ that are connected to one another. For a network of width $m$, we identify the number $G(r,m)$ of affine subspaces containing only symmetry-induced critical points that are related to the critical points of a smaller network of width r<r^*. Via a combinatorial analysis, we derive closed-form formulas for $T$ and $G$ and show that the number of symmetry-induced critical subspaces dominates the number of affine subspaces forming the global minima manifold in the mildly overparameterized regime (small $h$) and vice versa in the vastly overparameterized regime ($h \gg r^*$). Our results provide new insights into the minimization of the non-convex loss function of overparameterized neural networks.

Official source

This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.

Related concepts

Loading

Related publications

Loading

Related concepts (15)

Isolated point

In mathematics, a point x is called an isolated point of a subset S (in a topological space X) if x is an element of S and there exis

Manifold

In mathematics, a manifold is a topological space that locally resembles Euclidean space near each point. More precisely, an n-dimensional manifold, or n-manifold for sho

Affine space

In mathematics, an affine space is a geometric structure that generalizes some of the properties of Euclidean spaces in such a way that these are independent of the concepts of distance and measure

Related publications (15)

Loading

Loading

Loading

Wulfram Gerstner, Julien Mayor

The storage and short-term memory capacities of recurrent neural networks of spiking neurons are investigated. We demonstrate that it is possible to process online many superimposed streams of input. This is despite the fact that the stored information is spread throughout the network. We show that simple output structures are powerful enough to extract the diffuse information from the network. The dimensional blow up, which is crucial in kernel methods, is efficiently achieved by the dynamics of the network itself.

Wulfram Gerstner, Julien Mayor

We investigate the performance of sparsely-connected networks of integrate-and-fire neurons for ultra-short term information processing. We exploit the fact that the population activity of networks with balanced excitation and inhibition can switch from an oscillatory firing regime to a state of asynchronous irregular firing or quiescence depending on the rate of external background spikes. We find that in terms of information buffering the network performs best for a moderate, non-zero, amount of noise. Analogous to the phenomenon of stochastic resonance the performance decreases for higher and lower noise levels. The optimal amount of noise corresponds to the transition zone between a quiescent state and a regime of stochastic dynamics. This provides a potential explanation of the role of non-oscillatory population activity in a simplified model of cortical micro-circuits.

2004We study the propagation of solitary waves in a discrete excitatory network of integrate-and-fire neurons. We show the existence and the stability of a fast wave and a family of slow waves. Fast waves are similar to those already described in continuum networks. Stable slow waves have not been previously reported in purely excitatory networks and their propagation is particular to the discrete nature of the network. The robustness of our results is studied in the presence of noise.

2004