**Are you an EPFL student looking for a semester project?**

Work with us on data science and visualisation projects, and deploy your project as an app on top of GraphSearch.

Lecture# Statistical Physics of Learning

Description

This lecture provides a brief review of 40 years of statistical physics of learning, focusing on the relationship between the structure of neural networks and the results obtained from the statistical physics of disordered systems. It covers topics such as learning algorithms with optimal stability, information storage and retrieval in spin-glass models of neural networks, and the mean field approach to Bayes learning in feed-forward neural networks. The lecture delves into the theoretical foundations and practical implications of statistical mechanics in understanding the learning process in neural networks.

Official source

This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.

In course

PHYS-642: Statistical physics for optimization & learning

This course covers the statistical physics approach to computer science problems, with an emphasis on heuristic & rigorous mathematical technics, ranging from graph theory and constraint satisfaction

Instructors (2)

Related concepts (204)

Related lectures (350)

Feedforward neural network

A feedforward neural network (FNN) is one of the two broad types of artificial neural network, characterized by direction of the flow of information between its layers. Its flow is uni-directional, meaning that the information in the model flows in only one direction—forward—from the input nodes, through the hidden nodes (if any) and to the output nodes, without any cycles or loops, in contrast to recurrent neural networks, which have a bi-directional flow.

Quantum statistical mechanics

Quantum statistical mechanics is statistical mechanics applied to quantum mechanical systems. In quantum mechanics a statistical ensemble (probability distribution over possible quantum states) is described by a density operator S, which is a non-negative, self-adjoint, trace-class operator of trace 1 on the Hilbert space H describing the quantum system. This can be shown under various mathematical formalisms for quantum mechanics. One such formalism is provided by quantum logic.

Recurrent neural network

A recurrent neural network (RNN) is one of the two broad types of artificial neural network, characterized by direction of the flow of information between its layers. In contrast to uni-directional feedforward neural network, it is a bi-directional artificial neural network, meaning that it allows the output from some nodes to affect subsequent input to the same nodes. Their ability to use internal state (memory) to process arbitrary sequences of inputs makes them applicable to tasks such as unsegmented, connected handwriting recognition or speech recognition.

Partition function (statistical mechanics)

In physics, a partition function describes the statistical properties of a system in thermodynamic equilibrium. Partition functions are functions of the thermodynamic state variables, such as the temperature and volume. Most of the aggregate thermodynamic variables of the system, such as the total energy, free energy, entropy, and pressure, can be expressed in terms of the partition function or its derivatives. The partition function is dimensionless.

Microstate (statistical mechanics)

In statistical mechanics, a microstate is a specific configuration of a system that describes the precise positions and momenta of all the individual particles or components that make up the system. Each microstate has a certain probability of occurring during the course of the system's thermal fluctuations. In contrast, the macrostate of a system refers to its macroscopic properties, such as its temperature, pressure, volume and density.

Clustering: k-meansPHYS-467: Machine learning for physicists

Explains k-means clustering, assigning data points to clusters based on proximity and minimizing squared distances within clusters.

Landscape and Generalisation in Deep Learning

Explores the challenges and insights of deep learning, focusing on loss landscape, generalization, and feature learning.

Document Analysis: Topic ModelingDH-406: Machine learning for DH

Explores document analysis, topic modeling, and generative models for data generation in machine learning.

Neural Networks: Training and ActivationCIVIL-226: Introduction to machine learning for engineers

Explores neural networks, activation functions, backpropagation, and PyTorch implementation.

Quantum Random Number GenerationPHYS-758: Advanced Course on Quantum Communication

Explores quantum random number generation, discussing the challenges and implementations of generating good randomness using quantum devices.