Related lectures (31)
Sparse Regression
Covers the concept of sparse regression and the use of Gaussian additive noise in the context of MAP estimator and regularization.
Path Integral Representation of Propagators
Covers the path integral representation of propagators for free particles and harmonic oscillators.
Gradient Descent
Covers the concept of gradient descent, a universal algorithm used to find the minimum of a function.
Neural Networks: Training and Activation
Explores neural networks, activation functions, backpropagation, and PyTorch implementation.
Regularization: Promoting Optimal Solutions
Covers regularization in least-squares problems, promoting optimal solutions while addressing challenges like non-uniqueness, ill-conditioning, and over-fitting.
Quantum Field Theory: Renormalization
Covers the concept of renormalization in quantum field theory, focusing on regularization and divergences removal.
Linear Regression: Regularization
Covers linear regression, regularization, and probabilistic models in generating labels.
Neural Networks: Regularization & Optimization
Explores neural network regularization, optimization, and practical implementation tips.
Quantum Field Theory: Computing Loops
Covers the computation of loops in quantum field theory, focusing on dimensional regularization and Euclidean space.
Feynman Rules I: Asymptotic Statistic and Instantons
Covers the Feynman Rules, Asymptotic Statistics, Normal Ordering, and Instantons.

Graph Chatbot

Chat with Graph Search

Ask any question about EPFL courses, lectures, exercises, research, news, etc. or try the example questions below.

DISCLAIMER: The Graph Chatbot is not programmed to provide explicit or categorical answers to your questions. Rather, it transforms your questions into API requests that are distributed across the various IT services officially administered by EPFL. Its purpose is solely to collect and recommend relevant references to content that you can explore to help you answer your questions.