Ask any question about EPFL courses, lectures, exercises, research, news, etc. or try the example questions below.
DISCLAIMER: The Graph Chatbot is not programmed to provide explicit or categorical answers to your questions. Rather, it transforms your questions into API requests that are distributed across the various IT services officially administered by EPFL. Its purpose is solely to collect and recommend relevant references to content that you can explore to help you answer your questions.
Covers gradient descent methods for convex and nonconvex problems, including smooth unconstrained convex minimization, maximum likelihood estimation, and examples like ridge regression and image classification.
Explores the concept of stationary distribution in Markov chains, discussing its properties and implications, as well as the conditions for positive-recurrence.
Explores optimization methods, including convexity, gradient descent, and non-convex minimization, with examples like maximum likelihood estimation and ridge regression.