Augmented Lagrangian Methods for Provable and Scalable Machine Learning
Graph Chatbot
Chat with Graph Search
Ask any question about EPFL courses, lectures, exercises, research, news, etc. or try the example questions below.
DISCLAIMER: The Graph Chatbot is not programmed to provide explicit or categorical answers to your questions. Rather, it transforms your questions into API requests that are distributed across the various IT services officially administered by EPFL. Its purpose is solely to collect and recommend relevant references to content that you can explore to help you answer your questions.
The emergence of Big Data has enabled new research perspectives in the discrete choice community. While the techniques to estimate Machine Learning models on a massive amount of data are well established, these have not yet been fully explored for the esti ...
Many scientific inquiries in natural sciences involve approximating a spherical field -namely a scalar quantity defined over a continuum of directions- from generalised samples of the latter (e.g. directional samples, local averages, etc). Such an approxim ...
We propose a new non-perturbative method for studying UV complete unitary quantum field theories (QFTs) with a mass gap in general number of spacetime dimensions. The method relies on unitarity formulated as positive semi-definiteness of the matrix of inne ...
We propose a model order reduction approach for non-intrusive surrogate modeling of parametric dynamical systems. The reduced model over the whole parameter space is built by combining surrogates in frequency only, built at few selected values of the param ...
Semidefinite programming (SDP) is a powerful framework from convex optimization that has striking potential for data science applications. This paper develops a provably correct algorithm for solving large SDP problems by economizing on both the storage an ...
Federated learning is a useful framework for centralized learning from distributed data under practical considerations of heterogeneity, asynchrony, and privacy. Federated architectures are frequently deployed in deep learning settings, which generally giv ...
Popular clustering algorithms based on usual distance functions (e.g., the Euclidean distance) often suffer in high dimension, low sample size (HDLSS) situations, where concentration of pairwise distances and violation of neighborhood structure have advers ...
We present a strikingly simple proof that two rules are sufficient to automate gradient descent: 1) don’t increase the stepsize too fast and 2) don’t overstep the local curvature. No need for functional values, no line search, no information about the func ...
Wasserstein balls, which contain all probability measures within a pre-specified Wasserstein distance to a reference measure, have recently enjoyed wide popularity in the distributionally robust optimization and machine learning communities to formulate an ...
Stochastic optimization is a popular modeling paradigm for decision-making under uncertainty and has a wide spectrum of applications in management science, economics and engineering. However, the stochastic optimization models one faces in practice are int ...