Publication

A first-order primal-dual method with adaptivity to local smoothness

Publications associées (143)

Adaptive Stochastic Variance Reduction for Non-convex Finite-Sum Minimization

Volkan Cevher, Kimon Antonakopoulos, Efstratios Panteleimon Skoulakis, Leello Tadesse Dadi, Ali Kavis

We propose an adaptive variance-reduction method, called AdaSpider, for minimization of L-smooth, non-convex functions with a finite-sum structure. In essence, AdaSpider combines an AdaGrad-inspired [Duchi et al., 2011, McMahan & Streeter, 2010], but a fai ...
2022

UNDERGRAD: A Universal Black-Box Optimization Method with Almost Dimension-Free Convergence Rate Guarantees

Volkan Cevher, Kimon Antonakopoulos

Universal methods for optimization are designed to achieve theoretically optimal convergence rates without any prior knowledge of the problem’s regularity parameters or the accurarcy of the gradient oracle employed by the optimizer. In this regard, existin ...
2022

Adaptation in Stochastic Algorithms: From Nonsmooth Optimization to Min-Max Problems and Beyond

Ahmet Alacaoglu

Stochastic gradient descent (SGD) and randomized coordinate descent (RCD) are two of the workhorses for training modern automated decision systems. Intriguingly, convergence properties of these methods are not well-established as we move away from the spec ...
EPFL2021

Critical Parameters for Scalable Distributed Learning with Large Batches and Asynchronous Updates

Martin Jaggi, Sebastian Urban Stich, Amirkeivan Mohtashami

It has been experimentally observed that the efficiency of distributed training with stochastic gradient (SGD) depends decisively on the batch size and—in asynchronous implementations—on the gradient staleness. Especially, it has been observed that the spe ...
2021

Mathematical Foundations of Robust and Distributionally Robust Optimization

Daniel Kuhn, Jianzhe Zhen, Wolfram Wiesemann

Robust and distributionally robust optimization are modeling paradigms for decision-making under uncertainty where the uncertain parameters are only known to reside in an uncertainty set or are governed by any probability distribution from within an ambigu ...
2021

Federated Learning Over Wireless Networks: Convergence Analysis and Resource Allocation

Vincent Gramoli

There is an increasing interest in a fast-growing machine learning technique called Federated Learning (FL), in which the model training is distributed over mobile user equipment (UEs), exploiting UEs' local computation and training data. Despite its advan ...
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC2021

A Newton Frank-Wolfe method for constrained self-concordant minimization

Volkan Cevher, Quoc Tran Dinh

We develop a new Newton Frank-Wolfe algorithm to solve a class of constrained self-concordant minimization problems using linear minimization oracles (LMO). Unlike L-smooth convex functions, where the Lipschitz continuity of the objective gradient holds gl ...
SPRINGER2021

Fast Projection Onto Convex Smooth Constraints

Maryam Kamgarpour, Andreas Krause, Ilnura Usmanova

The Euclidean projection onto a convex set is an important problem that arises in numerous constrained optimization tasks. Unfortunately, in many cases, computing projections is computationally demanding. In this work, we focus on projection problems where ...
PMLR2021

On the Equivalence of Youla, System-Level, and Input-Output Parameterizations

Maryam Kamgarpour, Luca Furieri, Yu Zheng

A convex parameterization of internally stabilizing controllers is fundamental for many controller synthesis procedures. The celebrated Youla parameterization relies on a doubly coprime factorization of the system, while the recent system-level and input-o ...
2021

Yapa: Accelerated Proximal Algorithm For Convex Composite Problems

Mireille El Gheche, Giovanni Chierchia

Proximal splitting methods are standard tools for nonsmooth optimization. While primal-dual methods have become very popular in the last decade for their flexibility, primal methods may still be preferred for two reasons: acceleration schemes are more effe ...
IEEE2021

Graph Chatbot

Chattez avec Graph Search

Posez n’importe quelle question sur les cours, conférences, exercices, recherches, actualités, etc. de l’EPFL ou essayez les exemples de questions ci-dessous.

AVERTISSEMENT : Le chatbot Graph n'est pas programmé pour fournir des réponses explicites ou catégoriques à vos questions. Il transforme plutôt vos questions en demandes API qui sont distribuées aux différents services informatiques officiellement administrés par l'EPFL. Son but est uniquement de collecter et de recommander des références pertinentes à des contenus que vous pouvez explorer pour vous aider à répondre à vos questions.