Adaptation in Stochastic Algorithms: From Nonsmooth Optimization to Min-Max Problems and Beyond
Publications associées (166)
Graph Chatbot
Chattez avec Graph Search
Posez n’importe quelle question sur les cours, conférences, exercices, recherches, actualités, etc. de l’EPFL ou essayez les exemples de questions ci-dessous.
AVERTISSEMENT : Le chatbot Graph n'est pas programmé pour fournir des réponses explicites ou catégoriques à vos questions. Il transforme plutôt vos questions en demandes API qui sont distribuées aux différents services informatiques officiellement administrés par l'EPFL. Son but est uniquement de collecter et de recommander des références pertinentes à des contenus que vous pouvez explorer pour vous aider à répondre à vos questions.
Despite the widespread use of gradient-based algorithms for optimizing high-dimensional non-convex functions, understanding their ability of finding good minima instead of being trapped in spurious ones remains to a large extent an open problem. Here we fo ...
We introduce a generic \emph{two-loop} scheme for smooth minimax optimization with strongly-convex-concave objectives. Our approach applies the accelerated proximal point framework (or Catalyst) to the associated \emph{dual problem} and takes full advantag ...
With the ever-growing data sizes along with the increasing complexity of the modern problem formulations, contemporary applications in science and engineering impose heavy computational and storage burdens on the optimization algorithms. As a result, there ...
Combining diffusion strategies with complementary properties enables enhanced performance when they can be run simultaneously. In this article, we first propose two schemes for the convex combination of two diffusion strategies, namely, the power-normalize ...
We introduce a generic two-loop scheme for smooth minimax optimization with strongly-convex-concave objectives. Our approach applies the accelerated proximal point framework (or Catalyst) to the associated dual problem and takes full advantage of existing ...
A broad class of convex optimization problems can be formulated as a semidefinite program (SDP), minimization of a convex function over the positive-semidefinite cone subject to some affine constraints. The majority of classical SDP solvers are designed fo ...
We propose two novel conditional gradient-based methods for solving structured stochastic convex optimization problems with a large number of linear constraints. Instances of this template naturally arise from SDP-relaxations of combinatorial problems, whi ...
In this work, we propose a unified theoretical and practical spherical approximation framework for functional inverse problems on the hypersphere. More specifically, we consider recovering spherical fields directly in the continuous domain using functional ...
Optimization algorithms and Monte Carlo sampling algorithms have provided the computational foundations for the rapid growth in applications of statistical machine learning in recent years. There is, however, limited theoretical understanding of the relati ...
In this paper, we focus on a theory-practice gap for Adam and its variants (AMSgrad, AdamNC, etc.). In practice, these algorithms are used with a constant first-order moment parameter 1 (typically between 0:9 and 0:99). In theory, regret guarantees for onl ...