Harmonic symmetrization of convex sets and of Finsler structures, with applications to Hilbert geometry
Graph Chatbot
Chattez avec Graph Search
Posez n’importe quelle question sur les cours, conférences, exercices, recherches, actualités, etc. de l’EPFL ou essayez les exemples de questions ci-dessous.
AVERTISSEMENT : Le chatbot Graph n'est pas programmé pour fournir des réponses explicites ou catégoriques à vos questions. Il transforme plutôt vos questions en demandes API qui sont distribuées aux différents services informatiques officiellement administrés par l'EPFL. Son but est uniquement de collecter et de recommander des références pertinentes à des contenus que vous pouvez explorer pour vous aider à répondre à vos questions.
We describe the first gradient methods on Riemannian manifolds to achieve accelerated rates in the non-convex case. Under Lipschitz assumptions on the Riemannian gradient and Hessian of the cost function, these methods find approximate first-order critical ...
Covariance operators play a fundamental role in functional data analysis, providing the canonical means to analyse functional variation via the celebrated Karhunen-Loève expansion. These operators may themselves be subject to variation, for instance in con ...
We consider the problem of finding a saddle point for the convex-concave objective minxmaxyf(x)+⟨Ax,y⟩−g∗(y), where f is a convex function with locally Lipschitz gradient and g is convex and possibly non-smooth. We propose an ...
We present a strikingly simple proof that two rules are sufficient to automate gradient descent: 1) don’t increase the stepsize too fast and 2) don’t overstep the local curvature. No need for functional values, no line search, no information about the func ...
We provide counterexamples to regularity of optimal maps in the classical Monge problem under various assumptions on the initial data. Our construction is based on a variant of the counterexample in [10] to Lipschitz regularity of the monotone optimal map ...
It has been experimentally observed that the efficiency of distributed training with stochastic gradient (SGD) depends decisively on the batch size and—in asynchronous implementations—on the gradient staleness. Especially, it has been observed that the spe ...
The problem of allocating the closed-loop poles of linear systems in specific regions of the complex plane defined by discrete time-domain requirements is addressed. The resulting non-convex set is inner-approximated by a convex region described with linea ...
We consider model order reduction of parameterized Hamiltonian systems describing nondissipative phenomena, like wave-type and transport dominated problems. The development of reduced basis methods for such models is challenged by two main factors: the ric ...
This work studies multi-agent sharing optimization problems with the objective function being the sum of smooth local functions plus a convex (possibly non-smooth) function coupling all agents. This scenario arises in many machine learning and engineering ...
While phi-divergences have been extensively studied in convex analysis, their use in optimization problems often remains challenging. In this regard, one of the main shortcomings of existing methods is that the minimization of phi-divergences is usually pe ...