Efficient Greedy Coordinate Descent for Composite Problems
Publications associées (35)
Graph Chatbot
Chattez avec Graph Search
Posez n’importe quelle question sur les cours, conférences, exercices, recherches, actualités, etc. de l’EPFL ou essayez les exemples de questions ci-dessous.
AVERTISSEMENT : Le chatbot Graph n'est pas programmé pour fournir des réponses explicites ou catégoriques à vos questions. Il transforme plutôt vos questions en demandes API qui sont distribuées aux différents services informatiques officiellement administrés par l'EPFL. Son but est uniquement de collecter et de recommander des références pertinentes à des contenus que vous pouvez explorer pour vous aider à répondre à vos questions.
Non-convex constrained optimization problems have become a powerful framework for modeling a wide range of machine learning problems, with applications in k-means clustering, large- scale semidefinite programs (SDPs), and various other tasks. As the perfor ...
One of the main goal of Artificial Intelligence is to develop models capable of providing valuable predictions in real-world environments. In particular, Machine Learning (ML) seeks to design such models by learning from examples coming from this same envi ...
The goal of this thesis is to study continuous-domain inverse problems for the reconstruction of sparse signals and to develop efficient algorithms to solve such problems computationally. The task is to recover a signal of interest as a continuous function ...
We present a strikingly simple proof that two rules are sufficient to automate gradient descent: 1) don’t increase the stepsize too fast and 2) don’t overstep the local curvature. No need for functional values, no line search, no information about the func ...
We characterize the solution of a broad class of convex optimization problems that address the reconstruction of a function from a finite number of linear measurements. The underlying hypothesis is that the solution is decomposable as a finite sum of compo ...
In this paper we fully describe the trajectory of gradient flow over diagonal linear networks in the limit of vanishing initialisation. We show that the limiting flow successively jumps from a saddle of the training loss to another until reaching the minim ...
A computer-implemented method for reconstructing/recovering high-resolution visible light spectral data at a target resolution d, that comprises obtaining a configuration of a low- resolution multi-channel imaging sensor of resolution p, the configuration ...
We consider the problem of sampling from a density of the form p(x) ? exp(-f (x) - g(x)), where f : Rd-+ R is a smooth function and g : R-d-+ R is a convex and Lipschitz function. We propose a new algorithm based on the Metropolis-Hastings framework. Under ...
Inverse reconstruction from images is a central problem in many scientific and engineering disciplines. Recent progress on differentiable rendering has led to methods that can efficiently differentiate the full process of image formation with respect to mi ...
We develop a new Newton Frank-Wolfe algorithm to solve a class of constrained self-concordant minimization problems using linear minimization oracles (LMO). Unlike L-smooth convex functions, where the Lipschitz continuity of the objective gradient holds gl ...