Convex Optimization using Sparsified Stochastic Gradient Descent with Memory
Graph Chatbot
Chat with Graph Search
Ask any question about EPFL courses, lectures, exercises, research, news, etc. or try the example questions below.
DISCLAIMER: The Graph Chatbot is not programmed to provide explicit or categorical answers to your questions. Rather, it transforms your questions into API requests that are distributed across the various IT services officially administered by EPFL. Its purpose is solely to collect and recommend relevant references to content that you can explore to help you answer your questions.
We present a framework based on convex optimization and spectral regularization to perform learning when feature observations are multidimensional arrays (tensors). We give a mathematical characterization of spectral penalties for tensors and analyze a uni ...
We propose an algorithmic framework for convex minimization problems of a composite function with two terms: a self-concordant function and a possibly nonsmooth regularization term. Our method is a new proximal Newton algorithm that features a local quadra ...
This paper introduces a set of algorithms for Monte-Carlo Bayesian reinforcement learning. Firstly, Monte-Carlo estimation of upper bounds on the Bayes-optimal value function is employed to construct an optimistic policy. Secondly, gradient-based algorithm ...
We propose a convergence analysis of accelerated forward-backward splitting methods for composite function minimization, when the proximity operator is not available in closed form, and can only be computed up to a certain precision. We prove that the 1/k( ...
Machine learning is most often cast as an optimization problem. Ideally, one expects a convex objective function to rely on efficient convex optimizers with nice guarantees such as no local optima. Yet, non-convexity is very frequent in practice and it may ...
Vision is a natural tool for human-computer interaction, since it pro- vides visual feedback to the user and mimics some human behaviors. It requires however the fast and robust computation of motion primi- tives, which remains a difficult problem. In this ...
Ieee Service Center, 445 Hoes Lane, Po Box 1331, Piscataway, Nj 08855-1331 Usa2011
A new decomposition optimization algorithm, called path-following gradient-based decomposition, is proposed to solve separable convex optimization problems. Unlike path-following Newton methods considered in the literature, this algorithm does not require ...
This paper proposes a smoothing technique for nonsmooth convex minimization using self-concordant barriers. To illustrate the main ideas, we compare our technique and the proximity smoothing approach (Nesterov2005) via the classical gradient method on both ...
We consider the class of convex minimization problems, composed of a self-concordant function, such as the logdet metric, a convex data fidelity term h and, a regularizing -- possibly non-smooth -- function g. This type of problems have recently attracted ...
We investigate a compressive sensing system in which the sensors introduce a distortion to the measurements in the form of unknown gains. We focus on blind calibration, using measures performed on a few unknown (but sparse) signals. We extend our earlier s ...