A Linearly Convergent Proximal Gradient Algorithm for Decentralized Optimization
Graph Chatbot
Chat with Graph Search
Ask any question about EPFL courses, lectures, exercises, research, news, etc. or try the example questions below.
DISCLAIMER: The Graph Chatbot is not programmed to provide explicit or categorical answers to your questions. Rather, it transforms your questions into API requests that are distributed across the various IT services officially administered by EPFL. Its purpose is solely to collect and recommend relevant references to content that you can explore to help you answer your questions.
We propose new regularization models to solve inverse problems encountered in biomedical imaging applications. In formulating mathematical schemes, we base our approach on the sparse signal processing principles that have emerged as a central paradigm in t ...
We develop an effective distributed strategy for seeking the Pareto solution of an aggregate cost consisting of regularized risks. The focus is on stochastic optimization problems where each risk function is expressed as the expectation of some loss functi ...
The design and analysis of machine learning algorithms typically considers the problem of learning on a single task, and the nature of learning in such scenario is well explored. On the other hand, very often tasks faced by machine learning systems arrive ...
This work presents an algorithmic scheme for solving the infinite-time constrained linear quadratic regulation problem. We employ an accelerated version of a popular proximal gradient scheme, commonly known as the Forward-Backward Splitting (FBS), and prov ...
Institute of Electrical and Electronics Engineers2017
Metal cations often play an important role in shaping the three-dimensional structure of peptides. As an example, the model system AcPheAla5LysH+ is investigated in order to fully understand the forces that stabilize its helical structure. In particular, t ...
In modern-data analysis applications, the abundance of data makes extracting meaningful information from it challenging, in terms of computation, storage, and interpretability. In this setting, exploiting sparsity in data has been essential to the developm ...
In this work we consider the learning setting where, in addition to the training set, the learner receives a collection of auxiliary hypotheses originating from other tasks. We focus on a broad class of ERM-based linear algorithms that can be instantiated ...
In a recent article series, the authors have promoted convex optimization algorithms for radio-interferometric imaging in the framework of compressed sensing, which leverages sparsity regularization priors for the associated inverse problem and defines a m ...
The need for optimal control of processes under a restricted amount of resources renders first order optimization methods a viable option. Although computationally cheap, these methods typically suffer from slow convergence rates. In this work we discuss t ...
A new decomposition optimization algorithm, called path-following gradient-based decomposition, is proposed to solve separable convex optimization problems. Unlike path-following Newton methods considered in the literature, this algorithm does not require ...