Publication

Convex optimization in sums of Banach spaces

Michaël Unser, Shayan Aziznejad
2021
Journal paper
Abstract

We characterize the solution of a broad class of convex optimization problems that address the reconstruction of a function from a finite number of linear measurements. The underlying hypothesis is that the solution is decomposable as a finite sum of components, where each component belongs to its own prescribed Banach space; moreover, the problem is regularized by penalizing some composite norm of the solution. We establish general conditions for existence and derive the generic parametric representation of the solution components. These representations fall into three categories depending on the underlying regularization norm: (i) a linear expansion in terms of predefined “kernels” when the component space is a reproducing kernel Hilbert space (RKHS), (ii) a non-linear (duality) mapping of a linear combination of measurement functionals when the component Banach space is strictly convex, and, (iii) an adaptive expansion in terms of a small number of atoms within a larger dictionary when the component Banach space is not strictly convex. Our approach generalizes and unifies a number of multi-kernel (RKHS) and sparse-dictionary learning techniques for compressed sensing available in the literature. It also yields the natural extension of the classical spline-fitting techniques in (semi-)RKHS to the abstract Banach-space setting.

About this result
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.
Related concepts (41)
Hilbert space
In mathematics, Hilbert spaces (named after David Hilbert) allow the methods of linear algebra and calculus to be generalized from (finite-dimensional) Euclidean vector spaces to spaces that may be infinite-dimensional. Hilbert spaces arise naturally and frequently in mathematics and physics, typically as function spaces. Formally, a Hilbert space is a vector space equipped with an inner product that induces a distance function for which the space is a complete metric space.
Convex optimization
Convex optimization is a subfield of mathematical optimization that studies the problem of minimizing convex functions over convex sets (or, equivalently, maximizing concave functions over convex sets). Many classes of convex optimization problems admit polynomial-time algorithms, whereas mathematical optimization is in general NP-hard.
Banach algebra
In mathematics, especially functional analysis, a Banach algebra, named after Stefan Banach, is an associative algebra over the real or complex numbers (or over a non-Archimedean complete normed field) that at the same time is also a Banach space, that is, a normed space that is complete in the metric induced by the norm. The norm is required to satisfy This ensures that the multiplication operation is continuous. A Banach algebra is called unital if it has an identity element for the multiplication whose norm is and commutative if its multiplication is commutative.
Show more
Related publications (104)

WILD SOLUTIONS TO SCALAR EULER-LAGRANGE EQUATIONS

Carl Johan Peter Johansson

. We study very weak solutions to scalar Euler-Lagrange equations associated with quadratic convex functionals. We investigate whether W1,1 solutions are necessarily W 1,2 Nash and Schauder applicable. We answer this question positively for a suitable clas ...
Amer Mathematical Soc2024

Augmented Lagrangian Methods for Provable and Scalable Machine Learning

Mehmet Fatih Sahin

Non-convex constrained optimization problems have become a powerful framework for modeling a wide range of machine learning problems, with applications in k-means clustering, large- scale semidefinite programs (SDPs), and various other tasks. As the perfor ...
EPFL2023

Stability of Image-Reconstruction Algorithms

Michaël Unser, Sebastian Jonas Neumayer, Pol del Aguila Pla

Robustness and stability of image-reconstruction algorithms have recently come under scrutiny. Their importance to medical imaging cannot be overstated. We review the known results for the topical variational regularization strategies ( ℓ2 and ℓ1 regulariz ...
2023
Show more
Related MOOCs (31)
Introduction to optimization on smooth manifolds: first order methods
Learn to optimize on smooth, nonlinear spaces: Join us to build your foundations (starting at "what is a manifold?") and confidently implement your first algorithm (Riemannian gradient descent).
Algebra (part 1)
Un MOOC francophone d'algèbre linéaire accessible à tous, enseigné de manière rigoureuse et ne nécessitant aucun prérequis.
Algebra (part 1)
Un MOOC francophone d'algèbre linéaire accessible à tous, enseigné de manière rigoureuse et ne nécessitant aucun prérequis.
Show more

Graph Chatbot

Chat with Graph Search

Ask any question about EPFL courses, lectures, exercises, research, news, etc. or try the example questions below.

DISCLAIMER: The Graph Chatbot is not programmed to provide explicit or categorical answers to your questions. Rather, it transforms your questions into API requests that are distributed across the various IT services officially administered by EPFL. Its purpose is solely to collect and recommend relevant references to content that you can explore to help you answer your questions.