Algorithms and methods for sparse approximation in structured dictionaries
Graph Chatbot
Chat with Graph Search
Ask any question about EPFL courses, lectures, exercises, research, news, etc. or try the example questions below.
DISCLAIMER: The Graph Chatbot is not programmed to provide explicit or categorical answers to your questions. Rather, it transforms your questions into API requests that are distributed across the various IT services officially administered by EPFL. Its purpose is solely to collect and recommend relevant references to content that you can explore to help you answer your questions.
We introduce the Multiplicative Update Selector and Estimator (MUSE) algorithm for sparse approximation in under-determined linear regression problems. Given ƒ = Φα* + μ, the MUSE provably and efficiently finds a k-sparse vector α̂ such that ∥Φα̂ − ƒ∥∞ ≤ ∥ ...
In this work package (WP), we investigate the possibility of discovering structure within dictionary learning. This could range from exploring groups of atoms that appear in clusters - a form of molecule learning - to learning graphical dependencies across ...
Recovery of sparse signals from linear, dimensionality reducing measurements broadly fall under two well-known formulations, named the synthesis and the analysis a ́ la Elad et al. Recently, Chandrasekaran et al. introduced a new algorithmic sparse recover ...
Sparse methods are widely used in image and audio processing for denoising and classification, but there have been few previous applications to neural signals for brain-computer interfaces (BCIs). We used the dictionary- learning algorithm K-SVD, coupled w ...
Assume a multichannel data matrix, which due to the column-wise dependencies, has low-rank and joint-sparse representation. This matrix wont have many degrees of freedom. Enormous developments over the last decade in areas of compressed sensing and low-ran ...
The aim of this work package (WP) is to explore approaches to learn structured sparse models, that is sparse models where the sparsity assumption seems not to be sufficient, or when there is hope to exploit some additional knowledge together with the spars ...
We provide two compressive sensing (CS) recovery algorithms based on iterative hard-thresholding. The algorithms, collectively dubbed as algebraic pursuits (ALPS), exploit the restricted isometry properties of the CS measurement matrix within the algebra o ...
Ieee Service Center, 445 Hoes Lane, Po Box 1331, Piscataway, Nj 08855-1331 Usa2011
We develop an efficient learning framework to construct signal dictionaries for sparse representation by selecting the dictionary columns from multiple candidate bases. By sparse, we mean that only a few dictionary elements, compared to the ambient signal ...
This paper presents a new method for learning overcomplete dictionaries adapted to efficient joint representation of stereo images. We first formulate a sparse stereo image model where the multi-view correlation is described by local geometric transforms o ...
Institute of Electrical and Electronics Engineers2011
In this work we present a new greedy algorithm for sparse approximation called LocOMP. LocOMP is meant to be run on local dictionaries made of atoms with much shorter supports than the signal length. This notably encompasses shift-invariant dictionaries an ...