Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of GraphSearch.
We introduce the Multiplicative Update Selector and Estimator (MUSE) algorithm for sparse approximation in under-determined linear regression problems. Given ƒ = Φα* + μ, the MUSE provably and efficiently finds a k-sparse vector α̂ such that ∥Φα̂ − ƒ∥∞ ≤ ∥μ∥∞ + O ( 1 over √k), for any k-sparse vector α*, any measurement matrix Φ, and any noise vector μ. We cast the sparse approximation problem as a zero-sum game over a properly chosen new space; this reformulation provides salient computational advantages in recovery. When the measurement matrix Φ provides stable embedding to sparse vectors (the so-called restricted isometry property in compressive sensing), the MUSE also features guarantees on ∥α* − α̂∥2. Simulation results demonstrate the scalability and performance of the MUSE in solving sparse approximation problems based on the Dantzig Selector.