Publication

MATHICSE Technical Report : A fast gradient method for nonnegative sparse regression with self dictionary

Robert Gerhard Jérôme Luce
2016
Report or working paper
Abstract

Nonnegative matrix factorization (NMF) can be computed efficiently under the separability assumption, which asserts that all the columns of the input data matrix belong to the convex cone generated by only a few of its columns. The provably most robust methods to identify these basis columns are based on nonnegative sparse regression and self dictionary, and require the solution of large-scale convex optimization problems. In this paper we study a particular nonnegative sparse regression model with self dictionary. As opposed to previously proposed models, it is a smooth optimization problem where sparsity is enforced through appropriate linear constraints. We show that the Euclidean projection on the set defined by these constraints can be computed efficiently, and propose a fast gradient method to solve our model. We show the effectiveness of the approach compared to state-of-the-art methods on several synthetic data sets and real-world hyperspectral images.

About this result
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.

Graph Chatbot

Chat with Graph Search

Ask any question about EPFL courses, lectures, exercises, research, news, etc. or try the example questions below.

DISCLAIMER: The Graph Chatbot is not programmed to provide explicit or categorical answers to your questions. Rather, it transforms your questions into API requests that are distributed across the various IT services officially administered by EPFL. Its purpose is solely to collect and recommend relevant references to content that you can explore to help you answer your questions.