**Are you an EPFL student looking for a semester project?**

Work with us on data science and visualisation projects, and deploy your project as an app on top of GraphSearch.

Publication# A Non-Euclidean Gradient Descent Framework for Non-Convex Matrix Factorization

Volkan Cevher, Anastasios Kyrillidis, Ya-Ping Hsieh, Rabeeh Karimi Mahabadi, Yu-Chun Kao

2018

Journal paper

2018

Journal paper

Abstract

We study convex optimization problems that feature low-rank matrix solutions. In such scenarios, non-convex methods offer significant advantages over convex methods due to their lower space complexity as well as faster convergence speed. Moreover, many of these methods feature rigorous approximation guarantees. Non-convex algorithms are simple to analyze and implement as they perform Euclidean gradient descent on matrix factors. In contrast, this paper derives non-Euclidean optimization frame- work in the non-convex setting that takes nonlinear gradient steps on the factors. We prove convergence rates to the global minimum under appropriate assumptions. We provide numerical evidence with Fourier Ptychography and FastText applications using real data that shows our approach can significantly enhance solution quality

Official source

This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.

Related publications (48)

Related MOOCs (26)

Related concepts (32)

Introduction to optimization on smooth manifolds: first order methods

Learn to optimize on smooth, nonlinear spaces: Join us to build your foundations (starting at "what is a manifold?") and confidently implement your first algorithm (Riemannian gradient descent).

Algebra (part 1)

Un MOOC francophone d'algèbre linéaire accessible à tous, enseigné de manière rigoureuse et ne nécessitant aucun prérequis.

Algebra (part 1)

Un MOOC francophone d'algèbre linéaire accessible à tous, enseigné de manière rigoureuse et ne nécessitant aucun prérequis.

Convex optimization

Convex optimization is a subfield of mathematical optimization that studies the problem of minimizing convex functions over convex sets (or, equivalently, maximizing concave functions over convex sets). Many classes of convex optimization problems admit polynomial-time algorithms, whereas mathematical optimization is in general NP-hard.

Online machine learning

In computer science, online machine learning is a method of machine learning in which data becomes available in a sequential order and is used to update the best predictor for future data at each step, as opposed to batch learning techniques which generate the best predictor by learning on the entire training data set at once. Online learning is a common technique used in areas of machine learning where it is computationally infeasible to train over the entire dataset, requiring the need of out-of-core algorithms.

Convex hull algorithms

Algorithms that construct convex hulls of various objects have a broad range of applications in mathematics and computer science. In computational geometry, numerous algorithms are proposed for computing the convex hull of a finite set of points, with various computational complexities. Computing the convex hull means that a non-ambiguous and efficient representation of the required convex shape is constructed. The complexity of the corresponding algorithms is usually estimated in terms of n, the number of input points, and sometimes also in terms of h, the number of points on the convex hull.

Non-convex constrained optimization problems have become a powerful framework for modeling a wide range of machine learning problems, with applications in k-means clustering, large- scale semidefinite programs (SDPs), and various other tasks. As the perfor ...

Within the context of contemporary machine learning problems, efficiency of optimization process depends on the properties of the model and the nature of the data available, which poses a significant problem as the complexity of either increases ad infinit ...

, , ,

We address black-box convex optimization problems, where the objective and constraint functions are not explicitly known but can be sampled within the feasible set. The challenge is thus to generate a sequence of feasible points converging towards an optim ...

2023