Ask any question about EPFL courses, lectures, exercises, research, news, etc. or try the example questions below.
DISCLAIMER: The Graph Chatbot is not programmed to provide explicit or categorical answers to your questions. Rather, it transforms your questions into API requests that are distributed across the various IT services officially administered by EPFL. Its purpose is solely to collect and recommend relevant references to content that you can explore to help you answer your questions.
The convex ℓ1-regularized logdet divergence criterion has been shown to produce theoretically consistent graph learning. However, this objective function is challenging since the ℓ1-regularization is nonsmooth, the logdet objective is n ...
This paper proposes a smoothing technique for nonsmooth convex minimization using self-concordant barriers. To illustrate the main ideas, we compare our technique and the proximity smoothing approach (Nesterov2005) via the classical gradient method on both ...
Over the past few decades we have been experiencing a data explosion; massive amounts of data are increasingly collected and multimedia databases, such as YouTube and Flickr, are rapidly expanding. At the same time rapid technological advancements in mobil ...
The convex l(1)-regularized log det divergence criterion has been shown to produce theoretically consistent graph learning. However, this objective function is challenging since the l(1)-regularization is nonsmooth, the log det objective is not globally Li ...
Solving a convex optimization problem within an a priori certified period of time is a challenging problem. This paper discusses the certification of Nesterov’s fast gradient method for problems with a strictly quadratic objective and a feasible set given ...
We propose an algorithmic framework for convex minimization problems of a composite function with two terms: a self-concordant function and a possibly nonsmooth regularization term. Our method is a new proximal Newton algorithm that features a local quadra ...