Ask any question about EPFL courses, lectures, exercises, research, news, etc. or try the example questions below.
DISCLAIMER: The Graph Chatbot is not programmed to provide explicit or categorical answers to your questions. Rather, it transforms your questions into API requests that are distributed across the various IT services officially administered by EPFL. Its purpose is solely to collect and recommend relevant references to content that you can explore to help you answer your questions.
Introduction of optimisation problems in which the objective function is black box or obtaining the gradient is infeasible, has recently raised interest in zeroth-order optimisation methods. As an example finding adversarial examples for Deep Learning mode ...
Driven by the need to solve increasingly complex optimization problems in signal processing and machine learning, recent years have seen rising interest in the behavior of gradient-descent based algorithms in non-convex environments. Most of the works on d ...
We propose a graph signal processing framework to overcome the computational burden of Tensor Robust PCA (TRPCA). Our framework also serves as a convex alternative to graph regularized tensor factorization methods. Our method is based on projecting a tenso ...
We obtain quantitative bounds on the mixing properties of the Hamiltonian Monte Carlo (HMC) algorithm with target distribution in d-dimensional Euclidean space, showing that HMC mixes quickly whenever the target log-distribution is strongly concave and has ...
This work presents a novel methodology for speeding up the assembly of stiffness matrices for laminate composite 3D structures in the context of isogeometric and finite element discretizations. By splitting the involved terms into their in-plane and out-of ...
The ensemble Kalman filter is a computationally efficient technique to solve state and/or parameter estimation problems in the framework of statistical inversion when relying on a Bayesian paradigm. Unfortunately its cost may become moderately large for sy ...
Part I of this paper developed the exact diffusion algorithm to remove the bias that is characteristic of distributed solutions for deterministic optimization problems. The algorithm was shown to be applicable to the larger set of locally balanced left-sto ...
Despite the vast success neural networks have achieved in different application domains, they have been proven to be vulnerable to adversarial perturbations (small changes in the input), which lead them to produce the wrong output. In this paper, we propos ...
This paper considers optimization problems over networks where agents have individual objectives to meet, or individual parameter vectors to estimate, subject to subspace constraints that enforce the objectives across the network to lie in a low-dimensiona ...
The nonparametric learning of positive-valued functions appears widely in machine learning, especially in the context of estimating intensity functions of point processes. Yet, existing approaches either require computing expensive projections or semidefin ...