Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
In presence of sparse noise we propose kernel regression for predicting output vectors which are smooth over a given graph. Sparse noise models the training outputs being corrupted either with missing samples or large perturbations. The presence of sparse noise is handled using appropriate use of ℓ 1 -norm along-with use of ℓ2-norm in a convex cost function. For optimization of the cost function, we propose an iteratively reweighted least-squares (IRLS) approach that is suitable for kernel substitution or kernel trick due to availability of a closed form solution. Simulations using real-world temperature data show efficacy of our proposed method, mainly for limited-size training datasets.
Florent Gérard Krzakala, Lenka Zdeborová, Hugo Chao Cui
Friedrich Eisenbrand, Puck Elisabeth van Gerwen, Raimon Fabregat I De Aguilar-Amat
Michele Ceriotti, Alberto Fabrizio, Benjamin André René Meyer, Edgar Albert Engel, Raimon Fabregat I De Aguilar-Amat, Veronika Juraskova