Publications associées (57)

A Geometric Unification of Distributionally Robust Covariance Estimators: Shrinking the Spectrum by Inflating the Ambiguity Set

Daniel Kuhn, Yves Rychener, Viet Anh Nguyen

The state-of-the-art methods for estimating high-dimensional covariance matrices all shrink the eigenvalues of the sample covariance matrix towards a data-insensitive shrinkage target. The underlying shrinkage transformation is either chosen heuristically ...
2024

Bayes-optimal Learning of Deep Random Networks of Extensive-width

Florent Gérard Krzakala, Lenka Zdeborová, Hugo Chao Cui

We consider the problem of learning a target function corresponding to a deep, extensive-width, non-linear neural network with random Gaussian weights. We consider the asymptotic limit where the number of samples, the input dimension and the network width ...
2023

Local Kernel Regression and Neural Network Approaches to the Conformational Landscapes of Oligopeptides

Michele Ceriotti, Alberto Fabrizio, Benjamin André René Meyer, Edgar Albert Engel, Raimon Fabregat I De Aguilar-Amat, Veronika Juraskova

The application of machine learning to theoretical chemistry has made it possible to combine the accuracy of quantum chemical energetics with the thorough sampling of finite-temperature fluctuations. To reach this goal, a diverse set of methods has been pr ...
2022

Improving sample and feature selection with principal covariates regression

Michele Ceriotti, Edgar Albert Engel, Benjamin Aaron Helfrecht, Sergei Kliavinek

Selecting the most relevant features and samples out of a large set of candidates is a task that occurs very often in the context of automated data analysis, where it improves the computational performance and often the transferability of a model. Here we ...
IOP PUBLISHING LTD2021

Sparsest piecewise-linear regression of one-dimensional data

Michaël Unser, Julien René Pierre Fageot, Thomas Jean Debarre, Quentin Alain Denoyelle

We study the problem of one-dimensional regression of data points with total-variation (TV) regularization (in the sense of measures) on the second derivative, which is known to promote piecewise-linear solutions with few knots. While there are efficient a ...
Elsevier2021

Inkjet-printed polymer composites for the detection of volatile organic compounds

Mohammadmahdi Kiaee

Sensors capable of detecting and classifying volatile organic compounds (VOC) have been gaining more attention by the advent of internet-of-things (IoT) enabled devices and integration of various sensing elements into hand-held and portable devices. The re ...
EPFL2021

Counting People by Estimating People Flows

Pascal Fua, Mathieu Salzmann, Weizhe Liu

Modern methods for counting people in crowded scenes rely on deep networks to estimate people densities in individual images. As such, only very few take advantage of temporal consistency in video sequences, and those that do only impose weak smoothness co ...
2021

Compact atomic descriptors enable accurate predictions via linear models

Kevin Rossi

We probe the accuracy of linear ridge regression employing a three-body local density representation derived from the atomic cluster expansion. We benchmark the accuracy of this framework in the prediction of formation energies and atomic forces in molecul ...
AIP Publishing2021

Structure-property maps with Kernel principal covariates regression

Michele Ceriotti, Guillaume André Jean Fraux, Benjamin Aaron Helfrecht

Data analyses based on linear methods constitute the simplest, most robust, and transparent approaches to the automatic processing of large amounts of data for building supervised or unsupervised machine learning models. Principal covariates regression (PC ...
2020

Optimal Convergence for Distributed Learning with Stochastic Gradient Methods and Spectral Algorithms

Volkan Cevher, Junhong Lin

We study generalization properties of distributed algorithms in the setting of nonparametric regression over a reproducing kernel Hilbert space (RKHS). We first investigate distributed stochastic gradient methods (SGM), with mini-batches and multi-passes o ...
2020

Graph Chatbot

Chattez avec Graph Search

Posez n’importe quelle question sur les cours, conférences, exercices, recherches, actualités, etc. de l’EPFL ou essayez les exemples de questions ci-dessous.

AVERTISSEMENT : Le chatbot Graph n'est pas programmé pour fournir des réponses explicites ou catégoriques à vos questions. Il transforme plutôt vos questions en demandes API qui sont distribuées aux différents services informatiques officiellement administrés par l'EPFL. Son but est uniquement de collecter et de recommander des références pertinentes à des contenus que vous pouvez explorer pour vous aider à répondre à vos questions.