Publication

Local Kernel Regression and Neural Network Approaches to the Conformational Landscapes of Oligopeptides

Related publications (68)

Task-driven neural network models predict neural dynamics of proprioception: Experimental data, activations and predictions of neural network models

Alexander Mathis, Alberto Silvio Chiappa, Alessandro Marin Vargas, Axel Bisi

Here we provide the neural data, activation and predictions for the best models and result dataframes of our article "Task-driven neural network models predict neural dynamics of proprioception". It contains the behavioral and neural experimental data (cu ...
EPFL Infoscience2024

Topics in statistical physics of high-dimensional machine learning

Hugo Chao Cui

In the past few years, Machine Learning (ML) techniques have ushered in a paradigm shift, allowing the harnessing of ever more abundant sources of data to automate complex tasks. The technical workhorse behind these important breakthroughs arguably lies in ...
EPFL2024

Random matrix methods for high-dimensional machine learning models

Antoine Philippe Michel Bodin

In the rapidly evolving landscape of machine learning research, neural networks stand out with their ever-expanding number of parameters and reliance on increasingly large datasets. The financial cost and computational resources required for the training p ...
EPFL2024

Explainable Fault Diagnosis of Oil-Immersed Transformers: A Glass-Box Model

Yi Zhang, Wenlong Liao, Zhe Yang

Recently, remarkable progress has been made in the application of machine learning (ML) techniques (e.g., neural networks) to transformer fault diagnosis. However, the diagnostic processes employed by these techniques often suffer from a lack of interpreta ...
Piscataway2024

Bayes-optimal Learning of Deep Random Networks of Extensive-width

Florent Gérard Krzakala, Lenka Zdeborová, Hugo Chao Cui

We consider the problem of learning a target function corresponding to a deep, extensive-width, non-linear neural network with random Gaussian weights. We consider the asymptotic limit where the number of samples, the input dimension and the network width ...
2023

Fundamental Limits in Statistical Learning Problems: Block Models and Neural Networks

Elisabetta Cornacchia

This thesis focuses on two selected learning problems: 1) statistical inference on graphs models, and, 2) gradient descent on neural networks, with the common objective of defining and analysing the measures that characterize the fundamental limits.In the ...
EPFL2023

From Kernel Methods to Neural Networks: A Unifying Variational Formulation

Michaël Unser

The minimization of a data-fidelity term and an additive regularization functional gives rise to a powerful framework for supervised learning. In this paper, we present a unifying regularization functional that depends on an operator L\documentclass[12pt]{ ...
New York2023

Deep Learning Generalization with Limited and Noisy Labels

Mahsa Forouzesh

Deep neural networks have become ubiquitous in today's technological landscape, finding their way in a vast array of applications. Deep supervised learning, which relies on large labeled datasets, has been particularly successful in areas such as image cla ...
EPFL2023

Comparing transferability in neural network approaches and linear models for machine-learning interaction potentials

Kevin Rossi

Atomic simulations using machine learning interatomic potential (MLIP) have gained a lot of popularity owing to their accuracy in comparison to conventional empirical potentials. However, the transferability of MLIP to systems outside the training set pose ...
AMER PHYSICAL SOC2023

Breaking the Curse of Dimensionality in Deep Neural Networks by Learning Invariant Representations

Leonardo Petrini

Artificial intelligence, particularly the subfield of machine learning, has seen a paradigm shift towards data-driven models that learn from and adapt to data. This has resulted in unprecedented advancements in various domains such as natural language proc ...
EPFL2023

Graph Chatbot

Chat with Graph Search

Ask any question about EPFL courses, lectures, exercises, research, news, etc. or try the example questions below.

DISCLAIMER: The Graph Chatbot is not programmed to provide explicit or categorical answers to your questions. Rather, it transforms your questions into API requests that are distributed across the various IT services officially administered by EPFL. Its purpose is solely to collect and recommend relevant references to content that you can explore to help you answer your questions.