Publication

Rapid Network Adaptation: Learning to Adapt Neural Networks Using Test-Time Feedback

Publications associées (49)

Structured cerebellar connectivity supports resilient pattern separation

Ilaria Ricchi

The cerebellum is thought to help detect and correct errors between intended and executed commands(1,2) and is critical for social behaviours, cognition and emotion 3 Computations for motor control must be performed quicklyto correct errors in real time an ...
NATURE PORTFOLIO2022

Towards Robust Vision Transformer

Shaokai Ye, Yuan He

Recent advances on Vision Transformer (ViT) and its improved variants have shown that self-attention-based networks surpass traditional Convolutional Neural Networks (CNNs) in most vision tasks. However, existing ViTs focus on the standard accuracy and com ...
IEEE COMPUTER SOC2022

OptTTA: Learnable Test-Time Augmentation for Source-Free Medical Image Segmentation Under Domain Shift

Jean-Philippe Thiran, Guillaume Marc Georges Vray, Devavrat Tomar

As distribution shifts are inescapable in realistic clinical scenarios due to inconsistencies in imaging protocols, scanner vendors, and across different centers, well-trained deep models incur a domain generalization problem in unseen environments. Despit ...
PMLR2022

Biologically plausible unsupervised learning in shallow and deep neural networks

Bernd Albert Illing

The way our brain learns to disentangle complex signals into unambiguous concepts is fascinating but remains largely unknown. There is evidence, however, that hierarchical neural representations play a key role in the cortex. This thesis investigates biolo ...
EPFL2021

Loss landscape and symmetries in Neural Networks

Mario Geiger

Neural networks (NNs) have been very successful in a variety of tasks ranging from machine translation to image classification. Despite their success, the reasons for their performance are still not well-understood. This thesis explores two main themes: lo ...
EPFL2021

On Vacuous and Non-Vacuous Generalization Bounds for Deep Neural Networks.

Konstantinos Pitas

Deep neural networks have been empirically successful in a variety of tasks, however their theoretical understanding is still poor. In particular, modern deep neural networks have many more parameters than training data. Thus, in principle they should over ...
EPFL2020

Spiking Neural Networks Trained With Backpropagation For Low Power Neuromorphic Implementation Of Voice Activity Detection

Milos Cernak, Giorgia Dellaferrera

Recent advances in Voice Activity Detection (VAD) are driven by artificial and Recurrent Neural Networks (RNNs), however, using a VAD system in battery-operated devices requires further power efficiency. This can be achieved by neuromorphic hardware, which ...
IEEE2020

Artificial Neural Network Approach to the Analytic Continuation Problem

Oleg Yazyev, Quansheng Wu, Lei Wang, Romain Fournier

Inverse problems are encountered in many domains of physics, with analytic continuation of the imaginary Green's function into the real frequency domain being a particularly important example. However, the analytic continuation problem is ill defined and c ...
AMER PHYSICAL SOC2020

Experimental Demonstration of Supervised Learning in Spiking Neural Networks with Phase-Change Memory Synapses

Irem Boybat Kara, Evangelos Eleftheriou, Abu Sebastian

Spiking neural networks (SNN) are computational models inspired by the brain's ability to naturally encode and process information in the time domain. The added temporal dimension is believed to render them more computationally efficient than the conventio ...
2020

Deep Neural Networks With Trainable Activations and Controlled Lipschitz Constant

Michaël Unser, Shayan Aziznejad, Harshit Gupta, Joaquim Gonçalves Garcia Barreto Campos

We introduce a variational framework to learn the activation functions of deep neural networks. Our aim is to increase the capacity of the network while controlling an upper-bound of the actual Lipschitz constant of the input-output relation. To that end, ...
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC2020

Graph Chatbot

Chattez avec Graph Search

Posez n’importe quelle question sur les cours, conférences, exercices, recherches, actualités, etc. de l’EPFL ou essayez les exemples de questions ci-dessous.

AVERTISSEMENT : Le chatbot Graph n'est pas programmé pour fournir des réponses explicites ou catégoriques à vos questions. Il transforme plutôt vos questions en demandes API qui sont distribuées aux différents services informatiques officiellement administrés par l'EPFL. Son but est uniquement de collecter et de recommander des références pertinentes à des contenus que vous pouvez explorer pour vous aider à répondre à vos questions.