**Êtes-vous un étudiant de l'EPFL à la recherche d'un projet de semestre?**

Travaillez avec nous sur des projets en science des données et en visualisation, et déployez votre projet sous forme d'application sur GraphSearch.

Personne# Samuel Pavio Muscinelli

Cette page est générée automatiquement et peut contenir des informations qui ne sont pas correctes, complètes, à jour ou pertinentes par rapport à votre recherche. Il en va de même pour toutes les autres pages de ce site. Veillez à vérifier les informations auprès des sources officielles de l'EPFL.

Unités associées

Chargement

Cours enseignés par cette personne

Chargement

Domaines de recherche associés

Chargement

Publications associées

Chargement

Personnes menant des recherches similaires

Chargement

Cours enseignés par cette personne

Aucun résultat

Unités associées (3)

Domaines de recherche associés (5)

Réseau de neurones récurrents

Un réseau de neurones récurrents (RNN pour recurrent neural network en anglais) est un réseau de neurones artificiels présentant des connexions récurrentes. Un réseau de neurones récurrents est const

Réseau de neurones artificiels

Un réseau de neurones artificiels, ou réseau neuronal artificiel, est un système dont la conception est à l'origine schématiquement inspirée du fonctionnement des neurones biologique

Neural network

A neural network can refer to a neural circuit of biological neurons (sometimes also called a biological neural network), a network of artificial neurons or nodes in the case of an artificial neur

Publications associées (6)

Chargement

Chargement

Chargement

Sylvain Crochet, Vahid Esmaeili, Georgios Foustoukos, Wulfram Gerstner, Yanqi Liu, Alireza Modirshanechi, Samuel Pavio Muscinelli, Anastasiia Oryshchuk, Carl Petersen, Keita Tamura

The neuronal mechanisms generating a delayed motor response initiated by a sensory cue remain elusive. Here, we tracked the precise sequence of cortical activity in mice transforming a brief whisker stimulus into delayed licking using wide-field calcium imaging, multiregion high-density electrophysiology, and time-resolved optogenetic manipulation. Rapid activity evoked by whisker deflection acquired two prominent features for task performance: (1) an enhanced excitation of secondary whisker motor cortex, suggesting its important role connecting whisker sensory processing to lick motor planning; and (2) a transient reduction of activity in orofacial sensorimotor cortex, which contributed to suppressing premature licking. Subsequent widespread cortical activity during the delay period largely correlated with anticipatory movements, but when these were accounted for, a focal sustained activity remained in frontal cortex, which was causally essential for licking in the response period. Our results demonstrate key cortical nodes for motor plan generation and timely execution in delayed goal-directed licking.

2021Wulfram Gerstner, Samuel Pavio Muscinelli, Tilo Schwalger

While most models of randomly connected neural networks assume single-neuron models with simple dynamics, neurons in the brain exhibit complex intrinsic dynamics over multiple timescales. We analyze how the dynamical properties of single neurons and recurrent connections interact to shape the effective dynamics in large randomly connected networks. A novel dynamical mean-field theory for strongly connected networks of multi-dimensional rate neurons shows that the power spectrum of the network activity in the chaotic phase emerges from a nonlinear sharpening of the frequency response function of single neurons. For the case of two-dimensional rate neurons with strong adaptation, we find that the network exhibits a state of resonant chaos, characterized by robust, narrow-band stochastic oscillations. The coherence of stochastic oscillations is maximal at the onset of chaos and their correlation time scales with the adaptation timescale of single units. Surprisingly, the resonance frequency can be predicted from the properties of isolated neurons, even in the presence of heterogeneity in the adaptation parameters. In the presence of these internally-generated chaotic fluctuations, the transmission of weak, low-frequency signals is strongly enhanced by adaptation, whereas signal transmission is not influenced by adaptation in the non-chaotic regime. Our theoretical framework can be applied to other mechanisms at the level of single neurons, such as synaptic filtering, refractoriness or spike synchronization. These results advance our understanding of the interaction between the dynamics of single units and recurrent connectivity, which is a fundamental step toward the description of biologically realistic neural networks. Author summary Biological neural networks are formed by a large number of neurons whose interactions can be extremely complex. Such systems have been successfully studied using random network models, in which the interactions among neurons are assumed to be random. However, the dynamics of single units are usually described using over-simplified models, which might not capture several salient features of real neurons. Here, we show how accounting for richer single-neuron dynamics results in shaping the network dynamics and determines which signals are better transmitted. We focus on adaptation, an important mechanism present in biological neurons that consists in the decrease of their firing rate in response to a sustained stimulus. Our mean-field approach reveals that the presence of adaptation shifts the network into a previously unreported dynamical regime, that we term resonant chaos, in which chaotic activity has a strong oscillatory component. Moreover, we show that this regime is advantageous for the transmission of low-frequency signals. Our work bridges the microscopic dynamics (single neurons) to the macroscopic dynamics (network), and shows how the global signal-transmission properties of the network can be controlled by acting on the single-neuron dynamics. These results paves the way for further developments that include more complex neural mechanisms, and considerably advance our understanding of realistic neural networks.

2019Chiara Gastaldi, Wulfram Gerstner, Samuel Pavio Muscinelli

Synaptic changes induced by neural activity need to be consolidated to maintain memory over a timescale of hours. In experiments, synaptic consolidation can be induced by repeating a stimulation protocol several times and the effectiveness of consolidation depends crucially on the repetition frequency of the stimulations. We address the question: is there an understandable reason why induction protocols with repetitions at some frequency work better than sustained protocols-even though the accumulated stimulation strength might be exactly the same in both cases? In real synapses, plasticity occurs on multiple time scales from seconds (induction), to several minutes (early phase of long-term potentiation) to hours and days (late phase of synaptic consolidation). We use a simplified mathematical model of just two times scales to elucidate the above question in a purified setting. Our mathematical results show that, even in such a simple model, the repetition frequency of stimulation plays an important role for the successful induction, and stabilization, of potentiation.

Personnes menant des recherches similaires (115)

, , , , , , , , ,