Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of GraphSearch.
A correlation-based (Hebbian'') learning rule at the spike level is formulated, mathematically analyzed, and compared with learning in a firing-rate description. As for spike coding, we take advantage of a
learning window'' that describes the effect of timing of pre- and postsynaptic spikes on synaptic weights. A differential equation for the learning dynamics is derived under the assumption that the time scales of learning and spiking dynamics can be separated. Formation of structured synapses is analyzed for a Poissonian neuron model which receives time-dependent stochastic input. It is shown that correlations between input and output spikes tend to stabilize structure formation. With an appropriate choice of parameters, learning leads to an intrinsic normalization of the average weight and the output firing rates. Noise generates diffusion-like spreading of synaptic weights.
Loading
Loading
Loading
Loading
Loading
Nicolas Frémaux, Wulfram Gerstner, Walter Senn, Eleni Vasilaki
anti-Hebbian' learning in a model with slowly varying firing rates. For spike-based learning, a strict distinction between Hebbian and
anti-Hebbian' rules is questionable since learning is driven by correlations on the time scale of the learning window. The correlations between presynaptic and postsynaptic firing are evaluated for a piecewise-linear Poisson model and for a noisy spiking neuron model with refractoriness. Whereas a negative integral over the learning window leads to intrinsic rate stabilization, the positive part of the learning window picks up spatial and temporal correlations in the input.Wulfram Gerstner, Jean-Pascal Théodor Pfister, Taro Toyoizumi