Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
Over the course of a lifetime, the human brain acquires an astonishing amount of semantic knowledge and autobiographical memories, often with an imprinting strong enough to allow detailed information to be recalled many years after the initial learning experience took place. The formation of such long-lasting memories is known to primarily involve cortex, where it is accompanied by a wave of synaptic growth, pruning, and fine-tuning that stretches across several nights of sleep. This process, broadly referred to as consolidation, gradually stabilizes labile information and moves it into permanent storage. It has a profound impact on connectivity and cognitive function, especially during development. Though extensively studied in terms of behavior and neuroanatomy, it is still unclear how this interplay between structural adaptation and long-term memory consolidation can be explained from a theoretical and computational perspective.In this thesis, we take a top-down approach to develop a mathematical model of consolidation and pruning within the context of recurrent neural networks, by combining recent techniques from the fields of optimization, machine learning, and statistics. The first part of the thesis treats the problem of maximally noise-robust memory without synaptic resource constraints. Using kernel methods, we derive a compact description of networks with optimal weight configuration. This unifies many of the classical memory models under a common mathematical framework, and formalizes the relationship between active dendritic processing on the single-neuron level, and the storage capacity of the circuit as a whole.In the second part of the thesis, we treat the problem of maximal memory robustness under conditions of sparse connectivity. We combine our unconstrained model with an implicit regularization, by endowing the network with bi- and tri-partite synapses, instead of the usual scalar weights. This allows us to derive a simple synaptic learning rule that simultaneously consolidates memories and prunes weights, while incorporating memory replay, multiplicative homeostatic scaling, and weight-dependent plasticity. We also use the synapse model to derive scaling properties of intrinsic synaptic noise, which we test in a meta-analysis of experimental data on dendritic spine dynamics.In the concluding sections, we briefly discuss the implication of our results with regards to current memory-inspired machine learning methods, the function of sleep, and the environmental effects on structural plasticity in development.
Wulfram Gerstner, Alireza Modirshanechi, Sophia Becker
Henry Markram, Rodrigo de Campos Perin