Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
This lecture delves into the storage capacity of associative memory in networks of neurons, focusing on the learning and retention of multiple prototypes. By analyzing the interactions between patterns and neurons, the instructor explains the minimal conditions required for fixed points in the dynamics. The lecture explores the impact of too many prototypes on error rates and the proportional relationship between the number of stored prototypes and neurons. Additionally, it discusses the concept of memory load and the retrieval of memories without a centralized controller. The presentation also covers random walk models and standard deviations in the context of neuronal dynamics.