Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
This lecture by the instructor covers the evolution from sparse modeling to sparse communication, focusing on the use of sparse transformations in neural networks for natural language processing tasks. The talk explores various sparse regularization techniques, such as sparse neuron activations and structured sparsity, and their applications in machine-human communication. It delves into the concepts of sparsemax, entmax, and mixed distributions, highlighting their role in bridging the gap between discrete and continuous models. The lecture also discusses the implications of sparse communication for explainability and emergent communication in machine learning.