Lecture

Provable Benefits of Overparameterization in Model Compression

Description

This lecture discusses the provable benefits of overparameterization in model compression, focusing on the efficiency of deep neural networks and the process of model pruning. It covers the motivation behind efficient deep nets, the principles of model pruning, empirical investigations on CIFAR10, the relationship between overparameterization and double descent, and the theoretical setup for model compression. The main contribution lies in the distributional characterization of the model, enabling the study of model compression and demonstrating the benefits of overparameterization. The lecture also explores examples of linear models and random feature pruning, emphasizing the importance of retraining for improved performance.

About this result
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.