Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
This lecture delves into the concept of consistency in maximum likelihood estimation (MLE). The instructor explores the mathematical reasoning behind the sensible estimators provided by likelihood and the guarantees they offer. By examining entropy and Kullback-Leibler divergence, the lecture aims to answer questions about the MLE's consistency and its approach to reasonable Mean Squared Error (MSE) performance.