This lecture covers the mathematical aspects of parametric inference for regular models, focusing on estimating and deciding parameters, performance of methods, and handling model misspecification. It also delves into functions of random variables, asymptotic approximations, and various types of convergence in probability and distribution. The instructor discusses the Ky-Fan definition, the Delta Method, Slutsky's Theorem, and convergence rates. The lecture concludes with the Central Limit Theorem, Berry-Esseen theorem, and stronger notions of convergence like Scheffé's Theorem and the Cramér-Wold Device.