This lecture covers the variational formulation of information measures, focusing on Donsker-Vorodhon's approach. It delves into the proof and the mathematical expressions for measuring information content and divergence between probability distributions. The lecture also discusses entropy, mutual information, and the concept of divergence in detail.