This lecture covers the theory and algorithms of minimax optimization, focusing on weak and strong duality, saddle points, necessary and sufficient conditions for strong duality, Slater's qualification condition, numerical e-accuracy, primal-dual gap functions, optimality conditions, and practical performance of optimization algorithms. The instructor discusses the application of gradient descent-ascent (GDA) and its performance on simple problems, as well as the challenges posed by nonconvex-concave and nonconvex-nonconcave problems.