This lecture covers the optimization of functions under constraints, focusing on minimizing costs. Topics include subdifferential definitions, subgradient methods, convexity, and iterative optimization. Examples such as maximum likelihood estimation, least-squares estimation, and ridge regression are discussed. The lecture also delves into gradient descent methods, step-size selection, smooth unconstrained convex minimization, and the convergence rate of gradient descent. Geometric interpretations, non-convex minimization, and the necessity of non-convex optimization are explored, along with the geometric interpretation of stationarity and assumptions in the gradient method.