This lecture covers Optimal Control Problems (OCPs) focusing on the Calculus of Variations, Geometric Optimality Conditions, and the Principle of Optimality. It delves into the necessary conditions of optimality, the existence of optimal controls, and the performance criteria in OCPs. The lecture also discusses the numerical solutions of OCPs, including the Hamilton-Jacobi-Bellman equation, the Pontryagin's Maximum Principle, and the shooting algorithms for solving OCPs.