This lecture covers the introduction to proximal operators, proximal gradient methods, linear minimization oracles, and the conditional gradient method for constrained optimization. It delves into algorithms like the proximal-gradient scheme (ISTA) and the fast proximal-gradient scheme (FISTA). The convergence of these methods in both convex and non-convex cases is discussed, along with the complexities involved. Additionally, the lecture explores stochastic convex composite minimization, the stochastic proximal gradient method, and convergence analysis under various assumptions. Examples such as composite minimization in the non-convex case and phase retrieval are presented to illustrate the practical applications of the discussed methods.