This lecture covers the deficiency of smooth models, sparsity, compressive sensing, atomic norms, and non-smooth minimization using subgradient descent. It explores statistical learning motivations for non-smooth optimization, linear regression, and practical performance evaluation. The lecture delves into sparse signal models, compressible signals, and structured sparsity. It discusses gauge functions, atomic norms, and the Lasso optimization problem. Additionally, it examines the performance of the Lasso, multi-knapsack feasibility problems, and the selection of parameters in optimization problems. The lecture concludes with non-smooth unconstrained convex minimization, subdifferentials, stochastic subgradient methods, and composite convex minimization.