Explores gradient descent methods for smooth convex and non-convex problems, covering iterative strategies, convergence rates, and challenges in optimization.
Covers the concept of gradient descent in scalar cases, focusing on finding the minimum of a function by iteratively moving in the direction of the negative gradient.
Introduces Newton's method for solving non-linear equations iteratively, highlighting its fast convergence but also its potential failure to converge in some cases.