This lecture covers Quasi-Newton methods, which replace the Hessian matrix with an approximation to provide descent directions when second derivatives are not available or too expensive to compute. It introduces the Davidon-Fletcher-Powell (DFP) method and the Broyden-Fletcher-Goldfarb-Shanno (BFGS) algorithm, explaining how they estimate second derivatives on-the-fly and update the approximation matrix. The lecture concludes by comparing these methods with Gradient Descent and Newton's Method, highlighting the advantages of Quasi-Newton methods for unconstrained optimization.
This video is available exclusively on Mediaspace for a restricted audience. Please log in to MediaSpace to access it if you have the necessary permissions.
Watch on Mediaspace