This course introduces students to continuous, nonlinear optimization. We study the theory of optimization with continuous variables (with full proofs), and we analyze and implement important algorithms to solve constrained and unconstrained problems.
We develop, analyze and implement numerical algorithms to solve optimization problems of the form: min f(x) where x is a point on a smooth manifold. To this end, we first study differential and Riemannian geometry (with a focus dictated by pragmatic concerns). We also discuss several applications.
This course is in the form of a reading course / working group. We will focus on some mathematical aspects of the theory of neural networks, including universal approximation theorems, connections to ODEs and PDEs, optimiza-tion algorithms for NN training and their convergence.