Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
This lecture provides a comprehensive review of linear algebra concepts essential for understanding convex optimization. Topics covered include vector norms, eigenvalue decomposition, singular value decomposition, matrix norms, and properties of matrices such as trace and determinant. The lecture also delves into convex functions, inequalities involving vector norms, and the rule of Sarrus for 2x2 and 3x3 matrices. Additionally, it explores the concepts of positive semidefinite and positive definite matrices, principal minors, and the square root of positive semidefinite matrices. The instructor, Bahar Taskesen, presents various examples and proofs to solidify the understanding of these fundamental linear algebra concepts.
This video is available exclusively on Mediaspace for a restricted audience. Please log in to MediaSpace to access it if you have the necessary permissions.
Watch on Mediaspace