Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
This lecture covers the definitions of non-negative definite and positive definite matrices, as well as the covariance matrices. It explains how to determine if a matrix is non-negative definite based on quadratic form and spectral definitions. The lecture also discusses the covariance matrix of a random vector, encoding variances and covariances. Additionally, it explores the relationship between non-negative definite matrices and covariance matrices, showing that a real symmetric matrix is non-negative definite if and only if it is the covariance matrix of some random variable. Principal Component Analysis is introduced as a method for optimal linear dimension reduction, emphasizing the importance of choosing the top eigenvectors of the covariance matrix.