This lecture discusses the joint distribution of Gaussian random vectors, focusing on when Gaussian random vectors admit a joint PDF. The key criterion for a Gaussian random vector to have a joint PDF is that its covariance matrix must be positive definite. The lecture covers the case of a centered Gaussian vector with a diagonal covariance matrix, providing the general expression for the joint PDF. It then extends the discussion to Gaussian random vectors with a positive definite covariance matrix, emphasizing the importance of the spectral theorem in proving the existence of a joint PDF. The lecture concludes by showing how any centered Gaussian random vector can be expressed as a product of two random vectors, facilitating computations for correlated Gaussian random vectors.