Skip to main content
Graph
Search
fr
|
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Orthogonal Projections: Gram-Schmidt Method
Graph Chatbot
Related lectures (25)
Previous
Page 1 of 3
Next
Linear Regression: Absence or Presence of Covariates
Explores linear regression with and without covariates, covering models captured by independent distributions and tools like subspaces and orthogonal projections.
Eigenvalues and Eigenvectors Decomposition
Covers the decomposition of a matrix into its eigenvalues and eigenvectors, the orthogonality of eigenvectors, and the normalization of vectors.
Linear Algebra: Orthogonal Projection and QR Factorization
Explores Gram-Schmidt process, orthogonal projection, QR factorization, and least squares solutions for linear systems.
Linear Regression: Least Squares Method
Explains the method of least squares in linear regression to find the best-fitting line to a set of data points.
QR Factorization: Least Squares System Resolution
Covers the QR factorization method applied to solving a system of linear equations in the least squares sense.
Orthogonal Projections in Linear Algebra
Explores orthogonal projections, orthonormal bases, and QR factorization in linear algebra.
Linear Regression Testing
Explores least squares in linear regression, hypothesis testing, outliers, and model assumptions.
Eigenvalues and Eigenvectors
Explores eigenvalues, eigenvectors, and methods for solving linear systems with a focus on rounding errors and preconditioning matrices.
Subspaces, Spectra, and Projections
Explores subspaces, spectra, and projections in linear algebra, including symmetric matrices and orthogonal projections.
Least Squares Solutions
Explains the concept of least squares solutions and their application in finding the closest solution to a system of equations.