Skip to main content
Graph
Search
fr
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Principal Component Analysis: Dimensionality Reduction
Graph Chatbot
Related lectures (31)
Previous
Page 3 of 4
Next
PCA: Interactive class
On PCA includes interactive exercises and emphasizes minimizing information loss.
Clustering: Unsupervised Learning
Explores dimensionality reduction, clustering algorithms, and the state of machine learning.
Clustering: K-means & LDA
Covers clustering using K-means and LDA, PCA, K-means properties, Fisher LDA, and spectral clustering.
Unsupervised learning: Young-Eckart-Mirsky theorem and intro to PCA
Introduces the Young-Eckart-Mirsky theorem and PCA for unsupervised learning and data visualization.
Introduction to Machine Learning: Linear Models
Introduces linear models for supervised learning, covering overfitting, regularization, and kernels, with applications in machine learning tasks.
Principal Component Analysis: Dimension Reduction
Explores Principal Component Analysis for dimension reduction in datasets and its implications for supervised learning algorithms.
PCA: Key Concepts
Covers the key concepts of PCA, including reducing data dimensionality and extracting features, with practical exercises.
Machine Learning Fundamentals
Introduces fundamental machine learning concepts, covering regression, classification, dimensionality reduction, and deep generative models.
Principal Component Analysis: Dimensionality Reduction
Explores Principal Component Analysis for dimensionality reduction in machine learning, showcasing its feature extraction and data preprocessing capabilities.
Unsupervised Learning: Principal Component Analysis
Covers unsupervised learning with a focus on Principal Component Analysis and the Singular Value Decomposition.