Skip to main content
Graph
Search
fr
|
en
Switch to dark mode
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Unsupervised learning: Young-Eckart-Mirsky theorem and intro to PCA
Graph Chatbot
Related lectures (30)
Previous
Page 1 of 3
Next
Unsupervised Learning: Clustering & Dimensionality Reduction
Introduces unsupervised learning through clustering with K-means and dimensionality reduction using PCA, along with practical examples.
Unsupervised Learning: Dimensionality Reduction and Clustering
Covers unsupervised learning, focusing on dimensionality reduction and clustering, explaining how it helps find patterns in data without labels.
Dimensionality Reduction: PCA and LDA
Covers dimensionality reduction techniques like PCA and LDA, clustering methods, density estimation, and data representation.
Clustering & Density Estimation
Covers dimensionality reduction, PCA, clustering techniques, and density estimation methods.
Unsupervised Learning: Principal Component Analysis
Covers unsupervised learning with a focus on Principal Component Analysis and the Singular Value Decomposition.
Dimensionality Reduction: PCA & t-SNE
Explores PCA and t-SNE for reducing dimensions and visualizing high-dimensional data effectively.
Clustering & Density Estimation
Covers dimensionality reduction, clustering, and density estimation techniques, including PCA, K-means, GMM, and Mean Shift.
Clustering & Density Estimation
Covers clustering, PCA, LDA, K-means, GMM, KDE, and Mean Shift algorithms for density estimation and clustering.
Principal Component Analysis: Dimensionality Reduction
Covers Principal Component Analysis for dimensionality reduction, exploring its applications, limitations, and importance of choosing the right components.
Clustering: Theory and Practice
Covers the theory and practice of clustering algorithms, including PCA, K-means, Fisher LDA, spectral clustering, and dimensionality reduction.