Skip to main content
Graph
Search
fr
|
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
PCA: Directions of Largest Variance
Graph Chatbot
Related lectures (30)
Previous
Page 1 of 3
Next
Principal Component Analysis: Dimensionality Reduction
Covers Principal Component Analysis for dimensionality reduction, exploring its applications, limitations, and importance of choosing the right components.
Principal Component Analysis: Applications and Limitations
Explores the applications and limitations of Principal Component Analysis, including denoising, compression, and regression.
Principal Component Analysis: Olympic Medals & Image Compression
Explores PCA for predicting medals distribution and compressing face images.
Clustering Methods
Covers K-means, hierarchical, and DBSCAN clustering methods with practical examples.
Principal Component Analysis: Dimension Reduction
Covers Principal Component Analysis for dimension reduction in biological data, focusing on visualization and pattern identification.
Unsupervised Learning: Dimensionality Reduction and Clustering
Covers unsupervised learning, focusing on dimensionality reduction and clustering, explaining how it helps find patterns in data without labels.
Dimensionality Reduction: PCA & t-SNE
Explores PCA and t-SNE for reducing dimensions and visualizing high-dimensional data effectively.
Dimensionality Reduction: PCA and Autoencoders
Introduces artificial neural networks, CNNs, and dimensionality reduction using PCA and autoencoders.
Unsupervised Learning: Clustering & Dimensionality Reduction
Introduces unsupervised learning through clustering with K-means and dimensionality reduction using PCA, along with practical examples.
PCA: Key Concepts
Covers the key concepts of PCA, including reducing data dimensionality and extracting features, with practical exercises.