Skip to main content
Graph
Search
fr
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Canonical Correlation Analysis: Linear and Kernel CCA
Graph Chatbot
Related lectures (31)
Previous
Page 3 of 4
Next
Principal Component Analysis: Theory and Applications
Covers the theory and applications of Principal Component Analysis, focusing on dimension reduction and eigenvectors.
Neural Networks Recap: Activation Functions
Covers the basics of neural networks, activation functions, training, image processing, CNNs, regularization, and dimensionality reduction methods.
Diagonalization of Linear Transformations
Explains the diagonalization of linear transformations using eigenvectors and eigenvalues to form a diagonal matrix.
Unsupervised Learning: Dimensionality Reduction and Clustering
Covers unsupervised learning, focusing on dimensionality reduction and clustering, explaining how it helps find patterns in data without labels.
Principal Component Analysis: Introduction
Introduces Principal Component Analysis, focusing on maximizing variance in linear combinations to summarize data effectively.
Dimensionality Reduction: PCA and Autoencoders
Introduces artificial neural networks, CNNs, and dimensionality reduction using PCA and autoencoders.
Unsupervised Learning: Dimensionality Reduction
Explores unsupervised learning techniques for reducing dimensions in data, emphasizing PCA, LDA, and Kernel PCA.
Dimensionality Reduction: PCA & Autoencoders
Explores PCA, Autoencoders, and their applications in dimensionality reduction and data generation.
Non-Negative Definite Matrices and Covariance Matrices
Covers non-negative definite matrices, covariance matrices, and Principal Component Analysis for optimal dimension reduction.
Kernel Ridge Regression: Equivalent Formulations and Representer Theorem
Explores Kernel Ridge Regression, equivalent formulations, Representer Theorem, Kernel trick, and predicting with kernels.