Skip to main content
Graph
Search
fr
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Maximum Entropy Modeling: Applications & Inference
Graph Chatbot
Related lectures (30)
Previous
Page 2 of 3
Next
Unsupervised Learning: PCA & K-means
Covers unsupervised learning with PCA and K-means for dimensionality reduction and data clustering.
Random Variables and Expected Value
Introduces random variables, probability distributions, and expected values through practical examples.
Information Theory: Review and Mutual Information
Reviews information measures like entropy and introduces mutual information as a measure of information between random variables.
Probability Distributions in Environmental Studies
Explores probability distributions for random variables in air pollution and climate change studies, covering descriptive and inferential statistics.
Mutual Information: Understanding Random Variables
Explores mutual information, quantifying relationships between random variables and measuring information gain and statistical dependence.
Probability and Statistics
Delves into probability, statistics, paradoxes, and random variables, showcasing their real-world applications and properties.
Generalization Error
Explores tail bounds, information bounds, and maximal leakage in the context of generalization error.
Principal Components: Properties & Applications
Explores principal components, covariance, correlation, choice, and applications in data analysis.
Textual Data Analysis: Classification & Dimensionality Reduction
Explores textual data classification, focusing on methods like Naive Bayes and dimensionality reduction techniques like Principal Component Analysis.
Dimensionality Reduction: PCA & LDA
Covers PCA and LDA for dimensionality reduction, explaining variance maximization, eigenvector problems, and the benefits of Kernel PCA for nonlinear data.