Mutual Information: ContinuedExplores mutual information for quantifying statistical dependence between variables and inferring probability distributions from data.
Lecture: ShannonCovers the basics of information theory, focusing on Shannon's setting and channel transmission.
Data Compression and Entropy DefinitionExplores the concept of entropy as the average number of questions needed to guess a randomly chosen letter in a sequence, emphasizing its enduring relevance in information theory.
Information Measures: Part 1Covers information measures, tail bounds, subgaussions, subpossion, independence proof, and conditional expectation.
Information Measures: Part 2Covers information measures like entropy, joint entropy, and mutual information in information theory and data processing.
Decision Trees: ClassificationExplores decision trees for classification, entropy, information gain, one-hot encoding, hyperparameter optimization, and random forests.
Information MeasuresCovers variational representation and information measures such as entropy and mutual information.