Information Measures: Part 1Covers information measures, tail bounds, subgaussions, subpossion, independence proof, and conditional expectation.
Mutual Information: ContinuedExplores mutual information for quantifying statistical dependence between variables and inferring probability distributions from data.
Decision Trees: ClassificationExplores decision trees for classification, entropy, information gain, one-hot encoding, hyperparameter optimization, and random forests.
Interpretation of EntropyExplores the concept of entropy expressed in bits and its relation to probability distributions, focusing on information gain and loss in various scenarios.
Generalization ErrorDiscusses mutual information, data processing inequality, and properties related to leakage in discrete systems.
Information MeasuresCovers variational representation and information measures such as entropy and mutual information.