Information Measures: Part 2Covers information measures like entropy, joint entropy, and mutual information in information theory and data processing.
Information MeasuresCovers information measures like entropy, Kullback-Leibler divergence, and data processing inequality, along with probability kernels and mutual information.
Neural networks under SGDExplores the optimization of neural networks using Stochastic Gradient Descent (SGD) and the concept of dual risk versus empirical risk.
Decision Trees: ClassificationExplores decision trees for classification, entropy, information gain, one-hot encoding, hyperparameter optimization, and random forests.