Discusses entropy, data compression, and Huffman coding techniques, emphasizing their applications in optimizing codeword lengths and understanding conditional entropy.
Covers information measures like entropy, Kullback-Leibler divergence, and data processing inequality, along with probability kernels and mutual information.