Source Coding: CompressionCovers entropy, source coding, encoding maps, decodability, prefix-free codes, and Kraft-McMillan's inequality.
Lecture: ShannonCovers the basics of information theory, focusing on Shannon's setting and channel transmission.
Dependence and CorrelationExplores dependence, correlation, and conditional expectations in probability and statistics, highlighting their significance and limitations.
Probability and StatisticsCovers moments, variance, and expected values in probability and statistics, including the distribution of tokens in a product.
Channel Coding and BICM (LLRs)Explores channel coding, BICM, and LLRs in wireless communication systems, emphasizing the importance of error detection and correction.