Lecture: ShannonCovers the basics of information theory, focusing on Shannon's setting and channel transmission.
Entropy and Compression IExplores entropy theory, compression without loss, and the efficiency of the Shannon-Fano algorithm in data compression.
Data Compression: Entropy DefinitionExplores data compression through entropy definition, types, and practical examples, illustrating its role in efficient information storage and transmission.
Probability and StatisticsDelves into probability, statistics, paradoxes, and random variables, showcasing their real-world applications and properties.
Continuous Random VariablesExplores continuous random variables, density functions, joint variables, independence, and conditional densities.