Interpretation of EntropyExplores the concept of entropy expressed in bits and its relation to probability distributions, focusing on information gain and loss in various scenarios.
Continuous Random VariablesExplores continuous random variables, density functions, joint variables, independence, and conditional densities.
Lecture: ShannonCovers the basics of information theory, focusing on Shannon's setting and channel transmission.
Probability and StatisticsCovers Simpson's paradox, probability distributions, and real-life examples in probability and statistics.
Entropy in Neuroscience and EcologyDelves into entropy in neuroscience data and ecology, exploring the representation of sensory information and the diversity of biological populations.