Entropy in Neuroscience and EcologyDelves into entropy in neuroscience data and ecology, exploring the representation of sensory information and the diversity of biological populations.
Quantifying Entropy in Neuroscience DataDelves into quantifying entropy in neuroscience data, exploring how neuron activity represents sensory information and the implications of binary digit sequences.
Information MeasuresCovers variational representation and information measures such as entropy and mutual information.
Lecture: ShannonCovers the basics of information theory, focusing on Shannon's setting and channel transmission.
Second Principle: EntropyExplores the second principle of thermodynamics, emphasizing entropy's role in system evolution and equilibrium.
Information Measures: Part 1Covers information measures, tail bounds, subgaussions, subpossion, independence proof, and conditional expectation.
Information Theory and CodingCovers source coding, Kraft's inequality, mutual information, Huffman procedure, and properties of tropical sequences.