Skip to main content
Graph
Search
fr
|
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Data Compression and Entropy: Illustrating Entropy Properties
Graph Chatbot
Related lectures (25)
Previous
Page 2 of 3
Next
Information Measures: Entropy and Information Theory
Explains how entropy measures uncertainty in a system based on possible outcomes.
Data Compression and Entropy Interpretation
Explores the origins and interpretation of entropy, emphasizing its role in measuring disorder and information content in a system.
Data Compression and Entropy Definition
Explores the concept of entropy as the average number of questions needed to guess a randomly chosen letter in a sequence, emphasizing its enduring relevance in information theory.
Entropy and Disorder: Statistical Interpretation
Explores the statistical interpretation of entropy, showcasing how disorder increases with particle numbers.
Data Compression and Shannon-Fano Algorithm
Explores the Shannon-Fano algorithm for data compression and its efficiency in creating unique binary codes for letters.
Lecture: Shannon
Covers the basics of information theory, focusing on Shannon's setting and channel transmission.
Information Theory: Entropy and Information Processing
Explores entropy in information theory and its role in data processing and probability distributions.
Stationary Sources: Properties and Entropy
Explores stationary sources, entropy, regularity, and coding efficiency, including a challenging problem with billiard balls.
Entropy and Disorder: Statistical Interpretation
Explores the statistical interpretation of entropy through the Joule expansion example and the calculation of microstate multiplicity.
Sunny Rainy Source: Markov Model
Explores a first-order Markov model using a sunny-rainy source example, demonstrating how past events influence future outcomes.