Skip to main content
Graph
Search
fr
|
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Source Coding Theorems: Entropy and Source Models
Graph Chatbot
Related lectures (26)
Previous
Page 2 of 3
Next
Mutual Information: Understanding Random Variables
Explores mutual information, quantifying relationships between random variables and measuring information gain and statistical dependence.
Information Theory: Source Coding & Channel Coding
Covers the fundamentals of information theory, focusing on source coding and channel coding.
Probability Distributions in Environmental Studies
Explores probability distributions for random variables in air pollution and climate change studies, covering descriptive and inferential statistics.
Entropy Bounds: Conditional Entropy Theorems
Explores entropy bounds, conditional entropy theorems, and the chain rule for entropies, illustrating their application through examples.
Data Compression and Shannon-Fano Algorithm
Explores the Shannon-Fano algorithm for data compression and its efficiency in creating unique binary codes for letters.
Conditional Entropy: Huffman Coding
Explores conditional entropy and Huffman coding for efficient data compression techniques.
Random Variables and Expected Value
Introduces random variables, probability distributions, and expected values through practical examples.
Data Compression: Entropy Definition
Explores data compression through entropy definition, types, and practical examples, illustrating its role in efficient information storage and transmission.
Entropy and Information Theory
Explores entropy, uncertainty, coding theory, and data compression applications.
Information Theory Basics
Introduces information theory basics, including entropy, independence, and binary entropy function.