Skip to main content
Graph
Search
fr
|
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Information Theory: Basics and Applications
Graph Chatbot
Related lectures (28)
Previous
Page 1 of 3
Next
Variational Formulation: Information Measures
Explores variational formulation for measuring information content and divergence between probability distributions.
Information Theory: Review and Mutual Information
Reviews information measures like entropy and introduces mutual information as a measure of information between random variables.
Information in Networked Systems: Functional Representation and Data Compression
Explores traditional information theory, data compression, data transmission, and functional representation lemmas in networked systems.
Data Compression and Shannon-Fano Algorithm
Explores the Shannon-Fano algorithm for data compression and its efficiency in creating unique binary codes for letters.
Information Measures
Covers information measures like entropy, Kullback-Leibler divergence, and data processing inequality, along with probability kernels and mutual information.
Lecture: Shannon
Covers the basics of information theory, focusing on Shannon's setting and channel transmission.
Mutual Information and Entropy
Explores mutual information and entropy calculation between random variables.
Achievable Rate & Capacity
Explores achievable rate, channel capacity, spectral efficiency, and fading channels in wireless communication systems.
Data Compression and Entropy 2: Entropy as 'Question Game'
Explores entropy as a 'question game' to guess letters efficiently and its relation to data compression.
Compression: Prefix-Free Codes
Explains prefix-free codes for efficient data compression and the significance of uniquely decodable codes.