Skip to main content
Graph
Search
fr
|
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Information Theory and Coding
Graph Chatbot
Related lectures (30)
Previous
Page 1 of 3
Next
Source Coding and Prefix-Free Codes
Covers source coding, injective codes, prefix-free codes, and Kraft's inequality.
Information Theory: Source Coding & Channel Coding
Covers the fundamentals of information theory, focusing on source coding and channel coding.
Variational Formulation: Information Measures
Explores variational formulation for measuring information content and divergence between probability distributions.
Mutual Information and Entropy
Explores mutual information and entropy calculation between random variables.
Lecture: Shannon
Covers the basics of information theory, focusing on Shannon's setting and channel transmission.
Information Measures
Covers information measures like entropy, Kullback-Leibler divergence, and data processing inequality, along with probability kernels and mutual information.
Information Measures: Entropy and Information Theory
Explains how entropy measures uncertainty in a system based on possible outcomes.
Information Theory and Coding
Covers expected code word length, Huffman procedure, and entropy in coding theory.
Information Theory: Channel Capacity and Convex Functions
Explores channel capacity and convex functions in information theory, emphasizing the importance of convexity.
Information Theory: Review and Mutual Information
Reviews information measures like entropy and introduces mutual information as a measure of information between random variables.