Skip to main content
Graph
Search
fr
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Advanced Information Theory: Random Binning
Graph Chatbot
Related lectures (27)
Previous
Page 1 of 3
Next
Compression: Prefix-Free Codes
Explains prefix-free codes for efficient data compression and the significance of uniquely decodable codes.
Lecture: Shannon
Covers the basics of information theory, focusing on Shannon's setting and channel transmission.
Compression: Kraft Inequality
Explains compression and Kraft inequality in codes and sequences.
Entropy and Data Compression: Huffman Coding Techniques
Discusses entropy, data compression, and Huffman coding techniques, emphasizing their applications in optimizing codeword lengths and understanding conditional entropy.
Entropy and Algorithms: Applications in Sorting and Weighing
Covers the application of entropy in algorithms, focusing on sorting and decision-making strategies.
Variational Formulation: Information Measures
Explores variational formulation for measuring information content and divergence between probability distributions.
Coding Theorem: Proof and Properties
Covers the proof and properties of the coding theorem, focusing on maximizing the properties of lx and the achievable rate.
Uniform Integrability and Convergence
Explores uniform integrability, convergence theorems, and the importance of bounded sequences in understanding the convergence of random variables.
Information Measures: Entropy and Information Theory
Explains how entropy measures uncertainty in a system based on possible outcomes.
Information Theory: Source Coding, Cryptography, Channel Coding
Covers source coding, cryptography, and channel coding in communication systems, exploring entropy, codes, error channels, and future related courses.