Skip to main content
Graph
Search
fr
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Information Theory and Coding
Graph Chatbot
Related lectures (30)
Previous
Page 1 of 3
Next
Information Theory and Coding
Covers source coding, Kraft's inequality, mutual information, Huffman procedure, and properties of tropical sequences.
Lecture: Shannon
Covers the basics of information theory, focusing on Shannon's setting and channel transmission.
Source Coding and Prefix-Free Codes
Covers source coding, injective codes, prefix-free codes, and Kraft's inequality.
Information Theory: Source Coding, Cryptography, Channel Coding
Covers source coding, cryptography, and channel coding in communication systems, exploring entropy, codes, error channels, and future related courses.
Compression: Prefix-Free Codes
Explains prefix-free codes for efficient data compression and the significance of uniquely decodable codes.
Information Theory: Channel Capacity and Convex Functions
Explores channel capacity and convex functions in information theory, emphasizing the importance of convexity.
Information Theory: Entropy and Information Processing
Explores entropy in information theory and its role in data processing and probability distributions.
Entropy and Data Compression: Huffman Coding Techniques
Discusses entropy, data compression, and Huffman coding techniques, emphasizing their applications in optimizing codeword lengths and understanding conditional entropy.
Information Measures: Entropy and Information Theory
Explains how entropy measures uncertainty in a system based on possible outcomes.
Information Theory Basics
Introduces information theory basics, including entropy, independence, and binary entropy function.