This lecture covers the basics of binary codes, including definitions, examples, and the concept of prefix-free binary codes. It then introduces Shannon's Theorem, stating that for any prefix-free binary code used to represent a given sequence, the entropy of the sequence is always less than or equal to the average code length. The lecture also explains Kraft's inequality and provides a detailed demonstration of Shannon's Theorem, showcasing the relationship between entropy and code length. Additional inequalities related to entropy and code length are discussed, highlighting the limits of lossless data compression.