This lecture introduces the concept of entropy as the average number of questions needed to guess a randomly chosen letter in a sequence, providing both an approximate and formal definition. It explores the relationship between entropy and probability distributions, emphasizing the importance of understanding the formal definition of entropy and its applications in various contexts, such as stochastic processes. The lecture also touches upon the extension of entropy definitions to consider the probabilities of sequences of letters, beyond individual letter probabilities, highlighting the enduring relevance of the entropy formula in information theory.