This lecture covers a review of information measures, focusing on entropy and mutual information. It starts by explaining how to model information using tools from probability theory. Then, it explores expressing fundamental limits using information measures as functions of probability distributions, with examples like lossless compression. The lecture delves into the concept of entropy, discussing the number of bits needed to represent a discrete random variable and its lower bound by entropy. Finally, it introduces mutual information, defining it as the measure of the amount of information that one random variable contains about another random variable.