Discusses entropy, isentropic transformations, and the Clausius-Kelvin prohibitions, as well as the Carnot cycle and the efficiency of machines with two heat sources.
Covers information measures like entropy, Kullback-Leibler divergence, and data processing inequality, along with probability kernels and mutual information.