Résumé
In information theory, the conditional entropy quantifies the amount of information needed to describe the outcome of a random variable given that the value of another random variable is known. Here, information is measured in shannons, nats, or hartleys. The entropy of conditioned on is written as . The conditional entropy of given is defined as where and denote the support sets of and . Note: Here, the convention is that the expression should be treated as being equal to zero. This is because . Intuitively, notice that by definition of expected value and of conditional probability, can be written as , where is defined as . One can think of as associating each pair with a quantity measuring the information content of given . This quantity is directly related to the amount of information needed to describe the event given . Hence by computing the expected value of over all pairs of values , the conditional entropy measures how much information, on average, the variable encodes about . Let be the entropy of the discrete random variable conditioned on the discrete random variable taking a certain value . Denote the support sets of and by and . Let have probability mass function . The unconditional entropy of is calculated as , i.e. where is the information content of the outcome of taking the value . The entropy of conditioned on taking the value is defined analogously by conditional expectation: Note that is the result of averaging over all possible values that may take. Also, if the above sum is taken over a sample , the expected value is known in some domains as equivocation. Given discrete random variables with image and with image , the conditional entropy of given is defined as the weighted sum of for each possible value of , using as the weights: if and only if the value of is completely determined by the value of . Conversely, if and only if and are independent random variables. Assume that the combined system determined by two random variables and has joint entropy , that is, we need bits of information on average to describe its exact state.
À propos de ce résultat
Cette page est générée automatiquement et peut contenir des informations qui ne sont pas correctes, complètes, à jour ou pertinentes par rapport à votre recherche. Il en va de même pour toutes les autres pages de ce site. Veillez à vérifier les informations auprès des sources officielles de l'EPFL.
Publications associées (5)

Differential Entropy of the Conditional Expectation Under Additive Gaussian Noise

Michael Christoph Gastpar, Alper Köse, Ahmet Arda Atalik

The conditional mean is a fundamental and important quantity whose applications include the theories of estimation and rate-distortion. It is also notoriously difficult to work with. This paper establ
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC2022

Differential Entropy of the Conditional Expectation under Gaussian Noise

Michael Christoph Gastpar, Alper Köse, Ahmet Arda Atalik

This paper considers an additive Gaussian noise channel with arbitrarily distributed finite variance input signals. It studies the differential entropy of the minimum mean-square error (MMSE) estimato
IEEE2021

Orthogonality of Two-Dimensional Separations Based on Conditional Entropy

Hubert Girault, Mohammad Reza Pourhaghighi, Mohammad Karzand

A new approach to assess the orthogonality of two-dimensional (2-D) separation systems based on conditional entropy is developed. It considers the quantitative distribution of peaks in the entire sepa
American Chemical Society2011
Afficher plus
Personnes associées (2)
Concepts associés (12)
Entropie différentielle
Differential entropy (also referred to as continuous entropy) is a concept in information theory that began as an attempt by Claude Shannon to extend the idea of (Shannon) entropy, a measure of average (surprisal) of a random variable, to continuous probability distributions. Unfortunately, Shannon did not derive this formula, and rather just assumed it was the correct continuous analogue of discrete entropy, but it is not. The actual continuous version of discrete entropy is the limiting density of discrete points (LDDP).
Entropie conditionnelle
En théorie de l'information, l'entropie conditionnelle décrit la quantité d'information nécessaire pour connaitre le comportement d'une variable aléatoire , lorsque l'on connait exactement une variable aléatoire . On note l'entropie conditionnelle de sachant . On dit aussi parfois entropie de conditionnée par . Comme les autres entropies, elle se mesure généralement en bits. On peut introduire l'entropie conditionnelle de plusieurs façons, soit directement à partir des probabilités conditionnelles, soit en passant par l'entropie conjointe.
Entropie conjointe
vignette|Entropie conjointe. En théorie de l'information, l'entropie conjointe est une mesure d'entropie utilisée en théorie de l'information, qui mesure la quantité d'information contenue dans un système de deux variables aléatoires (ou plus de deux). Comme les autres entropies, l'entropie conjointe est mesurée en bits ou en nats, selon la base du logarithme utilisée. Si chaque paire d'états possibles des variables aléatoires ont une probabilité alors l'entropie conjointe de et est définie par : où est la fonction logarithme en base 2.
Afficher plus
Cours associés (7)
COM-621: Advanced Topics in Information Theory
The class will focus on information-theoretic progress of the last decade. Topics include: Network Information Theory ; Information Measures: definitions, properties, and applications to probabilistic
BIO-369: Randomness and information in biological data
Biology is becoming more and more a data science, as illustrated by the explosion of available genome sequences. This course aims to show how we can make sense of such data and harness it in order to
COM-102: Advanced information, computation, communication II
Text, sound, and images are examples of information sources stored in our computers and/or communicated over the Internet. How do we measure, compress, and protect the informatin they contain?
Afficher plus
Séances de cours associées (58)
Théorie des probabilités: Marginaux conjoints et causalité granger
Couvre les marginaux articulaires et la causalité de Granger dans la théorie des probabilités, en expliquant leurs implications dans la prédiction des résultats.
Entropie conditionnelle: Codage Huffman
Explore l'entropie conditionnelle et le codage Huffman pour des techniques de compression de données efficaces.
Renseignements quantitatifs
Explore l'opérateur CHSH, l'auto-test, les eigenstates et la quantification du hasard dans les systèmes quantiques.
Afficher plus