Résumé
In information theory, the conditional entropy quantifies the amount of information needed to describe the outcome of a random variable given that the value of another random variable is known. Here, information is measured in shannons, nats, or hartleys. The entropy of conditioned on is written as . The conditional entropy of given is defined as where and denote the support sets of and . Note: Here, the convention is that the expression should be treated as being equal to zero. This is because . Intuitively, notice that by definition of expected value and of conditional probability, can be written as , where is defined as . One can think of as associating each pair with a quantity measuring the information content of given . This quantity is directly related to the amount of information needed to describe the event given . Hence by computing the expected value of over all pairs of values , the conditional entropy measures how much information, on average, the variable encodes about . Let be the entropy of the discrete random variable conditioned on the discrete random variable taking a certain value . Denote the support sets of and by and . Let have probability mass function . The unconditional entropy of is calculated as , i.e. where is the information content of the outcome of taking the value . The entropy of conditioned on taking the value is defined analogously by conditional expectation: Note that is the result of averaging over all possible values that may take. Also, if the above sum is taken over a sample , the expected value is known in some domains as equivocation. Given discrete random variables with image and with image , the conditional entropy of given is defined as the weighted sum of for each possible value of , using as the weights: if and only if the value of is completely determined by the value of . Conversely, if and only if and are independent random variables. Assume that the combined system determined by two random variables and has joint entropy , that is, we need bits of information on average to describe its exact state.
À propos de ce résultat
Cette page est générée automatiquement et peut contenir des informations qui ne sont pas correctes, complètes, à jour ou pertinentes par rapport à votre recherche. Il en va de même pour toutes les autres pages de ce site. Veillez à vérifier les informations auprès des sources officielles de l'EPFL.