Résumé
In the mathematical theory of probability, the entropy rate or source information rate of a stochastic process is, informally, the time density of the average information in a stochastic process. For stochastic processes with a countable index, the entropy rate is the limit of the joint entropy of members of the process divided by , as tends to infinity: when the limit exists. An alternative, related quantity is: For strongly stationary stochastic processes, . The entropy rate can be thought of as a general property of stochastic sources; this is the asymptotic equipartition property. The entropy rate may be used to estimate the complexity of stochastic processes. It is used in diverse applications ranging from characterizing the complexity of languages, blind source separation, through to optimizing quantizers and data compression algorithms. For example, a maximum entropy rate criterion may be used for feature selection in machine learning. Since a stochastic process defined by a Markov chain that is irreducible, aperiodic and positive recurrent has a stationary distribution, the entropy rate is independent of the initial distribution. For example, for such a Markov chain defined on a countable number of states, given the transition matrix , is given by: where is the asymptotic distribution of the chain. A simple consequence of this definition is that an i.i.d. stochastic process has an entropy rate that is the same as the entropy of any individual member of the process.
À propos de ce résultat
Cette page est générée automatiquement et peut contenir des informations qui ne sont pas correctes, complètes, à jour ou pertinentes par rapport à votre recherche. Il en va de même pour toutes les autres pages de ce site. Veillez à vérifier les informations auprès des sources officielles de l'EPFL.
Séances de cours associées (28)
Théorème de codage des sources
Explore le théorème de codage source, l'entropie, le codage Huffman et l'impact du conditionnement sur la réduction de l'entropie.
Entropie sous condition: Canal Bit Flipper
Explore l'entropie conditionnelle, le codage Huffman et la réduction de l'entropie à travers des exemples comme le canal Bit Flipper.
Théorie de l'information: Codage source
Couvre le codage source, les séquences typiques, la stationnarité et le codage efficace dans la théorie de l'information.
Afficher plus
Publications associées (26)