Rényi entropyIn information theory, the Rényi entropy is a quantity that generalizes various notions of entropy, including Hartley entropy, Shannon entropy, collision entropy, and min-entropy. The Rényi entropy is named after Alfréd Rényi, who looked for the most general way to quantify information while preserving additivity for independent events. In the context of fractal dimension estimation, the Rényi entropy forms the basis of the concept of generalized dimensions. The Rényi entropy is important in ecology and statistics as index of diversity.
Mutual informationIn probability theory and information theory, the mutual information (MI) of two random variables is a measure of the mutual dependence between the two variables. More specifically, it quantifies the "amount of information" (in units such as shannons (bits), nats or hartleys) obtained about one random variable by observing the other random variable. The concept of mutual information is intimately linked to that of entropy of a random variable, a fundamental notion in information theory that quantifies the expected "amount of information" held in a random variable.
Gibbs' inequalityIn information theory, Gibbs' inequality is a statement about the information entropy of a discrete probability distribution. Several other bounds on the entropy of probability distributions are derived from Gibbs' inequality, including Fano's inequality. It was first presented by J. Willard Gibbs in the 19th century. Suppose that is a discrete probability distribution. Then for any other probability distribution the following inequality between positive quantities (since pi and qi are between zero and one) holds: with equality if and only if for all i.
Inequality (mathematics)In mathematics, an inequality is a relation which makes a non-equal comparison between two numbers or other mathematical expressions. It is used most often to compare two numbers on the number line by their size. There are several different notations used to represent different kinds of inequalities: The notation a < b means that a is less than b. The notation a > b means that a is greater than b. In either case, a is not equal to b. These relations are known as strict inequalities, meaning that a is strictly less than or strictly greater than b.
Quotient space (linear algebra)In linear algebra, the quotient of a vector space by a subspace is a vector space obtained by "collapsing" to zero. The space obtained is called a quotient space and is denoted (read " mod " or " by "). Formally, the construction is as follows. Let be a vector space over a field , and let be a subspace of . We define an equivalence relation on by stating that if . That is, is related to if one can be obtained from the other by adding an element of .
Peer supportPeer support occurs when people provide knowledge, experience, emotional, social or practical help to each other. It commonly refers to an initiative consisting of trained supporters (although it can be provided by peers without training), and can take a number of forms such as peer mentoring, reflective listening (reflecting content and/or feelings), or counseling. Peer support is also used to refer to initiatives where colleagues, members of self-help organizations and others meet, in person or online, as equals to give each other connection and support on a reciprocal basis.
Coherence lengthIn physics, coherence length is the propagation distance over which a coherent wave (e.g. an electromagnetic wave) maintains a specified degree of coherence. Wave interference is strong when the paths taken by all of the interfering waves differ by less than the coherence length. A wave with a longer coherence length is closer to a perfect sinusoidal wave. Coherence length is important in holography and telecommunications engineering. This article focuses on the coherence of classical electromagnetic fields.
Chebyshev's inequalityIn probability theory, Chebyshev's inequality (also called the Bienaymé–Chebyshev inequality) guarantees that, for a wide class of probability distributions, no more than a certain fraction of values can be more than a certain distance from the mean. Specifically, no more than 1/k2 of the distribution's values can be k or more standard deviations away from the mean (or equivalently, at least 1 − 1/k2 of the distribution's values are less than k standard deviations away from the mean).
PresentThe present is the period of time that is occurring now. The present is contrasted with the past, the period of time that has already occurred, and the future, the period of time that has yet to occur. It is sometimes represented as a hyperplane in space-time, typically called "now", although modern physics demonstrates that such a hyperplane cannot be defined uniquely for observers in relative motion. The present may also be viewed as a duration.
Hartley functionThe Hartley function is a measure of uncertainty, introduced by Ralph Hartley in 1928. If a sample from a finite set A uniformly at random is picked, the information revealed after the outcome is known is given by the Hartley function where denotes the cardinality of A. If the base of the logarithm is 2, then the unit of uncertainty is the shannon (more commonly known as bit). If it is the natural logarithm, then the unit is the nat. Hartley used a base-ten logarithm, and with this base, the unit of information is called the hartley (aka ban or dit) in his honor.