Conditional entropyIn information theory, the conditional entropy quantifies the amount of information needed to describe the outcome of a random variable given that the value of another random variable is known. Here, information is measured in shannons, nats, or hartleys. The entropy of conditioned on is written as . The conditional entropy of given is defined as where and denote the support sets of and . Note: Here, the convention is that the expression should be treated as being equal to zero. This is because .
Rényi entropyIn information theory, the Rényi entropy is a quantity that generalizes various notions of entropy, including Hartley entropy, Shannon entropy, collision entropy, and min-entropy. The Rényi entropy is named after Alfréd Rényi, who looked for the most general way to quantify information while preserving additivity for independent events. In the context of fractal dimension estimation, the Rényi entropy forms the basis of the concept of generalized dimensions. The Rényi entropy is important in ecology and statistics as index of diversity.
Min-entropyThe min-entropy, in information theory, is the smallest of the Rényi family of entropies, corresponding to the most conservative way of measuring the unpredictability of a set of outcomes, as the negative logarithm of the probability of the most likely outcome. The various Rényi entropies are all equal for a uniform distribution, but measure the unpredictability of a nonuniform distribution in different ways.
Serum protein electrophoresisSerum protein electrophoresis (SPEP or SPE) is a laboratory test that examines specific proteins in the blood called globulins. The most common indications for a serum protein electrophoresis test are to diagnose or monitor multiple myeloma, a monoclonal gammopathy of uncertain significance (MGUS), or further investigate a discrepancy between a low albumin and a relatively high total protein. Unexplained bone pain, anemia, proteinuria, chronic kidney disease, and hypercalcemia are also signs of multiple myeloma, and indications for SPE.
Separation axiomIn topology and related fields of mathematics, there are several restrictions that one often makes on the kinds of topological spaces that one wishes to consider. Some of these restrictions are given by the separation axioms. These are sometimes called Tychonoff separation axioms, after Andrey Tychonoff. The separation axioms are not fundamental axioms like those of set theory, but rather defining properties which may be specified to distinguish certain types of topological spaces.
Normal spaceIn topology and related branches of mathematics, a normal space is a topological space X that satisfies Axiom T4: every two disjoint closed sets of X have disjoint open neighborhoods. A normal Hausdorff space is also called a T4 space. These conditions are examples of separation axioms and their further strengthenings define completely normal Hausdorff spaces, or T5 spaces, and perfectly normal Hausdorff spaces, or T6 spaces. A topological space X is a normal space if, given any disjoint closed sets E and F, there are neighbourhoods U of E and V of F that are also disjoint.
Gel electrophoresisGel electrophoresis is a method for separation and analysis of biomacromolecules (DNA, RNA, proteins, etc.) and their fragments, based on their size and charge. It is used in clinical chemistry to separate proteins by charge or size (IEF agarose, essentially size independent) and in biochemistry and molecular biology to separate a mixed population of DNA and RNA fragments by length, to estimate the size of DNA and RNA fragments or to separate proteins by charge.
Gel electrophoresis of proteinsProtein electrophoresis is a method for analysing the proteins in a fluid or an extract. The electrophoresis may be performed with a small volume of sample in a number of alternative ways with or without a supporting medium, namely agarose or polyacrylamide. Variants of gel electrophoresis include SDS-PAGE, free-flow electrophoresis, electrofocusing, isotachophoresis, affinity electrophoresis, immunoelectrophoresis, counterelectrophoresis, and capillary electrophoresis. Each variant has many subtypes with individual advantages and limitations.
Entropy (information theory)In information theory, the entropy of a random variable is the average level of "information", "surprise", or "uncertainty" inherent to the variable's possible outcomes. Given a discrete random variable , which takes values in the alphabet and is distributed according to : where denotes the sum over the variable's possible values. The choice of base for , the logarithm, varies for different applications. Base 2 gives the unit of bits (or "shannons"), while base e gives "natural units" nat, and base 10 gives units of "dits", "bans", or "hartleys".
Kolmogorov spaceIn topology and related branches of mathematics, a topological space X is a T0 space or Kolmogorov space (named after Andrey Kolmogorov) if for every pair of distinct points of X, at least one of them has a neighborhood not containing the other. In a T0 space, all points are topologically distinguishable. This condition, called the T0 condition, is the weakest of the separation axioms. Nearly all topological spaces normally studied in mathematics are T0 spaces. In particular, all T1 spaces, i.e.