Mutual informationIn probability theory and information theory, the mutual information (MI) of two random variables is a measure of the mutual dependence between the two variables. More specifically, it quantifies the "amount of information" (in units such as shannons (bits), nats or hartleys) obtained about one random variable by observing the other random variable. The concept of mutual information is intimately linked to that of entropy of a random variable, a fundamental notion in information theory that quantifies the expected "amount of information" held in a random variable.
Line (geometry)In geometry, a line is an infinitely long object with no width, depth, or curvature. Thus, lines are one-dimensional objects, though they may exist embedded in two, three, or higher dimensional spaces. The word line may also refer to a line segment in everyday life that has two points to denote its ends (endpoints). A line can be referred to by two points that lie on it (e.g. ) or by a single letter (e.g. ).
Skew linesIn three-dimensional geometry, skew lines are two lines that do not intersect and are not parallel. A simple example of a pair of skew lines is the pair of lines through opposite edges of a regular tetrahedron. Two lines that both lie in the same plane must either cross each other or be parallel, so skew lines can exist only in three or more dimensions. Two lines are skew if and only if they are not coplanar. If four points are chosen at random uniformly within a unit cube, they will almost surely define a pair of skew lines.
Frenet–Serret formulasIn differential geometry, the Frenet–Serret formulas describe the kinematic properties of a particle moving along a differentiable curve in three-dimensional Euclidean space , or the geometric properties of the curve itself irrespective of any motion. More specifically, the formulas describe the derivatives of the so-called tangent, normal, and binormal unit vectors in terms of each other. The formulas are named after the two French mathematicians who independently discovered them: Jean Frédéric Frenet, in his thesis of 1847, and Joseph Alfred Serret, in 1851.
Fisher information metricIn information geometry, the Fisher information metric is a particular Riemannian metric which can be defined on a smooth statistical manifold, i.e., a smooth manifold whose points are probability measures defined on a common probability space. It can be used to calculate the informational difference between measurements. The metric is interesting in several respects. By Chentsov’s theorem, the Fisher information metric on statistical models is the only Riemannian metric (up to rescaling) that is invariant under sufficient statistics.
Primary progressive aphasiaPrimary progressive aphasia (PPA) is a type of neurological syndrome in which language capabilities slowly and progressively become impaired. As with other types of aphasia, the symptoms that accompany PPA depend on what parts of the left hemisphere are significantly damaged. However, unlike most other aphasias, PPA results from continuous deterioration in brain tissue, which leads to early symptoms being far less detrimental than later symptoms. Those with PPA slowly lose the ability to speak, write, read, and generally comprehend language.
Principal ideal domainIn mathematics, a principal ideal domain, or PID, is an integral domain in which every ideal is principal, i.e., can be generated by a single element. More generally, a principal ideal ring is a nonzero commutative ring whose ideals are principal, although some authors (e.g., Bourbaki) refer to PIDs as principal rings. The distinction is that a principal ideal ring may have zero divisors whereas a principal ideal domain cannot.
SymplectomorphismIn mathematics, a symplectomorphism or symplectic map is an isomorphism in the of symplectic manifolds. In classical mechanics, a symplectomorphism represents a transformation of phase space that is volume-preserving and preserves the symplectic structure of phase space, and is called a canonical transformation. A diffeomorphism between two symplectic manifolds is called a symplectomorphism if where is the pullback of . The symplectic diffeomorphisms from to are a (pseudo-)group, called the symplectomorphism group (see below).
Volume elementIn mathematics, a volume element provides a means for integrating a function with respect to volume in various coordinate systems such as spherical coordinates and cylindrical coordinates. Thus a volume element is an expression of the form where the are the coordinates, so that the volume of any set can be computed by For example, in spherical coordinates , and so . The notion of a volume element is not limited to three dimensions: in two dimensions it is often known as the area element, and in this setting it is useful for doing surface integrals.
Statistical distanceIn statistics, probability theory, and information theory, a statistical distance quantifies the distance between two statistical objects, which can be two random variables, or two probability distributions or samples, or the distance can be between an individual sample point and a population or a wider sample of points. A distance between populations can be interpreted as measuring the distance between two probability distributions and hence they are essentially measures of distances between probability measures.