Intraclass correlationIn statistics, the intraclass correlation, or the intraclass correlation coefficient (ICC), is a descriptive statistic that can be used when quantitative measurements are made on units that are organized into groups. It describes how strongly units in the same group resemble each other. While it is viewed as a type of correlation, unlike most other correlation measures, it operates on data structured as groups rather than data structured as paired observations.
Correlation does not imply causationThe phrase "correlation does not imply causation" refers to the inability to legitimately deduce a cause-and-effect relationship between two events or variables solely on the basis of an observed association or correlation between them. The idea that "correlation implies causation" is an example of a questionable-cause logical fallacy, in which two events occurring together are taken to have established a cause-and-effect relationship. This fallacy is also known by the Latin phrase cum hoc ergo propter hoc ('with this, therefore because of this').
CorrelationIn statistics, correlation or dependence is any statistical relationship, whether causal or not, between two random variables or bivariate data. Although in the broadest sense, "correlation" may indicate any type of association, in statistics it usually refers to the degree to which a pair of variables are linearly related. Familiar examples of dependent phenomena include the correlation between the height of parents and their offspring, and the correlation between the price of a good and the quantity the consumers are willing to purchase, as it is depicted in the so-called demand curve.
Cross-correlationIn signal processing, cross-correlation is a measure of similarity of two series as a function of the displacement of one relative to the other. This is also known as a sliding dot product or sliding inner-product. It is commonly used for searching a long signal for a shorter, known feature. It has applications in pattern recognition, single particle analysis, electron tomography, averaging, cryptanalysis, and neurophysiology. The cross-correlation is similar in nature to the convolution of two functions.
Double-slit experimentIn modern physics, the double-slit experiment demonstrates that light and matter can satisfy the seemingly-incongruous classical definitions for both waves and particles, which is considered evidence for the fundamentally probabilistic nature of quantum mechanics. This type of experiment was first performed by Thomas Young in 1801, as a demonstration of the wave behavior of visible light. At that time it was thought that light consisted of either waves or particles.
Scaled correlationIn statistics, scaled correlation is a form of a coefficient of correlation applicable to data that have a temporal component such as time series. It is the average short-term correlation. If the signals have multiple components (slow and fast), scaled coefficient of correlation can be computed only for the fast components of the signals, ignoring the contributions of the slow components. This filtering-like operation has the advantages of not having to make assumptions about the sinusoidal nature of the signals.
Pearson correlation coefficientIn statistics, the Pearson correlation coefficient (PCC) is a correlation coefficient that measures linear correlation between two sets of data. It is the ratio between the covariance of two variables and the product of their standard deviations; thus, it is essentially a normalized measurement of the covariance, such that the result always has a value between −1 and 1. As with covariance itself, the measure can only reflect a linear correlation of variables, and ignores many other types of relationships or correlations.
Correlation coefficientA correlation coefficient is a numerical measure of some type of correlation, meaning a statistical relationship between two variables. The variables may be two columns of a given data set of observations, often called a sample, or two components of a multivariate random variable with a known distribution. Several types of correlation coefficient exist, each with their own definition and own range of usability and characteristics. They all assume values in the range from −1 to +1, where ±1 indicates the strongest possible agreement and 0 the strongest possible disagreement.
AngleIn Euclidean geometry, an angle is the figure formed by two rays, called the sides of the angle, sharing a common endpoint, called the vertex of the angle. Angles formed by two rays are also known as plane angles as they lie in the plane that contains the rays. Angles are also formed by the intersection of two planes; these are called dihedral angles. Two intersecting curves may also define an angle, which is the angle of the rays lying tangent to the respective curves at their point of intersection.
Two-dimensional nuclear magnetic resonance spectroscopyTwo-dimensional nuclear magnetic resonance spectroscopy (2D NMR) is a set of nuclear magnetic resonance spectroscopy (NMR) methods which give data plotted in a space defined by two frequency axes rather than one. Types of 2D NMR include correlation spectroscopy (COSY), J-spectroscopy, exchange spectroscopy (EXSY), and nuclear Overhauser effect spectroscopy (NOESY). Two-dimensional NMR spectra provide more information about a molecule than one-dimensional NMR spectra and are especially useful in determining the structure of a molecule, particularly for molecules that are too complicated to work with using one-dimensional NMR.