Summary
In probability theory and statistics, two real-valued random variables, , , are said to be uncorrelated if their covariance, , is zero. If two variables are uncorrelated, there is no linear relationship between them. Uncorrelated random variables have a Pearson correlation coefficient, when it exists, of zero, except in the trivial case when either variable has zero variance (is a constant). In this case the correlation is undefined. In general, uncorrelatedness is not the same as orthogonality, except in the special case where at least one of the two random variables has an expected value of 0. In this case, the covariance is the expectation of the product, and and are uncorrelated if and only if . If and are independent, with finite second moments, then they are uncorrelated. However, not all uncorrelated variables are independent. Two random variables are called uncorrelated if their covariance is zero. Formally: Two complex random variables are called uncorrelated if their covariance and their pseudo-covariance is zero, i.e. A set of two or more random variables is called uncorrelated if each pair of them is uncorrelated. This is equivalent to the requirement that the non-diagonal elements of the autocovariance matrix of the random vector are all zero. The autocovariance matrix is defined as: Correlation and dependence Let be a random variable that takes the value 0 with probability 1/2, and takes the value 1 with probability 1/2. Let be a random variable, independent of , that takes the value −1 with probability 1/2, and takes the value 1 with probability 1/2. Let be a random variable constructed as . The claim is that and have zero covariance (and thus are uncorrelated), but are not independent. Proof: Taking into account that where the second equality holds because and are independent, one gets Therefore, and are uncorrelated. Independence of and means that for all and , . This is not true, in particular, for and . Thus so and are not independent. Q.E.D.
About this result
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.
Related courses (5)
ME-422: Multivariable control
This course covers methods for the analysis and control of systems with multiple inputs and outputs, which are ubiquitous in modern technology and industry. Special emphasis will be given to discrete-
MATH-342: Time series
A first course in statistical time series analysis and applications.
MATH-444: Multivariate statistics
Multivariate statistics focusses on inferring the joint distributional properties of several random variables, seen as random vectors, with a main focus on uncovering their underlying dependence struc
Show more
Related lectures (34)
Biclustering & latent variables: statistical analysis of network data
Explores biclustering techniques and latent variables in network data analysis.
Bayesian Inference: Gaussian Variables
Explores Bayesian inference for Gaussian random variables, covering joint distribution, marginal pdfs, and the Bayes classifier.
Optimality and Asymptotics
Explores the optimality of the Least Squares Estimator and its large sample distribution.
Show more
Related publications (12)

Information Spectrum Converse for Minimum Entropy Couplings and Functional Representations

Given two jointly distributed random variables (X,Y), a functional representation of X is a random variable Z independent of Y, and a deterministic function g(⋅,⋅) such that X=g(Y,Z). The problem of finding a minimum entropy functional representation is kn ...
2023
Show more
Related concepts (16)
Pearson correlation coefficient
In statistics, the Pearson correlation coefficient (PCC) is a correlation coefficient that measures linear correlation between two sets of data. It is the ratio between the covariance of two variables and the product of their standard deviations; thus, it is essentially a normalized measurement of the covariance, such that the result always has a value between −1 and 1. As with covariance itself, the measure can only reflect a linear correlation of variables, and ignores many other types of relationships or correlations.
Moment (mathematics)
In mathematics, the moments of a function are certain quantitative measures related to the shape of the function's graph. If the function represents mass density, then the zeroth moment is the total mass, the first moment (normalized by total mass) is the center of mass, and the second moment is the moment of inertia. If the function is a probability distribution, then the first moment is the expected value, the second central moment is the variance, the third standardized moment is the skewness, and the fourth standardized moment is the kurtosis.
Normally distributed and uncorrelated does not imply independent
In probability theory, although simple examples illustrate that linear uncorrelatedness of two random variables does not in general imply their independence, it is sometimes mistakenly thought that it does imply that when the two random variables are normally distributed. This article demonstrates that assumption of normal distributions does not have that consequence, although the multivariate normal distribution, including the bivariate normal distribution, does.
Show more