Theory in probability
In probability theory and statistics, two real-valued random variables,
,
, are said to be uncorrelated if their covariance,
, is zero. If two variables are uncorrelated, there is no linear relationship between them.
Uncorrelated random variables have a Pearson correlation coefficient, when it exists, of zero, except in the trivial case when either variable has zero variance (is a constant). In this case the correlation is undefined.
In general, uncorrelatedness is not the same as orthogonality, except in the special case where at least one of the two random variables has an expected value of 0. In this case, the covariance is the expectation of the product, and
and
are uncorrelated if and only if
.
If
and
are independent, with finite second moments, then they are uncorrelated. However, not all uncorrelated variables are independent.
Definition
Definition for two real random variables
Two random variables
are called uncorrelated if their covariance
is zero. Formally:
![{\displaystyle X,Y{\text{ uncorrelated}}\quad \iff \quad \operatorname {E} [XY]=\operatorname {E} [X]\cdot \operatorname {E} [Y]}](https://wikimedia.org/api/rest_v1/media/math/render/svg/562b87f28ef70f83e052ebec97459a5044a1125d)
Definition for two complex random variables
Two complex random variables
are called uncorrelated if their covariance
and their pseudo-covariance
is zero, i.e.
![{\displaystyle Z,W{\text{ uncorrelated}}\quad \iff \quad \operatorname {E} [Z{\overline {W}}]=\operatorname {E} [Z]\cdot \operatorname {E} [{\overline {W}}]{\text{ and }}\operatorname {E} [ZW]=\operatorname {E} [Z]\cdot \operatorname {E} [W]}](https://wikimedia.org/api/rest_v1/media/math/render/svg/eb7843711417e69eebdf5ce57687de2005c0abc9)
Definition for more than two random variables
A set of two or more random variables
is called uncorrelated if each pair of them is uncorrelated. This is equivalent to the requirement that the non-diagonal elements of the autocovariance matrix
of the random vector
are all zero. The autocovariance matrix is defined as:
![{\displaystyle \operatorname {K} _{\mathbf {X} \mathbf {X} }=\operatorname {cov} [\mathbf {X} ,\mathbf {X} ]=\operatorname {E} [(\mathbf {X} -\operatorname {E} [\mathbf {X} ])(\mathbf {X} -\operatorname {E} [\mathbf {X} ]))^{\rm {T}}]=\operatorname {E} [\mathbf {X} \mathbf {X} ^{T}]-\operatorname {E} [\mathbf {X} ]\operatorname {E} [\mathbf {X} ]^{T}}](https://wikimedia.org/api/rest_v1/media/math/render/svg/8fcbaa6c2613279e7018b348dac19099d0ba5d94)
Examples of dependence without correlation
Example 1
- Let
be a random variable that takes the value 0 with probability 1/2, and takes the value 1 with probability 1/2. - Let
be a random variable, independent of
, that takes the value −1 with probability 1/2, and takes the value 1 with probability 1/2. - Let
be a random variable constructed as
.
The claim is that
and
have zero covariance (and thus are uncorrelated), but are not independent.
Proof:
Taking into account that
![{\displaystyle \operatorname {E} [U]=\operatorname {E} [XY]=\operatorname {E} [X]\operatorname {E} [Y]=\operatorname {E} [X]\cdot 0=0,}](https://wikimedia.org/api/rest_v1/media/math/render/svg/bd4da767a2ab5ae8e62c4e76439b4a30a4ba624a)
where the second equality holds because
and
are independent, one gets
![{\displaystyle {\begin{aligned}\operatorname {cov} [U,X]&=\operatorname {E} [(U-\operatorname {E} [U])(X-\operatorname {E} [X])]=\operatorname {E} [U(X-{\tfrac {1}{2}})]\\&=\operatorname {E} [X^{2}Y-{\tfrac {1}{2}}XY]=\operatorname {E} [(X^{2}-{\tfrac {1}{2}}X)Y]=\operatorname {E} [(X^{2}-{\tfrac {1}{2}}X)]\operatorname {E} [Y]=0\end{aligned}}}](https://wikimedia.org/api/rest_v1/media/math/render/svg/a6d8e5b797ac2f8316c71571d441696d0bfbe8f8)
Therefore,
and
are uncorrelated.
Independence of
and
means that for all
and
,
. This is not true, in particular, for
and
.


Thus
so
and
are not independent.
Q.E.D.
Example 2
If
is a continuous random variable uniformly distributed on
and
, then
and
are uncorrelated even though
determines
and a particular value of
can be produced by only one or two values of
:
![{\displaystyle f_{X}(t)={1 \over 2}I_{[-1,1]};f_{Y}(t)={1 \over {2{\sqrt {t}}}}I_{]0,1]}}](https://wikimedia.org/api/rest_v1/media/math/render/svg/e77c0f6e695cb006ffbc6056f809b5bd13045399)
on the other hand,
is 0 on the triangle defined by
although
is not null on this domain. Therefore
and the variables are not independent.
![{\displaystyle E[X]={{1-1} \over 4}=0;E[Y]={{1^{3}-(-1)^{3}} \over {3\times 2}}={1 \over 3}}](https://wikimedia.org/api/rest_v1/media/math/render/svg/da9300036cffa01c1fb604ce320a1ef8d71a0611)
![{\displaystyle Cov[X,Y]=E\left[(X-E[X])(Y-E[Y])\right]=E\left[X^{3}-{X \over 3}\right]={{1^{4}-(-1)^{4}} \over {4\times 2}}=0}](https://wikimedia.org/api/rest_v1/media/math/render/svg/e174cb7a257d296619dd6aaf727484520c2f5eb6)
Therefore the variables are uncorrelated.
There are cases in which uncorrelatedness does imply independence. One of these cases is the one in which both random variables are two-valued (so each can be linearly transformed to have a Bernoulli distribution). Further, two jointly normally distributed random variables are independent if they are uncorrelated, although this does not hold for variables whose marginal distributions are normal and uncorrelated but whose joint distribution is not joint normal (see Normally distributed and uncorrelated does not imply independent).
Generalizations
Two random vectors
and
are called uncorrelated if
.
They are uncorrelated if and only if their cross-covariance matrix
is zero.
Two complex random vectors
and
are called uncorrelated if their cross-covariance matrix and their pseudo-cross-covariance matrix is zero, i.e. if

where
![{\displaystyle \operatorname {K} _{\mathbf {Z} \mathbf {W} }=\operatorname {E} [(\mathbf {Z} -\operatorname {E} [\mathbf {Z} ]){(\mathbf {W} -\operatorname {E} [\mathbf {W} ])}^{\mathrm {H} }]}](https://wikimedia.org/api/rest_v1/media/math/render/svg/686b575db50d74be88fa21e2778665659fcbc712)
and
.
Two stochastic processes
and
are called uncorrelated if their cross-covariance
is zero for all times. Formally:
.
See also
References
Further reading