Covariance
Definition
The covariance between r.v.s and is
Multiplying this out and using linearity, we have an equivalent expression
Think about the definition intuitively: If and tend to move in the same direction, then and will be positive on average, giving a positive covariance. If and tend to move in opposite directions, then the covariance would to negative.
If and are independent, then their covariance is zero. We say that r.v.s with zero covariance are uncorrelated
Warning
While independence implies a zero covariance, and are uncorrelated does not mean they are independent.
For example, let , and let . Then because the odd moments of the standard Normal distribution are equal to by symmetry. Thus
Covariance is a measure of linear association, so r.v.s can be dependent in nonlinear ways and still have zero covariance.
If are square integrable, the following four statements are equivalent:
- are uncorrelated
- = 1
- (need not to be independent)
- (need not to be independent)
Property
If are real-valued random variables and are real-valued constants we have
- Cauchy-Schwarz Inequality
this can be proved by
where we define as the inner product
Correlation
The correlation between r.v.s and is
Therefore, for any r.v.s and
Covariance Matrix
Definition
For multi-variate r.v.s, for example, , , we can define the covariance matrix
In particular, if , we call the variance matrix of .
Properties
- is positive definite
- If is a symmetric matrix, we call a quadratic form of . And we have