Convergence of Sequences
Convergence in Distribution
A sequence of real-valued random variables with CDF is said to converge in distribution, or converge weakly, or converge in law to a random variable with CDF if
for every number at which is continuous. We denote this as
Convergence in Probability
A sequence of random variables converges in probability towards the random variable if for all
This could be denoted by
Almost Everywhere
To say that the sequence converges almost surely or almost everywhere or with probability 1 or strongly towards means that
This means that the values of approach the value of , in the sense that events for which does not converge to have probability , this can be denoted as
Convergence in Mean
Given a real number , we say that the sequence converges in the -th mean towards the random variate , if
Convergence in -th mean tells us that the expectation of the -th power of the difference between and converges to zero. This can be denoted as
Relationship
- For continuous function and ,
- (Slustky's Theorem) If is a constant and , then
Law of Large Numbers
Weak Law
Given a collection of i.i.d. samples from a random variable with finite mean, the sample mean converges in probability to the expected value, that is
Strong Law
Given a collection of
- i.i.d. samples
- with finite mean
- with finite variance Then the samples mean converges almost everywhere to the expected value, that is
Chebyshev Inequality
If the expectation and the variance of a r.v. all exist, then for we have
With Chebyshev inequality, the condition of the weak law of large numbers can be rewritten as
Therefore, if are independent and bounded in variance, we have
where is the bound of the variances. Now in this case we have proved
which is known as Chebyshev's Law of Large Numbers
Khinchin's Law of Large Numbers
If i.i.d and their expectations exist, then