Consistency

Definition

Consistency in statistics refers to an estimator's fundamental property where estimates converge in probability to the true parameter values as the sample size increase indefinitely.

In other words, if for all , we have

Then we call is a consistent estimator of . This type of consistency is also called weak consistency.

For strong consistency, we require the estimator to converge almost surely to the true parameter as the sample size approaches infinity.

Conditions

Method of moments estimators are usually consistent because they are based on sample moments that converge to population moments as the sample size increases.

MLE estimators are consistent if

  1. The log likelihood function is differentiable
  2. The likelihood function is identifiable, that is
    • Different parameter values must correspond to different probability distributions + the parameter space is compact
    • Or the Kullback–Leibler Divergence between two distributions with different parameters be strictly positive

Note

These conditions are weaker than the condition to convergence in first order norm

Asymptotic Normality

Definition

Asymptotic Normality refers to the property of an estimator whereby, as the sample size  approaches infinity, the distribution of the estimator converges to a normal distribution. This means that for sufficiently large samples, the distribution of the estimator can be approximated by a normal distribution, regardless of the original distribution of the data from which it was derived.

The asymptotic normality of a estimator can been seen as a generalization of central limit theorem (CLT), where the estimator, in particular, is the sample mean.

An MLE with the true parameter has asymptotic normality if

  1. For every , , , and exist.
  2. For every , , , where is integrable and .
  3. The order of the partial derivative and the integral of can be exchanged.

Then we will have

where is the Fisher Information. From this formula we can see the asymptotic variance of is (if is an MLE), which happens to be the lower bound of C-R inequality.

Delta Method

The Delta Method is a statistical technique used to derive the asymptotic distribution of a function of an estimator that is asymptotically normal.

If we have an estimator  that is asymptotically normal, i.e.

We can utilize a first-order Taylor series expansion around the point

Then

  • The asymptotic mean is
  • The asymptotic variance is . Therefore

That is