Definition

In all the unbiased estimation of , we want to find a best estimation such that it has the lowest variance (that is, most efficient) among all these unbiased estimations. We call such the uniformly minimum variance unbiased estimate (UMVUE) of

To find a UMVUE of , we can apply the two methods below

Cramer-Rao Inequality

Fisher Information Matrix

Let be the PDF of the population and i.i.d. . We define

as the Fisher Information Matrix (FIM).

If , the matrix becomes a number

Assume we can change order the gradient and integral without any impacts on the result for , that is

and

Then we will have

where we call the score function of the population. Therefore, we get

and

For , this is

C-R Inequality

Let be an estimator of any vector function of parameters from the observed sample , and denote its expectation vector by . The C-R Inequality then states the covariance matrix of satisfies

where

  • The matrix inequality is understood that the matrix is positive semidefinite
  • is the Jacobian matrix whose element is given by

If is unbiased (i.e. ), then the inequality reduces to

If , the inequality becomes

and

at the unbiased case.

Particularly, if , we would have

From the C-R inequality, we see if the variance of a unbiased estimate is its lower bound, then it is the UMVUE.