Unbiasedness

An point estimator is considered unbiased if its expected value equals the true value of the parameter begin estimated. For example, if is an estimator for a parameter , it is unbiased if

Common examples of unbiased estimators include the sample mean for estimating the population mean and the sample variance calculated the Bessel's correction (dividing by instead of )

To get a unbiased estimator, we can first use the techniques in point estimation such as method of moments and MLE to get a estimation of the target parameter . Then, we correct our estimation by adding a coefficient to it and solve then equation

However, unbiasedness does not always guarantee that an estimator is the best choice. In some cases, biased estimators may perform better in terms of mean squared error (MSE) compared to unbiased ones. This is particularly true when biased estimators have lower variance, which can lead to more reliable estimates in practice. Mathematically, we have

since . In this formula, we could see that the MSE of the estimator is the sum of its variance error and the square of its bias

Efficiency

Efficiency measures how well an estimator performs relative to other estimators. An efficient estimator achieves the lowest possible variance among all unbiased estimators for a given parameter. Mathematically, if for we have

and there exist at least one such that . Then we say is more efficient than .