Cramér–Rao Bound

Cramér–Rao Bound

In estimation theory and statistics, the Cramér–Rao bound (CRB) or Cramér–Rao lower bound (CRLB), named in honor of Harald Cramér and Calyampudi Radhakrishna Rao who were among the first to derive it, expresses a lower bound on the variance of estimators of a deterministic parameter. The bound is also known as the Cramér–Rao inequality or the information inequality.

In its simplest form, the bound states that the variance of any unbiased estimator is at least as high as the inverse of the Fisher information. An unbiased estimator which achieves this lower bound is said to be (fully) efficient. Such a solution achieves the lowest possible mean squared error among all unbiased methods, and is therefore the minimum variance unbiased (MVU) estimator. However, in some cases, no unbiased technique exists which achieves the bound. This may occur even when an MVU estimator exists.

The Cramér–Rao bound can also be used to bound the variance of biased estimators of given bias. In some cases, a biased approach can result in both a variance and a mean squared error that are below the unbiased Cramér–Rao lower bound; see estimator bias.

Read more about Cramér–Rao Bound:  Statement, Single-parameter Proof

Other articles related to "bound":

Cramér–Rao Bound - Examples - Normal Variance With Known Mean
... The Cramer Rao bound states that In this case, the inequality is saturated (equality is achieved), showing that the estimator is efficient ... error is which is clearly less than the Cramér–Rao bound found above ...

Famous quotes containing the word bound:

    Slight are her arms, yet they have bound me straitly
    And left me cloaked as with a gauze of ther;
    As with sweet leaves; as with subtle clearness.
    Ezra Pound (1885–1972)