In statistics, the observed information, or observed Fisher information, is the negative of the second derivative (the Hessian matrix) of the "log-likelihood" (the logarithm of the likelihood function). It is a sample-based version of the Fisher information.
Suppose we observe random variables , independent and identically distributed with density f(X; θ), where θ is a (possibly unknown) vector. Then the log-likelihood of the parameters given the data is
We define the observed information matrix at as
In many instances, the observed information is evaluated at the maximum-likelihood estimate.
The Fisher information is the expected value of the observed information given a single observation distributed according to the hypothetical model with parameter :
In a notable article, Bradley Efron and David V. Hinkley  argued that the observed information should be used in preference to the expected information when employing normal approximations for the distribution of maximum-likelihood estimates.