Jump to content

Negative log predictive density: Difference between revisions

From Wikipedia, the free encyclopedia
Content deleted Content added
Lionfish0 (talk | contribs)
No edit summary
Lionfish0 (talk | contribs)
No edit summary
Line 9: Line 9:
==Example==
==Example==


We have a method that classifies images as dogs or cats. Importantly it provides probabilities to the two classes. We show it a picture of two dogs and two cats. It predicts that the probability of the first two being dogs as 0.9 and 0.4, and of the last two being cats as 0.8 and 0.3. The NLPD is: <math>\frac{1}{4}(\log 0.9 + \log 0.4 + \log 0.8 + \log 0.3) = </math>.
We have a method that classifies images as dogs or cats. Importantly it provides probabilities to the two classes. We show it a picture of two dogs and two cats. It predicts that the probability of the first two being dogs as 0.9 and 0.4, and of the last two being cats as 0.8 and 0.3. The NLPD is: <math>-(\log 0.9 + \log 0.4 + \log 0.8 + \log 0.3) = </math>.

Revision as of 15:01, 30 September 2022

In statistics, the negative log predictive density (NLPD) is a measure of error between a model's predictions and associated true values. Importantly the NLPD assesses the quality of the model's uncertainty quantification. It is used for both regression and classification.

Definition

where $p(y|\bm{x})$ is the model, $\bm{x_i}$ are the inputs (independent variables) and $t_i$ are the observations outputs (dependent variable).

Example

We have a method that classifies images as dogs or cats. Importantly it provides probabilities to the two classes. We show it a picture of two dogs and two cats. It predicts that the probability of the first two being dogs as 0.9 and 0.4, and of the last two being cats as 0.8 and 0.3. The NLPD is: .