Jump to content

Negative log predictive density

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by Lionfish0 (talk | contribs) at 14:59, 30 September 2022 (Created page with 'In statistics, the '''negative log predictive density''' ('''NLPD''') is a measure of error between a model's predictions and associated true values. Importantly the NLPD assesses the quality of the model's uncertainty quantification. It is used for both regression and classification. ==Definition== <math>\text{NLPD} = \frac{1}{N} \sum_{i=1}^N \log p(y_i = t_i | \bm{x_i})</math> where $p(y|\bm{x})$ is the model, $\bm{x_i}$ are the inputs (independent...'). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

(diff) ← Previous revision | Latest revision (diff) | Newer revision → (diff)

In statistics, the negative log predictive density (NLPD) is a measure of error between a model's predictions and associated true values. Importantly the NLPD assesses the quality of the model's uncertainty quantification. It is used for both regression and classification.

Definition

Failed to parse (unknown function "\bm"): {\displaystyle \text{NLPD} = \frac{1}{N} \sum_{i=1}^N \log p(y_i = t_i | \bm{x_i})}

where $p(y|\bm{x})$ is the model, $\bm{x_i}$ are the inputs (independent variables) and $t_i$ are the observations outputs (dependent variable).

Example

We have a method that classifies images as dogs or cats. Importantly it provides probabilities to the two classes. We show it a picture of two dogs and two cats. It predicts that the probability of the first two being dogs as 0.6 and 0.4, and of the last two being cats as 0.7 and 0.3. The NLPD is: $\frac{1}{4}(\log 0.6 + \log 0.4 + \log 0.7 + \log 0.3)$.