Jump to content

Table of confusion

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by Tilmann.Bruckhaus (talk | contribs) at 15:28, 20 January 2006. The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

(diff) ← Previous revision | Latest revision (diff) | Newer revision → (diff)

In Predictive Analytics, a Table of Confusion is a table with two rows and two columns that reports the number of True Negatives, False Positives, False Negatives, and True Positives.

                 Predicted Negative   Predicted Positive
Negative Cases   True Negatives       False Positives
Positive Cases   False Negatives      True Positives

Table 1: Table of Confusion.

For example, consider a model which predicts for 10,000 Insurance Claims whether each case is Fraudulent. This model correctly predicts 9,700 non-fraudulent cases, and 100 fraudulent cases. The model also incorrectly predicts 150 cases which are not fraudulent to be fraudulent, and 50 cases which are fraudulent to be non-fraudulent. The resulting Table of Confusion is shown below.

                 Predicted Negative   Predicted Positive
Negative Cases   9,700                150
Positive Cases      50                100

Table 2: Example Table of Confusion.