Jump to content

Sensitivity index

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by FrescoBot (talk | contribs) at 07:11, 5 November 2017 (Bot: link syntax). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

The sensitivity index or d' (pronounced 'dee-prime') is a statistic used in signal detection theory. It provides the separation between the means of the signal and the noise distributions, compared against the standard deviation of the signal or noise distribution. For normally distributed signal and noise with mean and standard deviations and , and and , respectively, d' is defined as:

[1]

Note that by convention, d' assumes that the standard deviations for signal and noise are equal. An estimate of d' can be also found from measurements of the hit rate and false-alarm rate. It is calculated as:

d' = Z(hit rate) − Z(false alarm rate),[1]: 7 

where function Z(p), p ∈ [0,1], is the inverse of the cumulative distribution function of the Gaussian distribution.

d' can be related to the Area Under the Receiver operating characteristic Curve, or AUC, via

[1]: 63 

d' is a dimensionless statistic. A higher d' indicates that the signal can be more readily detected.

See also

References

  • Wickens, Thomas D. (2001) Elementary Signal Detection Theory, OUP USA. ISBN 0-19-509250-3 (Ch. 2, p. 20). Excerpts