The sensitivity index or d′ (pronounced 'dee-prime') is a statistic used in signal detection theory. It provides the separation between the means of the signal and the noise distributions, compared against the standard deviation of the signal or noise distribution. For normally distributed signal and noise with mean and standard deviations μS and σS, and μN and σN, respectively, d′ is defined as:[page needed]
- d′ = Z(hit rate) − Z(false alarm rate),
d′ is a dimensionless statistic. A higher d′ indicates that the signal can be more readily detected.
- MacMillan, N.; Creelman, C. (2005). Detection Theory: A User’s Guide. Lawrence Erlbaum Associates.
- Wickens, Thomas D. (2001). Elementary Signal Detection Theory. OUP USA. ch. 2, p. 20. ISBN 0-19-509250-3.
- Interactive signal detection theory tutorial including calculation of d′.
|This signal processing-related article is a stub. You can help Wikipedia by expanding it.|
|This statistics-related article is a stub. You can help Wikipedia by expanding it.|