d'

From Wikipedia, the free encyclopedia
Jump to: navigation, search

The sensitivity index or d' (pronounced 'dee-prime') is a statistic used in signal detection theory. It provides the separation between the means of the signal and the noise distributions, compared against the standard deviation of the noise distribution. For normally distributed signal and noise with mean and standard deviations \mu_S and \sigma_S, and \mu_N and \sigma_N, respectively, d' is defined as:

d' = \frac{\mu_S - \mu_N}{\sqrt{\frac{1}{2}(\sigma_S^2 + \sigma_N^2)}}[1]

An estimate of d' can be also found from measurements of the hit rate and false-alarm rate. It is calculated as:

d' = Z(hit rate) - Z(false alarm rate),[2]

where function Z(p), p ∈ [0,1], is the inverse of the cumulative Gaussian distribution.

d' is a dimensionless statistic. A higher d' indicates that the signal can be more readily detected.

See also[edit]

References[edit]

  1. ^ Samuel Gale and David Perkel. A Basal Ganglia Pathway Drives Selective Auditory Responses in Songbird Dopaminergic Neurons via Disinhibition. The Journal of Neuroscience (2010). 30(3):1027–1037
  2. ^ MacMillan N, Creelman C (2005) Detection Theory: A User’s Guide. Lawrence Erlbaum Associates. (p.7) Retrieved from: http://books.google.co.uk/books/about/Detection_Theory.html?id=hDX65v9bReYC

External links[edit]