Jump to content

Constraint (information theory)

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by JHunterJ (talk | contribs) at 18:42, 20 December 2017 (is). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

Constraint in information theory is the degree of statistical dependence between or among variables.

Garner[1] provides a thorough discussion of various forms of constraint (internal constraint, external constraint, total constraint) with application to pattern recognition and psychology.

See also

References

  1. ^ Garner W R (1962). Uncertainty and Structure as Psychological Concepts, John Wiley & Sons, New York.