Jump to content

Product of experts

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by Upper-downer (talk | contribs) at 01:09, 4 January 2022 (Goeff Hinton -> Goeffrey Hinton). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

Product of experts (PoE) is a machine learning technique. It models a probability distribution by combining the output from several simpler distributions. It was proposed by Geoffrey Hinton, along with an algorithm for training the parameters of such a system.

The core idea is to combine several probability distributions ("experts") by multiplying their density functions—making the PoE classification similar to an "and" operation. This allows each expert to make decisions on the basis of a few dimensions without having to cover the full dimensionality of a problem.

This is related to (but quite different from) a mixture model, where several probability distributions are combined via an "or" operation, which is a weighted sum of their density functions.

  • Hinton, Geoffrey E. (2002). "Training Products of Experts by Minimizing Contrastive Divergence" (PDF). Neural Computation. 14 (8): 1771–1800. CiteSeerX 10.1.1.35.8613. doi:10.1162/089976602760128018. PMID 12180402. S2CID 207596505. Retrieved 2009-10-25.