Mixture of experts

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by Jarble (talk | contribs) at 19:16, 18 February 2020 (linking). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

Mixture of experts refers to a machine learning technique where multiple experts (learners) are used to divide the problem space into homogeneous regions[1]. An example from the computer vision domain is combining a neural network model for human detection with another for pose estimation. If the output is conditioned on multiple levels of probabilistic gating functions, the mixture is called a hierarchical mixture of experts[2].

A gating network decides which expert to use for each input region. Learning thus consists of 1) learning the parameters of individual learners and 2) learning the parameters of the gating network.

References

  1. ^ Baldacchino, Tara; Cross, Elizabeth J.; Worden, Keith; Rowson, Jennifer (2016). "Variational Bayesian mixture of experts models and sensitivity analysis for nonlinear dynamical systems". Mechanical Systems and Signal Processing. 66–67: 178. Bibcode:2016MSSP...66..178B. doi:10.1016/j.ymssp.2015.05.009.
  2. ^ Hauskrecht, Milos. "Ensamble methods: Mixtures of experts (Presentation)" (PDF).

Extra reading

  • Masoudnia, Saeed; Ebrahimpour, Reza (12 May 2012). "Mixture of experts: a literature survey". Artificial Intelligence Review. 42 (2): 275–293. doi:10.1007/s10462-012-9338-y.