Jump to content

Curse of dimensionality

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by Coneslayer (talk | contribs) at 20:24, 5 May 2005 (Use full name on first (and only) reference). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

Curse of dimensionality is a term coined by Richard Bellman applied to the problem caused by the rapid increase in volume associated with adding extra dimensions to a (mathematical) space.

Leo Breiman gives as an example the fact that 100 observations cover the one-dimensional unit interval [0,1] on the real line quite well. One could draw a histogram of the results, and draw inferences. If one now considers the corresponding 10-dimensional unit hypersquare, 100 observations are now isolated points in a vast empty space. To get similar coverage to the one-dimensional space would now require 1020 observations, which is at least a massive undertaking and may well be impractical.

The curse of dimensionality is a significant obstacle in machine learning problems that involve learning from few data samples in a high-dimensional feature space.

See also: quasi-random.

Reference

Bellman, R.E. 1961. Adaptive Control Processes. Princeton University Press, Princeton, NJ.