Jump to content

Isotonic regression

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by 99.150.137.63 (talk) at 13:29, 16 August 2011. The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

In numerical analysis, isotonic regression (IR) involves finding a weighted least-squares fit to a vector with weights vector subject to a set of monotonicity constraints giving a simple or partial order over the variables. The monotonicity constraints define a directed acyclic graph over the nodes corresponding to the variables . Thus, the IR problem where a simple order is defined corresponds to the following quadratic program (QP):

In the case when is a total order, a simple iterative algorithm for solving this QP is called the pool adjacent violators algorithm (PAVA). Best and Chakravarti (1990) have studied the problem as an active set identification problem, and have proposed a primal algorithm in O(n), the same complexity as the PAVA, which can be seen as a dual algorithm.

IR has applications in statistical inference, for example, computing the cost at the minimum of the above goal function, gives the "stress" of the fit of an isotonic curve to mean experimental results when an order is expected.

Another application is nonmetric multidimensional scaling (Kruskal, 1964), where a low-dimensional embedding for data points is sought such that order of distances between points in the embedding matches order of dissimilarity between points. Isotonic regression is used iteratively to fit ideal distances to preserve relative dissimilarity order.

Isotonic regression is also sometimes referred to as monotonic regression. Correctly speaking, isotonic is used when the direction of the trend is strictly increasing, while monotonic could imply a trend that is either strictly increasing or strictly decreasing.

Isotonic Regression under the for is defined as follows:

Simply ordered case

To illustrate the above, let , and , and .

The isotonic estimator, , minimizes the weighted least squares-like condition:

Where is the unknown function we are estimating, and is a known function.

Software has been developed in the R statistical package for computing isotone (monotonic) regression.[1]

References

  1. ^ De Leeuw, Jan (2009). "Isotone Optimization in R: Pool-Adjacent-Violators Algorithm (PAVA) and Active Set Methods". Journal of statistical software. 32 (5): 1. {{cite journal}}: Unknown parameter |coauthors= ignored (|author= suggested) (help)

The template {{Expand}} has been deprecated since 26 December 2010, and is retained only for old revisions. If this page is a current revision, please remove the template.

  • Best, M.J.; & Chakravarti N. (1990). "Active set algorithms for isotonic regression; a unifying framework". Mathematical Programming. 47: 425–439. doi:10.1007/BF01580873.{{cite journal}}: CS1 maint: multiple names: authors list (link)
  • Robertson, T.; Wright, F.T.; & Dykstra, R.L. (1988). Order restricted statistical inference. New York: Wiley, 1988. ISBN 0-471-91787-7
  • Barlow, R. E.; Bartholomew, D.J.; Bremner, J. M.; & Brunk, H. D. (1972). Statistical inference under order restrictions; the theory and application of isotonic regression. New York: Wiley, 1972. ISBN 0-471-04970-0.