Jump to content

Marr–Hildreth algorithm

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by Mr Stephen (talk | contribs) at 20:44, 29 September 2016 (top: clean up, ISBN format using AWB). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

In computer vision, the Marr–Hildreth algorithm is a method of detecting edges in digital images, that is, continuous curves where there are strong and rapid variations in image brightness. The Marr–Hildreth edge detection method is simple and operates by convolving the image with the Laplacian of the Gaussian function, or, as a fast approximation by Difference of Gaussians. Then, zero crossings are detected in the filtered result to obtain the edges. The Laplacian-of-Gaussian image operator is sometimes also referred to as the Mexican hat wavelet due to its visual shape when turned upside-down. David Marr and Ellen C. Hildreth are two of the inventors.[1]

The Marr–Hildreth operator, however, suffers from two main limitations. It generates responses that do not correspond to edges, so-called "false edges", and the localization error may be severe at curved edges. Today, there are much better edge detection methods, such as the Canny edge detector based on the search for local directional maxima in the gradient magnitude, or the differential approach based on the search for zero crossings of the differential expression that corresponds to the second-order derivative in the gradient direction (Both of these operations preceded by a Gaussian smoothing step.) For more details, see the article on Edge detection.

See also

References

  1. ^ Umbaugh, Scott E (2010). Digital image processing and analysis : human and computer vision applications with CVIPtools (2nd ed.). Boca Raton, FL: CRC Press. ISBN 978-1-4398-0205-2.