Lucky imaging (also called lucky exposures) is one form of speckle imaging used for astrophotography. Speckle imaging techniques use a high-speed camera with exposure times short enough (100 ms or less) so that the changes in the Earth's atmosphere during the exposure are minimal.
With lucky imaging, those optimum exposures least affected by the atmosphere (typically around 10%) are chosen and combined into a single image by shifting and adding the short exposures, yielding much higher angular resolution than would be possible with a single, longer exposure, which includes all the frames.
Images taken with ground-based telescopes are subject to the blurring effect of atmospheric turbulence (seen to the eye as the stars twinkling). Many astronomical imaging programs require higher resolution than is possible without some correction of the images. Lucky imaging is one of several methods used to remove atmospheric blurring. Used at a 1% selection or less, lucky imaging can reach the diffraction limit of even 2.5 m aperture telescopes, a resolution improvement factor of at least five over standard imaging systems.
Typical short-exposure image of this binary star from the same dataset, but without using any speckle processing. The effect of the Earth's atmosphere is to break the image of each star up into speckles.
Demonstration of the principle
The sequence of images below shows how lucky imaging works. From a series of 50,000 images taken at a speed of almost 40 images per second, five different long exposure images have been created. Additionally, a single exposure with very low image quality and another single exposure with very high image quality are shown at the beginning of the demo sequence. The astronomical target shown has the 2MASS ID J03323578+2843554. North is up and East on the left.
|Single exposure with low image quality, not selected for lucky imaging.||Single exposure with very high image quality, selected for lucky imaging.|
|This image shows the average of all 50,000 images, which is almost the same as the 21 minutes (50,000/40 seconds) long exposure seeing limited image. It looks like a typical star image, slightly elongated. The full width at half maximum (FWHM) of the seeing disk is around 0.9 arcsec.||This image shows the average of all 50,000 single images but here with the center of gravity (centroid) of each image shifted to the same reference position. This is the tip-tilt-corrected, or image-stabilized, long-exposure image. It already shows more details — two objects — than the seeing-limited image.|
|This image shows the 25,000 (50% selection) best images averaged, after the brightest pixel in each image was moved to the same reference position. In this image, we can almost see three objects.||This image shows the 5,000 (10% selection) best images averaged, after the brightest pixel in each image was moved to the same reference position. The surrounding seeing halo is further reduced, an Airy ring around the brightest object becomes clearly visible.|
|This image shows the 500 (1% selection) best images averaged, after the brightest pixel in each image was moved to the same reference position. The seeing halo is further reduced. The signal-to-noise ratio of the brightest object is the highest in this image.|
The difference between the seeing limited image (third image from top) and the best 1% images selected result is quite remarkable: a triple system has been detected. The brightest component in the West is a V=14.9 magnitude M4V star. This component is the lucky imaging reference source. The weaker component consists of two stars of spectral classes M4.5 and M5.5.  The distance of the system is about 45 parsecs (pc). Airy rings can be seen, which indicates that the diffraction limit of the Calar Alto Observatory's 2.2 m telescope was reached. The signal to noise ratio of the point sources increases with stronger selection. The seeing halo on the other side is more suppressed. The separation between the two brightest objects is around 0.53 arcsec and between the two faintest objects less than 0.16 arcsec. At a distance of 45 pc this corresponds to 7.2 times the distance between Earth and Sun, around 1 billion kilometers (109 km).
Lucky imaging methods were first used in the middle 20th century, and became popular for imaging planets in the 1950s and 1960s (using cine cameras, often with image intensifiers). For the most part it took 30 years for the separate imaging technologies to be perfected for this counter-intuitive imaging technology to become practical. The first numerical calculation of the probability of obtaining lucky exposures was an article by David L. Fried in 1978.
In early applications of lucky imaging, it was generally assumed that the atmosphere smeared-out or blurred the astronomical images. In that work, the full width at half maximum (FWHM) of the blurring was estimated, and used to select exposures. Later studies took advantage of the fact that the atmosphere does not blur astronomical images, but generally produces multiple sharp copies of the image (the point spread function has speckles). New methods were used which took advantage of this to produce much higher quality images than had been obtained assuming the image to be smeared.
In the early years of the 21st century, it was realised that turbulent intermittency (and the fluctuations in astronomical seeing conditions it produced) could substantially increase the probability of obtaining a "lucky exposure" for given average astronomical seeing conditions.
Lucky imaging and adaptive optics hybrid systems
In 2007 astronomers at Caltech and the University of Cambridge announced the first results from a new hybrid lucky imaging and adaptive optics (AO) system. The new camera gave the first diffraction-limited resolutions on 5 m-class telescopes in visible light. The research was performed on the Mt. Palomar Hale telescope of 200-inch-diameter aperture. The telescope, with lucky cam and adaptive optics, pushed it near its theoretical angular resolution, achieving up to 0.025 arc seconds for certain types of viewing. Compared to space telescopes like the 2.4 m Hubble, the system still has some drawbacks including a narrow field of view for crisp images (typically 10" to 20"), airglow, and electromagnetic frequencies blocked by the atmosphere.
When combined with an AO system, lucky imaging selects the periods when the turbulence the adaptive optics system must correct is reduced. In these periods, lasting a small fraction of a second, the correction given by the AO system is sufficient to give excellent resolution with visible light. The lucky imaging system averages the images taken during the excellent periods to produce a final image with much higher resolution than is possible with a conventional long-exposure AO camera.
This technique is applicable to getting very high resolution images of only relatively small astronomical objects, up to 10 arcseconds in diameter, as it is limited by the precision of the atmospheric turbulence correction. It also requires a relatively bright 14th-magnitude star in the field of view on which to guide. Being above the atmosphere, the Hubble Space Telescope is not limited by these concerns and so is capable of much wider-field high-resolution imaging.
Popularity of technique
Both amateur and professional astronomers have begun to use this technique. Modern webcams and camcorders have the ability to capture rapid short exposures with sufficient sensitivity for astrophotography, and these devices are used with a telescope and the shift-and-add method from speckle imaging (also known as image stacking) to achieve previously unattainable resolution. If some of the images are discarded, then this type of video astronomy is called lucky imaging.
Many methods exist for image selection, including the Strehl-selection method first suggested by John E. Baldwin from the Cambridge group and the image contrast selection used in the Selective Image Reconstruction method of Ron Dantowitz.
The development and availability of electron-multiplying CCDs (EMCCD, also known as LLLCCD, L3CCD, or low-light-level CCD) has allowed the first high-quality lucky imaging of faint objects.
On October 27, 2014, Google introduced a similar technique called HDR+. HDR+ takes a burst of shots with short exposures, selectively aligning the sharpest shots and averaging them using computational photography techniques. Short exposures avoids blur, blowing out highlights and averaging multiple shots reduces noise. HDR+ is processed on hardware accelerators including the Qualcomm Hexagon DSPs and Pixel Visual Core.
Other approaches that can yield resolving power exceeding the limits of atmospheric seeing include adaptive optics, interferometry, other forms of speckle imaging and space-based telescopes such as NASA's Hubble Space Telescope.
- C. L. Stong 1956 interviewing scientist Robert B. Leighton for Amateur Scientist, "Concerning the Problem of Making Sharper Photographs of the Planets", Scientific American, Vol 194, June 1956, p. 157. Early example of exposure selection with mechanical tip-tilt correction (using cine film and exposure times of 2 seconds or more).
- William A. Baum 1956, "Electronic Photography of Stars", Scientific American, Vol 194, March 1956. Discusses the selection of short exposures at moments when the image through a telescope is sharpest (using image intensifier and short exposures).
- Hippler et al., The AstraLux Sur Lucky Imaging Instrument at the NTT, The ESO Messenger 137 (2009). Bibcode: 2009Msngr.137...14H
- Janson et al., doi:10.1088/0004-637X/754/1/44 The AstraLux Large M-dwarf Multiplicity Survey, The Astrophysical Journal, Volume 754, Issue 1, article id. 44, 26 pp. (2012).
- David L. Fried, Probability of getting a lucky short-exposure image through turbulence, JOSA 68, pp. 1651-1658 (1978)
- Nieto and Thouvenot, Recentring and selection of short-exposure images with photon-counting detectors. I - Reliability tests, A&A 241, pp. 663-672 (1991)
- Law et al., Lucky Imaging: High Angular Resolution Imaging in the Visible from the Ground, A&A 446, pp. 739-745 (2006)
- Robert Nigel Tubbs, Lucky Exposures: Diffraction limited astronomical imaging through the atmosphere, Dissertation (2003), Published by VDM Verlag Dr. Müller, ISBN 3836497697 (2010)
- Batchelor and Townsend, doi:10.1098/rspa.1949.0136 The nature of turbulent motion at large wave-numbers, Proceedings of the Royal Society of London A, 199, pp. 238-255 (1949)
- Baldwin, Warner, and Mackay, doi:10.1051/0004-6361:20079214 The point spread function in Lucky Imaging and variations in seeing on short timescales], A&A 480, pp 589-597 (2008)
- Robert N. Tubbs, doi:10.1117/12.671170 The effect of temporal fluctuations in r0 on high-resolution observations], SPIE 6272, pp 93T (2006)
- Richard Tresch Fienberg, Sharpening the 200 Inch, Sky and Telescope (September 14, 2007)
- Baldwin et al., doi:10.1051/0004-6361:20010118 Diffraction-limited 800 nm imaging with the 2.56 m Nordic Optical Telescope], A&A 368, pp. L1–L4 (2001)
- Lucky Imaging at the Institute of Astronomy, University of Cambridge
- Dantowitz, Teare, and Kozubal, doi:10.1086/301328 Ground-based High-Resolution Imaging of Mercury, AJ 119, pp. 2455–2457 (2000)
- "HDR+: Low Light and High Dynamic Range photography in the Google Camera App". Google AI Blog. Retrieved 2019-08-02.
- "Introducing the HDR+ Burst Photography Dataset". Google AI Blog. Retrieved 2019-08-02.
|Wikimedia Commons has media related to Lucky imaging.|
- Amateur lucky imaging
- Lucky imaging with Astralux at the 2.2 m Calar Alto telescope
- Details of the Calar Alto and La Silla lucky imaging instruments
- Details of the LuckyCam instrument at the Nordic Optical Telescope
- BBC News article: 'Clearest' images taken of space
- Lucky imaging using gen 3 intensifier tubes