A streak camera is an instrument for measuring the variation in a pulse of light's intensity with time. They are used to measure the pulse duration of some ultrafast laser systems and for applications such as time-resolved spectroscopy and LIDAR.
A streak camera operates by Fourier transforming the time variations of a light pulse into a spatial profile on a detector, by causing a time-varying deflection of the light across the width of the detector. A light pulse enters the instrument through a narrow slit along one direction and gets deflected in the perpendicular direction so that photons that arrive first hit the detector at a different position compared to photons that arrive later.
The resulting image forms a "streak" of light, from which the duration, and other temporal properties, of the light pulse can be inferred. Usually, in order to record periodic phenomena, a streak camera needs to be triggered accordingly, similarly to an oscilloscope.
Optoelectronic streak cameras work by directing the light onto a photocathode, which when hit by photons produces electrons via the photoelectric effect. The electrons are accelerated in a cathode ray tube and pass through an electric field produced by a pair of plates, which deflects the electrons sideways. By modulating the electric potential between the plates, the electric field is quickly changed to give a time-varying deflection of the electrons, sweeping the electrons across a phosphor screen at the end of the tube. A linear detector, such as a charge-coupled device (CCD) array is used to measure the streak pattern on the screen, and thus the temporal profile of the light pulse.
The time-resolution of the best optoelectronic streak cameras is around 180 femtoseconds. Measurement of pulses shorter than this duration requires other techniques such as optical autocorrelation and frequency-resolved optical gating (FROG).
In December 2011, a team at MIT released images combining the use of a streak camera with repeated laser pulses to simulate a movie with a frame rate of one trillion frames per second. This was surpassed in 2020 by a team from Caltech that achieved frame rates of 70 trillion fps.
- Photo finish, which uses a much slower but 2-dimensional version of a camera mapping time into a spatial dimension
- "Hamamatsu: Interactive Java Tutorials - Streak Camera". Retrieved 2006-10-15.
- Horn, Alexander (2009). Ultra-fast Material Metrology. John Wiley & Sons. p. 7. ISBN 9783527627936.
- Mourou, Gerard A.; Bloom, David M.; Lee, Chi-H. (2013). Picosecond Electronics and Optoelectronics: Proceedings of the Topical Meeting Lake Tahoe, Nevada, March 13–15, 1985. Springer Science & Business Media. p. 58. ISBN 9783642707803.
- "Guide to streak cameras" (PDF). Retrieved 2015-07-07.
- Akira Takahashi et al.: "New femtosecond streak camera with temporal resolution of 180 fs" Proc. SPIE 2116, Generation, Amplification, and Measurement of Ultrashort Laser Pulses, 275 (May 16, 1994); doi:10.1117/12.175863
- Chang, Zenghu (2016). Fundamentals of Attosecond Optics. CRC Press. p. 84. ISBN 9781420089387.
- "MIT's trillion frames per second light-tracking camera". BBC News. 2011-12-13. Retrieved 2011-12-14.
- Wang, Peng; Liang, Jinyang; Wang, Lihong V. (29 April 2020). "Single-shot ultrafast imaging attaining 70 trillion frames per second". Nature Communications. 11 (1). doi:10.1038/s41467-020-15745-4.
|This optics-related article is a stub. You can help Wikipedia by expanding it.|