"It results from intermodulation or crosstalk between chrominance and luminance components of the signal, which are imperfectly multiplexed in the frequency domain." <<<In English? —Preceding unsigned comment added by 184.108.40.206 (talk) 12:55, 4 December 2008 (UTC)
- Translation: The analog TV signal is historically a black-and-white signal. The preferred form for transmitting an analog video signal is to send the three signals for red, green and blue seperately. This is how it's done in a VGA cable. But for compatibility reasons, and to save bandwith, the TV engineers decided to keep the old black-and-white norms, and to encode the color as dot crawl. For PAL and NTNC, this means: The intensity of the pattern encodes the saturation, and the shape of the pattern encodes the hue. There are differnt ways to remove the dot crawl from the luminance information, after the color has been decoded: the easiest way is to simply blur the image. This is how very old color TVs did it. The most advanced way is to use a comb filter. It uses the dot crawl from the last image and adds it to the dot crawl of this image. As the phase of the pattern moves between two images, it will eliminate this way, if the color has not changed since. Most TVs however work this way: they decode the color information, blur it al little, and reencode it as a negative pattern. This eliminates most of the pattern, but not at sharp edges where the color changes. -- Sloyment (talk) 07:36, 20 September 2011 (UTC)
It would be useful to have a discussion of SECAM here, which presumably has its own distinctive colour/luminance artifacts.
Anyone has a screenshot of the effect?
- In SECAM, color bleeding looks awful (just google for “SECAM fire” or “SECAM flares”). It results in bright red and blue bars. As this has to be prevented, the luminance signal has to be reduced in bandwith (i.e. blurred) on the sender side. So the receiver can do the same to remove the dot crawl. Without removal, the dot crawl would be all over the picture at constant intensity. -- Sloyment (talk) 07:36, 20 September 2011 (UTC)
"Dot crawl is most visible when the chrominance is transmitted with a high bandwidth, so that its spectrum reaches well into the band of frequencies used by the luminance signal in the composite video signal. This causes high-frequency chrominance detail at color transitions to be interpreted as luminance detail" -- nope, it's the other way around (high-bandwidth luminance is indistinguishable from chrominance in a composite video signal). —Preceding unsigned comment added by 220.127.116.11 (talk) 19:47, 31 May 2008 (UTC)
- ...and so there are two visible crosstalk effects: chroma into luma and luma into chroma. Both are correctly described in the article. Cuddlyable3 (talk) 20:39, 12 February 2011 (UTC)
A still or animated GIF of dot crawl would be helpful for readers.
Dot crawl isn't the reason why fine detail suits are avoided on TV. The real reason is the frequency inversion phenomenon that appears when sampling a signal at a frequency lower that the bandwidth of the signal (Nyquist). It typically generates moiré patterns.
The dot crawl effect also happens with SECAM and NTSC. In SECAM, the dot crawl effect gets worse, it undulates both horizontally and vertically, making it look like it goes round in tiny circles counterclockwise. In NTSC, the visuals are very similar to that of PAL, but instead of seeing pixels running down, they are running up.
Dot crawl is sometimes referred to as "hanging dot". For instance, the TI TP5154 video decoder provides a comb filter to reduce/remove "hanging dots at color boundaries. Jsarao (talk) 16:56, 13 October 2011 (UTC)