Apple Video

From Wikipedia, the free encyclopedia
Jump to: navigation, search

Apple Video is a lossy video compression and decompression algorithm (codec) developed by Apple Inc. and first released as part of QuickTime 1.0 in 1991.[1] The codec is also known as QuickTime Video, by its FourCC RPZA and the name Road Pizza.[2][3] When used in the AVI container, the FourCC AZPR is also used.[3] The bit-stream format of Apple Video has been reverse-engineered and a decoder has been implemented in the projects XAnim and libavcodec.[4][2]

Technical Details[edit]

The codec operates on 4×4 blocks of pixels in the RGB colorspace. Each frame is segmented into 4×4 blocks in raster-scan order. Each block is coded in one of four coding modes: skip, single color, four color, or 16 color.[3] Colors are represented by 16 bits with a bit-depth of 5 bit for each of the three components red, green, and blue, a format known as RGB555.[3] Because Apple Video operates in the image domain without motion compensation, decoding is much faster than MPEG-style codecs which use motion compensation and perform coding in a transform domain. As a tradeoff, the compression performance of Apple Video is lower.

Skip mode[edit]

The skip mode realizes conditional replenishment. If a block is coded in skip mode, the content of the block at same location in the previous frame is copied to the current frame.[3] Runs of skip blocks are coded in a run-length encoding scheme, enabling a high compression ratio in static areas of the picture.[3]

Single color mode[edit]

In single color mode, all pixels in a block are decoded in the same color.[3] This can be interpreted as a palette with a single color.

Four color mode[edit]

In four color mode, each pixel in a block is decoded as one of four colors which are specified in a palette.[3] To select one of the four entries, 2 bits per pixel are written to the bit-stream. The same palette is used for a run of length between one and 32 blocks.[3] Of the four colors, two are explicitly written to the bit-stream, while the other two are calculated at the decoder by linear interpolation in the RGB colorspace using the following equations:

\mathrm{color1} = \frac{21}{32} * \mathrm{color0} +\frac{11}{32} * \mathrm{color3} \approx \frac{2}{3} * \mathrm{color0} +\frac{1}{3} * \mathrm{color3}
\mathrm{color2} = \frac{11}{32} * \mathrm{color0} +\frac{21}{32} * \mathrm{color3} \approx \frac{1}{3} * \mathrm{color0} +\frac{2}{3} * \mathrm{color3}

where color0 and color3 are the two colors which are written in the bit-stream.[3] The four colors can be interpreted as lying equidistantly spaced on a line segment in the three-dimensional vector space with the three components red, green, and blue. The end-points of this line are written in the bit-stream. A similar color-interpolation scheme is used in S3 Texture Compression.

Interpreted as vector quantization, a three-dimensional vector with the components red, green, and blue is quantized using a codebook with four entries.

16 color mode[edit]

In 16-color mode, the color of each pixel in a block is explicitly written in the bit-stream.[3] This mode is lossless and equivalent to raw PCM without any compression.

See also[edit]

References[edit]

  1. ^ Guillermo A. Ortiz (Summer 1991). "QuickTime 1.0: "You oughta be in pictures"" (PDF). Apple Computer. Retrieved 14 April 2013. 
  2. ^ a b "FFmpeg Documentation". FFmpeg. Retrieved 4 April 2013. 
  3. ^ a b c d e f g h i j k "Apple RPZA". MultimediaWiki. 11 December 2008. Retrieved 4 April 2013. 
  4. ^ Mark Podlipec (10 December 1997). "xanim.2.70.6.4.2 README". XAnim. Retrieved 4 April 2013. 

External links[edit]