From Wikipedia, the free encyclopedia
Jump to: navigation, search
For the business-related fill rate, see Service rate.

The term fillrate usually refers to the number of pixels a video card can render and write to video memory in a second. In this case, fillrates are given in megapixels per second or in gigapixels per second (in the case of newer cards), and they are obtained by multiplying the number of raster operations (ROPs) by the clock frequency of the graphics processor unit (GPU) of a video card. However, there is no agreement on how to calculate and report fillrates. Other possible methods are: to multiply the number of texture units by the clock frequency, or to multiply the number of pixel pipelines by the clock frequency. [1] The results of these multiplications correspond to a theoretical number. The actual fillrate depends on many other factors. In the past, the fillrate has been used as an indicator of performance by video card manufacturers such as ATI and NVIDIA, however, the importance of the fillrate as a measurement of performance has declined as the bottleneck in graphics applications has shifted. For example, today, the number and speed of pixel shader units has gained attention.

Scene complexity can be increased by overdrawing, which happens when "an object is drawn to the frame buffer, and then another object (such as a wall) is drawn on top of it, covering it up. The time spent drawing the first object was wasted because it isn't visible." When a sequence of scenes is extremely complex (many pixels have to be drawn for each scene), the frame rate for the sequence may drop. When designing graphics intensive applications, one can determine whether the application is fillrate-limited by seeing if the frame rate increases dramatically when the application runs at a lower resolution or in a smaller window. [2]

See also[edit]


  1. ^ Don Woligroski (July 31, 2006). "Graphics Beginner's Guide, Part 2: Graphics Technology". Tom's Hardware. 
  2. ^ "Fill rate". DmWiki.