Jump to content

GeForce FX series

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by Swaaye (talk | contribs) at 03:42, 25 August 2010 (→‎Overview: clean up). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

Template:NvidiaGPU The GeForce FX or "GeForce 5" series (codenamed NV30) is a line of graphics processing units from the manufacturer NVIDIA.

Overview

NVIDIA's GeForce FX series is the fifth generation of the GeForce line. With GeForce 3, NVIDIA introduced programmable shader units into their 3D rendering capabilities, in line with the release of Microsoft's DirectX 8.0 release, and the GeForce 4 Ti was an enhancement of the GeForce 3. With real-time 3D graphics technology continually advancing, the release of DirectX 9.0 brought further refinement of programmable pipeline technology with the arrival of Shader Model 2.0. The GeForce FX series is NVIDIA's first generation Shader Model 2 hardware.

File:NVidia Dawn.jpg
The Dawn demo was released by NVIDIA to showcase pixel and vertex shaders effects of the GeForce FX Series

The series was manufactured on the 130 nm fabrication process. It is Shader Model 2.0/2.0A compliant vertex and pixel shaders. Among other features was an improved anisotropic filtering implementation which was not angle-dependent. The FX series utilizes various memory technologies, including DDR, DDR2, GDDR-2 and GDDR-3 memory.

The GeForce FX also included an improved VPE (Video Processing Engine), which was first deployed in the GeForce 4 MX. Its main upgrade was per-pixel video-deinterlacing — a feature first offered in ATI's Radeon, but seeing little use until the maturation of Microsoft's DirectX Video Acceleration and VMR (video mixing renderer) APIs.

Hardware based on the NV30 project didn't launch until near the end of 2002, several months after ATI had released their competing DirectX 9 architecture.

Marketing

While it is the fifth major revision in the series of GeForce graphics cards, it wasn't marketed as a GeForce 5. The FX ("effects") in the name was decided on to illustrate the power of the latest design's major improvements and new features, and to virtually distinguish the FX series as something greater than a revision of earlier designs. The FX in the name also was used to market the fact that the GeForce FX was the first GPU to be a combined effort from the previously acquired 3dfx Interactive engineers and NVIDIA's own engineers.

The advertising campaign for the GeForce FX featured the Dawn fairy demo, which was the work of several veterans from the computer animation Final Fantasy: The Spirits Within. NVIDIA touted it as "The Dawn of Cinematic Computing", while critics noted that this was the strongest case of using sex appeal in order to sell graphics cards yet.

The initial version of the GeForce FX (the 5800) was one of the first cards to come equipped with a large dual-slot cooling solution. Called "Flow FX", the cooler was very large in comparison to ATI's small, single-slot cooler on the 9700 series.[1] Its blower fan was also very loud. It was jokingly referred to as the 'Dustbuster'.[2]

The way it's meant to be played

NVIDIA debuted a new campaign to motivate developers to optimize their titles for NVIDIA hardware at the Game Developers Conference (GDC) in 2002. In exchange for prominently displaying the NVIDIA logo on the outside of the game packaging, NVIDIA offered free access to a state of the art test lab in Eastern Europe, that tested against 500 different PC configurations for compatibility. Developers also had extensive access to NVIDIA engineers, who helped produce code optimized for NVIDIA products.[3]

Overall performance

GeForce FX 5950 Ultra GPU

GeForce FX is an architecture designed with DirectX 7, 8 and 9 software in mind. Its performance for DirectX 7 and 8 is excellent compared to its competition, but it is much less competitive for software that primarily uses DirectX 9 features. [4]

Its relatively weak performance for Shader Model 2 is caused by several factors. The NV3x GPUs have less overall parallelism and calculation throughput than their competitors. It is also relatively more difficult to achieve high efficiency with the architecture due to architectural weaknesses and a resulting heavy reliance on optimized pixel shader code from the shader compiler.[5] Proper instruction ordering and instruction composition of shader code is critical.

Questionable tactics

GeForce FX 5950 Ultra

NVIDIA historically has been known for their impressive OpenGL driver performance and quality, and the FX series maintained this. However, with regard to image quality in both Direct3D and OpenGL, they aggressively began various optimization techniques not seen before. They started with filtering optimizations by changing how trilinear filtering operated on game textures, reducing its accuracy and thus visual quality.[6] Anisotropic filtering also saw dramatic tweaks to limit its use on as many textures as possible to save memory bandwidth and fillrate.[6] Tweaks to these types of texture filtering can often be spotted in games from a shimmering phenomenon that occurs with floor textures as the player moves through the environment (often signifying poor transitions between mip-maps). Changing the driver settings to "High Quality" can alleviate this occurrence at the cost of performance.[6]

NVIDIA also replaced pixel shader code in software with GeForce FX-optimized versions with lower accuracy. These "tweaks" were especially noticed in benchmark software from Futuremark. In 3DMark03 it was found that NVIDIA had gone to extremes to limit the complexity of the scenes through driver shader changeouts and aggressive hacks that prevented parts of the scene from even rendering at all.[7] Side by side analysis of screenshots in games and 3DMark03 showed noticeable differences between what a Radeon 9800/9700 displayed and what the FX series was doing.[7] NVIDIA also publicly attacked the usefulness of these programs and the techniques used within them in order to undermine their influence upon consumers. It should however be noted that ATI also created a software profile for 3DMark03.[8] Application-specific optimizations are typical practice to fix bugs and enhance performance, but when they affect visual quality significantly to enhance performance they become controversial. With regards to 3DMark, Futuremark began updates to their software and screening driver releases for these optimizations.

Hardware refreshes and diversification

GeForceFX 5500-SX

NVIDIA's only initial release, the GeForce FX 5800, was intended as a high-end part. There were no GeForce FX products for the other segments of the market. The GeForce 4 MX continued in its role as the budget video card and the older GeForce 4 Ti cards filled in the mid-range.

In April 2003, NVIDIA introduced the mid-range GeForce FX 5600 and budget GeForce FX 5200 models to address the other market segments. Each had an "Ultra" variant and a slower, cheaper non-Ultra variant. With conventional single-slot cooling and a mid-range price-tag, the 5600 Ultra had respectable performance but failed to measure up to its direct competitor, Radeon 9600 Pro. The GeForce FX 5600 parts did not even advance performance over the GeForce 4 Ti chips they were designed to replace.[9] Likewise, the entry-level FX 5200 did not perform as well as the DirectX 7.0 generation GeForce 4 MX440, despite the FX 5200 possessing a notably superior feature-set.[10] FX 5200 was also outperformed by the older Radeon 9000.[citation needed]

In May 2003, NVIDIA launched a new top-end model, the GeForce FX 5900 Ultra. This chip, based on a heavily-revised NV35 GPU, fixed many of the shortcomings of the 5800, which had been discontinued. While the 5800 used fast but hot and expensive GDDR-2 and had a 128-bit memory bus, the 5900 moved to slower and cheaper DDR SDRAM with a wider 256-bit memory bus. The 5900 Ultra performed somewhat better than the Radeon 9800 Pro in games not heavily using shader model 2, and had a quieter cooling system than the 5800.[11]

In October 2003, NVIDIA released a more potent mid-range card using technology from NV35; the GeForce FX 5700, using a new NV36 core. The FX 5700 was ahead of the Radeon 9600 Pro and XT in games with light use of shader model 2.[12] In December 2003, NVIDIA launched the 5900XT, a graphics card intended for the mid-range segment. It was similar to the 5900, but clocked slower and using slower memory. It managed to more soundly defeat Radeon 9600 XT, but was still behind in a few shader-heavy scenarios.[13]

The final GeForce FX model released was the 5950 Ultra, which was a 5900 Ultra with higher clock speeds. The board was fairly competitive with the Radeon 9800XT, again as long as pixel shaders were lightly used.[14]

Windows Vista and GeForce FX PCI cards

Windows Vista requires a DirectX 9-compliant 3D accelerator to display the full Windows Aero user interface. During pre-release testing of Vista and upon launch of the operating system, the video card options for owners of computers without AGP or PCIe slots were limited almost exclusively to PCI cards based on the NVIDIA NV34 core. This included cards such as GeForce FX 5200 and 5500 PCI. Since then, both ATI and Nvidia have launched a number of DirectX 9 PCI cards utilizing newer architectures.

Discontinued support

NVIDIA has ceased driver support for GeForce FX series.

Final Drivers Include

  • Windows 9x & Windows Me: 81.98 released on December 21, 2005; Download;
Product Support List Windows 95/98/Me – 81.98.
  • Windows 2000, 32-bit Windows XP & Media Center Edition: 175.19 released on July 9, 2008; Download. (Products supported list also on this page)
Note that the 175.19 driver is known to break Windows Remote Desktop (RDP)[15]. The last version before the problem is 174.74. It was apparently fixed in 177.83, although this version is not available for the GeForce 5 graphic cards[16]. Also worth of note is that 163.75 is the last known good driver, that correctly handles the adjustment of the video overlay color properties for the GeForce FX Series. Subsequent WHQL drivers do not handle the whole range of possible video overlay adjustments (169.21) or have no effect on those (175.xx).
  • Windows Vista RC2: 96.85 released on October 17, 2006; Download;
Product Support List Windows Vista – 96.85.

See also

References

  1. ^ Cite error: The named reference TRGFFX5800U was invoked but never defined (see the help page).
  2. ^ From Voodoo to GeForce: The Awesome History of 3D Graphics
  3. ^ Ferret, Wily (May 4, 2007). "Post-NVIDIA man writes in". The Inquirer. Retrieved 2008-06-14.
  4. ^ Cross, Jason. Benchmarking Half-Life 2: ATI vs. NVIDIA, ExtremeTech, November 29, 2004.
  5. ^ Cite error: The named reference 3dcenternv30 was invoked but never defined (see the help page).
  6. ^ a b c StealthHawk. Forceware texture filtering quality study, NVNews Forum, October 16, 2003.
  7. ^ a b Wasson, Scott (June 5, 2003). "Further NVIDIA optimizations for 3DMark03?". Tech Report. Retrieved 2008-06-14.
  8. ^ Shilov, Anton (May 23, 2003). "Futuremark Caught NVIDIA and ATI Technologies On Cheating In 3DMark03". X-bit labs. Retrieved 2008-06-14.
  9. ^ Gasior, Geoff (May 6, 2003). "Nvidia's GeForce FX 5600 GPU". Tech Report. Retrieved 2008-06-14.
  10. ^ Gasior, Geoff (April 29, 2003). "Nvidia's GeForce FX 5200 GPU". Tech Report. Retrieved 2008-06-14.
  11. ^ Bell, Brandon (June 20, 2003). "eVGA e-GeForce FX 5900 Ultra Review". FiringSquad. Retrieved 2008-06-14.
  12. ^ Gasior, Geoff (October 23, 2003). "NVIDIA's GeForce FX 5700 Ultra GPU". Tech Report. Retrieved 2008-06-14.
  13. ^ Gasior, Geoff (December 15, 2003). "NVIDIA's GeForce FX 5900 XT GPU". Tech Report. Retrieved 2008-06-14.
  14. ^ Hagedoorn, Hilbert (October 23, 2003). "GeForce FX 5700 Ultra & 5950 Ultra Review". Guru3D. Archived from the original on 2007-08-20. Retrieved 2008-06-14.
  15. ^ User forum complaints about v175.19 driver breaking RDP
  16. ^ AnandTech forum post regarding RDP issue