Jump to content

GeForce FX series: Difference between revisions

From Wikipedia, the free encyclopedia
Content deleted Content added
extensive clean up and new references
more clean up, fixes and additions
Line 14: Line 14:
NVIDIA's GeForce FX series is the fifth generation of the [[GeForce]] line. With [[GeForce 3]], NVIDIA introduced programmable shader functionality into their 3D architecture, in line with the release of Microsoft's [[DirectX]] 8.0. The [[GeForce 4|GeForce 4 Ti]] was an enhancement of the GeForce 3 technology. With real-time [[3D graphics]] technology continually advancing, the release of DirectX 9.0 brought further refinement of programmable pipeline technology with the arrival of [[Shader Model]] 2.0. The GeForce FX series is NVIDIA's first generation Direct3D 9-compliant hardware.
NVIDIA's GeForce FX series is the fifth generation of the [[GeForce]] line. With [[GeForce 3]], NVIDIA introduced programmable shader functionality into their 3D architecture, in line with the release of Microsoft's [[DirectX]] 8.0. The [[GeForce 4|GeForce 4 Ti]] was an enhancement of the GeForce 3 technology. With real-time [[3D graphics]] technology continually advancing, the release of DirectX 9.0 brought further refinement of programmable pipeline technology with the arrival of [[Shader Model]] 2.0. The GeForce FX series is NVIDIA's first generation Direct3D 9-compliant hardware.


[[Image:nVidia Dawn.jpg|right|thumb|The ''[[Dawn (demo)|Dawn]]'' demo was released by NVIDIA to showcase pixel and vertex shaders effects of the GeForce FX Series]]
[[Image:nVidia Dawn.jpg|right|thumb|The ''[[Dawn (demo)|Dawn]]'' demo was released by NVIDIA to showcase the new programmable shader effects of the GeForce FX Series]]
The series was manufactured on the 130 [[nanometer|nm]] fabrication process.<ref name="anand5800preview">{{cite web |author=Lal Shimpi, Anand|url=http://www.anandtech.com/show/1034 |title=NNVIDIA Introduces GeForce FX (NV30) |publisher=Anandtech |date=2002-11-18|accessdate=2010-08-25}}</ref> It is Shader Model 2.0/2.0A compliant, allowing for more flexibility in complex shader/fragment programs and much higher arithmetic precision. It supports a number of new memory technologies, including [[DDR2 SDRAM|DDR2]], [[GDDR-2]] and [[GDDR-3]] and saw NVIDIA's first implementation of a memory data bus wider than 128-bit.<ref name="ixbtlabs5900">{{cite web |author=Barkovoi, Aleksei and Vorobiev, Andrey|url=http://ixbtlabs.com/articles2/gffx/5900u.html |title=NVIDIA GeForce FX 5900 Ultra 256MB Video Card Review |publisher=X-bit labs |date=2003 |accessdate=2010-08-25}}</ref> The anisotropic filtering implementation has potentially higher quality than previous NVIDIA designs.<ref name="anand5800preview" /> Anti-aliasing methods have been enhanced and additional modes are available compared to GeForce 4.<ref name="anand5800preview" /> Memory bandwidth and fill-rate optimization mechanisms have been improved.<ref name="anand5800preview" /> Some members of the series offer double fill-rate in z-buffer/stencil-only passes.<ref name="ixbtlabs5900" />
The series was manufactured on the 130 [[nanometer|nm]] fabrication process.<ref name="anand5800preview">{{cite web |author=Lal Shimpi, Anand|url=http://www.anandtech.com/show/1034 |title=NNVIDIA Introduces GeForce FX (NV30) |publisher=Anandtech |date=2002-11-18|accessdate=2010-08-25}}</ref> It is Shader Model 2.0/2.0A compliant, allowing for more flexibility in complex shader/fragment programs and much higher arithmetic precision. It supports a number of new memory technologies, including [[DDR2 SDRAM|DDR2]], [[GDDR-2]] and [[GDDR-3]] and saw NVIDIA's first implementation of a memory data bus wider than 128-bit.<ref name="ixbtlabs5900">{{cite web |author=Barkovoi, Aleksei and Vorobiev, Andrey|url=http://ixbtlabs.com/articles2/gffx/5900u.html |title=NVIDIA GeForce FX 5900 Ultra 256MB Video Card Review |publisher=X-bit labs |date=2003 |accessdate=2010-08-25}}</ref> The anisotropic filtering implementation has potentially higher quality than previous NVIDIA designs.<ref name="anand5800preview" /> Anti-aliasing methods have been enhanced and additional modes are available compared to GeForce 4.<ref name="anand5800preview" /> Memory bandwidth and fill-rate optimization mechanisms have been improved.<ref name="anand5800preview" /> Some members of the series offer double fill-rate in z-buffer/stencil-only passes.<ref name="ixbtlabs5900" />


Line 21: Line 21:
The initial version of the GeForce FX (the 5800) was one of the first cards to come equipped with a large dual-slot cooling solution. Called "Flow FX", the cooler was very large in comparison to ATI's small, single-slot cooler on the 9700 series.<ref name=TRGFFX5800U>{{cite web |author=Wasson, Scott |url=http://techreport.com/articles.x/4966 |title=NVIDIA's GeForce FX 5800 Ultra GPU |publisher=Tech Report |date=April 7, 2003 |accessdate=2008-06-14}}</ref> It was jokingly referred to as the 'Dustbuster', due to a high level of fan noise.<ref>[http://www.maximumpc.com/article/features/graphics_extravaganza_ultimate_gpu_retrospective?page=0%2C6 From Voodoo to GeForce: The Awesome History of 3D Graphics]</ref>
The initial version of the GeForce FX (the 5800) was one of the first cards to come equipped with a large dual-slot cooling solution. Called "Flow FX", the cooler was very large in comparison to ATI's small, single-slot cooler on the 9700 series.<ref name=TRGFFX5800U>{{cite web |author=Wasson, Scott |url=http://techreport.com/articles.x/4966 |title=NVIDIA's GeForce FX 5800 Ultra GPU |publisher=Tech Report |date=April 7, 2003 |accessdate=2008-06-14}}</ref> It was jokingly referred to as the 'Dustbuster', due to a high level of fan noise.<ref>[http://www.maximumpc.com/article/features/graphics_extravaganza_ultimate_gpu_retrospective?page=0%2C6 From Voodoo to GeForce: The Awesome History of 3D Graphics]</ref>


Hardware based on the NV30 project didn't launch until near the end of 2002, several months after ATI had released their competing DirectX 9 architecture.<ref name="anand5800review">{{cite web |author=Lal Shimpi, Anand|url=http://www.anandtech.com/show/1062 |title=NVIDIA GeForce FX 5800 Ultra: It's Here, but is it Good? |publisher=Anandtech |date=2003-01-27|accessdate=2010-08-25}}</ref>
The advertising campaign for the GeForce FX featured the ''[[Dawn (demo)|Dawn technology demo]]'', which was the work of several veterans from the computer animation [[Final Fantasy: The Spirits Within]].<ref name="anand5800review">{{cite web |author= |url=http://www.nzone.com/object/nzone_dawndemo_home.html |title=Dawn Demo |publisher=NVIDIA |date= |accessdate=2010-08-25}}</ref> NVIDIA touted it as "The Dawn of Cinematic Computing", while critics noted that this was the strongest case of using sex appeal in order to sell graphics cards yet. <ref name="nvdawncinematic">{{cite web |author= |url=http://www.nor-tech.com/solutions/dox/PO_GFFX_5600_5200.pdf |title=Cinematic Computing For Every User |publisher=NVIDIA |date= |accessdate=2010-08-25}}</ref>


The advertising campaign for the GeForce FX featured the ''[[Dawn (demo)|Dawn]]'' [[fairy]] demo, which was the work of several veterans from the computer animation [[Final Fantasy: The Spirits Within]].<ref name="anand5800review">{{cite web |author= |url=http://www.nzone.com/object/nzone_dawndemo_home.html |title=Dawn Demo |publisher=NVIDIA |date= |accessdate=2010-08-25}}</ref> NVIDIA touted it as "The Dawn of Cinematic Computing", while critics noted that this was the strongest case of using sex appeal in order to sell graphics cards yet. <ref name="nvdawncinematic">{{cite web |author= |url=http://www.nor-tech.com/solutions/dox/PO_GFFX_5600_5200.pdf |title=Cinematic Computing For Every User |publisher=NVIDIA |date= |accessdate=2010-08-25}}</ref>

==The way it's meant to be played==
NVIDIA debuted a new campaign to motivate developers to optimize their titles for NVIDIA hardware at the [[Game Developers Conference]] (GDC) in 2002. In exchange for prominently displaying the NVIDIA logo on the outside of the game packaging, NVIDIA offered free access to a state of the art test lab in Eastern Europe, that tested against 500 different PC configurations for compatibility. Developers also had extensive access to NVIDIA engineers, who helped produce code optimized for NVIDIA products.<ref>{{cite web |author=Ferret, Wily |url=http://www.theinquirer.net/default.aspx?article=39402 |title=Post-NVIDIA man writes in |publisher=The Inquirer |date=May 4, 2007 |accessdate=2008-06-14}}</ref>
NVIDIA debuted a new campaign to motivate developers to optimize their titles for NVIDIA hardware at the [[Game Developers Conference]] (GDC) in 2002. In exchange for prominently displaying the NVIDIA logo on the outside of the game packaging, NVIDIA offered free access to a state of the art test lab in Eastern Europe, that tested against 500 different PC configurations for compatibility. Developers also had extensive access to NVIDIA engineers, who helped produce code optimized for NVIDIA products.<ref>{{cite web |author=Ferret, Wily |url=http://www.theinquirer.net/default.aspx?article=39402 |title=Post-NVIDIA man writes in |publisher=The Inquirer |date=May 4, 2007 |accessdate=2008-06-14}}</ref>

Hardware based on the NV30 project didn't launch until near the end of 2002, several months after ATI had released their competing DirectX 9 architecture.<ref name="anand5800review">{{cite web |author=Lal Shimpi, Anand|url=http://www.anandtech.com/show/1062 |title=NVIDIA GeForce FX 5800 Ultra: It's Here, but is it Good? |publisher=Anandtech |date=2003-01-27|accessdate=2010-08-25}}</ref>


== Overall performance ==
== Overall performance ==
[[File:GeForce FX 5950 Ultra GPU.jpg|thumb|GeForce FX 5950 Ultra GPU]]
GeForce FX is an architecture designed with DirectX 7, 8 and 9 software in mind. Its performance for DirectX 7 and 8 is excellent compared to its competition, but it is much less competitive for software that primarily uses DirectX 9 features. <ref name=XtechHL2GFFX>Cross, Jason. [http://www.extremetech.com/article2/0,1697,1733181,00.asp Benchmarking Half-Life 2: ATI vs. NVIDIA], ExtremeTech, November 29, 2004.</ref>
GeForce FX is an architecture designed with DirectX 7, 8 and 9 software in mind. Its performance for DirectX 7 and 8 is excellent compared to its competition, but it is much less competitive for software that primarily uses DirectX 9 features. <ref name=XtechHL2GFFX>Cross, Jason. [http://www.extremetech.com/article2/0,1697,1733181,00.asp Benchmarking Half-Life 2: ATI vs. NVIDIA], ExtremeTech, November 29, 2004.</ref>


Line 36: Line 34:
==Hardware refreshes and diversification==
==Hardware refreshes and diversification==
{{See also|Comparison of NVIDIA graphics processing units}}
{{See also|Comparison of NVIDIA graphics processing units}}
[[File:nVidia GeForceFX 5500 SX.jpg|thumb|GeForceFX 5500-SX]]
[[File:GeForce FX 5950 Ultra GPU.jpg|thumb|FX 5950 (NV35)]]
[[File:nVidia GeForceFX 5500 SX.jpg|thumb|FX 5500 (NV34)]]


NVIDIA's initial release, the GeForce FX 5800, was intended as a high-end part. There were no GeForce FX products for the other segments of the market. The GeForce 4 MX continued in its role as the budget video card and the older GeForce 4 Ti cards filled in the mid-range.
NVIDIA's initial release, the GeForce FX 5800, was intended as a high-end part. There were no GeForce FX products for the other segments of the market. The GeForce 4 MX continued in its role as the budget video card and the older GeForce 4 Ti cards filled in the mid-range.


In April 2003, NVIDIA introduced the GeForce FX 5600 and the GeForce FX 5200 to address the other market segments. Each had an "Ultra" variant and a slower, lower-priced variant and all used conventional single-slot cooling. The 5600 Ultra had respectable performance overall but was slower than the Radeon 9600 Pro and sometimes slower than the GeForce 4 Ti series.<ref>{{cite web |author=Gasior, Geoff |url=http://techreport.com/articles.x/5103/1 |title=Nvidia's GeForce FX 5600 GPU |publisher=Tech Report |date=May 6, 2003 |accessdate=2008-06-14}}</ref> The FX 5200 did not perform as well as the DirectX 7.0 generation GeForce 4 MX440.<ref>{{cite web |author=Gasior, Geoff |url=http://techreport.com/articles.x/5065/1 |title=Nvidia's GeForce FX 5200 GPU |publisher=Tech Report |date=April 29, 2003 |accessdate=2008-06-14}}</ref>
In April 2003, NVIDIA introduced the GeForce FX 5600 and the GeForce FX 5200 to address the other market segments. Each had an "Ultra" variant and a slower, budget-oriented variant and all used conventional single-slot cooling. The 5600 Ultra had respectable performance overall but it was slower than the Radeon 9600 Pro and sometimes slower than the GeForce 4 Ti series.<ref>{{cite web |author=Gasior, Geoff |url=http://techreport.com/articles.x/5103/1 |title=Nvidia's GeForce FX 5600 GPU |publisher=Tech Report |date=May 6, 2003 |accessdate=2008-06-14}}</ref> The FX 5200 did not perform as well as the DirectX 7.0 generation GeForce 4 MX440 or Radeon 9000 Pro.<ref>{{cite web |author=Gasior, Geoff |url=http://techreport.com/articles.x/5065/1 |title=Nvidia's GeForce FX 5200 GPU |publisher=Tech Report |date=April 29, 2003 |accessdate=2008-06-14}}</ref>


In May 2003, NVIDIA launched a new high-end product, the GeForce FX 5900 Ultra. This model was based on a revised GPU called NV35 and fixed some of the shortcomings of the discontinued NV30 generation.<ref name="BellFX5900" >{{cite web |author=Bell, Brandon |url=http://www.firingsquad.com/hardware/evga_e-geforce_fx_5900_ultra_review/default.asp |title=eVGA e-GeForce FX 5900 Ultra Review |publisher=FiringSquad |date=June 20, 2003 |accessdate=2008-06-14}}</ref> The 5900 utilized slower and cheaper DDR SDRAM, but by switching to a 256-bit memory bus it still had considerably more memory bandwidth than the 5800.<ref name="BellFX5900" /> The 5900 Ultra performed somewhat better than the Radeon 9800 Pro in games not heavily using shader model 2, and had a quieter cooling system than the 5800.<ref name="BellFX5900" />
In May 2003, NVIDIA launched the GeForce FX 5900 Ultra, a new high-end product to replace the low-volume and disappointing FX 5800. Based upon a revised GPU called NV35, which fixed some of the DirectX 9 shortcomings of the discontinued NV30, this product was more competitive with the Radeon 9700 and 9800.<ref name="BellFX5900" >{{cite web |author=Bell, Brandon |url=http://www.firingsquad.com/hardware/evga_e-geforce_fx_5900_ultra_review/default.asp |title=eVGA e-GeForce FX 5900 Ultra Review |publisher=FiringSquad |date=June 20, 2003 |accessdate=2008-06-14}}</ref> In addition to redesigning parts of the GPU, NVIDIA moved to a 256-bit memory data bus, allowing for significantly higher memory bandwidth than the 5800 even when utilizing more common DDR SDRAM instead of DDR2.<ref name="BellFX5900" /> The 5900 Ultra performed somewhat better than the Radeon 9800 Pro in games not heavily using shader model 2, and had a quieter cooling system than the 5800.<ref name="BellFX5900" />


In October 2003, NVIDIA released the GeForce FX 5700, a mid-range card using the NV36 GPU with technology from NV35. The 5700 strong competition for the Radeon 9600 XT in games with light use of shader model 2.<ref>{{cite web |author=Gasior, Geoff |url=http://techreport.com/articles.x/5799/1 |title=NVIDIA's GeForce FX 5700 Ultra GPU |publisher=Tech Report |date=October 23, 2003 |accessdate=2008-06-14}}</ref>
In October 2003, NVIDIA released the GeForce FX 5700 and GeForce FX 5950. The 5700 was a mid-range card using the NV36 GPU with technology from NV35 while the 5950 was a high-end card again using the NV35 GPU but with additional clock speed. The 5700 provided strong competition for the Radeon 9600 XT in games limited to light use of shader model 2.<ref>{{cite web |author=Gasior, Geoff |url=http://techreport.com/articles.x/5799/1 |title=NVIDIA's GeForce FX 5700 Ultra GPU |publisher=Tech Report |date=October 23, 2003 |accessdate=2008-06-14}}</ref> The 5950 was competitive with the Radeon 9800 XT, again as long as pixel shaders were lightly used.<ref>{{cite web |author=Hagedoorn, Hilbert |url=http://www.guru3d.com/article/content/88/ |archiveurl=http://web.archive.org/web/20070820001403/http://www.guru3d.com/article/content/88/ |archivedate=2007-08-20 |title=GeForce FX 5700 Ultra & 5950 Ultra Review |publisher=Guru3D |date=October 23, 2003 |accessdate=2008-06-14}}</ref>


In December 2003, NVIDIA launched the GeForce FX 5900XT, a graphics card intended for the mid-range segment. It was similar to the 5900 Ultra, but clocked slower and used slower memory. It more thoroughly competed with Radeon 9600 XT, but was still behind in a few shader-intense scenarios.<ref>{{cite web |author=Gasior, Geoff |url=http://techreport.com/articles.x/5990/1 |title=NVIDIA's GeForce FX 5900 XT GPU |publisher=Tech Report |date=December 15, 2003 |accessdate=2008-06-14}}</ref>
In December 2003, NVIDIA launched the GeForce FX 5900XT, a graphics card intended for the mid-range segment. It was similar to the 5900 Ultra, but clocked slower and used slower memory. It more thoroughly competed with Radeon 9600 XT, but was still behind in a few shader-intense scenarios.<ref>{{cite web |author=Gasior, Geoff |url=http://techreport.com/articles.x/5990/1 |title=NVIDIA's GeForce FX 5900 XT GPU |publisher=Tech Report |date=December 15, 2003 |accessdate=2008-06-14}}</ref>


The final GeForce FX model was the FX 5950 Ultra, which was a 5900 Ultra with higher clock speeds. The board was competitive with the Radeon 9800 XT, again as long as pixel shaders were lightly used.<ref>{{cite web |author=Hagedoorn, Hilbert |url=http://www.guru3d.com/article/content/88/ |archiveurl=http://web.archive.org/web/20070820001403/http://www.guru3d.com/article/content/88/ |archivedate=2007-08-20 |title=GeForce FX 5700 Ultra & 5950 Ultra Review |publisher=Guru3D |date=October 23, 2003 |accessdate=2008-06-14}}</ref>
The GeForce FX line moved to [[PCI Express]] in early 2004 with a number of models, including the PCX 5300, PCX 5750, PCX 5900 and PCX 5950. These cards were largely the same as their AGP predecessors with similar model numbers. To operate on the PCIe bus, an AGP-to-PCIe "HSI bridge" chip on the video card converted the PCIe signals into AGP signals for the GPU.<ref>{{cite web |author=Timofeeva, Anna |url=http://www.digital-daily.com/video/gigabyte-pcx5900/ |title=Gigabyte GeForce PCX 5900 Video Card Review |publisher=Digital-Daily |date=April 8, 2004 |accessdate=2010-08-25}}</ref>


Also in 2004, the GeForce FX 5200 / 5300 series that utilized the NV34 GPU received a new member with the FX 5500.<ref>{{cite web |author=Hagedoorn, Hilbert |url=http://www.guru3d.com/article/pov-geforce-fx-5500-review/ |title= PoV GeForce FX 5500 Review |publisher=Digital-Daily |date=March 9, 2004 |accessdate=2010-08-25}}</ref>
==Windows Vista and GeForce FX PCI cards==
[[Windows Vista]] requires a DirectX 9-compliant 3D accelerator to display the full [[Windows Aero]] user interface. During pre-release testing of Vista and upon launch of the [[operating system]], the video card options for owners of computers without [[Accelerated Graphics Port|AGP]] or [[PCI Express|PCIe]] slots were limited almost exclusively to PCI cards based on the NVIDIA NV34 core. This included cards such as GeForce FX 5200 and 5500 PCI. Since then, both ATI and Nvidia have launched a number of DirectX 9 PCI cards utilizing newer architectures.


== Discontinued support ==
== Discontinued support ==
Line 61: Line 59:
* Windows 2000, 32-bit Windows XP & Media Center Edition: 175.19 released on July 9, 2008; [http://www.nvidia.com/object/winxp_175.19_whql.html Download]. (Products supported list also on this page)
* Windows 2000, 32-bit Windows XP & Media Center Edition: 175.19 released on July 9, 2008; [http://www.nvidia.com/object/winxp_175.19_whql.html Download]. (Products supported list also on this page)
:Note that the 175.19 driver is known to break Windows Remote Desktop (RDP)<ref>[http://forums.nvidia.com/lofiversion/index.php?t70691.html User forum complaints about v175.19 driver breaking RDP]</ref>. The last version before the problem is 174.74. It was apparently fixed in 177.83, although this version is not available for the GeForce 5 graphic cards<ref>[http://forums.anandtech.com/showpost.php?p=26234627 AnandTech forum post regarding RDP issue]</ref>. Also worth of note is that 163.75 is the last known good driver, that correctly handles the adjustment of the video overlay color properties for the GeForce FX Series. Subsequent WHQL drivers do not handle the whole range of possible video overlay adjustments (169.21) or have no effect on those (175.xx).
:Note that the 175.19 driver is known to break Windows Remote Desktop (RDP)<ref>[http://forums.nvidia.com/lofiversion/index.php?t70691.html User forum complaints about v175.19 driver breaking RDP]</ref>. The last version before the problem is 174.74. It was apparently fixed in 177.83, although this version is not available for the GeForce 5 graphic cards<ref>[http://forums.anandtech.com/showpost.php?p=26234627 AnandTech forum post regarding RDP issue]</ref>. Also worth of note is that 163.75 is the last known good driver, that correctly handles the adjustment of the video overlay color properties for the GeForce FX Series. Subsequent WHQL drivers do not handle the whole range of possible video overlay adjustments (169.21) or have no effect on those (175.xx).
* Windows Vista RC2: 96.85 released on October 17, 2006; [http://www.nvidia.com/object/winvista_x86_96.85_2.html Download];
* Windows Vista & 7: 96.85 released on October 17, 2006; [http://www.nvidia.com/object/winvista_x86_96.85_2.html Download];
:[http://www.nvidia.com/object/winvista_x86_96.85_2.html Product Support List Windows Vista – 96.85].
:[http://www.nvidia.com/object/winvista_x86_96.85_2.html Product Support List Windows Vista – 96.85].



Revision as of 20:20, 26 August 2010

Template:NvidiaGPU The GeForce FX or "GeForce 5" series (codenamed NV30) is a line of graphics processing units from the manufacturer NVIDIA.

Overview

NVIDIA's GeForce FX series is the fifth generation of the GeForce line. With GeForce 3, NVIDIA introduced programmable shader functionality into their 3D architecture, in line with the release of Microsoft's DirectX 8.0. The GeForce 4 Ti was an enhancement of the GeForce 3 technology. With real-time 3D graphics technology continually advancing, the release of DirectX 9.0 brought further refinement of programmable pipeline technology with the arrival of Shader Model 2.0. The GeForce FX series is NVIDIA's first generation Direct3D 9-compliant hardware.

File:NVidia Dawn.jpg
The Dawn demo was released by NVIDIA to showcase the new programmable shader effects of the GeForce FX Series

The series was manufactured on the 130 nm fabrication process.[1] It is Shader Model 2.0/2.0A compliant, allowing for more flexibility in complex shader/fragment programs and much higher arithmetic precision. It supports a number of new memory technologies, including DDR2, GDDR-2 and GDDR-3 and saw NVIDIA's first implementation of a memory data bus wider than 128-bit.[2] The anisotropic filtering implementation has potentially higher quality than previous NVIDIA designs.[1] Anti-aliasing methods have been enhanced and additional modes are available compared to GeForce 4.[1] Memory bandwidth and fill-rate optimization mechanisms have been improved.[1] Some members of the series offer double fill-rate in z-buffer/stencil-only passes.[2]

The series also brought improvements to NVIDIA's video processing hardware, in the form of the Video Processing Engine (VPE), which was first deployed in the GeForce 4 MX.[3] The primary addition, compared to previous NVIDIA video processors, was per-pixel video-deinterlacing.[3]

The initial version of the GeForce FX (the 5800) was one of the first cards to come equipped with a large dual-slot cooling solution. Called "Flow FX", the cooler was very large in comparison to ATI's small, single-slot cooler on the 9700 series.[4] It was jokingly referred to as the 'Dustbuster', due to a high level of fan noise.[5]

The advertising campaign for the GeForce FX featured the Dawn technology demo, which was the work of several veterans from the computer animation Final Fantasy: The Spirits Within.[6] NVIDIA touted it as "The Dawn of Cinematic Computing", while critics noted that this was the strongest case of using sex appeal in order to sell graphics cards yet. [7]

NVIDIA debuted a new campaign to motivate developers to optimize their titles for NVIDIA hardware at the Game Developers Conference (GDC) in 2002. In exchange for prominently displaying the NVIDIA logo on the outside of the game packaging, NVIDIA offered free access to a state of the art test lab in Eastern Europe, that tested against 500 different PC configurations for compatibility. Developers also had extensive access to NVIDIA engineers, who helped produce code optimized for NVIDIA products.[8]

Hardware based on the NV30 project didn't launch until near the end of 2002, several months after ATI had released their competing DirectX 9 architecture.[6]

Overall performance

GeForce FX is an architecture designed with DirectX 7, 8 and 9 software in mind. Its performance for DirectX 7 and 8 is excellent compared to its competition, but it is much less competitive for software that primarily uses DirectX 9 features. [9]

Its relatively weak performance for processing Shader Model 2 programs is caused by several factors. The NV3x design has less overall parallelism and calculation throughput than its competitors.[10] It is relatively more difficult, compared to GeForce 6 and ATI Radeon R3x0, to achieve high efficiency with the architecture due to architectural weaknesses and a resulting heavy reliance on optimized pixel shader code.[10] Proper instruction ordering and instruction composition of shader code is critical for making the most of the available computational resources.[10]

Hardware refreshes and diversification

FX 5950 (NV35)
FX 5500 (NV34)

NVIDIA's initial release, the GeForce FX 5800, was intended as a high-end part. There were no GeForce FX products for the other segments of the market. The GeForce 4 MX continued in its role as the budget video card and the older GeForce 4 Ti cards filled in the mid-range.

In April 2003, NVIDIA introduced the GeForce FX 5600 and the GeForce FX 5200 to address the other market segments. Each had an "Ultra" variant and a slower, budget-oriented variant and all used conventional single-slot cooling. The 5600 Ultra had respectable performance overall but it was slower than the Radeon 9600 Pro and sometimes slower than the GeForce 4 Ti series.[11] The FX 5200 did not perform as well as the DirectX 7.0 generation GeForce 4 MX440 or Radeon 9000 Pro.[12]

In May 2003, NVIDIA launched the GeForce FX 5900 Ultra, a new high-end product to replace the low-volume and disappointing FX 5800. Based upon a revised GPU called NV35, which fixed some of the DirectX 9 shortcomings of the discontinued NV30, this product was more competitive with the Radeon 9700 and 9800.[13] In addition to redesigning parts of the GPU, NVIDIA moved to a 256-bit memory data bus, allowing for significantly higher memory bandwidth than the 5800 even when utilizing more common DDR SDRAM instead of DDR2.[13] The 5900 Ultra performed somewhat better than the Radeon 9800 Pro in games not heavily using shader model 2, and had a quieter cooling system than the 5800.[13]

In October 2003, NVIDIA released the GeForce FX 5700 and GeForce FX 5950. The 5700 was a mid-range card using the NV36 GPU with technology from NV35 while the 5950 was a high-end card again using the NV35 GPU but with additional clock speed. The 5700 provided strong competition for the Radeon 9600 XT in games limited to light use of shader model 2.[14] The 5950 was competitive with the Radeon 9800 XT, again as long as pixel shaders were lightly used.[15]

In December 2003, NVIDIA launched the GeForce FX 5900XT, a graphics card intended for the mid-range segment. It was similar to the 5900 Ultra, but clocked slower and used slower memory. It more thoroughly competed with Radeon 9600 XT, but was still behind in a few shader-intense scenarios.[16]

The GeForce FX line moved to PCI Express in early 2004 with a number of models, including the PCX 5300, PCX 5750, PCX 5900 and PCX 5950. These cards were largely the same as their AGP predecessors with similar model numbers. To operate on the PCIe bus, an AGP-to-PCIe "HSI bridge" chip on the video card converted the PCIe signals into AGP signals for the GPU.[17]

Also in 2004, the GeForce FX 5200 / 5300 series that utilized the NV34 GPU received a new member with the FX 5500.[18]

Discontinued support

NVIDIA has ceased driver support for GeForce FX series.

  • Windows 9x & Windows Me: 81.98 released on December 21, 2005; Download;
Product Support List Windows 95/98/Me – 81.98.
  • Windows 2000, 32-bit Windows XP & Media Center Edition: 175.19 released on July 9, 2008; Download. (Products supported list also on this page)
Note that the 175.19 driver is known to break Windows Remote Desktop (RDP)[19]. The last version before the problem is 174.74. It was apparently fixed in 177.83, although this version is not available for the GeForce 5 graphic cards[20]. Also worth of note is that 163.75 is the last known good driver, that correctly handles the adjustment of the video overlay color properties for the GeForce FX Series. Subsequent WHQL drivers do not handle the whole range of possible video overlay adjustments (169.21) or have no effect on those (175.xx).
  • Windows Vista & 7: 96.85 released on October 17, 2006; Download;
Product Support List Windows Vista – 96.85.

See also

References

  1. ^ a b c d Lal Shimpi, Anand (2002-11-18). "NNVIDIA Introduces GeForce FX (NV30)". Anandtech. Retrieved 2010-08-25.
  2. ^ a b Barkovoi, Aleksei and Vorobiev, Andrey (2003). "NVIDIA GeForce FX 5900 Ultra 256MB Video Card Review". X-bit labs. Retrieved 2010-08-25.{{cite web}}: CS1 maint: multiple names: authors list (link)
  3. ^ a b "Video Processing Engine". NVIDIA. Retrieved 2010-08-25.
  4. ^ Wasson, Scott (April 7, 2003). "NVIDIA's GeForce FX 5800 Ultra GPU". Tech Report. Retrieved 2008-06-14.
  5. ^ From Voodoo to GeForce: The Awesome History of 3D Graphics
  6. ^ a b "Dawn Demo". NVIDIA. Retrieved 2010-08-25. Cite error: The named reference "anand5800review" was defined multiple times with different content (see the help page).
  7. ^ "Cinematic Computing For Every User" (PDF). NVIDIA. Retrieved 2010-08-25.
  8. ^ Ferret, Wily (May 4, 2007). "Post-NVIDIA man writes in". The Inquirer. Retrieved 2008-06-14.
  9. ^ Cross, Jason. Benchmarking Half-Life 2: ATI vs. NVIDIA, ExtremeTech, November 29, 2004.
  10. ^ a b c Demirug. CineFX (NV30) Inside, 3DCenter, August 31, 2003.
  11. ^ Gasior, Geoff (May 6, 2003). "Nvidia's GeForce FX 5600 GPU". Tech Report. Retrieved 2008-06-14.
  12. ^ Gasior, Geoff (April 29, 2003). "Nvidia's GeForce FX 5200 GPU". Tech Report. Retrieved 2008-06-14.
  13. ^ a b c Bell, Brandon (June 20, 2003). "eVGA e-GeForce FX 5900 Ultra Review". FiringSquad. Retrieved 2008-06-14.
  14. ^ Gasior, Geoff (October 23, 2003). "NVIDIA's GeForce FX 5700 Ultra GPU". Tech Report. Retrieved 2008-06-14.
  15. ^ Hagedoorn, Hilbert (October 23, 2003). "GeForce FX 5700 Ultra & 5950 Ultra Review". Guru3D. Archived from the original on 2007-08-20. Retrieved 2008-06-14.
  16. ^ Gasior, Geoff (December 15, 2003). "NVIDIA's GeForce FX 5900 XT GPU". Tech Report. Retrieved 2008-06-14.
  17. ^ Timofeeva, Anna (April 8, 2004). "Gigabyte GeForce PCX 5900 Video Card Review". Digital-Daily. Retrieved 2010-08-25.
  18. ^ Hagedoorn, Hilbert (March 9, 2004). "PoV GeForce FX 5500 Review". Digital-Daily. Retrieved 2010-08-25.
  19. ^ User forum complaints about v175.19 driver breaking RDP
  20. ^ AnandTech forum post regarding RDP issue