Jump to content

Nvidia

Coordinates: 37°22′14.62″N 121°57′49.46″W / 37.3707278°N 121.9637389°W / 37.3707278; -121.9637389
From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by 71.183.43.161 (talk) at 00:53, 23 January 2009 (Graphics chipsets). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

NVIDIA Corporation
Company typePublic (NasdaqNVDA)
IndustrySemiconductors- Specialized
Founded1993
Headquarters2701 San Tomas Expressway
Santa Clara, California
USA
Key people
Jen-Hsun Huang, Co-founder, President and CEO
Chris Malachowsky, Co-founder, NVIDIA Fellow, Senior Vice President, Engineering and Operations
Jonah M. Alben, Vice President, GPU Engineering
Debora Shoquist, Senior Vice President, Operations
Dr Ranga Jayaraman, CIO
ProductsGraphics processing units
Motherboard chipsets
RevenueIncrease$4.1 Billion USD (2007)
32,972,000,000 United States dollar (2023) Edit this on Wikidata
Increase$797.6 Million USD (2007)
Total assets7,251,000,000 United States dollar (2014) Edit this on Wikidata
Number of employees
over 4,985 (as of June 2008)
Websitewww.nvidia.com

The multinational corporation Nvidia (NasdaqNVDA, Template:PronEng), specializes in the manufacture of graphics-processor technologies for workstations, desktop computers, and mobile devices. Based in Santa Clara, California, the company has become a major supplier of integrated circuits (ICs) used for personal computer motherboard chipsets, graphics processing units (GPUs), and video game consoles.

The company's name combines an initial n — a letter usable as a pronumeral in mathematical statements — and the root of video— which comes from Latin videre, "to see", thus implying "the best visual experience"[citation needed] or perhaps "immeasurable display".[original research?] The name NVIDIA suggests "envy" (Spanish envidia or in Latin, Italian, or Romanian invidia); and Nvidia's GeForce 8 series product uses the slogan "Green with envy". The company-name appears entirely in upper-case ("NVIDIA") in company technical documentation.

Notable Nvidia product lines include the GeForce series for gaming and the Quadro series for graphics processing on workstations, as well as the nForce series of integrated motherboard chipsets.

Company history

Jen-Hsun Huang (the present CEO), Curtis Priem, and Chris Malachowsky co-founded the company in 1993 with venture-capital funding from Sequoia Capital. [2]

In 2000 Nvidia acquired the intellectual assets of its one-time rival 3dfx, one of the biggest graphics companies of the mid- to late-1990s.

On December 14, 2005, Nvidia acquired ULI Electronics, which at the time supplied third-party Southbridge parts for chipsets to ATI, Nvidia's competitor. In March 2006, Nvidia acquired Hybrid Graphics[3] and on January 5, 2007, it announced that it had completed the acquisition of PortalPlayer, Inc.[4]

In December 2006 Nvidia, along with its main rival in the graphics industry AMD (which acquired ATI), received subpoenas from the Justice Department regarding possible antitrust violations in the graphics card industry.[5]

Forbes magazine named Nvidia its Company of the Year for 2007, citing the accomplishments it made during the said period as well as during the previous 5 years.[6]

In February 2008 Nvidia acquired Ageia Technologies for an undisclosed sum. "The purchase reflects both companies['] shared goal of creating the most amazing and captivating game experiences", said Jen-Hsun Huang, president and CEO of Nvidia. "By combining the teams that created the world's most pervasive GPU and physics engine brands, we can now bring GeForce-accelerated PhysX to twelve million gamers around the world."[This quote needs a citation] (The press-release[citation needed] made no mention of the acquisition-cost nor of specific products.)

Products

Nvidia headquarters in Santa Clara
A graphics processing unit on an Nvidia GeForce 6600 GT

Nvidia's product-portfolio includes graphics-processors, wireless-communications processors, PC platform (motherboard core-logic) chipsets, and digital-media-player software. The community of computer users arguably knows Nvidia best for its "GeForce" product-line, which not only offers a complete line of "discrete" graphics chips found in AIB (add-in-board) video cards, but also provides a core-technology in both the Microsoft Xbox game console and nForce motherboards.

In many respects Nvidia resembles its competitor ATI: Both companies began with a focus in the PC market and later expanded their activities into chips for non-PC applications. Nvidia does not sell graphics boards into the retail market, instead focusing on the development of GPU chips. Since Nvidia is a fabless semiconductor company, chip manufacturing is provided under contract by Taiwan Semiconductor Manufacturing Company, Ltd. (TSMC). As part of their operations, both ATI and Nvidia create "reference designs" (circuit board schematics) and provide manufacturing samples to their board partners. Manufacturers of Nvidia cards include BFG, EVGA, Foxconn, PNY, and XFX. ASUS, Gigabyte Technology, and MSI exemplify manufacturers of both ATI and Nvidia cards.

December 2004 saw the announcement that Nvidia would assist Sony with the design of the graphics processor (RSX) in the PlayStation 3 game console. In March 2006 it emerged that Nvidia would deliver RSX to Sony as an IP-core, and that Sony alone would be responsible for manufacturing the RSX. Under the agreement, Nvidia will provide ongoing support to port the RSX to Sony's fabs of choice (Sony and Toshiba), as well as die shrinks to 65 nm. This is a departure from Nvidia's business arrangement with Microsoft, in which Nvidia managed production and delivery of the Xbox GPU through Nvidia's usual third-party foundry contracts. (Meanwhile, Microsoft has chosen to license a design by ATI and make their own manufacturing arrangements for Xbox 360's graphics hardware, as has Nintendo for their Wii console to succeed the ATI-based GameCube.)

On February 4, 2008, Nvidia announced plans to acquire physics software producer AGEIA, whose PhysX physics engine program forms part of hundreds of games shipping or in development for PlayStation 3, Xbox 360, Wii, and gaming PCs.[7] This transaction completed on February 13, 2008[8] and efforts to integrate PhysX into the GeForce 8800's CUDA system began. [9] [10]

On June 2, 2008 Nvidia officially announced its new Tegra product-line.[11] These "computers on a chip" integrate CPU (ARM), GPU, northbridge, southbridge and primary memory functionality onto a single chip. Commentators[who?] opine that Nvidia will target this product at the smart-phone and mobile Internet device sector.

Graphics chipsets

Motherboard chipsets

Documentation and drivers

Nvidia does not publish the documentation for its hardware, meaning that programmers cannot write appropriate and effective open-source drivers for Nvidia's products. Instead, Nvidia provides its own binary GeForce graphics drivers for X.Org and a thin open-source library that interfaces with the Linux, FreeBSD or Solaris kernels and the proprietary graphics software. Nvidia also supports an obfuscated open-source driver that only supports two-dimensional hardware acceleration and ships with the X.Org distribution. Nvidia's Linux support has promoted mutual adoption in the entertainment, scientific visualization, defense and simulation/training industries, traditionally dominated by SGI, Evans & Sutherland and other relatively costly vendors.

Because of the proprietary nature of Nvidia's drivers, they continue to generate controversy within the free-software communities. Some Linux and BSD users insist on using only open-source drivers, and regard Nvidia's insistence on providing nothing more than a binary-only driver as wholly inadequate, given that competing manufacturers like Intel offer support and documentation for open-source developers, and that others like ATI at least release partial documentation.[12] Because of the closed nature of the drivers, Nvidia video cards do not deliver adequate features on several platforms and architectures, such as FreeBSD on the x86-64 architecture and the other BSD operating systems on any architecture. Support for three-dimensional graphics acceleration in Linux on the PowerPC does not exist; nor does support for Linux on the hypervisor-restricted PlayStation 3 console. While some users accept the Nvidia-supported drivers, many users of open-source software would prefer better out-of-the-box performance[13] if given the choice. However, the performance and functionality of the binary Nvidia video card drivers surpass those of open-source alternatives[citation needed] following VESA standards.

Nvidia drivers cause known issues[citation needed] on computers running Windows Vista. The forums on the Nvidia homepage have various topics where users discuss the failure and recovery error of the driver without any solution.

X.Org Foundation and Freedesktop.org have started the Nouveau project, which aims to develop free software drivers for Nvidia graphics cards by reverse-engineering Nvidia's current proprietary drivers for Linux.

Market-share

According to a survey[14] conducted by Jon Peddie Research, a market-watch firm, in the third quarter of 2007, Nvidia occupied the top slot in the desktop graphic-devices market with a market share of 37.8%. However, in the mobile space, it remained third with 22.8% of the market. Overall Nvidia has maintained its position as the second-largest supplier of PC graphic shipments, which includes both integrated and discrete GPUs, with 33.9% market share, their highest in many years, which puts them just behind Intel (38%).

According to the Steam hardware survey[15] conducted by the game-developer Valve, Nvidia had 64.64% of PC video card market share (as of 1 December 2008). ATI had 27.12% of the PC video card market share. But this could relate to Valve releasing trial versions of The Orange Box to Nvidia graphics-card users, which link to the test. However, free copies of The Orange Box were also released to ATI card purchasers, notably those who purchased the Radeon 2900XT.

Market history

Before DirectX

An Nvidia RIVA 128 AGP video card

Nvidia released its first graphics card, the NV1, in 1995. Its design used quadratic surfaces, with an integrated playback-only sound-card and ports for Sega Saturn gamepads. Because the Saturn also used forward-rendered quadratics, programmers ported several Saturn games to play on a PC with NV1, such as Panzer Dragoon and Virtua Fighter Remix. However, the NV1 struggled in a market-place full of several competing proprietary standards.

Market interest in the product ended when Microsoft announced the DirectX specifications, based upon polygons. Subsequently NV1 development continued internally as the NV2 project, funded by several millions of dollars of investment from Sega. Sega hoped that an integrated sound-and-graphics chip would cut the manufacturing cost of their next console. However, Sega eventually realized the flaws in implementing quadratic surfaces, and the NV2 was never fully developed.[citation needed]

A fresh start

Nvidia's CEO Jen-Hsun Huang realized at this point that after two failed products, something had to change for the company to survive. He hired David Kirk, Ph.D. as Chief Scientist from software-developer Crystal Dynamics, a company renowned for the visual quality of its titles. David Kirk turned Nvidia around by combining the company's experience in 3D hardware with an intimate understanding of practical implementations of rendering.

As part of the corporate transformation, Nvidia abandoned proprietary interfaces, sought to fully support DirectX, and dropped multimedia-functionality in order to reduce manufacturing-costs. Nvidia also adopted the goal of an internal 6-month product-cycle. The future failure of any one product would not threaten the survival of the company, since a next-generation replacement part would always come available.

However, since the Sega NV2 contract remained secret, and since Nvidia had laid off employees, it appeared to many industry-observers that Nvidia had ceased active research-and-development. So when Nvidia first announced the RIVA 128 in 1997, the specifications were hard to believe: performance superior to market leader 3dfx Voodoo Graphics, and a full hardware triangle setup engine. The RIVA 128 shipped in volume, and the combination of its low cost and high performance made it a popular choice for OEMs.

Ascendency: RIVA TNT

Having finally developed and shipped in volume the market-leading integrated graphics chipset, Nvidia set itself the goal of doubling the number of pixel pipelines in its chip, in order to realize a substantial performance-gain. The TwiN Texel (RIVA TNT) engine which Nvidia subsequently developed could either apply two textures to a single pixel, or process two pixels per clock-cycle. The former case allowed for improved visual quality, the latter for doubling the maximum fill-rate.

New features included a 24-bit Z-buffer with 8-bit stencil support, anisotropic filtering, and per-pixel MIP mapping. In certain respects (such as transistor-count) the TNT had begun to rival Intel's Pentium processors for complexity. However, while the TNT offered an astonishing range of quality integrated features, it failed to displace the market leader, 3dfx's Voodoo 2, because the actual clock-speed ended up at only 90 MHz, about 35% less than expected.

Nvidia responded with a refresh part: a die shrink for the TNT architecture from 350 nm to 250 nm. A stock TNT2 now ran at 125 MHz, an Ultra at 150 MHz. Though the Voodoo 3 beat Nvidia to the market, 3dfx's offering proved disappointing: it was not much faster and lacked features that were becoming standard, such as 32-bit color and textures of resolution greater than 256 x 256 pixels.

The RIVA TNT2 marked a major turning-point for Nvidia. They had finally delivered a product competitive with the fastest on the market, with a superior feature-set, strong 2D functionality, all integrated onto a single die with strong yields, that ramped to impressive clock-speeds. Nvidia's six month cycle refresh took the competition by surprise, giving it the initiative in rolling out new products.

Market leadership: GeForce

The autumn of 1999 saw the release of the GeForce 256 (NV10), most notably bringing on-board transformation and lighting. It ran at 120 MHz; it implemented advanced video-acceleration, motion-compensation and hardware sub-picture alpha-blending; and had four pixel pipelines. The GeForce outperformed existing products — such as the ATI Rage 128, 3dfx Voodoo 3, Matrox G400 MAX, and RIVA TNT2 — by a wide margin.

Due to the success of its products, Nvidia won the contract to develop the graphics hardware for Microsoft’s Xbox game-console, which earned Nvidia a large $200 million advance. However, the project drew the time of many of Nvidia's best engineers. In the short term, this was of no importance, and the GeForce 2 GTS shipped in the summer of 2000.

The GTS benefited from the fact that Nvidia had by this time acquired extensive manufacturing experience with their highly integrated cores, and as a result they succeeded in optimizing the core for clock-speeds. The volume of chips produced by Nvidia also enabled it to bin-split parts, picking out the highest-quality cores for its premium range. As a result, the GTS shipped at 200 MHz. The pixel fill rate of the GeForce256 nearly doubled, and texel-fill rate nearly quadrupled because multi-texturing was added to each pixel pipeline. New features included S3TC compression, FSAA, and improved MPEG-2 motion compensation.

Shortly afterward[when?] Nvidia launched the GeForce 2 MX, intended for the budget and OEM market. It had two pixel-pipelines fewer, and ran at 165 MHz and later at 250 MHz. Offering strong performance at a mid-range price, the GeForce 2MX became one of the most successful graphics chipsets. Nvidia also shipped a mobile derivative called the GeForce2 Go at the end of 2000.

Nvidia's success proved too much for 3dfx to recover its past market-share. The long-delayed Voodoo 5, the successor to the Voodoo 3, did not compare favorably with the GeForce 2 in either price or performance, and failed to generate the sales needed to keep the company afloat. With 3dfx on the verge of bankruptcy near the end of 2000, Nvidia purchased most of 3dfx's intellectual property (in dispute at the time)[citation needed]. Nvidia also acquired anti-aliasing expertise and about 100 engineers (but not the company itself, which filed for bankruptcy in 2002).

Nvidia developed the GeForce 3, which pioneered DirectX 8 vertex and pixel-shaders, and then refined it with the GeForce 4 Ti line. After the GeForce 2 MX came the GeForce 4 MX. Nvidia announced the GeForce 4 Ti, MX, and Go in January 2002, one of the largest releases in Nvidia history. Cleverly, the chips in the Ti and Go series differed only in chip and memory clock-speeds. (The MX series lacked the pixel and vertex shader functionalities; it derived from GeForce 2 level hardware.)

GfMX4000 64MB card. Produced 2002-2003

Shortcomings of the FX series

At this point Nvidia’s market position looked unassailable, and industry observers began to refer to Nvidia as the Intel of the graphics-industry. However, its major remaining rival ATI Technologies did stay competitive due to its Radeon which performed mostly on a par with the GeForce 2 GTS. Though ATI's answer to the GeForce 3, the Radeon 8500, came later to market and initially suffered from driver issues, the 8500 proved a superior competitor due to its lower price and greater potential. Nvidia countered ATI's offering with the GeForce 4 Ti line, though the Ti 4200's delayed rollout enabled the 8500 to carve out a niche. ATI opted to work on its next-generation Radeon 9700 rather than on a direct competitor to the GeForce 4 Ti.

During the development of the next-generation GeForce FX chips, many of Nvidia’s best engineers focused on the Xbox contract, developing a motherboard solution, including the API used as part of the SoundStorm platform. Nvidia also had a contractual obligation to develop newer and more hack-resistant NV2A chips, and this requirement further shortchanged the FX project. The Xbox contract did not allow for falling manufacturing costs as processor technology improved, and Microsoft sought to re-negotiate the terms of the contract, withholding the DirectX 9 specifications as leverage. As a result, Nvidia and Microsoft relations, which had previously been very good, deteriorated. Both parties later settled the dispute through arbitration and the terms were not released to the public. However, the dispute was what prompted Nvidia to pass over developing a graphics solution for the succeeding Xbox 360, with ATI taking on that contract, while Nvidia decided to work on the Sony PlayStation 3 instead.

Due to the Xbox dispute, no consultation with Nvidia took place during the drawing up of the DirectX 9 specification, while ATI designed the Radeon 9700 to fit the DirectX specifications. This development limited rendering color support to 24 bits floating point[citation needed], and the whole development emphasized shader performance, the main expected focus of DirectX 9. Developers built the shader-compiler using the Radeon 9700 as the base card.[clarification needed]

In contrast, Nvidia’s cards offered 16- and 32-bit floating point modes, offering either lower visual quality (as compared to the competition), or slow performance. The 32-bit support made them much more expensive to manufacture, requiring a higher transistor count. Shader performance often remained at half or less of the speed provided by ATI's competing products. Having made its reputation by designing easy-to-manufacture DirectX-compatible parts, Nvidia had misjudged Microsoft’s next standard and paid a heavy price: as more and more games started to rely on DirectX 9 features, the poor shader performance of the GeForce FX series became ever more obvious. With the exception of the FX 5700 series (a late revision), the FX series lacked performance compared to equivalent ATI cards.

Nvidia started to become ever more desperate to hide the shortcomings of the GeForce FX range. It released a notable "FX only" demo called Dawn, but a hacked wrapper enabled it to run on a 9700, where it ran faster despite a perceived translation overhead. Nvidia also began to include "optimizations" in their drivers to increase performance. While some users contended that increased real world gaming performance were valid, hardware review sites started to run articles showing how Nvidia’s driver auto-detected benchmarks, and produced artificially inflated scores that did not relate to real world performance. Often it was tips from ATI’s driver development team that lay behind these articles[citation needed]. As Nvidia’s drivers became ever more full of hacks and "optimizations," the legendary stability and compatibility also began to suffer. While Nvidia did partially close the gap with new instruction reordering capabilities introduced in later drivers, shader performance remained weak and over-sensitive to hardware-specific code compilation. Nvidia worked with Microsoft to release an updated DirectX compiler, that generated GeForce FX specific optimized code.

Furthermore, GeForce FX devices also ran hot, because they drew as much as double the amount of power as equivalent parts from ATI. The GeForce FX 5800 Ultra became notorious for its fan noise, and acquired the nicknames "dustbuster" and "leafblower" - Nvidia jokingly acknowledged these accusations with a video in which the marketing team compares the cards to a Harley-Davidson motorcycle.[16] Although the quieter 5900 replaced the 5800 without fanfare, the FX chips still needed large and expensive fans, placing Nvidia's partners at a manufacturing cost disadvantage compared to ATI. As a result of Microsoft's actions, and the resultant FX series' weaknesses, Nvidia quite unexpectedly lost its market leadership position to ATI.

GeForce 6 series and later

The old Nvidia logo, in use until the release of the GeForce 8 series

With the GeForce 6 series, Nvidia had clearly moved beyond the DX9 performance problems that plagued the previous generation. The GeForce 6 series not only performed competitively where Direct 3D shaders were concerned, but also supported DirectX Shader Model 3.0, while ATI's competing X800 series chips only supported the previous 2.0 specification. This proved an insignificant advantage, mainly because games of that period did not employ extensions for Shader Model 3.0. However, it demonstrated Nvidia's desire to design and follow through with the newest features and deliver them in a specific timeframe. What became more apparent during this time was that the products of the two firms, ATI and Nvidia, offered equivalent performance. The two firms traded blows in specific titles and specific criteria — resolution, image quality, anisotropic filtering/anti-aliasing — but differences were becoming more abstract, and the reigning concern became price-to-performance. The mid-range offerings of the two firms demonstrated the consumers' appetite for affordable, high-performance graphics cards, and it is now this price segment in which much of the firms' profitability is determined. The GeForce 6 series were released in a very interesting period: the game Doom 3 was just released where ATI's Radeon 9700 struggled at the OpenGL performance. In 2004, the GeForce 6800 performed excellently, while the GeForce 6600GT remained as important to Nvidia as the GeForce2 MX a few years previously. The GeForce 6600GT enabled users of the card to play Doom 3 at very high resolutions and graphical settings, which was thought to be highly unlikely considering its selling price. The GeForce 6 series also introduced SLI (which is similar to what 3dfx was using on the Voodoo 2). A combination of SLI and the performance gain as a result returned Nvidia to market leadership.

File:Sli ready.jpg
Badge displayed on products certified by Nvidia to utilize SLI technology

The GeForce 7 series represented a heavily beefed-up extension of the reliable 6-series. The industry's introduction of the PCI Express bus standard allowed Nvidia to release SLI (Scalable Link Interface), a solution that employs two similar cards to share the workload in rendering. While these solutions do not equate to double the performance, and require more electricity (two cards vis-à-vis one), they can make a huge difference as higher resolutions and settings are enabled and, more importantly, offer more upgrade flexibility. ATI responded with the X1000 series, and their own dual-rendering solution called "CrossFire". Sony chose Nvidia to develop the "RSX" chip used in the PlayStation 3 — a modified version of the 7800 GPU.

Nvidia released the 8-series chip towards the end of 2006, making the 8-series the first to support Microsoft's next-generation DirectX 10 specification. The 8-series GPUs also featured the revolutionary Unified Shader Architecture, and Nvidia leveraged this to provide an additional functionality for its graphics cards: better support for General Purpose Computing on GPU (GPGPU). A new product-line of "compute-only" devices called Nvidia Tesla emerged from the G80 architecture, and subsequently Nvidia also became the market leader of this new field by introducing the world's first C programming language API for GPGPU: CUDA.

Nvidia released two models of the high-end 8-series (8800) chip: the 8800GTS (640MB and 320MB) and the 8800GTX (768MB). Later, Nvidia released the 8800 Ultra (essentially an 8800GTX with a different cooler and higher clocks). All three of these cards derive from the 90nm G80 core (with 681 million transistors). The GTS model had 96 stream processors and 20 ROPS and the GTX/Ultra had 128 stream processors and 24 ROPS.

In early 2007 Nvidia released the 8800GTS 320mb. This card resembles an 8800GTS 640, but with 32MB memory chips instead of 64MB (the cards contained 10 memory chips).

In October 2007 Nvidia released the 8800GT. The 8800GT used the new 65nm G92 GPU and had 112 stream processors. It contained 512Mb of VRAM and operated on a 256bit bus. It had several fixes and new features that the previous 8800s lacked.

Later in December 2007 Nvidia released the 8800GTS G92. It represented a larger 8800GT with higher clocks and all of the 128 stream processors of the G92 unlocked. Both the 8800GTS G92 and 8800GT have full PCI Express 2.0 support.

In February 2008 Nvidia released the 9600-series chip, which supports Microsoft's DirectX 10 specification, in response to ATI's release of the Radeon HD3800 series. After March Nvidia released the GeForce 9800 GX2, which, roughly put, packs two GeForce 8800 GTS G92s into a single card.

In June 2008 Nvidia released their new flagship GPUs named the GTX 280 and GTX 260. The cards used the same basic Unified Architecture deployed in the previous 8 and 9 series cards, but with a tune-up in power. Both of the cards take as their basis the gt200 GPU. This GPU contains 1.4 billion transistors on a 65nm fabrication. According to TSMC, it has the largest die area of any chip ever fabricated. The GTX 280 has 240 shaders (stream processors) and the GTX 260 has 192 shaders. The GTX 280 has 1GB of GDDR3 VRAM and uses a 512 bit bus. The GTX 260 has 896MB of GDDR3 VRAM on a 448 bit bus. The GTX 280 allegedly provides approximately 933 GFLOPS of floating point power.[citation needed]

In January 2009, Nvidia released a 55 nm die shrink of GTX 280 (called GTX 285), and a dual-chip card, based on two 55 nm-shrinked GTX 280 (called GTX 295).

Defective mobile video adapters

In July 2008, Nvidia noted increased rates of failure in certain mobile video adapters.[17] A writer for The Inquirer alleged that the problems potentially affect all G84 and G86, mobile and desktop, video adapters,[18] though NVIDIA have denied this.[19] [20] In response to this issue, Dell and HP released BIOS updates for all affected notebook computers which turn on the cooling fan earlier than before in an effort to keep the defective video adapter at a lower temperature. Leigh Stark has suggested that this may lead to the premature failure of the cooling fan.[21] It is also possible that this resolution may only delay component failure past warranty expiration.

In August 2008 rumors emerged that these issues also affected G92 & G94 mobile video adapters.[22] But at the end of August 2008, Nvidia reportedly issued a product-change notification announcing plans to update the bump material of GeForce 8 and 9 series chips “to increase supply and enhance package robustness”. [23] In response to the possibility of defects in some mobile video adapters from Nvidia, some notebook manufacturers have allegedly turned to ATI to provide graphics options on their new Montevina notebook computers.[24]

On 18 August 2008, according to the direct2dell.com blog, Dell began to offer a 12-month limited warranty "enhancement" specific to this issue on affected notebook computers worldwide.[25]

On 8 September 2008, Nvidia made a deal with large OEMs, such as Dell and HP, that they will get $200 per affected notebook[26]

On 9 October 2008, Apple Inc. announced on a support page that MacBook Pro notebook computers had exhibited faulty Nvidia GeForce 8600M GT graphics adapters.[27] The manufacture of affected computers took place between approximately May 2007 and September 2008. Apple also stated that they would repair MacBook Pros affected within two years of the original purchase date free-of-charge and also offered refunds to customers who had paid for repairs related to this issue.

On 9 December 2008, The Inquirer conducted another series of tests to check whether the new MacBook Pro notebook computers used eutectic solder or high-lead solder.[28] They found that the 9400M chipset used eutectic solder, while the 9600M used a high-lead solder which they associated with the "old process" responsible for the failures.

Video-card manufacturers

Nvidia does not manufacture video-cards, only the GPU chips (the Nvidia official website shows prototypical models). Nvidia does specify the speed and configuration for the chips, the video memory, and (most of the time) the design and layout which it expects third-parties to follow. Each OEM assembles the cards under one of the following brand names:

Current partners

Previous partners

See also

Template:Companies portal

References

  1. ^ "Company Profile for NVIDIA Corporation (NVDA)". Retrieved 2008-09-30.
  2. ^ Forbes.com - Magazine Article
  3. ^ The Register Hardware news: NVIDIA acquires Hybrid Graphics
  4. ^ Press Release: Nvidia acquires PortalPlayer, dated January 5, 2007.
  5. ^ "Justice Dept. subpoenas AMD, NVIDIA". New York Times. 2006-12-01.
  6. ^ Brian Caulfield (2008-01-07). "Shoot to Kill". Forbes.com. Retrieved 2007-12-26.
  7. ^ "NVIDIA to Acquire AGEIA". DailyTech.com. 2008-02-04.
  8. ^ NVIDIA Completes Acquisition of AGEIA Technologies: Financial News - Yahoo! Finance
  9. ^ [Phoronix] PhysX For CUDA, Linux Support A Given?
  10. ^ GeForce 8 graphics processors to gain PhysX support - The Tech Report
  11. ^ http://www.techtree.com/India/News/Nvidia_Rolls_out_Tegra_Processors/551-89833-581.html NVIDIA Rolls out "Tegra" Processors
  12. ^ "X.org, distributors, and proprietary modules". Linux Weekly News. Eklektix. 2006-08-14. Retrieved 2008-11-03. {{cite web}}: Cite has empty unknown parameter: |coauthors= (help)
  13. ^ LinuxQuestions.org 20 September 2007: '
  14. ^ "NVIDIA Continues to Gain Graphics Market Share, AMD Keeps on Downfall – JPR". X-bit Labs. 2007-10-29. {{cite web}}: Unknown parameter |accessmonthday= ignored (help); Unknown parameter |accessyear= ignored (|access-date= suggested) (help)
  15. ^ Valve - Survey Summary Data
  16. ^ YouTube - Nvidia Hair Dryer
  17. ^ NVIDIA Corporation (2008-07-02). "NVIDIA Provides Second Quarter Fiscal 2009 Business Update". Retrieved 2008-10-05. Certain notebook configurations with GPUs and MCPs manufactured with a certain die/packaging material set are failing in the field at higher than normal rates. To date, abnormal failure rates with systems other than certain notebook systems have not been seen.
  18. ^ Demerjian, Charlie (2008-07-09). "All Nvidia G84 and G86s are bad". The Inquirer. Retrieved 2008-10-05. The short story is that all the G84 and G86 parts are bad. Period. No exceptions. All of them, mobile and desktop, use the exact same ASIC, so expect them to go south in inordinate numbers as well.
  19. ^ Hruska, Joel (2008-07-16). "NVIDIA denies rumors of faulty chips, mass GPU failures". Ars Technica. Retrieved 2008-10-05. This is a serious charge to level at any company, and we contacted NVIDIA for additional information. The company's response first affirms its intent to stand behind its customers and repair any and all notebooks that experience field failures. It then states: 1) The issue is limited to a few notebook chips only; we have not seen and don't expect to see this issue on any NVIDIA-based desktop systems. 2) Only a very small percentage of the notebook chips that have shipped are potentially affected, and the problem depends on a combination of environmental conditions, configuration, and usage model. 3) We continue to work closely with our partners and have taken the necessary steps to ensure that all NVIDIA chips currently in production do not exhibit the problem.
  20. ^ Kingsley-Hughe, Adrian (2008-07-17). "NVIDIA: Nothing to see here, move along". ZDNet. Retrieved 2008-10-05. ...So just how widespread are NVIDIA's GPU failure problem. According to NVIDIA, it's nothing to worry about....
  21. ^ Stark, Leigh (2008-08-18). "NVIDIA DISASTER: thousands of GPUs faulty". APC. ninemsn Pty Ltd. Retrieved 2008-08-18. ... updates that force your computer to cool itself down not only kill your battery life further but also leave you running the risk that now with the extra needed fan cycles, that cooling system built into your laptop might die sooner than expected.
  22. ^ Demerjian, Charlie (2008-08-12). "Nvidia G92s and G94 reportedly failing: Desktop boards this time". The Inquirer. Incisive Media Investments Ltd. Retrieved 2008-08-18. A little digging revealed what this, and more, is all about, and it's far uglier than just the 'notebook' version. It seems that four board partners are seeing G92 and G94 chips going bad in the field at high rates... From the look of it, all G8x variants other than the G80, and all G9x variants are defective
  23. ^ Shilov, Anton (2008-08-29). "Nvidia Updates Bump Material of GeForce 8800, 9800 Chips". X-Bit Labs. Retrieved 2008-09-29. Nvidia Corp. has reportedly issued yet another product change notification (PCN) document, informing its customers that it plans to change bump material on its code-named G92 chips, which power a great amount of GeForce graphics cards. Potentially, this may mean that those graphics processing units are also subject to failures similar to [sic] already confirmed by Nvidia.
  24. ^ O'Brien, Kevin (2008-08-12). "More Defective NVIDIA Graphics Chipsets". NotebookReview.com. TechTarget. Retrieved 2008-08-18. Expect to see more BIOS updates released to increase cooling fan cycles, and more ATI graphics options from notebook manufacturers. We are already seeing a spike in high-end ATI options on almost all new Montevina notebooks, with fewer NVIDIA options day by day. {{cite web}}: Cite has empty unknown parameter: |coauthors= (help); line feed character in |quote= at position 132 (help)
  25. ^ Menchaca, Lionel (2008-08-18). "NVIDIA GPU Update: Dell to Offer Limited Warranty Enhancement to All Affected Customers Worldwide". Direct2Dell Blog. Retrieved 2008-08-18. ...
  26. ^ Abazovic, Fuad (2008-09-08). "Nvidia gives OEMs $200 per bad mobile GPU". Fudzilla. Fudzilla. Retrieved 2008-11-03. Nvidia made a deal with big OEMs, such as Dell and HP, that they will get $200 per affected notebook and we are hearing that OEMs are quite happy about it. It turns out that this is more than generous and that this covers the cost of a new chip, the repair cost and all the other cost related to this issue.
  27. ^ "MacBook Pro: Distorted video or no video issues". Apple Inc. 2008-10-10. Retrieved 2008-11-03. Apple has determined that some MacBook Pro computers with the NVIDIA GeForce 8600M GT graphics processor may be affected. {{cite web}}: Cite has empty unknown parameter: |coauthors= (help)
  28. ^ "INQUIRER confirms Apple Macbook Pros have Nvidia bad bump material- The Inquirer". The Inquirer. 2008-12-09. Retrieved 2008-12-10. The Inquirer reviews the new MacBook Pro 15" notebook's GPU solder. {{cite web}}: Cite has empty unknown parameter: |coauthors= (help)
  29. ^ http://www.tomshardware.com/news/Nvidia-Foxconn-XFX-Gainward,6455.html
  30. ^ http://www.tomshardware.com/news/Nvidia-Foxconn-XFX-Gainward,6455.html
  31. ^ http://www.tomshardware.com/news/Nvidia-Foxconn-XFX-Gainward,6455.html

37°22′14.62″N 121°57′49.46″W / 37.3707278°N 121.9637389°W / 37.3707278; -121.9637389

Template:IT giants