Jump to content

Nvidia

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by Superchad (talk | contribs) at 04:28, 5 April 2008 (GeForce 6 series and later: fixed geforce 8-series chip release date from 2007 to 2006, and fixed the model name from 8500 to 8800). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

Nvidia Corporation
Company typePublic (NasdaqNVDA)
IndustrySemiconductors- Specialized
Founded1993
Headquarters2701 San Tomas Expressway
Santa Clara, California
United States USA, additional locations in Europe and Asia
Key people
Jen-Hsun Huang, Co-founder, President and CEO
Chris A. Malachowsky, Co-founder, Nvidia Fellow, Senior Vice President, Engineering and Operations
Jonah M. Alben, Vice President, GPU Engineering
Debora Shoquist, Senior Vice President, Operations
ProductsGraphics processing units
motherboard-chipsets
RevenueIncrease$3.77 Billion USD (2007)
32,972,000,000 United States dollar (2023) Edit this on Wikidata
Increase$704.2 Million USD (2007)
Total assets7,251,000,000 United States dollar (2014) Edit this on Wikidata
Number of employees
over 4,083 (as of 2007)
Websitewww.nvidia.com

The American multinational Nvidia Corporation (NasdaqNVDA) (Template:PronEng) specializes in the manufacture of graphics-processor technologies for workstations, desktop computers, and handheld devices. The company, based in Santa Clara, California, has become a major supplier of integrated circuits (ICs) used for personal-computer motherboard chipsets, graphics processing units (GPUs), and game-consoles. Notable product lines include the GeForce series for gaming and the Quadro series for graphics processing on professional workstations, as well as the nForce series of integrated motherboard-chipsets.

Company history

The company's name, Nvidia, combines an initial n — a letter often used for mathematical variables — and the root of video — which comes from Latin videre, "to see" — implying "the best visual experience".[citation needed] The company-name appears entirely in upper-case ("NVIDIA") in company technical documentation, although marketing materials and other collateral show less brand-consistency.

The name Nvidia suggests "envy" (Spanish envidia or in Latin, Italian, or Romanian invidia); and the GeForce 8 series uses the slogan "Green with envy".

Jen-Hsun Huang (the present CEO), Curtis Priem, and Chris Malachowsky co-founded the company in 1993. In July 2003, the company received venture capital funding from Sequoia Capital. [1]

In 2000 Nvidia acquired the intellectual assets of its one-time rival 3dfx, one of the biggest graphics companies of the mid to late 1990s.

On December 14, 2005, Nvidia acquired ULI Electronics, which at the time supplied third-party Southbridge parts for chipsets to ATI, Nvidia's competitor. In March 2006, Nvidia acquired Hybrid Graphics[2] and on January 5, 2007, it announced that it had completed the acquisition of PortalPlayer, Inc.[3]

In December 2006 Nvidia, along with its main rival in the graphics industry AMD (which acquired ATI), received subpoenas from the Justice Department regarding possible antitrust violations in the graphics-card industry.[4]

Forbes magazine named Nvidia its Company of the Year for 2007, citing the accomplishments it made during the said period as well as during the previous 5 years.[5]

In February 2008 Nvidia acquired Ageia Technologies for an undisclosed sum. "The purchase reflects both companies['] shared goal of creating the most amazing and captivating game experiences," said Jen-Hsun Huang, president and CEO of Nvidia. "By combining the teams that created the world's most pervasive GPU and physics engine brands, we can now bring GeForce-accelerated PhysX to hundreds of millions of gamers around the world."[citation needed] The press release makes no mention of the acquisition-cost nor of specific products.

The same month, Nvidia announced NVISION, a visual computing related event, starting the August 25 in San Jose. The event will also feature a Guinness World Records attempt for the world's largest LAN party, and the finals for the Electronic Sports World Cup. [6]

Market history

Before DirectX

Graphic processor on Nvidia GeForce 6600GT
Nvidia Riva 128 video card

Nvidia released its first graphics card, the NV1, in 1995. Its design used quadratic surfaces, with an integrated playback-only sound-card and ports for Sega Saturn gamepads. Because the Saturn also used forward-rendered quadratics, programmers ported several Saturn games to play on a PC with NV1, such as Panzer Dragoon and Virtua Fighter Remix. However, the NV1 struggled in a market-place full of several competing proprietary standards.

Market interest in the product ended when Microsoft announced the DirectX specifications, based upon polygons. Subsequently NV1 development continued internally as the NV2 project, funded by several millions of dollars of investment from Sega. Sega hoped that an integrated sound-and-graphics chip would cut the manufacturing cost of their next console. However, Sega eventually realized the flaws in implementing quadratic surfaces, and the NV2 was never fully developed.[citation needed]

A fresh start

Nvidia's CEO Jen-Hsun Huang realized at this point that after two failed products, something had to change for the company to survive. He hired David Kirk, Ph.D. as Chief Scientist from software-developer Crystal Dynamics, a company renowned for the visual quality of its titles. David Kirk turned Nvidia around by combining the company's experience in 3D hardware with an intimate understanding of practical implementations of rendering.

As part of the corporate transformation, Nvidia abandoned proprietary interfaces, sought to fully support DirectX, and dropped multimedia-functionality in order to reduce manufacturing-costs. Nvidia also adopted the goal of an internal 6-month product-cycle. The future failure of any one product would not threaten the survival of the company, since a next-generation replacement part would always come available.

However, since the Sega NV2 contract remained secret, and since Nvidia had laid off employees, it appeared to many industry-observers that Nvidia had ceased active research-and-development. So when Nvidia first announced the RIVA 128 in 1997, the specifications were hard to believe: performance superior to market leader 3dfx Voodoo Graphics, and a full hardware triangle setup engine. The RIVA 128 shipped in volume, and the combination of its low cost and high performance made it a popular choice for OEMs.

Ascendency: RIVA TNT

Having finally developed and shipped in volume the market-leading integrated graphics chipset, Nvidia set the internal goal of doubling the number of pixel pipelines in its chip, in order to realize a substantial performance-gain. The TwiN Texel (RIVA TNT) engine Nvidia subsequently developed, allowed either for two textures to be applied to a single pixel, or for two pixels to be processed per clock cycle. The former case allowing for improved visual quality, the latter doubling maximum fill rate.

New features included a 24-bit Z-buffer with 8-bit stencil support, anisotropic filtering, and per-pixel MIP mapping. In certain respects (such as transistor-count) the TNT had begun to rival Intel's Pentium processors for complexity. However, while the TNT offered an astonishing range of quality integrated features, it failed to displace the market leader, 3dfx's Voodoo 2, because the actual clock-speed ended up at only 90 MHz, about 35% less than expected.

Nvidia responded with a refresh part: a die-shrink for the TNT architecture from 350 nm to 250 nm. A stock TNT now ran at 125 MHz, an Ultra at 150 MHz. Though the Voodoo 3 beat Nvidia to the market, 3dfx's offering proved disappointing: it was not much faster and lacked features that were becoming standard, such as 32-bit color and textures of resolution greater than 256 x 256 pixels.

The RIVA TNT2 marked a major turning-point for Nvidia. They had finally delivered a product competitive with the fastest on the market, with a superior feature-set, strong 2D functionality, all integrated onto a single die with strong yields, that ramped to impressive clock-speeds. Nvidia's six month cycle refresh took the competition by surprise, giving it the initiative in rolling out new products.

Market leadership: GeForce

The autumn of 1999 saw the release of the GeForce 256 (NV10), most notably bringing on-board transformation and lighting. It ran at 120 MHz; it implemented advanced video-acceleration, motion-compensation and hardware sub-picture alpha-blending; and had four pixel pipelines. The GeForce outperformed existing products — such as the ATI Rage 128, 3dfx Voodoo 3, Matrox G400 MAX, and RIVA TNT2 — by a wide margin.

Due to the success of its products, Nvidia won the contract to develop the graphics hardware for Microsoft’s Xbox game-console, which earned Nvidia a large $200 million advance. However, the project drew the time of many of Nvidia's best engineers. In the short term, this was of no importance, and the GeForce 2 GTS shipped in the summer of 2000.

The GTS benefited from the fact that Nvidia had by this time acquired extensive manufacturing experience with their highly integrated cores, and as a result they succeeded in optimizing the core for clock-speeds. The volumes of chips Nvidia was producing also enabled them to bin split parts, picking out the highest quality cores for their premium range. As a result, the GTS shipped at 200 MHz. The pixel fill rate of the GeForce256 nearly doubled, and texel fill rate nearly quadrupled because multi-texturing was added to each pixel pipeline. New features included S3TC compression, FSAA, and improved MPEG-2 motion compensation.

Shortly afterwards Nvidia launched the GeForce 2 MX, intended for the budget and OEM market. It had two pixel-pipelines fewer, and ran at 165 MHz and later, 250 MHz. Offering strong performance at a midrange price, the GeForce 2MX is one of the most successful graphics chipset of all time. A mobile derivative called the GeForce2 Go was also shipped at the end of 2000.

Nvidia's success proved too much for 3dfx to recover its past market-share. The long-delayed Voodoo 5, the successor to the Voodoo 3, did not compare favorably with the GeForce 2 in either price or performance, and failed to generate the sales needed to keep the company afloat. With 3dfx on the verge of bankruptcy near the end of 2000, Nvidia purchased most of their intellectual property, which was in dispute at the time,[7] but also acquired anti-aliasing expertise, and about 100 engineers (but not the company itself which filed for bankruptcy in 2002).

Nvidia developed the GeForce 3 which pioneered DirectX 8 vertex and pixel-shaders, and then refined it with the GeForce 4 Ti line. The GeForce 2 was succeeded by the GeForce 4 MX. The GeForce 4 Ti, MX, and Go were all announced in January 2002, one of the largest releases in Nvidia history. Cleverly, the chips in the Ti and Go series differed only by chip and memory clock-speeds. (The MX series lacked the pixel and vertex shader functionalities; it derived from GeForce 2 level hardware.)

Shortcomings of the FX series

At this point Nvidia’s market position looked unassailable, and industry observers began to refer to Nvidia as the Intel of the graphics-industry. However, their major remaining rival ATI Technologies did stay competitive due to their Radeon which was mostly on par with the GeForce 2 GTS. Though their answer to the GeForce 3, the Radeon 8500, was later and initially plagued by driver issues, the 8500 proved a superior competitor due to its lower price and greater potential. Nvidia countered ATI's offering with the GeForce 4 Ti line, though the Ti 4200's delayed rollout enabled the 8500 to carve out a niche. ATI opted to work on their next generation Radeon 9700 rather than a direct competitor to the GeForce 4 Ti.

During the development of the next-generation GeForce FX chips, many of Nvidia’s best engineers focused on the Xbox contract, developing a motherboard solution, including the API used as part of the SoundStorm platform. Nvidia also had a contractual obligation to develop newer and more hack-resistant NV2A chips, and this requirement further shortchanged the FX project. The Xbox contract did not allow for falling manufacturing costs as processor technology improved, and Microsoft sought to re-negotiate the terms of the contract, withholding the DirectX 9 specifications as leverage. As a result, Nvidia and Microsoft relations, which had previously been very good, deteriorated. Both parties later settled the dispute through arbitration and the terms were not released to the public. However, the dispute was what prompted Nvidia to pass over developing a graphics solution for the succeeding Xbox 360, with ATI taking on that contract, while Nvidia decided to work on the Sony PlayStation 3 instead.

Due to the Xbox dispute, Nvidia was not consulted during the drawing up of the DirectX 9 specification, while ATI designed the Radeon 9700 to fit the DirectX specifications. Rendering color support was limited to 12 bits floating point, and shader performance had been emphasized throughout development, since this was to be the main focus of DirectX 9. The shader compiler was also built using the Radeon 9700 as the base card.

In contrast, Nvidia’s cards offered 16- and 32-bit floating point modes, offering either lower visual quality (as compared to the competition), or slow performance. The 32-bit support made them much more expensive to manufacture, requiring a higher transistor count. Shader performance often remained at half or less of the speed provided by ATI's competing products. Having made its reputation by providing easy to manufacture DirectX compatible parts, Nvidia had misjudged Microsoft’s next standard and paid a heavy price: As more and more games started to rely on DirectX 9 features, the poor shader performance of the GeForce FX series became ever more obvious. With the exception of the FX 5700 series (a late revision), the FX series lacked performance compared to equivalent ATI cards.

Nvidia started to become ever more desperate to hide the shortcomings of the GeForce FX range. It released a notable "FX only" demo called Dawn, but the wrapper was hacked to enable it to run on a 9700, where it ran faster despite a perceived translation overhead. Nvidia also began to include "optimizations" in their drivers to increase performance. While some users contended that increased real world gaming performance were valid, hardware review sites started to run articles showing how Nvidia’s driver auto-detected benchmarks, and produced artificially inflated scores that did not relate to real world performance. Often it was tips from ATI’s driver development team that lay behind these articles. As Nvidia’s drivers became ever more full of hacks and "optimizations," the legendary stability and compatibility also began to suffer. While Nvidia did partially close the gap with new instruction reordering capabilities introduced in later drivers, shader performance remained weak and over-sensitive to hardware-specific code compilation. Nvidia worked with Microsoft to release an updated DirectX compiler, that generated GeForce FX specific optimized code.

Furthermore, GeForce FX devices also ran hot, because they drew as much as double the amount of power as equivalent parts from ATI. The GeForce FX 5800 Ultra became notorious for the fan noise, and acquired the nicknames "dustbuster" and "leafblower" - Nvidia jokingly acknowledged these accusations with a video, where the marketing team compares the cards to a Harley-Davidson. [8] While it was quietly withdrawn and replaced with the quieter 5900, the FX chips still needed large and expensive fans, placing Nvidia's partners at a manufacturing cost disadvantage compared to ATI. As a result of Microsoft's actions, and the resultant FX series' weaknesses, Nvidia quite unexpectedly lost its market leadership position to ATI.

GeForce 6 series and later

The old Nvidia logo, in use until the release of the GeForce 7 series.

With the GeForce 6 series, Nvidia had clearly moved beyond the DX9 performance problems that plagued the previous generation. The GeForce 6 series not only performed competitively where Direct 3D shaders were concerned, but also supported DirectX Shader Model 3.0, while ATI's competing X800 series chips only supported the previous 2.0 specification. This proved an insignificant advantage, mainly because games of that period did not employ extensions for Shader Model 3.0. However, it demonstrated Nvidia's desire to design and follow through with the newest features and deliver them in a specific timeframe. What became more apparent during this time was that the products of the two firms, ATI and Nvidia, offered equivalent performance. The two firms traded blows in specific titles and specific criteria — resolution, image quality, anisotropic filtering/anti-aliasing — but differences were becoming more abstract, and the reigning concern became price-to-performance. The mid-range offerings of the two firms demonstrated the consumers' appetite for affordable, high-performance graphics cards, and it is now this price segment in which much of the firms' profitability is determined. The GeForce 6 series were released in a very interesting period: the game Doom 3 was just released where ATI's Radeon 9700 struggled at the OpenGL performance. In 2004, the GeForce 6800 performed excellently, while the GeForce 6600GT remained as important to Nvidia as the GeForce2 MX a few years previously. The GeForce 6600GT enabled users of the card to play Doom 3 at very high resolutions and graphical settings, which was thought to be highly unlikely considering its selling price. The GeForce 6 series also introduced SLI (which is similar to what 3dfx was using on the Voodoo 2). A combination of SLI and the performance gain as a result returned Nvidia to market leadership.

The GeForce 7 series represented a heavily beefed-up extension of the reliable 6-series. The industry's introduction of the PCI Express bus standard allowed Nvidia to release "SLI", a solution that employs two similar cards to share the workload in rendering. While these solutions do not equate to double the performance, and require more electricity (two cards vis-à-vis one), they can make a huge difference as higher resolutions and settings are enabled and, more importantly, offer more upgrade flexibility. ATI responded with the X1000 series, and their own dual-rendering solution called "Crossfire". Sony chose Nvidia to develop the "RSX" chip used in the PlayStation 3 — a modified version of the 7800 GPU.

Nvidia released the 8-series chip towards the end of 2006, making the 8800 the first to support Microsoft's next-generation DirectX 10 specification.

Nvidia has also released the 9600-series chip, which supports Microsoft's DirectX 10 specification in February 2008, as a response to ATI's release of the Radeon HD3800 series.

Current market-share

According to a survey[9] conducted by Jon Peddie Research, a market-watch firm, in the 3rd quarter of 2007, Nvidia occupied the top slot in the desktop graphic-devices market with a market share of 37.9%. However, in the mobile space, it remained third with 22.8% of the market. Overall Nvidia has maintained its position as the second-largest supplier of PC graphic shipments, which includes both integrated and discrete GPUs, with 33.9% market share, their highest in many years, which puts them just behind Intel (38%).

According to the Steam hardware survey [1] conducted by Valve, Nvidia has 62.06% (as of March 11, 2008) of PC videocard market share. ATI currently has 30.81% of the PC videocard market share. But this could be attributed to Valve releasing trial versions of The Orange Box to Nvidia graphics card users, which link to the test.

Products

Nvidia's product-portfolio includes graphics-processors, wireless-communications processors, PC platform (motherboard core-logic) chipsets, and digital-media-player software. The Mac/PC user community arguably knows Nvidia best for its "GeForce" product-line, which not only offers a complete line of "discrete" graphics chips found in AIB (add-in-board) video cards, but also provides a core-technology in both the Microsoft Xbox game-console and nForce motherboards.

In many respects, Nvidia resembles its competitor ATI, because both companies began with a focus in the PC market, but later expanded their businesses into chips for non-PC applications. Nvidia does not sell graphics boards into the retail market, instead focusing on the development of GPU chips. The manufacturing of Nvidia chips, as a fabless semiconductor company occurs in the Taiwanese TSMC. As part of their operations, both ATI and Nvidia do create "reference designs" (board schematics) and provide manufacturing samples to their board partners. BFG, EVGA, PNY, and XFX are some prominent Nvidia card manufacturers. ASUS and MSI are examples of manufacturers of both ATI and Nvidia cards.

December 2004 saw the announcement that Nvidia would assist Sony with the design of the graphics processor (RSX) in the Sony PlayStation 3 game-console. As of March 2006, it is known that Nvidia will deliver RSX to Sony as an IP-core, and that Sony alone would be responsible for manufacturing the RSX. Under the agreement, Nvidia will provide ongoing support to port the RSX to Sony's fabs of choice (Sony and Toshiba), as well as die-shrinks to 65 nm. This is a departure from Nvidia's business arrangement with Microsoft, in which Nvidia managed production and delivery of the Xbox GPU through Nvidia's usual third-party foundry contracts. (Meanwhile, Microsoft has chosen to license a design by ATI and make their own manufacturing arrangements for Xbox 360's graphics hardware, as has Nintendo for their Wii console to succeed the ATI-based GameCube.)

On February 4, 2008, NVIDIA announced their plans to acquire physics software producer AGEIA, whose PhysX physics engine program can be found in hundreds of games shipping or in development for PlayStation 2, Xbox 360, Wii, and gaming PCs.[10] This transaction was finished on February 13, 2008 [11] and efforts to integrate PhysX into the GeForce 8800's CUDA system began. [12] [13]

Graphics chipsets

Personal-computer platforms and chipsets

Documentation and drivers

Nvidia does not publish the documentation for its hardware, meaning that programmers cannot write appropriate and effective open-source drivers for Nvidia's products. Instead, Nvidia provides its own binary GeForce graphics drivers for X.Org and a thin open-source library that interfaces with the Linux, FreeBSD or Solaris kernels and the proprietary graphics software. NVIDIA also supports an obfuscated open-source driver that only supports two-dimensional hardware acceleration and ships with the X.Org distribution. Nvidia's Linux support has promoted mutual adoption in the entertainment, scientific visualization, defense and simulation/training industries, traditionally dominated by SGI, Evans & Sutherland and other relatively costly vendors.

Because of the proprietary nature of Nvidia's drivers, they continue to generate controversy within the free software communities. Many Linux and BSD users insist on using only open-source drivers, and regard Nvidia's insistence to provide nothing more than a binary-only driver as wholly inadequate, when competing manufacturers like ATI and Intel offer excellent support and documentation for open-source developers.[14] Because of the closed nature of the drivers, NVIDIA video cards do not deliver adequate features on several platforms and architectures, such as FreeBSD on the x86-64 architecture and the other BSD operating systems on any architecture. Support for three dimensional graphics acceleration Linux on the PowerPC is also absent, as is support for Linux the hypervisor-restricted PlayStation 3 console. While some users accept the Nvidia-supported drivers, many users of open-source software would prefer a better OOTB experience[15] if given the choice.

X.Org Foundation and Freedesktop.org started the Nouveau project, which aims to develop free software drivers for Nvidia graphics cards by reverse-engineering Nvidia's current proprietary drivers for Linux.

Video-card manufacturers

Nvidia does not manufacture video-cards, only the GPU chips. They do specify the speed and configuration for both the chips and video memory, which the third-parties are expected to follow. The cards are assembled by OEMs under one of the following brand names:

See also

References

  1. ^ "Sequoia Capital funds Nvidia".
  2. ^ The Register Hardware news: Nvidia acquires Hybrid Graphics
  3. ^ Press Release: Nvidia acquires PortalPlayer, dated January 5, 2007.
  4. ^ "Justice Dept. subpoenas AMD, Nvidia". New York Times. 2006-12-01.
  5. ^ Brian Caulfield (2008-01-07). "Shoot to Kill". Forbes.com. Retrieved 2007-12-26.
  6. ^ NVISION 08
  7. ^ http://www.google.ca/search?hl=en&q=Nvidia+3dfx+lawsuit&meta=
  8. ^ YouTube - Nvidia Hair Dryer
  9. ^ "Nvidia Continues to Gain Graphics Market Share, AMD Keeps on Downfall – JPR". X-bit Labs. 2007-10-29. {{cite web}}: Unknown parameter |accessmonthday= ignored (help); Unknown parameter |accessyear= ignored (|access-date= suggested) (help)
  10. ^ "NVIDIA to Acquire AGEIA". DailyTech.com. 2008-02-04. {{cite web}}: Check date values in: |date= (help)
  11. ^ NVIDIA Completes Acquisition of AGEIA Technologies: Financial News - Yahoo! Finance
  12. ^ [Phoronix] PhysX For CUDA, Linux Support A Given?
  13. ^ GeForce 8 graphics processors to gain PhysX support - The Tech Report
  14. ^ Linux Weekly News 14 August 2006: X.org, distributors, and proprietary modules
  15. ^ LinuxQuestions.org 20 September 2007: '

Template:IT giants

37°22′14.62″N 121°57′49.46″W / 37.3707278°N 121.9637389°W / 37.3707278; -121.9637389