Jump to content

Matrox Parhelia

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by Teceha (talk | contribs) at 12:53, 21 May 2008 (Performance: remove dangling link). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

The Matrox Parhelia 512 is a 512-bit graphics card launched in 2002 with full support for DirectX 8.1 and incorporating several DirectX 9.0 features. It was best known for its ability to drive three monitors ("Surround Gaming") and its Coral Reef tech demo.

Background

The Parhelia was Matrox's attempt to return to the market after a long hiatus, their first significant effort since the G200 and G400 lines had become uncompetitive. Their other post-G400 products, G450 and G550, were cost-reduced revisions of "G400" technology and were not competitive with ATI's Radeon or NVIDIA's GeForce lines with regards to 3D graphics.

Description

Features

The Parhelia 512 was the first graphics card to be equipped with a 256-bit memory bus, giving it an advantage over other cards of the time in the area of memory bandwidth. The Parhelia featured Glyph acceleration, where anti-aliasing of text was accelerated by the hardware.

The various innovative features of Parhelia included quad-vertex shader arrays, hardware displacement mapping, and 16x fragment anti-aliasing, all of which were featured prominently in Matrox's Coral Reef technical demo.

The "Surround Gaming" support allowed the card to drive three monitors creating a unique level gaming immersion. For example, in a flight simulator or sim racing, the middle monitor could show the windshield while the left and right monitors could display the side views (offering peripheral vision).

The Parhelia also supported 10-bit color, which was an improvement over its competitors. It called it "Gigacolor". The specifications were: 10-bit per channel RGB rendering and output; over one billion simultaneously displayed colors; 10-bit precision for 2D, 3D, DVD and video; 10-bit frame buffer mode for ARGB (2:10:10:10), and 10-bit RAMDACs with full gamma correction.

The Parhelia could drive a 30" Apple Cinema Display at its native resolution of 2,560 x 1,600.

The card was released in two memory configurations, 128 MB or 256 MB or DDR RAM.

Performance

For a top-of-the-line, and rather expensive card ($399 USD), the Matrox Parhelia's 3D gaming performance was well behind NVIDIA's older and similarly priced GeForce 4 Ti 4600. The Parhelia was only competitive with the older Radeon 8500 and GeForce 3, which typically cost half as much. The Parhelia's potent performance was held back by its comparatively low GPU clock speed, which in turn was initially believed to be limited by its large number of transistors. However, ATI's Radeon 9700 was released later that year, with many more transistors (80 million vs. 108 million), all on the same 150 nm chip fabrication process, yet it had a much higher clock speed (250 MHz vs. 325 MHz).

The card's fillrate performance was only formidable if a game used many layers of textures because it was only equipped with 4 pixel pipelines, but each had 4 texture units. Unfortunately this did not turn out to be an optimal approach for games. Parhelia was also hampered by limited bandwidth saving technologies, while ATI had their 3rd generation HyperZ in Radeon 9700 and NVIDIA had their Lightning Memory Architecture 2 in GeForce 4. So, while the Parhelia had formidable memory bandwidth, much of it was wasted because the card didn't have the ability to efficiently prevent overdraw or compress z-buffer data, among other inefficiencies. Parhelia was also believed to have a crippled triangle-setup engine that starved the rest of the chip in typical 3D rendering tasks. Some suggested that Matrox did not have engineering talent on par with NVIDIA and ATI and, as a result, their GPU was less efficient and less well designed overall.

Later in Parhelia's life, when DirectX 9 applications were becoming quite prevalent, Matrox acknowledged that the vertex shaders were not Shader Model 2.0 capable, and as such not DirectX 9-compliant, as was initially advertised. Presumably there were several bugs within the Parhelia core that could not be worked around in the drivers. However, it was all a bit of a moot point because Parhelia's performance was not adequate to drive most DirectX 9-supporting titles well even without more complex shader code weighing the card down.

Sales

Despite the lackluster performance for its price, Matrox hoped to win over enthusiasts with the Parhelia's unique and high quality features, such as "Surround Gaming", glyph acceleration, high resolutions, and 16x fragment anti-aliasing. In these aspects, some reviewers suggested that Parhelia could have been a compelling alternative to the comparably priced GeForce 4 Ti 4600 ($399 USD), which was the performance leader but only DirectX 8.1 compliant.

However, within a few months after release, the Parhelia was completely overshadowed by ATI's far faster and fully DirectX 9.0 compliant Radeon 9700. The Radeon 9700 was faster and produced higher quality 3D images, while debuting at the same price point as the Parhelia ($399 USD). Due to their equivalent pricing against faster cards, the Parhelia never got a significant hold in the market. It remains a niche product today while nVidia and ATI control the majority of the discrete graphics chip market.

Parhelia-LX

After the launch of Parhelia-512, Matrox released Parhelia-LX, which supports only 128-bit memory and has only 2 pixel pipelines. The first video cards using it included Matrox Millennium P650 and Millennium P750.

Future products

Originally, Matrox planned to produce the 'Parhelia 2' successor, codenamed 'Pitou'.[1] However, when Parhelia-512 failed to compete in the gaming market, the project was never again mentioned.

Parhelia processors were later upgraded to support AGP 8x, and PCI Express.

With the introduction of Millennium P690, it was die-shrunk to 90nm, and supports DDR2 memory.

In 2006, Matrox re-introduced Surround Gaming with their TripleHead2Go, which utilizing the existing ATI or NVIDIA GPU to render 3D graphics, splitting the resulting image over three screens. [2]