Jump to content

Talk:S3 ViRGE

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia

I suggest that the assertion that the ViRGE had better graphics quality is wrong. While the card had better texture smoothing, it also had terrible dithering artefacts -- in any bitmap with 100% transparent areas, there would be little artefacts around the areas that were supposed to be 100% transparant. This is documented in the Diamond review referenced, and it is significant enough that issues with washed out colours are definitely overblown in comparison. SJ Zero 23:11, 6 November 2006 (UTC)[reply]

It is a tough call. I will say that, IMO, Voodoo definitely does not look very good much of the time. It was very fast though, and that excused it for most people (including me) back then. However, if you run it side by side with a Verite or even a ViRGE, you would not argue that the Voodoo was significantly lower quality. It's partly due to bad default gamma settings, and partly do to overly aggressive mipmapping that lowers texture detail too close to the player. Things can get very blurry and washed out, and the gamma problems cause contrasting areas to look terrible. This is still subjective, obviously.
I have a Verite and a Voodoo Graphics running in a PC at home right now. I ran Jedi Knight on it just last night and it is plainly obvious that Verite looks a lot better. Dropping gamma in driver helped, but the lower texture quality (mip map bias) is a sore thumb that sticks out. Maybe I'll try to get shots. The problem is ViRGE can hardly run D3D games and I'm not sure how to take screenshots of proprietary API DOS games (camera I guess). --Swaaye 20:18, 6 November 2006 (UTC)[reply]
My issue isn't with the fact that the Voodoo had poor texture smoothing, but with the fact that alpha blending on the ViRGE was so bad that anywhere there was 0% alpha, you'd get a bunch of little dots on the screen. Depending on the game, you could end up with a screen filled with artefacts. I'm arguing that this is a far greater problem than any texture blending issues other cards might have had. I remember playing Terminal Velocity and the screen was just a blur of artefacts. This being the case, I can't see how you could positively compare the ViRGE to anything other card of the era -- my Matrox Mystique 220 didn't have alpha blending or texture smoothing at all, and it still looked much better because the image was at least free of artefacts around every object on the screen.SJ Zero 23:11, 6 November 2006 (UTC)[reply]
Hmmm. I played Terminal Velocity on a Virge about a year ago. I don't remember it looking all that bad, honestly. The only TV port is VIRGE tho so it's tough to compare side by side. Tomb Raider 1 has patches for Glide, S3D, and Speedy3D (Verite) tho so I could check that out and compare easily. I don't remember any major alpha problems.
I too had a Mystique 220 once and that sucker was ugly as hell. It didn't even do transparency at all (just stippling), and the nearest neighbor filtering wasn't too snazzy either. Funny thing is it looked ok at times too cuz the textures back then were so awful that the pixelated point sampled textures actually wasn't a big hinderence compared to uber blur. --Swaaye 22:58, 6 November 2006 (UTC)[reply]
At any rate, just check out the screenshots in the review -- the ones of Descent. They demonstrate what I'm saying well enough. In my own opinion, the grittiness around each and every sprite object on the screen is enough to dip the scales. I remember playing Doom Legacy (D3d mode, In the TNT era, but I was too poor to get a decent video card at the time), and the enemeies, the weapon you were holding, the bulletholes in the walls, virtually every object in the game had this speckled effect around it. Honestly, it's a universal effect in every game that uses 0% alpha blending as a method to mask sprites. SJ Zero 23:11, 6 November 2006 (UTC)[reply]

Fair use rationale for Image:S3 logo.gif

[edit]

Image:S3 logo.gif is being used on this article. I notice the image page specifies that the image is being used under fair use but there is no explanation or rationale as to why its use in this Wikipedia article constitutes fair use. In addition to the boilerplate fair use template, you must also write out on the image description page a specific explanation or rationale for why using this image in each article is consistent with fair use.

Please go to the image description page and edit it to include a fair use rationale. Using one of the templates at Wikipedia:Fair use rationale guideline is an easy way to insure that your image is in compliance with Wikipedia policy, but remember that you must complete the template. Do not simply insert a blank template on an image page.

If there is other fair use media, consider checking that you have specified the fair use rationale on the other images used on this page. Note that any fair use images lacking such an explanation can be deleted one week after being tagged, as described on criteria for speedy deletion. If you have any questions please ask them at the Media copyright questions page. Thank you.

BetacommandBot (talk) 04:51, 24 January 2008 (UTC)[reply]

1600x1200 with 16.7M

[edit]

1600x1200 with 16.7M is unrealistic, might be mere 16 colors. — Preceding unsigned comment added by 46.252.26.110 (talk) 14:23, 2 July 2015 (UTC)[reply]

Yes, that would be nonsense as that would require at least almost 5.5 MiB (considering 24 bpp only, with padding to 32 bpp we get over 7 MiB) of video RAM which is above the maximum 4 MiB supported. And you are also mostly correct with the 16 color depth -- the datasheet lists 4 bpp non-interlaced, but 8 bpp is also supported with interlaced video. --JITR (talk) 02:51, 4 September 2015 (UTC)[reply]
There, fixed. I also had to remove the accompanying 80 Hz refresh rate info, as this seems to be nonsense too. The 1280x1024 is supported at 75 Hz only so fat chance to have 1600x1200 at 80 Hz. Unfortunately I couldn't find the correct refresh rate in the datasheet, so this information had to be dropped without replacement. --JITR (talk) 03:10, 4 September 2015 (UTC)[reply]

OpenGL?

[edit]

The article states that the card didn't support OpenGL, but lists in the specifications section that the card had driver support for that API. What is correct? 77.9.67.72 (talk) 01:21, 6 June 2016 (UTC)[reply]