Talk:Enhanced Graphics Adapter

From Wikipedia, the free encyclopedia
Jump to: navigation, search
WikiProject Computing (Rated Start-class)
WikiProject icon This article is within the scope of WikiProject Computing, a collaborative effort to improve the coverage of computers, computing, and information technology on Wikipedia. If you would like to participate, please visit the project page, where you can join the discussion and see a list of open tasks.
Start-Class article Start  This article has been rated as Start-Class on the project's quality scale.
 ???  This article has not yet received a rating on the project's importance scale.
 

The page previously said[edit]

The page previously said "Introduced in 1984 by Microsoft [sic]." I'm assuming this was just a mistake, and have changed it to IBM. I don't recall Microsoft having anything to do with display hardware (although they've had more hardware products than many people thing).

If I'm wrong about this, please revert the change and leave a note here. Dpbsmith 15:26, 10 Jan 2004 (UTC)

I figured this was a good topic for Google Groups. Joel talks about his woes with his IBM card in December, 1984 in the first message mentioning either "Enhanced Graphics Adapter" or "EGA." He discusses required BIOS dates of October 1982. Further details in this thread discuss an alternate card, the Princeton SR-12, in addition to some workaround information for the memory problem Joel encountered. Based on those early mentions, I'd say IBM did it. -- Ke4roh 15:56, 10 Jan 2004 (UTC)

Currently the article[edit]

Currently the article states "EGA also included full 16-colour versions of the CGA 640×200 and 320×200 graphics modes; only the 16 CGA/RGBI colours are available in these modes." However, I'm quite sure the pseudo 64 color mode was also supported at 320x200. Only one game used this to its advantage; Ivan Ironman's Off Road. The mode was called "EGA64" in it.

Oooh! Pseudo 64 colour mode? Any idea how this was achieved? The EGA has registers I believe that allow for very fast switching from one of four palettes. This could perhaps have been used to switch the palette during display time such that all four palettes were displayed in a single frame, resulting in the EGA's full 64 colour palette being displayed. Perhaps this was the trick? Fast pallette switching has often been used to this effect, think 'copper' bars. -- Funkymonkey
It'd be nice if someone could root out a screenshot or any other info on this, all I've found with a quick google is an abandonware version for download (which I *may* try out, still having an AT with an old ISA VGA adaptor squirreled away somewhere deep in a cupboard - it may show up what's going on if it's got any command line switches which allow you to force the display mode)... Somewhere out on the interweb must have a working EGA PC and a copy of the game to figure out what's going on here. The best I can estimate at are several possibilities --- 1, it uses fancy hardware tricks (as found on enough other computers) to force the hardware to change the palette against spec, and also maybe to perform palette switching (though even the VGA screenshots show a game screen remarkably similar to my Atari version, which is a solid 16 colour (from 512) game) in 320x200 mode... 2, it does the LucasArts trick of using 640x200 mode and dithering alternate pixels (ick)... 3, colour flickering? (as seen on Sega GG/MD)... 4, as the graphics used in the game are quite small and limited-motion, it may have just used 640x350 as-is, with higher rez trucks and high res or pixel doubled backgrounds, and varied the 16 colour palette within-spec (or used extra tricks)... 5, it set 640x350 and displayed all graphics doubled for an effective 320x175 rez, with 16 custom colours and again maybe tricks... 6, it set a custom mode of the hardware, using 640x350 as a basis but reducing vertical rez to 240 or less and doubling horizontally (i've seen demos that supposedly run on stock EGA hardware and can give upto 640x700 rez in 64 colours, so reducing the resolution instead probably wouldn't have been too difficult - or alternately increasing it to 400 then doubling everything). So yeah. Complicated. Needs actual game experience to figure out. 82.46.180.56 (talk) 18:10, 30 March 2008 (UTC)
Not sure if this is the same thing or not, but I remember some of the later LucasArts adventure games (Monkey Island 2, Fate of Atlantis) claimed to require VGA, but did actually run on EGA cards: the way it worked was to run in 640x200 mode, with each 320x200 logical pixel using two actual EGA pixels, i.e. in effect a rather crude kind of dithering to give the impression of displaying more than 16 colors. It was kind of hideous, though, and pretty slow at times, which is probably why they never documented the feature! 81.86.133.45 (talk) 21:36, 10 February 2008 (UTC)
It wasn't undocumented. Indiana Jones and the Fate of Atlantis allowed you to enter a switch to choose the display mode, and there was an option for EGA. It produced a 320x200 image that was stretched to 640x200. It was hideously ugly, and it was obvious that each pixel was vertically doubled. DOSGuy (talk) 21:46, 10 February 2008 (UTC)
The EGA card offers a free programmable mapping of the available color numbers in a mode to any of the 64 monitor colors in any mode, even in the low-res graphics mode with CGA resolution. As every color is individually programmable, there are not just 4 palettes, as Funkeymonkey suggests. As the hardware design of the EGA is that the image generation circuits output 4-bit-codes (and a blanking bit) for the image, and a totally separate circuit (called the "Attribute Controller") translates these 4-bit-codes to 6-bit-codes, the availability of this translation is not connected to the kind of video mode used. Still, the fact that real-time palette changing was used for effects as the copper bars is true. Of course, if the 200 lines graphics modes are used in conjunction with a CGA monitor that does not support the 350 lines mode, only 16 colors are available due to monitor limitations. 89.15.60.144 (talk) 17:36, 9 September 2011 (UTC)
In fact, the article also states the real reason for the non-availability of 64 colors in the low-res graphics modes. It is in fact not a limitation of the graphics cards, but of the monitor, as the original EGA monitor, the IBM 5154 falls back to a CGA compatibility mode when CGA frequencies are present, as it does not detect whether the two extra pins used for the 64 EGA colors are driven by the card (as an EGA card would do) or grounded respectively floating (as a CGA card would do). 89.15.60.144 (talk) 17:49, 9 September 2011 (UTC)

Should we say that EGA[edit]

Should we say that EGA is the one which introduces "character generator" for text mode? Where you can modify text-mode character fonts instead of using the hardcoded BIOS ones only... -- FourBlades 20:11, 12 August 2006 (UTC)


"EGA can drive an MDA monitor by a special setting of switches on the board; only 640×350 high-res is available in this mode." EGA can also drive a CGA monitor by doing nothing special, just avoiding 640x350 modes. The max you could do was 640x200@16 colors like you could see in Thexder II. Most sierra games used 320x200@16 colors because of that.

MC6845[edit]

Some sources claim that EGA doesn't include MC6845 controller (this article and the book "Programmer's Problem Solver for the IBM PC, XT & AT" by Robert Jourdain). Can anyone confirm that? --Anton Khorev 11:14, 24 October 2006 (UTC)

The 640x350 mode uses 28000 bytes of contiguous memory, but the MC6845 can only address 16384 bytes, so clearly it can't be used in that mode. Of course this doesn't completely rule out the possibility that EGA cards had MC6845 for other modes, but that doesn't seem very likely. --Derlay 23:40, 16 June 2007 (UTC)
The EGA card does not use the 6845 video controller, but a chip designed by IBM that is partly compatible (to make page flipping and cursor positioning on the register level CGA compatible). The timing registers of the EGA card are not compatible to the CGA card, and, if I remember correctly, protected by a software controlled write lock (I might be wrong about that with the write lock existing only on VGA cards). The most pressing issue with the 6845 is the line counter which is limited to 7 bits. While this is clearly enough for text modes, in graphics modes, the consequence is that tricky ways of using the character row counter for high address bits have to be used to support more than 128 scan lines, which results in a banked video memory layout. In fact, the CGA has two banks and the Hercules card has 4 banks of 16 Kilobytes each in graphics mode to circument the line counter limitation. The EGA chip extended the line counter to 9 bits to avoid the issue, it also extended the address counter from 14 bits to 16 bits (that's the limit Derlay mentions). -- 89.15.60.144 (talk) 17:42, 9 September 2011 (UTC)

The Palette[edit]

Does anyone know the exact reason why custom palettes were available in hi-res (640x350) mode, but not in the 320x200 and 640x200 modes, at least not without some rumored heavy tweaking as mentioned above? I find it interesting (and a bit strange) that IBM waited until the VGA card to introduce a 320x200 16-color mode with user-defined palettes... 80.178.137.247 10:10, 28 May 2007 (UTC)

Maybe they already had a future hardware roadmap that included introduction of VGA and XGA when the hardware was mature and economically viable, and considered the upgrade from 320/640x200x16 (fixed) to 320/640x200/240x256 (or 16, variable, and all of VGA's other official and tweaked modes) more of a selling point than upgrading from 320/640x200x16 (variable, and full 64 with tricks) and 640x350 to 640x480 would have been... The desire to ditch the ugly EGA low mode for VGA must have been more of a (lucrative) upgrade driving force than swapping passable Master System-ish graphics for half decent Amiga-ish ones would have been.
Or possibly they just rushed it, as seemed to be the case for original CGA (how else do you explain away those horrendous fixed 4-colour palettes when making them user-definable was demonstrably quite easy?), and made it work at the minimum level possible to be a worthwhile upgrade at all levels, as well as providing the 640x350 resolution... which itself wasn't available in full colour unless you got the memory upgrade (64kb = monochrome, or maybe 4 colours/4 greys if such a thing was ever offered, only 128kb gave full colour, and 256kb smooth (page flipping) display of such) 82.46.180.56 (talk) 18:00, 30 March 2008 (UTC)
Good points! Perhaps another explanation lies in the fact that "EGA can also drive a CGA monitor by doing nothing special, just avoiding 640x350 modes" (as mentioned elsewhere in this talk page).
That is, maybe they wanted to provide some form of backward compatibility (and a cheaper upgrade option) for CGA owners - for the two resolutions that are available on a CGA monitor, EGA sticks to the RGBI signals that the CGA monitor can handle, so if you replaced only the card, the 320x200 and 640x200 modes would still work on your CGA monitor, but with the enhanced memory allowing all 16 colors to be displayed simultaneously.
Of course, that begs the question why *separate* low-res modes (utilizing the full "rgbRGB" color range) were not added... I had an EGA for a while as a kid, and I remember being annoyed at the fact that many 16-color games looked much better on VGA with their custom palettes, while on EGA they were (seemingly) needlessly "crippled" - your "they just rushed it" theory seems like a good answer to that one. ;)
77.125.144.117 (talk) 18:23, 5 May 2008 (UTC)

So named "obsolescence of EGA"[edit]

Please, give a quote from Scott Mueller (BTW user:Wtshymanski used an incorrect spelling). By what had EGA supplanted, and when exactly? VGA originally was intended as a video adapter for PS/2, not for various PCs. Incnis Mrsi (talk) 14:54, 7 September 2011 (UTC)

I don't have the Mu(e)ller book handy now, but the EGA was withdrawn from marketing by IBM fairly quickly, so the IBM EGA was certainly obsolete by 1987. 3rd party EGA boards were available after that, I'm sure, but the leading edge had clearly gone to VGA and fairly soon you couldn't give away an EGA board or monitor. --Wtshymanski (talk) 16:53, 7 September 2011 (UTC)
What means "give away an EGA… monitor"? This was a relatively expensive device at the time, and the whole world is not only North America, Europe and Japan there people were rich enough to spend money on continuous upgrades. Incnis Mrsi (talk) 16:20, 8 September 2011 (UTC)
It's an English figure of speech. It means that even if you offered the ownership of an EGA monitor or adapater to someone without exchange of money, that no-one would take it because it was worthless. --Wtshymanski (talk) 18:14, 8 September 2011 (UTC)