Talk:Color Graphics Adapter

From Wikipedia, the free encyclopedia
Jump to: navigation, search
WikiProject Computing / Hardware (Rated B-class, Low-importance)
WikiProject icon This article is within the scope of WikiProject Computing, a collaborative effort to improve the coverage of computers, computing, and information technology on Wikipedia. If you would like to participate, please visit the project page, where you can join the discussion and see a list of open tasks.
B-Class article B  This article has been rated as B-Class on the project's quality scale.
 Low  This article has been rated as Low-importance on the project's importance scale.
Taskforce icon
This article is supported by Computer hardware task force (marked as Mid-importance).
Former featured article Color Graphics Adapter is a former featured article. Please see the links under Article milestones below for its original nomination page (for older articles, check the nomination archive) and why it was removed.
Article milestones
Date Process Result
August 24, 2004 Featured article candidate Promoted
July 22, 2007 Featured article review Demoted
Current status: Former featured article
Version 0.5 (Rated B-class)
WikiProject icon This article has been selected for Version 0.5 and subsequent release versions of Wikipedia.
B-Class article B  This article has been rated as B-Class on the quality scale.
 ???  This article has not yet received a rating on the importance scale.
Note icon
This article is within of subsequent release version of Engineering, applied sciences, and technology.

High Colour Modes[edit]

The 8088 mph demo showed how to do high colour modes on CGA composite going well beyond the previous colour ranges.

so these probably want incorporating into the tricks and tweaks

Alan Cox (talk) 09:44, 21 April 2015 (UTC)

I took the liberty of adding this information, since I found it important. I hope I did it in the appropriate fashion. (talk) 02:26, 25 March 2016 (UTC)

160x100 tweaked text mode doesn't fit?[edit]

How does the 160x100 fit into 16K? Each character occupies two bytes of RAM -- one for the character itself, and one for the attribute. Thus, that's 160×100×2 = 32000 bytes. I buy that machines that use the MC6845 with 32K of RAM can display this mode, but an original 16K machine shouldn't be able to properly—the bottom half of the screen should show the same as the top half. I took a peek inside the MESS code, and it appears they just put 32K on all CGA variants. (ref: "offs=(offs+2)&0x3fff" in the text renderers) --Mr z 23:21, 20 October 2007 (UTC)

It's 80×100×2 = 16000 bytes, because each character represents two pixels. Calvero2 13:59, 21 October 2007 (UTC)
  • d'oh* I knew I must be missing something obvious. *chuckle* Thanks. -- 20:31, 24 October 2007 (UTC) (And that was me, just forgot to sign in. --Mr z 20:32, 24 October 2007 (UTC))


Hi does anyone know about the signals applied through the pin's to give you the colour and what would they be?

"Contact Me" is Trixter[edit]

Just a quick note that in one of the edits, I said "if you have a problem with the CGA color palette, contact me" but then realized later that I wasn't logged in. I am the one who put the final, correct CGA colors into the palette table -- if you want to change them you'd better have a good reason because I took them from the actual MC6845 specs and verified them with a TTL scan converter. Most people who think they know CGA colors seem to forget that when you toggle the brightness bit, ALL colors go bright, not just the ones that are supposed to go bright. Trixter 15:09, 20 Aug 2004 (UTC)

Got the reference in there? That would be the handiest thing - David Gerard 15:32, 20 Aug 2004 (UTC)
Huh? In graphics mode, yes (IIRC), but in text mode? I seem to recall that there were the full palette of 16 colours available in text mode. Also, all the sources suggest that. Plus the screenshots from the ICONDEMO.exe [1] are 16 colour. In composite mode there also are 16 colours. Surely you meant that only with reference to the 320x200 4 colour mode? Ropers 20:21, 20 Aug 2004 (UTC)
All I'm saying is that people tend to "make up" the 16-color CGA text mode palette with things that seem to make sense, like "FF0000" for "bright red" when in fact that simply wasn't the case. When you enable the "I" bit, ALL components get intensified. Black becomes gray, etc. So "bright red" is really "FF5454". The actual number of "FE" instead of "FF" is me being picky based on the specs, but those are the actual numbers. To further clarify, the graphics mode palette colors are identical to the text mode colors that they are based off of. For example, the "red" in the "red/green/yellow" palette corresponds to the CGA text color "red", etc. It is not a "different" red. Now, composite color mode used completely different colors than CGA text/graphics mode, and one of these days I will hook up my old PC to my broadcast equipment and capture the definitive CGA 16-color composite output color chart. But today is not one of those days :-) Trixter 04:42, 21 Aug 2004 (UTC)
If the hex values presently used in the table are precisely correct ... would it be worth listing them visibly in the table itself? - David Gerard 21:58, 24 Aug 2004 (UTC)
Never mind, I've just done so - noting they're from the MC6845 spec :-) What's the precise name of the spec? It should go in the 'References' section - David Gerard 22:35, 24 Aug 2004 (UTC)

160x200 composite mode did exist and was possible w/ 16 KB VRAM[edit]

Goplat, I think you're confusing the 160x100 "tweaked text" pseude-graphics mode with the 160x200 composite mode. They were two totally different things:

The 160x100 tweaked text mode was achieved on standard RGB monitors. The 160x200 mode was ONLY available on the CGA card's composite video output (which you could hook up to your telly.) It had nothing to do with text mode. It was a seperate graphics mode in its own right -- only that few folks used it as most CGA boxen were hooked up to RGB monitors permanently.

As regards your video RAM size objection:

  • 160x200 equals 32,000 (that's how many pixels we have.
  • 16 colours require 4 bits (2x2x2x2=16).
  • 4 times 32,000 equals 128,000. That's how many bits we need to encode 32,000 pixels at 16 colours.
  • 128,000 bits equal 16,000 bytes (as 1 byte=8 bits, so 128,000 gets divided by 8).
  • 1 KB equals 1,024 bytes.
  • Thus, 16 KB equal 16,384 bytes.
  • 16,000 bytes should thus fit into 16 KB with room to spare.

Fair enough?
Ropers 22:16, 20 Aug 2004 (UTC)

You were right, I screwed it up. Sorry about that. Goplat 22:31, 20 Aug 2004 (UTC)
And I (initially) screwed up as well, forgetting about the fact that 1KB = 1024b (that's now corrected, and it doesn't change my point). Thanks for the quick reply :) -- I even got an edit conflict while I was correcting my own screwup ;-) Ropers 22:38, 20 Aug 2004 (UTC)
No corrections here, you've got it right. But what's interesting to note, and what may be adding to the confusion, is that 320x200x4, 640x200x2, 160x200x16, and 160x100x16(tweaked text) ALL took up EXACTLY 16000 bytes... yet there is 16K available on the adapter, which means you actually have 16384 bytes to work with. What to do with the extra 384 bytes? Not too much, but I once came across a BASIC program that used them for temporary storage because the BASIC program itself was so big it filled all of the RAM available in a 64K PCjr ;-) Trixter 04:47, 21 Aug 2004 (UTC)

I don't know what mode it was, but I know for a fact that Bards Tale ]I[ became 16 color when I discovered that I could plug my TV into my computer.

I had been playing it with just the 4 colors that my Compaq 8088 (or was it 8086?) could produce.

When I discovered the 16 color trick, I played around in BASICA, and discovered that 2 pixels side-by-side were being interpreted as just one color on TV, with an aggregate of the color taken.

So I don't know if it was a different mode or not, or if it was the same mode just being interpreted differently by the two displays, or what. =^_^=

It was quite a shock to see that Bards Tale III was suddenly pretty..!


Severe copyediting[edit]

I just went through the article again, trying to bring the writing up to FAC standard. I cut a lot of side issues out to try to keep the actual points clear. Also worked over 160x200 16-color severely - it was confusing before, but I think it gets the point across now (and I think I got the point now) - David Gerard 21:47, 24 Aug 2004 (UTC)

I'd like to touch up the first two paragraphs of the article; the wording is somewhat awkward and needs cleanup. Any objections? I'll wait a few weeks for replies before doing it. Trixter 05:12, 31 January 2006 (UTC)
Before I got to cleaning up the first two paragraphs of the article, someone added a very subjective, non-verified, dubious history to the acceptance of CGA. One of the things they claim is that EGA in 1984 meant that CGA finally took off -- not only does the existance of DIP switchins on the EGA card (that let it be used with the CGA monitor) refute this, but MobyGames shows 130+ games that support CGA with publish dates BEFORE 1984, so I have a hard time believing the claim. They also claim that Hercules cards, with their graphics capabilities, drove CGA further into disuse, but MobyGames shows *NO* games that use Hercules graphics under 1985. (Not to mention all of it contradicts my own personal memory of the time period, but I try not to alter articles (any more :-) without proof for discussion.)
So, I officially call "BS" and would like to significantly alter the paragraph to remove the subjective parts (and *still* clean up the original 2). If nobody objects in a few days, that's what I'll do. --Trixter 20:58, 7 February 2006 (UTC)

CGA trivia[edit]

It might be worth mentioning a couple of extra points:

  • Original IBM PC motherboards (8088 processors) had a small trimmer capacitor on the 14.318318 quartz oscillator labelled "Color Adjust" - since the IBM CGA card derived its 3.58 MHZ NTSC color subcarrier frequency from the motherboard clock, which was distributed on the original XT expansion bus. All that to save a 39 cent crystal on the CGA card!
  • The CGA card had both an RGB connector and a composite video output. The composite video could be connected to a household (NTSC,525-line 60 Hz) TV set, just like a home computer. I did see this used in an industrial context when a PC with a CGA card was hooked up as a text display on a plant-wide closed circuit TV system - an ingenious way of recycling old hardware.
  • A common rookie PC user mistake of the early 1980's was trying to plug in a color monitor to a monochrome display adapter, or the reverse - they used the same DE-9 female connector. Legend had it that some monitors were destroyed by this, though I never managed to do it myself.
  • A CGA card and IBM monitor produced really terrible-looking text, even using an RGB monitor. Since the 320*200 color mode was worthless for serious graphics, nearly all "business" PCs used either MDA or a Hercules card, both of which produced text that looked even nicer than that on a Kaypro 10. --Wtshymanski 21:11, 23 May 2005 (UTC)
Trimmer: agreed, probably something to mention (but here or on the 5150 page?). Composite: already mentioned in the article. Hookup up monitors: not entirely sure where that would be approprite (btw, not possible to break either monitor on either card this way; the frequencies were different, but not exceedingly high which is what breaks monitors). "Terrible" text: That's quite subjective; I rather like the way text looks on my 5153 :-). I've spend decades looking at 80-col on it and never thought it was terrible. --Trixter 21:05, 7 February 2006 (UTC)

Additional palettes[edit]

In my experience of my first PC I know of additional palettes for the CGA. Now I must disclaim this saying that I had an Amstrad PC1512, which may have been custom-fit to produce unusual features. That said, I saw the following:

  • Graphics mode (640×400) with the CGA 16-colour RGBI palette. I know it had that one because GEMPaint wouldn't work without it (used for drawing colour graphics).
  • 4-colour palette with #0000A8 (blue) instead of #000000 (black) in addition to magenta, cyan and light grey. Some games (like Alley Cat) would flash between this palette and the normal one for alerting the user.
  • 4-colour palette with #A8A800 (yellow) instead of #A85400 (brown) in addition to black, red and green. I saw this at least on one game, Double Dragon, where I remember the game would switch between the normal palette (black, magenta, cyan, light grey) and what I called "Chinese colours".

I repeat, I don't know how much of this is peculiar to the Amstrad PC1512, or could be achieved on any CGA card through hacking. --Shlomital 10:24, 2005 Jun 2 (UTC)

Of these, the 640x200x16 mode is unique to the PC1512, and isn't available on normal CGA. The other two are entirely standard; you get the first one by selecting magenta/cyan/white and then setting the background to blue, and the second by selecting red/green/brown and turning on high intensity --HungryHorace 16:35, 4 Jun 2005 (UTC)
Great! Thanks for educating me on that. I was wondering how much hacking went into the ol' PC1512. --Shlomital 13:15, 2005 Jun 11 (UTC)

I've added a table for the tweaked 3rd palette in the "tweaks" section, but my HTML skills aren't what they used to be and I can't figure out why it is too wide. Can someone fix the table for me? Thanks in advance! Trixter 17:20, 4 January 2006 (UTC)

You added a red-cyan-white palette. What about the red-green-yellow palette mentioned by Shlomital? Another game I know to use this is Legacy of the Ancients. -- 14:57, 1 February 2006 (UTC)
That's the same palette as the red-green-brown, just with the high-intensity bit turned on. It's not a fundamentally different third palette. Trixter 21:11, 2 February 2006 (UTC)

"hex values adapted from MC6845 specification" ??[edit]

This claim seems to be nonsense. The MC6845 produced only the address of the current pixel, it was up to other chips to read that address and actually display the color(s) stored there. If anybody can explain how those hex values were derived, please do so. -- 14:23, 29 July 2005 (UTC)

I agree. The MC6845's data sheet makes no mention of any color values at all --- the color values mentioned are just what you get with the RGBI color model (with the exception of color 6). Let's remove that table if nobody objects. NewRisingSun 21:12, 16 December 2005 (UTC)
Please don't remove the table; the hex values were created by me based on research from how the TTL actually operates. If you'd like to "fix" the rounding then please do so but do not change the basic color mix which is correct. (For example, most people guess at the "brown" and "intensity" colors and always get them wrong.) Trixter 1:12, 23 December 2005 (UTC)
You claim that these values are "adapted from the MC6845 specification", even though the MC6845 specification contains no such information. Now you say it's based on your research on the "TTL", even though the CGA doesn't use a TTL monitor (that one is used by the MDA/HGC card!)... I guess I'll just call the table "Full CGA 16-color palette" and correct the values. NewRisingSun 20:46, 21 December 2005 (UTC)
Actually, both the CGA and MDA use TTL monitors; TTL in this context just specifies which voltage ranges on the pins mean "on" or "off". From the IBM PC tech ref, page 1-158, describing the IBM Color Display: "Horizontal drive: Positive-level, TTL-compatibility, at a frequency of 15.75 kHz." Identical wording is used on page 1-132 to describe the IBM Monochrome Display except that the frequency is 18.432 KHz. HungryHorace 22:45, 21 December 2005 (UTC)
That would make the EGA's Enhanced Color Display (IBM 5154) a TTL monitor as well.... but if you say so. In any case, I'd still like to get a copy of that ominous "MC6845 specification" that supposedly lists the color values.... the 22-page data sheet from Motorola (plus 2 pages revised addendum) certainly has no such information. NewRisingSun 23:42, 21 December 2005 (UTC)
If you want to remove "MC6845 Specification" then go ahead and do so. I wrote that originally as a deterrent to people guessing at the colors, when you can clearly see that the interface is RGBI not just RGB. But please don't alter the values beyond rounding them upward (ie. changing FE into FF) ... like I wrote earlier, most people think that "light red" is "#FF0000" which is blatantly incorrect because they don't take the "I" component into account. The table has the correct color mix. And yes, the 5154 is also a TTL monitor. Trixter 04:56, 22 December 2005 (UTC)

Well, well, this makes it the fourth version of the CGA colour palette values I've seen. Earliest, the numbers were (hex) 00, 54, B0, FE. Then, it was 00, 54, A8, FC. Later, on a previous version of this article, they were 00, 54, A8, FE. And now it's 00, 55, AA, FF, which I first saw as the "Linux console" colour palette for GNOME Terminal. I still don't know which one is correct (I root for 00, 54, A8, FC, if only because screenshots from the DOS window on Windows 9x, as well as from DOSBox, give out those colours), and I don't know even how to find the answer out, but please, if anyone does, bring it here, with evidence that it's the correct one, because all that juggling doesn't make for a reliable source. --Shlomital 17:14, 27 February 2006 (UTC)

The RGBI color model translates the digital TTL bits to analogue voltage factors of 0, 1/3, 2/3, 3/3. On a digital scale from 0 to 255, that translates to 0, 55, AA, FF. Since the VGA's color registers accept only 6 bits, the scale is from 0 to only 63, which would be 0, 15, 2A, 3F. If you simply multiply these by four (or shift by two bits), which is what most programs do, you'll get 0, 54, A8, FC. In other words, the differences are merely rounding errors. NewRisingSun 20:10, 27 February 2006 (UTC)
The original palette (the one mentioned in "adapted from MC6845 specification") was created by me, using the bit shift method, which is why the values were 54, A8, FC (truncation). I have since re-evaluated that position and now stand by the 55, AA, FF (rounding) numbers because they more closely resemble what they are modeling (ie. full TTL white output = full RGB white output). --Trixter 21:55, 27 February 2006 (UTC)
First, thanks for the convincing explanation. So 54 and A8 and FC are the result of algorithmic upsampling from a lower bit-depth, while 55 and AA and FF are the result of intelligent upsampling according to the specification (the 0/3, 1/3, 2/3, 3/3 factors mentioned above). Second, I've now done one even better and prepared a 48-bit PNG image of the palette, with the values 0000, 5555, AAAA, FFFF. It's on my user page. --Shlomital 18:03, 28 February 2006 (UTC)

Original name of the card[edit]

The original CGA from IBM was actually called "Color/Graphics Monitor Adapter" (C/GMA) (see this auction for pictures:, and the BASICA graphics demos on the DOS 1.x system disks call it that as well. I don't know when exactly the shortened name "CGA" came up (maybe when clones were produced?), but this original name should probably be incorporated in the main article. NewRisingSun 21:12, 16 December 2005 (UTC)

Color 6 on IBM 5153[edit]

It was recently discovered [2] that IBM's original CGA monitor (IBM 5153 Color Display) displays color 6 differently than most compatibles (closer to what you'd expect if you strictly applied the RGBI color model). This should definitely be reflected in the article (as quite a number of games depend on it!); feel free to change my markup to something prettier... :) NewRisingSun 19:50, 17 January 2006 (UTC)

No offense, but this is bullshit. I own a 5153, in fact it is sitting right next to me turned on, and it clearly shows brown. Unless you can come up with something to back up what you wrote, I'm going to revert it. An improperly calibrated montior can be fiddled with to show yellow if you want it to, but that's not original. I'll give you a week to respond. Trixter 04:26, 18 January 2006 (UTC)
(In fact, I've posted to that very forum challenging the photographer's competence. It's not yellow, it's brown, and the poster in that forum is trying to enact revisionist history.) Trixter 04:41, 18 January 2006 (UTC)
Offense taken, due to your pompous attitude. I'll give *YOU* a week to provide a rebuttal to the photographic evidence I linked to, a rebuttal that is more convincing than "bullshit", "the photographer is incompetent", "poster is engaging in a conspiracy to revise history", "trust me", "it's brown because I say so" and "I got the real colors from the 6845 spec". NewRisingSun 14:40, 18 January 2006 (UTC)
Offense not meant, but I am continually exasperated by people who don't actually own CGA hardware and yet keep trying to tell others what the colors are. My first home machine was a model 5150 with CGA in 1983, and I still have it on the table next to me in 2006 and it still works fine.
Okay, I found proof and posted the link in the forum previously referenced [3]. May I please revert the cga palette table changes? If it will help appease you, I fully admit I BS'd on the MC6845 comments -- but check the link to the info, it's the real deal and proves the color is brown. And yes, I have learned the lesson that shortcuts, however well-intentioned, are inappropriate for wikipedia entries. Trixter 05:50, 19 January 2006 (UTC)
We can both agree on that on *most* RGBI monitors, it appears as brown; I have included that in the article. I think we can also agree on that it is relevant information that strictly following the RGBI color model, you'd get dark yellow, and that it requires additional circuity to turn dark yellow into brown. I have updated the article accordingly. I further maintain that on *some* (possibly pre-1983) monitors, that additional circuity is missing, so they display dark yellow, and Great Hierophant's photographic evidence confirms this. If you think this is so irrelevant that the dark yellow should not be in the palette table, then put it somewhere else, but put it somewhere, because people should see what the dark yellow looks like.
Also, I definitely need to rewrite the composite section to present more accurate information, using proper NTSC terminology, especially about the artifacting in the 320x200x4 mode. NewRisingSun 16:26, 19 January 2006 (UTC)
I can concede that a proper "dark yellow" should be visible for the curious; I will move it out of the table, but add it to the Bugs and Errata section so it is not lost. Trixter 01:11, 20 January 2006 (UTC)

Fonts and the BIOS[edit]

I think it needs to be made clear that the font used in graphics modes comes from a completely different source than the font used in text modes - hence my use of "separate". The text-mode fonts are in the character ROM on the card itself; this ROM doesn't appear in the PC's address space, so the BIOS contains its own 128-character font that it uses in graphics modes. On an original IBM PC with an original IBM CGA, the two fonts are the same so the effect isn't noticeable; but putting an IBM CGA card in an Amstrad PC2086, which has a completely different BIOS font, soon shows the difference. This is also the reason why changing the font jumper on the card doesn't change the font used in graphics modes. HungryHorace 10:33, 3 February 2006 (UTC)

I always got the first 128 characters from F000:FA6E (BIOS of course), and the second 128 from where interrupt 1F pointed to (which on my machine eventually leads to C000:2D06). The second location is clearly the CGA, so I'm confused... Trixter 00:22, 6 February 2006 (UTC)
C000:2D06 is not the CGA, but your EGA/VGA. The CGA doesn't come with its own BIOS at C000 ;)NewRisingSun 00:28, 6 February 2006 (UTC)
You are correct, I just (5 minutes ago, damn you're quick) ran it on my EGA/VGA. Let me run the same program on my 5150... --Trixter 00:31, 6 February 2006 (UTC)
Well, that was educational. F000:FA6E is indeed the first 128 chars, and Int 1F points to F000:0000 which (on the 5150/CGA) contains garbage. So I stand corrected, and I'll alter the text in the article to reflect HungryHorace's concerns.--Trixter 02:33, 6 February 2006 (UTC)


It would be great if some knowledgable person could add the pinout of the CGA video connector. A drawing of the physical connector can be taken from the EGA article. -- 16:38, 3 July 2006 (UTC)


I made some larger changes to the article, and in order to avoid any needless reverting, I temporarily put them under User:NewRisingSun/cga.

  • Rewrote and expanded the "composite" section. I also plan to write a dedicated "Artifact color" article, discussing the matter from a more general technical point of view; after all, the technique is basically the same from CGA to Apple II to Atari to Coco.
  • Moved the second and third paragraphs to the end of the article, since the CGA's market penetration goes better along with the mentioning of its competitors.

Please review the changes and discuss. The picture markup is probably suboptimal as well. I'm also a little bemused by the constant past tense of the article; while it is appropriate to describe when it "was" sold, the inner workings of the CGA are still the same. ;) --NewRisingSun

Considering I'm still developing for it as a hobby, I agree :-)
I have no suggestions whatsoever -- as always, you did a fine, thorough job and I am especially looking forward to the composite artifacting article. One of these days I'll re-convert 8088 Corruption to take advantage of composite artifacting. --Trixter 21:49, 2 September 2006 (UTC)
Ok. I have copied over the changes into the article. With regards to my claim "making for a total gamut of well over a hundred colors", I need to calculate how many these are exactly. It should be quite a few though, mostly because of the sixteen different possible settings of color 0. NewRisingSun 23:41, 2 September 2006 (UTC)
When writing how many colors are available, see if you can produce two figures: One for the total number of colors available using all palettes, and another number for how many are distinctly visible (and/or addressable?) on a single screen. I'm assuming the latter will be 16, but I'm hoping it's otherwise...--Trixter 21:41, 3 September 2006 (UTC)

Two details[edit]

I changed the rather vague "this could be changed" about the black and white in hi-res monochrome mode to this text:

By default the colors were black and white, but the foreground color (white) could be changed to any other color of the CGA palette. This could be done at runtime without refreshing the screen.

I know for certain that this is possible, there is a game [4] that allowed the player to change the foreground color with a keystroke. What I don't know is whether it was possible to change the background color too (as could be done in lo-res). I don't think so, since I've never seen it done (though the background color would often display incorrrectly on a VGA display).

Correct, on CGA only the foreground color can be changed. It's the same port write for all three modes, actually: Writing to color select register changes the foreground color in 640x200; changes the background (ie. color index 0) in 320x200; and changes the overscan/border color in text modes. --Trixter 18:05, 15 March 2007 (UTC)
Thanks for clearing that up. Maybe you could put this into the article? -- 16:19, 16 March 2007 (UTC)
It's already mentioned in the section "Further RGB graphics modes and tweaks". --Trixter 20:58, 16 March 2007 (UTC)

Another thing that bugs me is the following sentence in the paragraph about text mode:

This mode allowed each character a foreground and a background color, both of which could be freely chosen from the entire CGA palette (see table)—e.g. red on yellow text for one character, white on black for the next and cyan on gray for yet another.

Is this really true? As far as I remember, you can't set the background to a bright color in VGA text mode. If you do, the character will blink instead. Somehow it seems unlikely to me that this is a feature that was introduced later, and I'd expect the CGA to behave the same way. -- 20:17, 12 March 2007 (UTC)

You can turn of the blinking, both on CGA cards and on VGA cards, but you have to write to a different port. Calvero2 20:17, 13 March 2007 (UTC)
Was this a documented or an undocumented feature? -- 16:19, 16 March 2007 (UTC)
Fully documented, as far back as the CGA tech ref :-) --Trixter 20:58, 16 March 2007 (UTC)

Technical comparison[edit]

How did the CGA compare on a techincal level to other graphics adapters, computers, consoles, etc. sold in its time (1981 - 1987)? Shinobu 18:08, 2 September 2007 (UTC)

Medium/average/mediocre, as far as I can tell; there were many that were worse, and many that were better, and some that were sort-of the same, but with their own odd limitations. The disappointing thing is that it acheived this mediocrity whilst being an expensive add-in card for an already expensive system, when you could sacrifice some expandibility and CPU speed in order to have much improved graphic and sound capability for a lower price in, say, a C64 (which actually had a similar amount of available graphics memory, but used it in a much more intelligent and flexible manner). Arguably even a Sinclair Spectrum could produce better results in some situations, despite having far less video RAM and having to resort to the hybrid text/graphics colour attribute mode; though the effective colour resolution was much lower and tricky to work with, it enabled more colours on screen at once without sacrificing the potential feature detail resolution.
It's also a pity that it ended up being sort-of the culprit for some of the crippled features of EGA, for compatibility and ease-of-programming's sake - it would have been a far nicer and popular standard if it had been able to produce 16-from-64 colours in the two lower-rez modes (instead of the fixed CGA 16) as well as, paradoxially, the highest. —Preceding unsigned comment added by (talk) 17:10, 30 March 2008 (UTC)
Can you describe the differences between the CGA cards of the mid-1980s, and the 8563 graphics chip on the Commodore 128? Are they completely the same? Were there any foundries making RGBI chips with more than two bits of intensity? Like 3 or 4 bits of intensity? (talk) 02:03, 26 January 2009 (UTC)
As far as I can tell, there's no such thing as RGBI, except as an informal term for the 5153 4-bit interface. What would it mean to have more bits of this 4-bit concept? Dicklyon (talk) 02:22, 26 January 2009 (UTC)


I feel that the first two images, Image:Arachne CGA Mode.svg and Image:Cosmos bipinnatus0 CGA.png are very bad choices. They were both artificially created much later than the CGA era (and apparently with non-CGA hardware) and they are not representative of the actual use to which the Color Graphics Adapter was put in its time. One is an image of a browser that didn't exist in the CGA era and the other is a down-conversion from a better image. They're both nice enough pictures alright, but they don't inform, they mislead, because they're not depicting the past, but a latter-day reimagination of the past. The flower picture is also misleading because of the way ImageMagick (part of the Wikipedia server software) generates its thumbnails -- ImageMagick makes that picture thumbnail look much better than the exact picture would or could ever have looked on a CGA monitor. I suggest that these images be replaced by more representative pictures. Moving up the Alley Cat image later on in the article would be an excellent choice. That game was well-known and widespread during the CGA era. 20:26, 11 September 2007 (UTC)

I agree wholeheartedly, and removed the dithered image. As for Arachne, all 640x200 images will probably have to be stretched to 640x400 in order to have the proper aspect ratio for viewing (ie. representing what they looked like on a real CGA monitor), but I agree that there are better choices than Arachne. I could offer up my Adlib Visual Composer screenshot, if anyone wants to use it (ie. I give my permission to use it). It's located here: --Trixter 04:36, 24 September 2007 (UTC)
The problem with the Adlib visual composer image is that it's non-free. The Arachne image is not only a free software screen shot, but it's accurate as its in a mode where CGA hardware could handle --wL<speak·check> 17:05, 16 November 2007 (UTC)
Wait, before we make any assumptions in that particular regard - did Trixter mean, by "my Adlib VC screenshot", that they made the program, hold the rights and have waived them for this use, or only that they made the screenshot? (Which, I thought, was allowed? As it's not a copy of this (long "dead") software itself, but merely an illustrative shot of it's main screen, which seems to be covered by Fair Use in most other cases (re: any other entry for a particular piece of software or operating system). (talk) 17:21, 30 March 2008 (UTC)

I have put the image back in.

Surely there's some way of getting a shot of a program that works more intelligently with the available display capability, e.g. reducing the text size (12pt Arial is appropriate in order to be visible on an XGA screen, a smaller one will easily suffice with limited rez). Though I never had a CGA PC, I did plenty of "serious" work (word processing, spreadsheets/databasing, route planning, midi/tracker music composition) on an Atari ST in my youth, using it's (maximum for colour screen) "Medium Resolution", which, excepting the additional colour depth (2 bit / 4 colours) that was rarely used for anything significant (and it's better composite encoding - no strange colour artifacting), had exactly the same capability - and programs written for it rarely felt cramped or as "chunky" and over-zoomd as that Arachne screenshot does. Most shots of non-DTP Hi-Rez (640x400) programs I saw seemed only to use the extra lines to make the text smoother and double-height rather than fitting in a great deal more detail... I don't see any reason a 640x200 CGA-targeted app should be any different. (talk) 17:18, 30 March 2008 (UTC)
If the image is for illustrative purposes, it shouldn't matter if it is free. However, I'm going to shelve that argument and point out that the Arachne screenshot is now horribly stretched and fuzzy. 640x200 CGA images look "proper" if you simply double the number of lines. The slight aspect ratio difference is barely noticable, but your 640x480 stretch misrepresents the graphics mode by making it look like 640x200 had more than two colors. Can you please fix the image?
Why do I get the feeling that the people posting inappropriate/mis-representative shots never actually used CGA? --Trixter (talk) 05:22, 17 November 2007 (UTC)

Removed FS dithered image because it is misrepresentative of CGA. CGA had many different palettes, and the dithering in the image further confuses the adapters abilities.

  • It is not misrepresentative of CGA. The colourvalues are correct, as is the resolution. I have tried correcting it for aspect ratio, but it is too close to 1:1, and thus corrections tend to lead to Moiré effects.
  • CGA did have different palettes, but that is not an argument for removing this image. Perhaps it can be an argument for adding more images. But this palette was used the most, because otherwise you couldn't have both black and white.
  • The dithering exemplifies that you had to dither to create gradients, since only a limited colours were available.

Regarding the comments posted above in this section:

  • "They were both artificially created" - all computer images are.
  • "much later than the CGA era" - good luck finding free (as in freedom) images from that era. Also, I don't think it matters. These iamges show what CGA is / isn't capable of.
  • "(and apparently with non-CGA hardware)" - feel free to recreate them with CGA hardware, but it's pointless, since the resulting bits will be the same (possibly modulo the palette order of the PNG, but even on CGA hardware this is not guaranteed to be the same as that of the CGA palette).
  • "and they are not representative of the actual use to which the Color Graphics Adapter was put in its time." - I have seen dithered CGA images. And my note about palette choice above also applies here.
  • "One is an image of a browser that didn't exist in the CGA era" - see the remark about free images above
  • "and the other is a down-conversion from a better image." - ditto, plus dithering from an original with more colours or higher resolution was done in the CGA era. And even if it weren't, I don't think that would matter.
  • "They're both nice enough pictures alright," - thanks, I guess.
  • "but they don't inform, they mislead," - no they don't, see comments above.
  • "because they're not depicting the past, but a latter-day reimagination of the past." - again, see comments above. You're repeating the same statements over and over again, but the argument remains weak.
  • "The flower picture is also misleading because of the way ImageMagick (part of the Wikipedia server software) generates its thumbnails -- ImageMagick makes that picture thumbnail look much better than the exact picture would or could ever have looked on a CGA monitor." - not any better than a CGA monitor would have looked from a distance. Thumbnails are just that.
  • "I suggest that these images be replaced by more representative pictures." - good luck, but they will have to be dithered photographic images, because otherwise you are underrepresenting what can be done with CGA.
  • "Moving up the Alley Cat image later on in the article would be an excellent choice. That game was well-known and widespread during the CGA era." - it would be a horrible choice, because copyright on Alley Cat has not expired yet and because the lack of dithering misrepresents the capabilities of CGA hardware.

Shinobu 01:41, 10 November 2007 (UTC)

All of your arguments don't address the fact that a dithered image misrepresents the capabilities of the card because the card itself did not dither. Yes, all of us who used CGA routinely dithered images to make viewing them better, but this is an article that has to represent the card and what it could do, and a 320x200 dithered image is extremely misleading. Anyone not familiar with the card would look at that and think that CGA was possible of more than 4 colors onscreen at a time in 320x200 graphics mode.
Dithering the image only serves to show the image as accurately as possible, not the card's capabilities as accurately as possible. If you can't come up with a less misleading image, I will remove it (again). --Trixter 05:12, 16 November 2007 (UTC)

Why are the images stretched like they are? It looks awful. SharkD (talk) 03:05, 23 March 2009 (UTC)

...And a reason for not using this image that seems to have been missed by everyone: Doesn't the "C" in "CGA" stand for "color"? This why put a screen capture of black and white graphics at the top of the article?--Drvanthorp (talk) 21:31, 18 June 2011 (UTC)

  • SharkD: That is what CGA often looked like. When using 640x200 on a standard 4:3 aspect screen, you end up with very tall pixels. Software that doesn't take this aspect into account (assuming square pixels) comes out stretched like that.
  • Drvanthorp: Yes, CGA features color. But it's maximum resolution is 640x200 monochrome. This is the resolution that was typically used for non-game software, because you can render 80 columns of text with it.
Shamino (talk) 18:51, 11 February 2014 (UTC)

OK. Some anonymous user just reverted the Fractint image back to the Arachne image with the non-CGA aspect ratios. If you want to use Arachne, then please use the one stretched to 640x480. CGA does not use a 1:1 pixel-aspect ratio, it is 1:2.4. Any CGA screen shot that isn't scaled to this aspect, by definition, is not presenting what CGA actually looks like. Shamino (talk) 14:14, 9 August 2014 (UTC)

Done, an image with a more correct ratio is there. 4throck (talk) 18:09, 11 August 2014 (UTC)

I was the uploader of the Fractint image and placed it in the article. In my opinion this image accurately reflects both what CGA graphics "felt like" and usefully represents the technical limitations of the standard.
I can't support using the Arachne image - it is utterly anachronistic to show a screen shot of a web browser running in CGA. Surely it is common sense to show an application that actually existed in the CGA era. CGA came out in 1981. Arachne was released in 1996. Hopefully everyone working on this article has enough knowledge of computing history to understand that quite a lot changed in those fifteen years - not least in the world of computer graphics.
Anyway, I think the four-colour graphics modes are more representative of the unique characteristics of CGA. The Fractint image has exactly the right "feel" to invoke the CGI days, and also correctly represents the technical capabilities - big pixels in exactly two choices of four-colour pallet. I think a screenshot of a game or similar would also invoke that mood and represent the capabilities well, but the advantage of the Fractint image is that it is unambiguously public domain.
Thparkth (talk) 00:20, 23 August 2014 (UTC)

Moved both the Fracint and Arachne image to the gallery under "Capabilities". I think it works as a better overview of what CGA was capable at the time and also using newer software. 4throck (talk) 11:19, 23 August 2014 (UTC)


I emailed the author at and asked permission to use the Digger animated graphic seen on their webpage:

It is a perfect example of CGA in action, using a popular game from the period.

Kreline (talk) 06:12, 11 February 2008 (UTC)

Agreed! That would definitely be a better top-of-page image than the Arachne screenshot (an animation on top of the page might be a tad annoying, but the preview could just be a single frame, while the animation could be viewable after clicking it). (talk) 15:02, 16 May 2008 (UTC)

I went ahead and added it. If the animation is found to be annoying, it would be easy enough to edit the image to preserve only the first frame. --Kreline (talk) 04:09, 21 September 2008 (UTC)

ICON: Quest For The Ring[edit]

The story of CGA is not complete without including a mention of this game.,10/section,40/,10/section,41/,67846/

This game took the "tweaked text mode" technique to a new height. It used many other ASCII characters that were available, not just the "left block" and "right block" characters. Using drawing techniques of ASCII art, an effective resolution of 320x200 was created.

The 320 comes from 40 columns of text characters, at 8 pixels per character. 80-column mode was not used, because it was too slow, and because of the well-known "snow" bug of CGA's 80-column mode.

The 200 comes from 100 rows of text characters, using only the top 2 pixels of each character, just as in 160x200 mode.

The graphics have a distinctive look, as the available shapes were limited to what happened to be in the top 2 rows of pixels for each of the 256 text characters in the CGA's ROM. For example, the top of the "2" character produced a distinctive shape that looked like a musical slur, useful for drawing graphics that looked like waves in water.

Since this "320x200" mode was a text mode, it could use all 16 colors. This gave the graphics an appearance that could, at first glance, equal the 16-color 320x200 resolution of the EGA.

Perhaps the best example of this can be seen in the title screen of the ICON game:,695/

Kreline (talk) 06:12, 11 February 2008 (UTC)

You wrote "This game took the 160x200 technique to a new height." You're confusing the 160x100 "tweaked" text mode with the 160x200 composite mode. ICON used the tweaked text mode, but instead of tweaking the 80x25 text mode (like e.g. Moon Bugs did), it tweaked the 40x25 text mode, but achieved a higher apparent resolution than would be possible with ordinary tweaking (w/ the ASCII 221/222 half-block characters) by using ASCII art. (talk) 21:26, 14 June 2008 (UTC)

Thanks for the feedback. I changed the phrase of "160x200 technique" simply to read "tweaked text mode", to avoid this confusion. The reader can read the rest of the article if they're interested in learning more about the resolutions of the various tweaked text modes.

FYI, the game did in fact achieve a vertical resolution of 200, by using the top 2 pixels of each of 100 rows of text characters. The major drawback was that not all the pixels were individually addressable. This gave ICON its distinctive appearance, as the programmers were limited to shapes found in the top two rows of each of the text characters in the BIOS of the CGA.

--Kreline (talk) 22:03, 11 August 2008 (UTC)

And, interestingly, it's doing some things I came to this page to ask whether anyone ever attempted - IE, something akin to 160x100 (or 80x200...) mode but taking advantage of a limited number of other already sort-of-graphic characters in the 437 codepage. IE the 25/50/75% shade blocks, the line/window-drawing characters, etc. Thus you might have otherwise slightly chunky, blockmode-like graphics, but with a number of readymade dithers to show a greater number of apparent colours (maybe as much as 128 just with 50%, 256+ with 25/75%?), and a few thinner lines and patters that can be thrown in here and there. Clever :) — Preceding unsigned comment added by (talk) 13:48, 6 June 2014 (UTC)


This is a forum thread, but maybe this info, particularly the info at post number 4 and 5, could be incorporated into the article? (talk) 08:42, 22 June 2008 (UTC)

Ambiguous/misleading formula for converting to RGB?[edit]

In the RGBI section, is the formula to convert to RGB floating point values supposed to be in C? If then, shoudn't it read something like:

red   = 2.0/3.0*(colorNumber & 4? 1:0) + 1.0/3.0*(colorNumber & 8? 1:0);
green = 2.0/3.0*(colorNumber & 2? 1:0) + 1.0/3.0*(colorNumber & 8? 1:0);
blue  = 2.0/3.0*(colorNumber & 1? 1:0) + 1.0/3.0*(colorNumber & 8? 1:0);

Of course it is understandable the way it is, but I think it's a little bit better if it can be correctly executed. Samuel Lowry (talk) 17:08, 6 November 2008 (UTC)

The picture "KL CGA Unknown.jpg" is not a CGA card[edit]

A. Video memory: The Toshiba 2016 chips are 2kB each and thus there are only 4kB on the board. The other chips are logic only. 4kB are not enough for a Hercules card so this should be a MDA clone (80x25x2 = 4000).

B. Clock: There is a 16MHz crystal on the board. CGA uses the 14.3xx MHz provided by the ISA bus as pixel clock and thus needs none of its own. MDA/Hercules adapters use 16MHz crystals. —Preceding unsigned comment added by (talk) 18:09, 2 July 2009 (UTC)

Well spotted. This card also misses the composite connector. —Ruud 18:03, 11 August 2009 (UTC)

Blinking needed to be switched off for "tweaked text" 160x100 16 colour graphics mode[edit]

Maybe this information (in the yellow box) should go into the article? (talk) 22:20, 19 June 2011 (UTC)

PAL in Europe? Or just RGB? Or NTSC?[edit]

Did IBM-made CGA cards sold in Europe output PAL? Or NTSC? Or neither? Please add. -- (talk) 11:45, 14 May 2012 (UTC)

There was no such thing as a CGA with PAL output, even in the European market. As far as I know CGA cards sold in PAL countries were identical to those sold elsewhere - you just couldn't take advantage of the composite output, so in effect you were limited to RGB. -- (talk) 13:36, 4 June 2012 (UTC)

x86 is little endian[edit]

Each byte in memory is represented on screen with pixels from smaller to higher.

Micro Plato tweaked graphics mode[edit]

I remember, back from my college days in 1988, that the Micro Plato software we used (an MS-DOS port of PLATO running courseware from floppy disk) generated a monochrome graphics mode with more than 200 lines on a standard CGA adapter. Some of our monitors could not display this resolution at all. Those that could were either NEC Multisync displays or needed us to twiddle the hold knobs in order to create a stable image. We typically also needed to adjust the size knobs to make the image wider and shorter in order to fit on-screen.

Is anyone familiar with this software and what it was actually doing to generate this unique graphics mode? Are there any CGA wizards who might have a good theory? Shamino (talk) 17:39, 10 February 2014 (UTC)

Direct register manipulation of some kind, to allow use of the normally blank lines that sit between the active area and the actual blanking/Vsync area, much like the overscan modes that other contemporary computers often allowed? Difference being here that the amount of video memory is strictly limited, and there's very little spare (enough for maybe 210 lines vs the usual 200), so whether it's somehow re-loading and re-using parts of it on the fly, or slightly reducing the horizontal rez in order to use more vertical, or even doing very fancy tricks to reduce the colour depth in low-rez and then interlacing (thus getting, say, 320x420i) instead of rendering in progressive mode (with 640x200, or 320x200 in more colours)... who knows. It might even have been a trick text-based one, or switching between text and graphics mode partway...
(the leftover memory would be enough for at least 4 lines of text in 40-column mode, with hi-res mono or low-res colour graphics above it... IE 640x200 mono graphics, plus 4 lines of 40 column colour or 80 column mono text, which is the equivalent of 232 lines total, coming close to the maximum that an NTSC screen could show; reduce vertical graphics height to 192 lines, and you've enough for another 12 lines of text, meaning a total of 288 active lines... more than NTSC can handle, and thus probably requiring it flip into 50Hz PAL mode which would just about handle that. The only problem is that, even though the horizontal line frequency is essentially the same, both your card and monitor would need to support 50Hz vertical scan. 232 would already be noticeably higher rez than 200, as it's an extra 16%... maybe it split the difference, showed 196 graphic lines, and 8 extra lines of text, for an edge-of-the-envelope 260 active lines per field (half-frame)... which doesn't leave much for Vsync even if you abuse some of the variability allowable in the system and take total lines out to, say, 540 instead of 525... So maybe even 198 graphic lines, 6 lines of text, and 246 total, which is about the maximum you can realistically see on an NTSC CRT. Though of course, it could do 192 lines of all-points-addressable graphics, then use the remaining text blocks at less than their normal height. 192 normal lines, 4 full lines of text, and 8 at quarter-height (same as the trick 160x100 mode) makes 240 altogether.)
So yeah there's ways and means... whether the registers actually allowed for that is another matter. (talk) 13:44, 6 June 2014 (UTC)

Misuse in arcade game terminology[edit]

It would seem that when arcade hardware buffs use the term "CGA", what they mean is 15KHz analog RGB, not digital IRGB (which is what CGA really is); needless to say, the two types of signal are not compatible. This usage has become widespread enough on the web to mislead and confuse (just run a web search for "CGA monitor" or "CGA converter" to see what I mean). I wonder if it's just plain ignorance or perhaps there is some sort of reasoning behind this.

At any rate, due to the popularity of this wrong usage of the term, perhaps a short clarification should be appended to the article, to alert readers to the distinction. (talk) 23:52, 28 April 2014 (UTC)

It's probably because yer typical CGA monitor was a multistandard (although not multifrequency) device and had TTL, analogue RGB and composite inputs. Certainly any I've seen which don't use a captive flylead have these separate sockets, although the shape and pin count of the RGB ones does vary. So if they go buy a CGA monitor and don't get a less common digital-only model, they can plug it into an analogue or digital board, so long as it outputs at approx 15.7kHz line frequency. (talk) 13:30, 6 June 2014 (UTC)

Palette sort order[edit]

I reverted the edit that sorted the 16-color palette; all other palettes on the page are unsorted, and viewing the 16-color palette unsorted illustrates the relationship the intensity pin had on color generation. Additionally, most pages that describe 8-bit color modes also show their modes unsorted. Trixter (talk) 17:04, 30 June 2014 (UTC)