Jump to content

Talk:MOS Technology 8568

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia

Merge into 8563 article?

[edit]

Since almost all the information in this article is the same as MOS Technology 8563, we might be better off merging them, and just note the differences between the chips in the common article. If so, the title should probably be "MOS Technology VDC", which I recently reverted to "8563" before thinking things through and coming up with this proposal. *sigh* --Wernher 04:29, 14 December 2005 (UTC)[reply]

Needs to merge

[edit]

I second this....who wants to volunteer?  :-) cbmeeks 20:55, 27 November 2017 (UTC)

[edit]

The image File:Ultra Hi-Res Cube Demo.gif is used in this article under a claim of fair use, but it does not have an adequate explanation for why it meets the requirements for such images when used here. In particular, for each page the image is used on, it must have an explanation linking to that page which explains why it needs to be used on that page. Please check

  • That there is a non-free use rationale on the image's description page for the use in this article.
  • That this article is linked to from the image description page.

This is an automated notice by FairuseBot. For assistance on the image use policy, see Wikipedia:Media copyright questions. --19:43, 8 January 2009 (UTC)[reply]

Significant misunderstanding of (and in) source material...?

[edit]

(Will probably copy this over to the 8563 article as well because they're so similar and likely to get merged)

OK, the claim of 720x700 seemed extremely suspect, as regular 15khz displays just don't have enough lines to do that even in interlace, and there was nothing else mentioned of multisync etc (even though the controller looks like it could be reprogrammed to give different line frequencies and counts, your pixels per line would be somewhat limited as the controller was most likely only using a single clock of around 14 to 16MHz). You might be able to get away with MDA or even something a bit like EGA, and interlace those up to actually reach some of the stated resolutions, in software, if it was a full 16MHz, but that would be your limit, and the others mentioned in the article such as "Atari ST standard" 640x400 would be out of reach because that requires a 32MHz clock instead, or pixel rates somewhere in-between. And was the chip even made to do interlace natively, when that would be no use for the text mode it was mainly employed for? (And if it was actually 14MHz instead, as used by CGA, you would maybe scrape 640 pixels on an MDA and certainly not get that many on EGA)

Additionally there's mention in the attached article of 65,000 colours and suchlike, with attached images that look definitely more like the output of an analogue system rather than dithered/flickered RGBI, and you're just not doing that with the C128's standard hardware.

But, most crucially, one of the scanned articles that the linked source relies on shows and describes a HARDWARE graphics booster that provides those expanded capabilities. It's an add-in graphics card, essentially. Either it's overclocking something, or adding its own RAMDAC and faster memory and pixel clock circuit, or a bit of both. I wouldn't be entirely surprised to find that it's got some relation to the Commodore C900 workstation computer, FWIW, as apparently it had some version of the C128 chip in, but running at upto 640x400 (and probably noninterlaced in that application, thus probably a higher frequency than in the C128 and with at least 32 if not 64k memory, as it's unlikely that having to lace to achieve that relatively "low" resolution would be accepted in a workstation application). The underlying hardware may well still be good for that level of performance, but locked down to more modest output by the rest of the C128 circuitry and need the booster to achieve its full potential.

Anyway... In which case that booster is its own thing and nothing to do with either the 8568 or the 8563, any more than other hardware graphics enhancements are with the native graphics hardware of other machines they might be found in.

Is there any way of properly certifying what the chip could do *by itself* within the computer as sold, without any add-ins, and maybe only with simple programming tricks that could be employed by the everyday user at their own risk of issuing a killer poke? 92.10.68.40 (talk) 11:53, 3 March 2023 (UTC)[reply]