Talk:Motorola 68020

From Wikipedia, the free encyclopedia
Jump to: navigation, search
WikiProject Computing (Rated Start-class, High-importance)
WikiProject icon This article is within the scope of WikiProject Computing, a collaborative effort to improve the coverage of computers, computing, and information technology on Wikipedia. If you would like to participate, please visit the project page, where you can join the discussion and see a list of open tasks.
Start-Class article Start  This article has been rated as Start-Class on the project's quality scale.
 High  This article has been rated as High-importance on the project's importance scale.


It may be worth comparing the transition from 16-bit to 32-bit APIs and programming models in the Motorola/Macintosh world versus the Intel/Windows world.

In the Intel/Windows world, the transition entailed the creation of all-new 32-bit versions of the Windows APIs, with a new "flat" memory model in place of the old "segmented" memory model. This transition took most of the 1990s to happen.

In the Motorola/Macintosh world, on the other hand, since the original 68000 was designed essentially as a cut-down 32-bit processor to begin with, the transition to the 32-bit 68020 and later processors was really just a matter of filling in gaps. Addresses were always 32 bits--albeit since programmers knew the 68000 processor ignored the top 8 bits, they had got into the habit of storing other information there, and so there was a transition as programmers had to rid themselves of this habit and make their software "32-bit clean". But this did not involve the creation of any major new APIs, let alone a new memory model, and 32-bit-clean software could continue to run on older machines. The transition was essentially complete by 1993.

Ldo 10:16, 12 Sep 2003 (UTC)

The problem with talking about "bits" with a processor is, which set of bit width are you talking about? The APU register set? Index or pointer registers? The address bus width? The data bus width? The 68000 and 68010 were 32/32/24/16 in that regard. The 68012 was 32/32/31/16, the 68008 was 32/32/20/8, the 68EC020 was 32/32/24/32 and the 68020 was 32/32/32/32. Confusing, eh?
If you're going to talk about the API, then you're mainly talking about passing stuff via the data and memory registers, since passing parameters in functions is very register dependent (unless you want to talk about passing non-native data formats via the stack). I haven't seen too many guides out there that discuss this evolution from the old 16/8 processors to 32/16 ones. Most of the ones I have seen are very 8086 -> 80286 -> 80386 centric. So, you need to be careful of doing original research. Dinjiin (talk) —Preceding undated comment added 04:22, 25 October 2010 (UTC).

In which year did the 68020 appear ?

PS: There was no 16-bit API on the macs, AFAIK.

-- 17:27, 6 September 2005 (UTC)

I believe it was 1984 - agrees. Mdwh 02:21, 16 January 2006 (UTC)
In what year did development of the 68020 start? I've been unable to find any references that describe the timeline leading up to the 68020's release. Dinjiin (talk)

This page says it was introduced in 1982. So whats correct? —Preceding unsigned comment added by (talk) 15:32, 9 January 2009 (UTC)

Motorola's own museum states that 1984 was the release year for the 68020. Dinjiin (talk)

The 68k series was never 16 bit. The 68000 was a 32bit processor, as its GPRs were 32 bits. Hence any talk of a 16bit transition is nonsensical. Wayne Hardman 23:36, 27 January 2006 (UTC)

The part about 68000s problems to virtualize hardly seems relevant, is there a really reason for it? Zorbeltuss (talk) 01:24, 9 December 2009 (UTC)

The problems with virtualization on the 68000 led to instruction set changes on the 68010 and 68020, and VM support was a selling point for those processors, so yes, it is relevant. --Brouhaha (talk) 05:54, 21 August 2010 (UTC)