Talk:18-bit

From Wikipedia, the free encyclopedia
Jump to: navigation, search
WikiProject Computing (Rated Start-class, Mid-importance)
WikiProject icon This article is within the scope of WikiProject Computing, a collaborative effort to improve the coverage of computers, computing, and information technology on Wikipedia. If you would like to participate, please visit the project page, where you can join the discussion and see a list of open tasks.
Start-Class article Start  This article has been rated as Start-Class on the project's quality scale.
 Mid  This article has been rated as Mid-importance on the project's importance scale.
 

Size of char/byte in these systems (first UNIX)[edit]

18-bit is the word size. "Two bytes + 2 kiddies"[1], probably meant just for size comparison. But byte can actually be be 18/2=9 bits in this schema. Or 18/3=6 bits. As PDP-7 was 18-bit and first Unix machine I wander what they did? I think Unix has always used at least 7-bit ASCII from C-era UNIX, but first version is pre-C assembly. Yes, C requires byte to be at least 8-bit in C99, but was 6-bit allowed at some point? comp.arch (talk) 12:00, 25 July 2014 (UTC)

I also wonder what the "standard" way of storing letters of text was in these systems.
I speculate that (assembly-language programs running on) 18-bit machines probably stored characters in exactly same ways as 36-bit machines. (See Wikipedia: "36-bit" for a surprisingly long list of ways).
--DavidCary (talk) 22:53, 18 June 2015 (UTC)