|WikiProject Mathematics||(Rated B-class, Mid-importance)|
Sexidecimal is Correct! (not Literally but Historically)
It is true that engineers at IBM first used the bastardized Latin form Sexidecimal to describe a base 16 numbering system. It is not, and should not be the primary question here whether the form is correct grammatically. Instead, what is much more important is that the term was accurate historically -- it was used at a certain place and time. Language is less concerned with Accuracy than with Consensus. As such, it is always subject to change as soon as enough people agree that it should.
Google returns 374 results on "sexidecimal". Sounds like consensus to me.
Other characters used
NEC in the NEAC 1103 computer documentation from 1958, uses the term "sexadecimal" and the sequence 0123456789DGHJKV. See the brochure at http://archive.computerhistory.org/resources/text/NEC/NEC.1103.1958102646285.pdf.
The article does nothing to explain why the hexadecimal system was invented, or why it sees so much use in computer science. Can someone address this please? What is the point of using a base-16 system?
- Computers use binary. Converting from a binary representation to hexadecimal representation is much simplier than converting from binary to decimal. This is performed rather quickly in your head by grouping 4-bit numbers in longer binary representations, and converting each 4-bit group to a hexadecimal digit. For example:
|0010100100010101||16-bit binary number representation|
|0010 | 1001 | 0001 | 0101||16-bit binary number representation grouped in 4-bit groups|
|2 | 9 | 1 | 5||Converting 4 bit binary groups to 1 hexadecimal "digit"|
|0x2915||Hex value after conversion|
With time, it becomes simple to convert from a 4-bit binary representation to a 1-hexadecimal "digit" representation. 188.8.131.52
Hidden comment in article body that should have been posted here
Regarding the following text in the article: "In typeset text, hexadecimal is often indicated by a subscripted suffix such as 5A316, 5A3SIXTEEN" This comment was appended: "this seems hugely verbose and i can't say i've ever seen it does anyone here a source?" by: Plugwash 23:13, 10 July 2005 (UTC).
Computing additions in lede
In [this edit], User:Nimur added computer-programming-oriented info in the lede. As a programmer myself, I agree with what it says, but is it right to emphasize this use in the lede? Hexadecimal isn't just for computing. Also, the new lede's "0x" notation emphasis contradicts with the "Representation" section just after the lede. Unless someone objects, I'll revert this per WP:BRD (which I'm not quite doing in order -- that's just how I roll). A D Monroe III (talk) 17:31, 20 June 2014 (UTC)
- No worries. I was proactively responding to a Computing reference desk discussion, in which another user was confused: Hexadecimal question, (June 17). If you can edit my changes to clarify the lede, please feel free; or if you strongly feel that the lede was more clear before my changes, please feel free to revert. Nimur (talk) 22:40, 20 June 2014 (UTC)
- Hm. So, if I follow this, the change was in response to the lede being incomprehensible; programming use was introduced as a way of better explaining this. I agree the original lede was not very helpful. So, I won't revert, but still don't want to rely so much on 'C' programming use right from the start. I'll have to think about this. A D Monroe III (talk) 23:10, 24 June 2014 (UTC)