From Wikipedia, the free encyclopedia
In computers, a serial decimal numeric representation is one in which ten bits are reserved for each digit, with a different bit turned on depending on which of the ten possible digits is intended. ENIAC used this representation.
|This computer storage–related article is a stub. You can help Wikipedia by expanding it.|