Talk:Roman numerals/Archive 5

From Wikipedia, the free encyclopedia
Jump to: navigation, search

Numerals Caused Slow Development

I've read somewhere that Roman numerals were partly responsible for slowing the development of science and math. This was purely because they are harder to deal with, and it takes even a trained user longer to add, subtract, multiply and divide numbers written in Roman numerals that it does someone using Arabic numerals. This greater barrier to entry, as it were, resulted in less research.

Is this true? -- ansible

Maybe yes, maybe not: they certainly could not interact with "our" computing system because of lenght (in chars) of single numbers, which is variable too. For instance: number 77 is expressed in arabic with two chars only, while LXXVII needs six. But the worse is that number 78 needs seven chars (LXXVIII).

Side note. Don't computers use 0s and 1s, anyway? - Rose

But this does not mean that there are operations that you cannot perform with them too. If you try to calculate a square root of some number, you will get the same result with both systems. Maybe obviously this would require a different use of the space.

On the other side, every calculation in roman numbers requires a logical scheme that is different from arabic system. I could not say which is the best: if you are latin-minded (and you are consequently used to decline words, verbs and other object of same frequency making use of a sort of "on-the-fly" developping), you will find it as natural as today we find arabic ones, the longer time only depending on writing.

I think that it is only that arabic system was used by phoenician merchants in the whole Mediterranean area well before that Rome had an influence over a similarly extended territory. A fact is that Rome created the widest empire of ancient world using its numbers, and another fact is that we use arabic system; opinions might evaluate whether it is better for us, but keep in mind that we were born "within" this mentality.

I do think however, that it would be quite complicated to eventually revert our system now :-)

In Latin class I had learned the exact opposite of what Ansible suggested. From what I understand, the Roman numeral system is supposedly really easy to count on your hands with. Essentially, the Roman numerals were quicker to add and subtract with, whilst Arabic numerals are easier to mulitiply with. Just some thoughts... --BlackGriffen

I can kind of see how Roman numerals would be easier to learn how to count. But there's a lot more science needs than simple counting. It's multiplying and division that seem to be overly difficult in Roman numerals. I remember in 3rd grade, learning how to divide using Arabic numbers was hard enough. What are even the rules for doing manual division with Roman numerals? Does anyone even know anymore? -- ansible

As far as I remember, there is no different operational method with roman numbers, since it should be only a matter of graphical rendering, or I didn't meet this point in my studies. The concept of division should be the same in both methods. Roman system has a different approach to rendering, requiring not to consider a linear sequencial scale (as in arabic ones) but a more complex thought about "notable" numeric entities: 99 is in arabic only the number after 98, in roman it is the one before one hundred (closer relevant entity), and is "IC" (really, I am not aware it is wrong, being more purely latin minded, and I have many books with that form too - just checked). A rendering like LXXXXVIIII would be the first result in our current mentality, but a second thought is required to better describe it in latin concepts (so LXXXXVIIII is a wrong form, correct being only IC - or the other one proposed in article).
With arabic numbers you have to learn by heart some concepts like fixed relationships: series of adding or multiplying factors (like 2,4,6.8.... or 3,6,9,12,.... or,.... and so on) will be recalled from your memory when you compute the separate parts of a multiplying operation. In roman maths, you will constantly evaluate concrete "weight" of numbers, so you will get your result with less use of memory and deeper instant analysis.
With romans you need to realise "where" in the proportion of values your number is located: is 98 closer to 50 or 100? Am I talking about something that is of (this) kind of proportion or of (this other kind)? Idea is: main identifiable concept = one hundred, my number differs from that of II less (so it's on the left = IIC), while let's say 105 is more than one hundred by 5 (so it's on the right = CV).
Obviously this is easier to perform with sums, and it is true that main progresses in main scientific disciplines were achieved by arabs (just think of astronomy). I wouldn't however agree it simply slowened progress: Roman system might have been better structured to complete a mentality which put humanist sciences before technical sciences, but today's progressed world belongs more to latin civilisation than to arabian one.

I do not believe that the difference between operating with Roman or Arabic numerals is just a question of getting used to one or the other, as some of the above posters seem to imply. A positional number system like the Arabic one(with a symbol for the zero, which the Romans didn't have, and which marks a significant difference) is much more handy for performing all kinds of operations thana system like the Roman. In fact, the Romans did not do any long divisions or stuff like that with their symbols, but they used an abacus for calculations, and an abacus is basically a positioning system. By the way, the Arabic numbers should really be called Indian numbers, that's where they originally came from, although they were introduced into Europe by the Arabs. In reply to the last post, I think the progress in the Western world owes much more to the Arabic digits than to the Roman humanistic legacy....

I would like to quote Georges Ifrah from "The Universal History of Numbers":

Anyone who reflects on the universal history of written number-systems cannot be but struck by the ingeniousness of this system, since the concept of zero, and the positional value attached to each figure in the representation of a number, give it a huge advantage over all other systems thought up by people through the ages. -Calypso

Oh - so the West was able to 'progress' using Arabic zero while the Arabs were not? Just a query from a specialist in the western humanities. Let's not start this silliness. Culture is considerably more complex than ease of computation. --MichaelTinkler

I certainly agree that culture and 'progress' are complex topics. It seems to me, though, that the tools (physical and mental) that are available to people drastically change their outlook on the world. Mathmatics is the basis of science and technology, and arithmetic is the starting point of it all. It seems to me that entire new opportunities became available to us, when we switched numbering systems. However, I'd like to have some references before I write up an encyclopedia article about it. Are there any good studies of history where fundamental practices changed because of better math? Like some example from military history, where someone, because they were able to figure out their logistics better, were able to win some battle. --ansible

Sure it's important, but given that everyone from India to Iceland had the use of the numeral system by some date, we're in Sapir-Whorf fallacy zone to use it as much of an explanation. And what about the Central American zero? There are scholars who insist that the invention of double-entry accounting (Venice, late middle ages) is really what does it for the West. My only goal here is to suppress sweeping, universalistic statements about the zero changing the world. It did so, but very, very, very slowly. --MichaelTinkler

Of course culture is terribly complex, and I certainly did not want to reduce the success of Western culture (or, let's say, the current dominant position of Western culture in economic terms) to the adoption of a certain numerical system. In any case, the main point of my previous post is that the Arabic numerals are intrinsecally better for doing mathematics than the Roman numerals are. --Calypso

Take a look here above: most of the words you use have latin or greek roots. Maybe this is stats, still it's not economy. I agree that the explanation of this concept might be better shown in arabic numbers, but I still prefer a latin "idea", than a fair perfect result.
I thought that in the Roman era and the Middle Ages, people used the abacus for calculation and the numerals for writing them down. -- Error 00:34, 5 May 2004 (UTC)

Indeed. The very word "abacus" is a Latin one, though nowadays abaci are chiefly identified with East Asia. -- 07:38, 10 Oct 2004 (UTC)

I would agree with the original post here. Roman numerals do not use a system with an identifiable base. Because there is no standard system for putting particular numerals in place, there is no way to line them up. That's not to say that the arabic numerals themselves were superior, but the organization ceratinly was. You can, for example, line up 153 × 802 vertically very easily, with each digit occupying it's own place. Even with placeholder zeros you could not do this using roman numerals. Simply by memorizing a multiplication table from 0-9, one can multiply and divide any number vertically in this way in arabic numerasl, but not in roman ones. This is why many medieval European mathematicians and astronomers began to use arabic numerals - because they saw the convenience (so claimed and documented by the men themselves, not just conjectured later by historians). Integer exponentiation is not easy with arabic numerals, but requires conversion to multiplication, so one might imagine an even more convenient number system. Exponentiation and its inverse, logarithmetic, with noninteger solutions are still extremely difficult in arabic numerals, but would require infinite iterative processes in any number system (since they result in irrational numbers).

So yes, I think (and have heard in History once or twicethe conversion from roman to arabic numerals was in many ways a result of convenience. (talk) 01:41, 15 January 2008 (UTC)

Is MIM ok for 1999?

The article said:

Some rules regarding Roman numerals state that a symbol representing 10x may not precede any symbol larger than 10x+1. For example, one should represent the number "ninety-nine" as XCIX, not IC. However, these rules are not universally applied.

The last sentence is wrong. MIM for 1999 is not kosher, any way you look at it. So I removed the last sentence. Egil 07:31, 3 Feb 2005 (UTC)

That last sentence reflects that fact that - "kosher" or not - some people do it anyway. Since the role of Wikipedia is not to proclaim what's right, but to describe what's done, I've restored a slightly modified version of that statement. Tverbeek 15:20, 3 Feb 2005 (UTC)

I beg to disagree. There is no reason to report usage that is patent nonsense, unless in very particular cases. The statement However, these rules are not universally followed is just as bad as the previous one. As far as I know, the only usage of 'MIM' and 'IMM' is by people who haven't even bothered finding out what roman numbers really are. The 'pedia should report correct information. Your sentence gives the impression there is doubt about what the correct usage is. This is misleading. And if you feel the need to state that "People do not always bother following rules", it should be done in the context of human behaviour in general. -- Egil 17:23, 3 Feb 2005 (UTC)

One point of that section is that the question of "correct" usage isn't as simple as you state. Not only has usage varied somewhat with time and place, the Romans themselves exhibited some inconsistency in their usage, and a degree of personal preference seems to have been involved. Certainly we can and should spell out the usage that's most prevalent, but since no one can find the original RFC or ISO standard for them, the position that there is an indisputable standard for "correct" usage - and that you have it - seems hard to justify. Tverbeek 20:52, 3 Feb 2005 (UTC)

Tverbeek is right: Even today, there no indisputable standard defining these rules. Moreover, the only documents I know who codify these rigid rules are modern. Are there any Roman documents known who describe the Roman numbering systems? If so, do they codify the stringent rules, or do they allow IM etc., or don't they mention the problem? Are there any medieval documents known who describe the Roman numbering systems? Same questions apply. -- Adhemar, 12 December 2005

The usage XIIX for 18 is attested in actual usage in medieval times, and I think IC for 99 is also. People who actually wrote and read these numerals could communicate unambiguously with a slightly more flexible version of "The Rules", so who are we to be throwing around epithets such as "patent nonsense"? My guess is that "The Rules" were written by printers round about the time that they standardized spelling. Tverbeek is right. Cbdorsett 07:09, 6 Feb 2005 (UTC)

I came across a photo of an ancient inscription with the numeral XIIX (the tomb of Secundinus on the Via Appia). Does anyone know of an ancient example of the use of IC, IM or XM? --Zundark 14:03, 6 Feb 2005 (UTC)
In Rome, archway number 29 of the Colosseum has the inscription XXVIIII. The MathWorld article on Roman numerals cites (Menninger 1992, p. 281; Cajori 1993, p. 32) that "Romans occasionally wrote IM, IIM, etc." -- Adhemar, 12 December 2005

How about VL for 45? I for one don't think this is a 'decimal' system, so the rule about subtracting exactly one-tenth seems suspect. Aleš Wikiak 21:03, 25 August 2006 (UTC)

I was wondering the exact same thing. I've never seen VL or LD used before. Rather than trying to put it in terms of 10x and 10x+1, why not just explicitly spell out which letters may precede others. It's not like there are that many. I can precede V and X, X can precede L and C, and C can precede D and M. That's it. One short sentence. Alexwagner 13:16, 14 June 2007 (UTC)

Food for thought: Roman numerals were not inspired by a decimal number system... or does anyone care to provide a reference to a document proving otherwise?

  • As such, original rules could not have implied that every decimal digit is to be spelled out separately. Note that this has nothing to do with common modern use (or confusion).
  • Can anyone provide a reference to a "Roman numerals decimal reform" that would establish the new standard?
  • Note that whatever form a number takes (e.g IMM or MIM or MCMIC or MCMXCIX or even MDCDIC, MDCDXCIX, ...) it is always unambigous as to what its value is. The only question is what is the preferred way to write it. Speaking of preference, mine is the one that is easier to look and understand and IM for 999 beats the alternatives any day (one less than a thousand).
  • If one wants to specify letters that can precede others (to avoid introducing decimal rules where they do not belong), they would probably have to think about how "distant" the letters are. For example, there are no (currently standard) letters between I and V, hence IV is OK. Same happens with XL for example. Since IX, XC and CM are considered valid, then at least two previous letters must be allowed. Hence VL and LD should also be fine. However, if "in spirit" Roman numberals were to simply show the value in the simplest form, then as 999 is so close to 1000, it should be similar to it... hence IM and not CMXCIX or, worse, say, DCDLVLIV... Then remember that there were other letters in use as well...
  • The fact that we today spell out roman numbers while thinking in decimal shows just that ... that we are thinking decimal... only to end up with ugly representations in the end...
  • The real fun begins now... is "IXC" valid and, if it is, is it (IX)C = 100-(10-1)=91 or I(XC) = (100-10)-1=89... :)

--Aleksandar Šušnjar (talk) 21:40, 16 June 2008 (UTC)

A not mathematically inclined friend of mine said that IVC = 100-4 feels better than IVC = 100-5-1. Here's an interpretation of all strings over {I,V,X,L,C,D,M} that treats the unambiguous cases correctly.

-- This software was written in 2008 and is granted to the public domain.

roman :: String -> Integer
roman xs = rom (map val xs)

val 'I' =    1
val 'V' =    5
val 'X' =   10
val 'L' =   50
val 'C' =  100
val 'D' =  500
val 'M' = 1000

rom [] = 0
rom xs = let (ys,zs) = salm xs in last ys - rom (init ys) + rom zs

salm xs = spaf (== foldr1 max xs) xs  -- split after leftmost maximum
spaf p (x:xs) = if p x then ([x],xs) else let (ys,zs) = spaf p xs in (x:ys,zs)

-- (talk) 09:14, 18 September 2008 (UTC)

Z for 7

It says at Talk:English alphabet that the Romans used Z for 7. Where is the source of this?? Georgia guy 01:33, 12 November 2005 (UTC)

Z was used for the numeral 7 in Hebrew and Greek, and still is today. The numerical values of the letters of the alphabet came along for the ride with the alphabet itself, and Z likely remained associated with '7' long after it was dropped from writing, just as the Greeks retained digamma (F), qoph (Q), and sampi as numerals long after they were gone from the alphabet. Today, now that miniscule sigma has two forms, the Greeks tend to replace digamma with the alternate form (ς) for '6', and we can imagine that something similar happened with the Romans when they had a new letter G to use instead of the obsolete Z for '7'. See Gematria and Greek numerals. I don't know if the Romans ever used this as a daily system for indexing or anything instead of the Roman numerals, but you can probably find something in Ifrah Universal History of Numbers or Daniels & Bright World's Writing Systems, or many other refs. kwami 02:32, 12 November 2005 (UTC)
I've heard not only as Z as a alternative form for VII = 7 but also O = XI = 11; F = XL = 40; K = L = 50; S = LXX = 70 although far more frequently S = 1/2; R = LXXX = 80; N = XC = 90 long before Bede used N for nulla; Y = CL = 150; T = CLX = 160; H = CC = 200; E = CCL = 250; P (= sometimes also G) both equal to CD = 400 although CD can be confused with CIƆ = 1000. Were these actually used by the Romans at one time? Does anybody have a source? – Adhemar 18:14, 18 May 2006 (UTC)
I, too, have seen those "medieval Roman numerals." The ones I have seen, and this is consulting a Webster's dictionary:
I/J        1
V/U     5
s       7
X       10
O       11
F       40
L       50
S       70
R       80
N       90
C       100
Y       150
K       151 (yes, that's a 151-- don't ask)
T       160
H       200
E       250
B       300
P/G     400
D/Q     500
M/φ     1000
Z       2000
You're right, I think there should be something on medieval Roman numerals, but I'm not sure if it's worth its own article. (I also have the backward S as 1/2, not a forward S, but that I just pulled off some Web page so I'm not sure of its accuracy.) Again, these are mostly out of the Webster's, so I'm pretty sure they're reliable. (And yes, the CIɔ = ф = 1000 is accurate.) J. Myrle Fuller (talk) 01:37, 30 January 2008 (UTC)


All these other letters were "medeival Roman numerals." Some are listed in dictionaries, others less commonly so, but none were ever used in actual ancient times. Still, they should be listed here, as they indeed are considered "Roman numerals." However, I heard that Z represented 2 000, not 7. No, I do not have a good source; I got this information from an "answers" page (not on google, but somewhere else; I found it during a google search), but if you find a good unabridged dictionary and looked up each letter individually you could find a nearly infallible source, given the research put into these. By the way, where did you get that "CIƆ" from? I've never seen that before as a representation for 1,000. Is Ɔ=D=500, then? (talk) 01:50, 15 January 2008 (UTC)


It's beyond me why anyone is still using this archaic system! It tends to be popular among academics with a fetish for ancient things.

Yes, but there's nothing wrong with that, right? Sometimes knowing obscure information can be fun, especially if you get a chance to show it off. It could help one win money on a game show, too.  ;) --Lance E Sloan (talk) 13:52, 3 January 2008 (UTC)

RoMaN nUmERals

Roman numerals is a numbers in a different language eg I is 1 II is 2 III is 3 IX is 4 and X is 5 ect.

Not sure how constructive this comment is. No, IX is not 4 and X is not 5, nor are Roman Numerals a language. I'm not sure I understand this. —Preceding unsigned comment added by (talk) 01:01, 23 January 2008 (UTC)

Unicode chart

Should the chart be replaced with {{Unicode chart Number Forms}}:

Number Forms[1][2]
Official Unicode Consortium code chart (PDF)
  0 1 2 3 4 5 6 7 8 9 A B C D E F
1.^ As of Unicode version 9.0
2.^ Grey areas indicate non-assigned code points

I don't like how it looks, but that entire group of templates could use some cleanup. —Random832 14:27, 20 September 2007 (UTC)

I see Japanese characters in my web browser, Firefox, when I look at that chart. However, when I view the PDF, I can see that the chart should contain Roman numerals in place of the Japanese characters. This may vary from one browser to another. It would be nice to have an explanation (or a link to one) of how to get the Roman numerals to appear in a browser correctly. --Lance E Sloan (talk) 14:45, 3 January 2008 (UTC)
I also use Firefox but do see the PDF Roman numerals, not Japanese characters. Nevertheless, I still see Chinese/Korean/Japanese characters when they appear in Wikipedia articles. The difference may involve how I initially downloaded Firefox because I knew I wanted to see all characters used throughout the world. — Joe Kress (talk) 05:25, 4 January 2008 (UTC)

other system?

i have read that R should mean 250 and N 900... [this unsigned entry was moved from the middle of an entry above]

You might be thinking of an alphabetical numbering system where A = 1, B = 2, etc., but in that case N would be 40, and R, 80. kwami 01:36, 3 October 2007 (UTC)

ok now i am taking latin 1 on virtual virginia and one of the web sites that my perfessor gave to us was a web page that said that for the roman numeral 4, 5 and 6 they would put llll, lllll, and llllll some times is this true? —Preceding unsigned comment added by (talk) 00:26, 5 January 2008 (UTC)

Finger origin for I, V, and X

I have been told in school that the origin of using I's as units came from counting on one's fingers, where fingers resemble I's. V's resemble the space between the forefinger and the thumb when all five fingers are extended. X's resemble two V's put together. C simply represents "centum," and M "mille." I think the origins of L and D are listed. (Aren't they?) The origins of bars are obvious shorthands using previous symbols.

Is this true, and can anybody find a source for it? (talk) 02:09, 15 January 2008 (UTC)

This folk etymology is covered in the article. kwami (talk) 07:33, 30 January 2008 (UTC)

Bullet Points instead of asterisks in "Modern Usage"

How about a bullet pointed list instead of asterisks in "Modern Usage"? Asterisks look a bit shoddy and are not a suitable list marker. —Preceding unsigned comment added by (talk) 15:09, 3 February 2008 (UTC)

That list already uses bullets. An asterisk on an edit page is automatically displayed as a bullet by the Wikipedia software. — Joe Kress (talk) 09:30, 4 February 2008 (UTC)

This Article May Be All Wrong in Two Essential Points - Origin of Subtractive Numerals and Origin of Etruscan Numerals

This article describes the use of "subtractive" numerals as something going back to classical Roman times, with a taboo against "IV" as an abbreviation of Jupiter. In one paragraph the Jupiter concept is stated as fact, in another as a possibility.

The development of subtractive Roman numerals is usually described as a Medieval innovation -- any claim of being a classical Roman practice needs citations and hopefully a coin or something showing it in use. The "Jupiter" concept for IV may be completely unfounded and also requires a citation -- and should be presented as a theory not fact unless there are actual classical Roman sources describing this taboo rather than modern speculation.

The origin of Etruscan numerals is described as being "tally stick" notches, and not alphabet letters.

But all of the Etruscan "notch" symbols shown are actually letters of the Greek-derived Etruscan alphabet which you will find here: And the Greeks had a system of using Greek letters as numbers (a simple, Roman-like system using a few letters in archaic times, a more complicated system using the full alphabet in classical times). So the entire "notch" concept needs LOTS of references and should be presented as a possible alternative if it can be supported at all.

This entire article looks like some inaccurate impressions and speculations by a math major and not a sound scholarly resource. —Preceding unsigned comment added by (talk) 04:39, 17 February 2008 (UTC)

First of all, no need to shout. I'm deleting your bolding.
About the taboo thing, I have no idea.
The tally stick theory comes from Georges Ifrah, 2000, The Universal History of Numbers. And your objections are original research - because the Unicode range for Old Italic includes a few extra symbols (I, Λ, X, and T) for numerals (1, 5, 10, 50), you decide that that somehow invalidates the article. It does nothing of the kind, and in fact supports the claim you are disputing. Do you have any objection founded on anything other than your imagination? kwami (talk) 07:41, 17 February 2008 (UTC)

Firstly, it's too bad that doesn't identify himself, because his comments are spot on, especially the last paragraph. Secondly, Kwami, you should get a life that doesn't involve computers or the internet. Freddy011 (talk) 21:59, 6 December 2008 (UTC)

Proper Images/Unicode Encoding for Etruscan Characters

See previous comment on likely errors in this article about the nature of the Etruscan numeral system.

The Etruscan numerals are currently shown in imitated form using a combination of Latin alphabet letters and numbers, Greek Lambda, a mathematic "circled plus" symbol, and some garbled character put in markup ["span class="Unicode"]⋔[/span].

They should all be actual Unicode characters which you will find here:

Wikipedia pages are served in Unicode-encoding (UTF-8) so these characters can be typed directly into the page code. But as you likely don't have an Old Italic keyboard layout handy, you can use numbered Unicode entities (ampersand-x-number-semicolon) instead.

But as most readers wont have an Old Italic font installed, each letter should actually be presented as an in-line graphical image (which you'll have to draw, the Unicode chart images are copyrighted), with the Unicode ampersand-x-number-semicolon code as its alt tag.

Sorry that I don't have time to do this myself -- and I don't know what the missing "?" character is supposed to be anyway. —Preceding unsigned comment added by (talk) 04:52, 17 February 2008 (UTC)

  • I think the anonymous editor is right about one thing for sure: the article is pretty light on citations. Kwami provided one - anyone else is welcome to find more. No problem with flagging and blatantly unsupported assertions with {{Cite}}; please go right ahead. I myself have suspected that any taboos against using the name or initials of Jupiter were Christian innovations, but don't have any cites. It doesn't sound logical to me that the pre-Christian Romans would have had a taboo like that, but I'll leave that to those who have studied Roman history and religion. I would also like some more actual support for the origin of the subtractive principle. I have read that sometimes the ancient Romans would pile all the digits together without regard to our modern concepts of ones-tens-hundreds, etc. Subtractive numerals would not work then. When was the one system dropped and the other adopted? Cbdorsett (talk) 22:11, 17 February 2008 (UTC)
The Origins section comes straight out of Ifrah. As for replacing Etruscan numerals with their Unicode equivalents, that might be useful, but there was a lot of graphic variation that Unicode conceals. kwami (talk) 23:12, 17 February 2008 (UTC)


"The four-character form IIII creates a visual symmetry with the VIII on the other side, which IV would not (with the exception of square faced watches and clocks, where the opposite number is X)."

This doesn't seem to make any sense. Square-faced timepieces I've seen have the numbers in the exact same order as for round-faced ones. 10 is diametrically opposite 4, and 8 is opposite 4 with respect to a vertical axis. -- Smjg (talk) 19:28, 20 March 2008 (UTC)

"The four-character form IIII creates a visual symmetry with the VIII on the other side, which IV would not ... "

I read this idea about visual symmetry some years ago, and it didn't make sense either. But I think the mistake is considering the visual symmetry just between the numbers 8 and 4. Your eyes don't behave as a pendulum bouncing back and forth between those two positions of the clock.

A visual appeal (I wouldn't call it symmetry) makes a lot more sense if you divide the whole face of the clock in three parts: The 1st part (the four hours 1 to 4) will contain only Is; the 2nd part (the four hours 5 to 8) will be the only part with Vs in it; and the 3rd part (the four hours 9 to 12) will be the only part with Xs in it. -- Vikfra (talk) 19:28, 13 May 2008 (UTC)

An excerpt from "The Modern Clock", pp. 428-429...

We often hear stories concerning the IIII in place of IV. The story usually told is that Louis XIV of France was inspecting a clock made for him by a celebrated watchmaker of that day and remarked that the IV was an error. It should be IIII. There was no disputing the King and so the watchmaker took away the dial and had the IIII engraved in place of IV, and that it has thus remained in deference of all tradition.

Mr. A. L. Gordon, of the Seth Thomas Clock Co., has the following to say concerning this story and thus furnishes the only plausible explanation we have ever seen for the continuance of this manifest error in the Roman numeral of the dial:

"That the attempt has been made to use the IV for the fourth hour on clock dials, any one making a study of them may observe. The dials on the Big Ben clock in the tower of the Parliament buildings, London, which may be said to be the most celebrated clock int he world, have the IV mark, and the dial on the Herald building in New York City also has it.

That the IIII mark has come to stay all must admidt and if so there must be a good and sufficient reason. Art writers tell us that pictures must have a balance in the placing and prominence of the several subjects. Most conventional forms are equally balanced about a center line or a central point. Of the latter class the well known trefoil is a common example.

A clock or watch dial with Roman numerals has three points where the numbers are heavier, at the IIII, VIII and XII. Fortunately these heavier numerals come at points equally spaced about the center of the dial and about a center line perpendicular to the dial. Of these heavy numberals the lighter of them comes at the top and it is especially necessary that the other two, which are placed at the opposite points in relation to the center line, should be balanced as nearly as possible. As the VIII is the heavier and cannot be changed, the balancing figure must be made to correspond as nearly as possible, and if marked as IV, it will not do so nearly as effectively as if the usual IIII is used."

Reverted Tales Of Symphonia to Ultima

Tales Of Symphonia series does not appear to use roman numerals, only an unreleased (or retitled?) game in the series was once planned to use "II".


A table of modern (post-Victorian) Roman numerals has been in the article forever (since 2001). 4000 was originally added as MMMM on 3 October 2005, then changed to MV on 23 December 2006 by an editor who stated that subtractive notation should be used. On 19 June 2007 the original editor changed it back to MMMM. Another editor then added "Not MV" on 6 October 2007. "Clarify" was added to "Not MV" on 29 November 2007 by an editor who noted a conflict with the top of the article. On 12 April 2008 a "pattern" table was added that stopped at MMM. On 15 April 2008 symbols up to IX including MMMM were added to it. On 22 April 2008 an editor changed MMMM in the pattern table to IV, claiming that was the pattern. On 14 May 2008 "Not MV" was "clarified" by stating it was 5000-1000, which actually confirmed that MV was correct. On 15 May 2008 I made MV preferred and MMMM optional.

Now both conflict with the pattern table. Editors who prefer MMMM may be ignoring that the table is for "Modern Roman numerals". But I don't know of citations for any of these forms, just editor's opinions. Should they be removed, as well as all entries above MMM in the pattern table? — Joe Kress (talk) 07:06, 15 May 2008 (UTC)

Article contradicts itself

"This problem manifested in such questions as why 1990 was not written as MXM instead of the universal usage MCMXC, or why 1999 was not written simply IMM or MIM as opposed to the universal MCMXCIX.

However, these rules are not universally followed."

If the rules "are not universally followed", how can one believe that MCMXCIX was nevertheless "universal"? 10:40, 13 July 2008 (UTC)

Roman Numerals in common usage today - guitar chord diagrams

Are they? (talk) 14:39, 29 September 2008 (UTC)

Well, not specifically guitar chord diagrams, but roman numerals are in fact used for chords in general most of the time, especially in Jazz literature. Dgtljunglist (talk) 08:40, 30 November 2008 (UTC)

VV for 10?

On the page, it says uses for VV as 10 were discovered. I know IIX can be used for 8, but have never heard of VV being used for 10.

I'd like to know what source this is from. ZtObOr 23:31, 29 October 2008 (UTC)

plural title

Why is this article called Roman numerals and not Roman numeral? — Reinyday, 16:00, 3 December 2008 (UTC)

This is why Wikipedia is ridiculed

This whole article is embarassingly uninformed and demonstrates the weakness and unworkability of the whole Wikipedia concept better than any other I've seen. I won't even attempt to fix it. I can now see why so many academics regard Wikipedia as a pathetic joke. Freddy011 (talk) 21:45, 6 December 2008 (UTC)

Could you give us a clue as to what the problem is? Or is it only you that thinks it ridiculous? --WiseWoman (talk) 19:41, 12 January 2009 (UTC)

whats the numeral for 999,999,999?

I have no idea. te top numeral i can name is ___

                                            MCXMCXI  —Preceding unsigned comment added by (talk) 01:09, 24 December 2008 (UTC) 

Why XIX and not IXX?

I don't understand why XIX (1 before 10 [which is nine] after 10) is used to represent 19 rather than IXX (1 before twenty). Explanation? ~BRENT NOT MEMBER. :-) —Preceding unsigned comment added by (talk) 17:02, 16 January 2009 (UTC)

XIX is 10 before (10-1)=9. But IXX is not (20-1), it is (10-1)=9 before 10, because the subtraction only acts on one single following larger numeral. The resulting numerals that should be added ought to be in decreasing order: so X IX = 10+9 is okay, but IX X = 9+10 is not. Or that is how I would explain it, at any rate. Alatius (talk) 10:01, 18 January 2009 (UTC)
Could be a historical reason. If Roman numerals did indeed derive from tally marks, as seems likely, then XIX would be an abbreviation of IIIIVIIIIXIIIIVIIIIX: that is, the I just before the second X. It's before the second X because that's where it is on an actual tally stick. kwami (talk) 10:24, 18 January 2009 (UTC)

Mistake MMVIX instead of MMIX for 2009?

I saw MMVIX instead of MMIX for 2009 in a copyright notice on a television program produced by a relatively small outfit. A web search suggests the same mistake may have been published by an even larger television broadcaster elsewhere. I can't even see how someone could possibly derive such a value. I hope it doesn't become widespread.--SportWagon (talk) 17:46, 22 February 2009 (UTC)

A web search finds a forum where people were posting successive Roman Numerals and went MCMVIII, MCMVIX, MCMX. But I also find [1] where MCMVIX is mistakenly used instead of MCMLIX for 1959. When VIX is used instead of 9, one could argue (poorly) the V is merely superflous. However, the confusion with 59 suggests it is more than superflous, and is ambiguous and could be confusing. But if MMVIX does become a habit with broadcasters and other publishers, notes and citations should be made in this article. The show I saw it on was My Classic Car.--SportWagon (talk) 18:03, 23 February 2009 (UTC)
Web search finds several instances [2] . . . -- SportWagon (talk) 19:46, 24 February 2009 (UTC)

You won't believe this, but the nature of text editor I usually use means that it hadn't occurred to me that the mistake was purely mechanical during editing. Someone who had MMVIX on a web page indicated to me, with thanks, that it occurred when they had updated their page by modifying the previously existing "MMVIII". It now seems obvious to me now that that's how the mistake occurs, but it really wasn't obvious to me before!--SportWagon (talk) 16:26, 26 February 2009 (UTC)

Something seriously wrong

Hi there,

I came on here to determine the exact definition of roman numerals, and the article contradicts itself as far as I can see. The question is: can V proceed L?

The page says "10^n may not precede any symbol larger than 10^(n+1)". 5 is (approximately) 10^0.7. 10^(1 + 0.7) = 50.12. Therefore 5 cannot proceed any number larger than 50.12. Therefore 5 CAN proceed 50, and VL is ok. However, the list at the bottom has XLV for 45.

Well, then there's the fact that you approximated and 10^(n+1) = 10 * 10^n, so 5 cannot precede numbers larger than 50.
However, I believe that Kwamikagami below sums it up pretty nicely - n is an integer, so it only refers to I, X, C, M, etc.
For example, XD will not equal 490, but LD will not equal 450 either. ZtObOr 01:35, 25 February 2009 (UTC)

Furthermore, I can't see in the article anywhere why VX is not allowed! Of course it shouldn't be (since VX = V). Same with LC, and DM. This should be in here!

Would someone please take the time to clean this up? There should be a nice coherent explanation of what is allowed in roman numerals and what is not! And I should not have to search through the entire article to find it!

Rob (talk) 16:51, 24 February 2009 (UTC)

Because n is an integer, as it stated in the second line. I reworded it. kwami (talk) 18:32, 24 February 2009 (UTC)
I only learned Roman Numerals intuitively. But it seems that only integer powers of ten can be used to "decrement" a decimal digit. I.e. I,X,C and logically eventually M. They are used only to create 4's and 9's.--SportWagon (talk) 19:39, 24 February 2009 (UTC)

Think of as Decimal Digits

Thinking about the confusion made me realize that I keep Roman Numerals straight by thinking of them as decimal digits (with never any zeroes, but all digits are actually "scaled" by the appropriate power of ten). But when I over-think about that, it seems it should be wrong. Were there historical reasons why Romans would have concepts of decimal digits even though they represented numerals with their own system? That is, would it be wrong to include in the article somewhere the notion that Roman Numerals are not completely distinct from decimal digits? That is, as far as I can tell, a correct Roman Numeral can always be split into its non-zero decimal digits. I.e. XLV splits into "XL" for "40" ("4"), and "V" for "5". The incorrect "VL" would not divide that way. Perhaps "XLIX" versus the incorrect "IL" is an even more pointed example. ("Can we find a citation for that?"). MMVIX is still just plain sloppy. ("VL", in contrast, is at least creative). --SportWagon (talk) 20:00, 24 February 2009 (UTC)

The spoken language was decimal. It makes sense that if you say "two hundreds and thirty (and) five", you would write CC + XXX + V. Also, in speech the rule is "five and forty" (as in German) or "forty five" (as in English), so maybe writing "VL" would have been confused with 55. 9, on the other hand, was just "nine", and there was no "one and ten" for 11, so IX and XI do not have that problem. kwami (talk) 21:50, 24 February 2009 (UTC)
Yes, the current article does imply Romans counted in decimal. For people who get confused, it seems it could be suggested that any valid Roman numeral must use zero or one combination from each of the lines of the table at the end of the Symbols section (written from left to right, lower lines of table first). But that notion could be introduced in at least two places. and there are already notes about duplication in the article, so I just offer the thought here for now.--SportWagon (talk) 22:08, 24 February 2009 (UTC)
Zero? There is no zero. kwami (talk) 23:01, 24 February 2009 (UTC)
Either you jest, or I didn't explain correctly. Take the table I indicate. From the bottom, go up to the first line with values small enough for the number in question. Select the appropriate value from that line. (This will be the first "digit" of your number, suitably scaled). Then go up to successive lines. If the particular digit is zero, then you pick zero combinations from that line, otherwise you must pick exactly one--the one corresponding to the digit in question--and append it to the numeral you are creating. You keep doing that until you get to the top of the table. Thus my "zero or one". The table defines the combinations which can occur in valid (modern) Roman Numerals; it is more restrictive than more general rules would imply.--SportWagon (talk) 00:18, 25 February 2009 (UTC)
I think that's suitably explained now. kwami (talk) 08:21, 25 February 2009 (UTC)

Definitely Not Decimal? Not.

A revision comment states that the Roman Numeral system is definitely not decimal. Simple inspection would seem to say that the system is decimal (based on powers of 10). The basic symbols all represent either powers of ten, or five times a power of ten. A Roman Numeral can be visualized as its decimal digits, and omitted zero placeholders (which are unnecessary because the magnitude is explicitly indicated by the choice of symbols). It's not like the symbols represent dozens and gross, or other truly non-decimal quantities. True one might be able to construct restrictive definitions of "decimal system" which would exclude Roman Numerals, but that wouldn't seem to qualify as "definitely". Given the obviously decimal nature of the system, I don't see why one would need an explicit source for using the word, especially in one of the deletions.--SportWagon (talk) 18:12, 25 February 2009 (UTC)

Attempting to find web citations found this unfortunate garbage where, in the middle of the page, they seem to encourage very young readers to do things like VL. [3]--SportWagon (talk) 18:36, 25 February 2009 (UTC)
Here's someone who agrees with the "decimal" concept.[4]--SportWagon (talk) 18:36, 25 February 2009 (UTC)
web search does indicate most people contrast "Roman Numerals" to "decimal", however. That is, when they say "decimal" they mean more than what we mean by "decimal".--SportWagon (talk) 18:36, 25 February 2009 (UTC)
Another programmer who agrees with much of what we say. [5].--SportWagon (talk) 18:53, 25 February 2009 (UTC)
A "Math Forum" (not forum in that sense...)[6]. Contains further references.--SportWagon (talk) 18:57, 25 February 2009 (UTC)

I was the one deleting the "decimal" remark: thanks for your comments. I must admit that I have not yet checked the links you give, but if Wikipedia has to be consistent as a whole, this article must agree with Decimal (and with Decimal representation). In those articles, and for what I know in the current use of the word (when talking about numeration systems), a decimal numeration system is before anything else a positional system, that is, one in which each digits gets a meaning depending on where it is in the representation of a number. So in "13" the digit "3" denotes three units, while in "31" it denotes three "tens", and so on. So merely the fact that some of the symbols used in Roman numerals denote powers of ten is not sufficient to qualify it as "decimal". Thanks, Goochelaar (talk) 18:59, 25 February 2009 (UTC)

I'll leave it to kwami to choose some appropriate compromises. It seems like the word we want might have disappeared from the English language as "decimal" has taken on extra implications. Ignoring pedantic interpretations of "decimal", the now omitted paragraph following the table should simplify readers' thinking. (And true, one can just as easily say our attempted use of "decimal" is the pedantic one...)--SportWagon (talk) 19:12, 25 February 2009 (UTC)
Egyptian numerals and Chinese numerals are examples of decimal systems which are certainly not positional because they use different symbols for every power of ten. — Joe Kress (talk) 19:41, 25 February 2009 (UTC)
Declaring that decimal must imply positional seems pedantic because it insists one cannot understand a broader definition of a term in different contexts. However, the way you used "decimal" can also be considered pedantic because it would appear that "the decimal system" and "decimal numbers", even just "decimal" are commonly understood to imply the currently widespread positional system. That is it requires the reader to not make common assumptions. I will keep out of the editing for now.--SportWagon (talk) 20:58, 25 February 2009 (UTC)
"Decimal" means base 10, just as "binary" means base 2. That's all it means. We speak of languages as having decimal, vigessimal, etc. systems, but spoken languages are neither positional nor do they have a zero. Other than the auxiliary base 5 (presumably due to the limits of visual processing of iterated symbols), Roman numerals are analogous to the decimal numbering system of the Latin language. If our decimal article is wrong, then it needs to be corrected. The intro to this article clearly states that Roman numerals are decimal but not positional. Though a check with a dictionary is all that should be needed, I added a ref from Ifrah. As far as compromising with people who do not understand the concepts involved, that would be like stating that whales are fish as a compromise with people don't know the difference. kwami (talk) 21:15, 25 February 2009 (UTC)
Kwamigami, thanks for your good work on this article, but I must disagree. First of all, the reference you give uses the words "a decimal system in which the number 5 is an auxiliary base" not about Roman numerals, but about a previous, archaic, conjectural numbering system an hypothetical herdsman might have developed in order to tally his animals; in fact, it follows "(and the numbers 2 and 5 are alternating bases)". Second, we might as well say that it is a "quinary", or base 5, system. Third, even if we find and agree on a source that describes Roman numerals as decimal under a broader definition of "decimal", we should immediately modify Decimal and Decimal representation (we have to be able to link the word "decimal" in this article to one of those articles).
You are right about languages, but we are not covering Latin language here; only this numeration system as used then and now in several countries with several different languages (in fact, nowhere is told anything about the Latin names for composite numbers, that is, different from 1, 5, 10 etc.).
As this is an encyclopedia, we are forced to be careful about the meaning we attribute to the words we use (even when this looks like pedantry). Goochelaar (talk) 23:45, 25 February 2009 (UTC)
It is nothing close to being a base 5 system. 25, 125, 625 etc. do not fall out as being simple representations the way C and M, and larger powers of ten do. The use of base 5 is merely a means of shortening what are conceptually decimal digits.--SportWagon (talk) 23:55, 25 February 2009 (UTC)
Yes, Ifrah was speculating on the origins of the system, but what he ended up with was a decimal system "exactly the same as in the Roman system". As for language, my point was that if decimal means base 10 when describing numeral systems in language, it means base 10 when describing numeral systems in writing. I've seen no reason to believe that the word "decimal" changes definition depending on which medium we're discussing. SportWagon is right. There is no *VVV for 15, or *LLLL for 200, which is what Ifrah meant by 5 being "auxiliary". Roman numerals are base 10, therefore decimal. kwami (talk) 00:24, 26 February 2009 (UTC)
There are two main definitions. The historically primary definition deals with fractions, as decim means a tenth, but since at least 1684 the phrase "decimal fraction" has been used to disambiguate. Roman numerals are not decimal in this sense, since the fractions were duodecimal. But no-one uses Roman fractions anymore; for nearly everyone, we're talking about the non-fractional part. The second definition, per the OED, is "decimal numeration, the numerical system generally prevalent in all ages, of which 10 forms the basis; i.e. in which the units have distinct names up to 10, and the higher numbers are expressed by multiples or powers of 10 with the units added as required." There is ambiguity with decimal referring to decimal point & decimal places, etc., but the intro is clear enough for the reader to follow.

To be precise, Ifrah says that "the successive order of magnitude [used by the hypothetical herdsman] are exactly the same as in Roman system", not the system itself about which he only says that "the graphical forms for the figures ... are closely comparable with those in the archaic Roman and Etruscan systems". More importantly, the paragraph

The number system of the Latin language was decimal. That is, one said "one thousand and two hundreds and thirty [and] four". When writing a Roman number, the thousands, hundreds, tens, and units in the chart above are strung together the way they are spoken: M (one thousand) + CC (two hundreds) + XXX (thirty) + IV (four), for MCCXXXIV. Thus eleven is XI, 32 is XXXII, and 45 is XLV. Note that the subtractive principle is not extended beyond the chart, and *VL is not used for 45, as it does not correspond to the spoken language.

cannot stay as it is. Of course, in Latin one did not say "one thousand etc."; if anything "mille etc." Moreover, we cannot bring Latin language into this, because it would immediately contradict what is being said: "eighteen" is in Latin "duodeviginti", that is, "two-from-twenty", which would suggest such an expression as *IIXX, and similarly for 19, 28, 29 and so on. In order to find a solution acceptable to everybody, I suggest rewriting the former along the lines of

A practical way to write a Roman number is to consider it as if it were written in the modern decimal number system, and string together separately the thousands, hundreds, tens, and units as given in the chart above. So, for instance, 1234 may be thought of as "one thousand and two hundreds and thirty [and] four", obtaining M (one thousand) + CC (two hundreds) + XXX (thirty) + IV (four), for MCCXXXIV. Thus eleven is XI, 32 is XXXII, and 45 is XLV. Note that the subtractive principle is not extended beyond the chart, and *VL is not used for 45.

Would this be acceptable? (The problem would still remain to make this article and those on decimal notation not contradict each other.) Goochelaar (talk) 01:11, 26 February 2009 (UTC)

Yeah, that's better. I'd forgotten about duodeviginti. I'll go ahead and change it. kwami (talk) 02:17, 26 February 2009 (UTC)

The new paragraph is about what I thought could be asserted. The advice also helps for reading Roman Numerals. Why not say "numeral" rather than "number"? When you begin to move towards insisting that decimal comes from "tenth" rather than "ten", you also start to call some explanations of the term "decimal system" into question. Strange, web searches for "decemal" find first references to "decemal point". Oh well. Would some remaining uses of the term "decimal" in this page be better changed to references to "powers of 10", or possibly "base 10"?--SportWagon (talk) 03:21, 26 February 2009 (UTC)

We could probably come up with different wording, but I don't see the point. Decimālis means 'pertaining to decima, which means 'tenth' or 'tithe'. The earliest usage I can find is for writing fractions rather than xx/xxx. However, that is only one use out of several, and when people speak of decimal numeration, it has nothing to do with fractions, just as it has nothing to do with tithing. —kwami (talk) 04:11, 26 February 2009 (UTC)