Jump to content

Talk:Algorithm

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by Splang (talk | contribs) at 12:31, 11 May 2006 (→‎Translation). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

This template must be substituted. Replace {{FAR ...}} with {{subst:FAR ...}}. Template:Todo priority Template:Featured article is only for Wikipedia:Featured articles. Template:Mainpage date

WikiProject iconComputer science Unassessed
WikiProject iconThis article is within the scope of WikiProject Computer science, a collaborative effort to improve the coverage of Computer science related articles on Wikipedia. If you would like to participate, please visit the project page, where you can join the discussion and see a list of open tasks.
???This article has not yet received a rating on Wikipedia's content assessment scale.
???This article has not yet received a rating on the project's importance scale.
Things you can help WikiProject Computer science with:

Basic Definition of the Algorithm

I removed: "Some people restrict the definition of algorithm to procedures that eventually finish, while others also include procedures that run forever without stopping." Because, A) its opinion. B) its moronic.

It is not opinion, it is fact. Some people do indeed restrict the term in this way. AxelBoldt 16:55 Jan 4, 2003 (UTC)

Havent you ever heard "non-deterministic algorithm"?? this has been used since the 1950s.

Imagine this algorithm:

  while 1:
      a = random integer from 0 to 10
      if a < 5:
         quit

Thats a set of repeatable steps. Yes, the results will be different each time, but its still an algorithm. This is too simple to have a debate on.

Lot's of people will disagree that this is an algorithm. You can't implement it on a Turing Machine, for example. AxelBoldt 17:02 Jan 4, 2003 (UTC)

I agree that this is moronic, but I'm afraid AxelBoldt has a point about many people using it this way. This distinction even makes it into a footnote in the not that technical book Darwin's Dangerous Idea (p. 52). --Ryguasu 05:29 Jan 12, 2003 (UTC)

I just want to add that non-deterministic, applied to algorithms, does not mean non-terminating. Non-deterministics processes are ones that arrive at the set of choices, from recurring multiple choices, straightaway. That is, behavior is as if they make the right choice every time. This overcomes a problem with deterministic processes where a number of very bad choices done first can lead to expenditure of extreme effort, and for some methods there is no perfect first choice. (The famous P = NP question is about a comparison of certain kinds of deterministic procedures and non-deterministic procedures for the same problem.) --Orcmid 01:45, 12 Mar 2004 (UTC)
Hmm, I also want to add that if the random integer procedure uses some source of input not given at the start (say some source of thermal noise), the procedure doesn't qualify as an algorithm either, in the usual strong sense. That is, algorithms of the kind that are the usual focus of attention are ones where all inputs are known in advance and someone could sit down and produce exactly the same result from the given initial input and the procedure. For a human procedure, you could probably ask the person to throw dice and other things, but there is some question whether this would qualify, even though it can certainly be carried out. This strong sense of algorithm is not meant to rule out other (computational) procedures, and many computational procedures realized with computers are not algorithmic. There's no shame. Donald Knuth gives some additional categories in Chapter 1.1 of The Art of Computer Programming, and that is probably a perfectly usable separation. --Orcmid 01:45, 12 Mar 2004 (UTC)


I believe it is important to mention that some people use the term algorithm for nondeterministic algorithms, some (like me) use it for nonterminating procedures, and some use it for procedures in which the data used is not all determined in advance (such as cooking recipes). I believe it is important to clarify that different contexts of use exist in which the presence or absence on these restrictions makes a crucial difference to understanding. Rp 6 July 2005 18:48 (UTC)


To put an end (hopefully) once and for all to this debate, in The Art of Computer Programming vol. 1, Donald Knuth gives a list of five properties an algorithm:
(1) Finiteness. "An algorithm must always terminate after a finite number of steps."
(2) Definiteness. "Each step of the algorithm must be precisely defined..."
(3) Input. "An algorithm has zero or more inputs..."
(4) Output. "An algorithm has one or more outputs..."
(5) Effectiveness. At first Knuth states, somewhat confusingly: An algorithm's "operations must all be sufficiently basic that they can in principal be done exactly and in a finite length of time by someone using a pencil and paper." He later goes on to clarify that the "effectiveness" principle means that the algorithm requires no cleverness on the part of its executor.
I would like to draw attention in particular to the Finiteness axiom. This should bring the issue to a close and inspire a revision of the article. It will not, since this is Wikipedia. However, since my reference comes straight from the horse's mouth (Knuth), it certainly deserves some mention on the page. There are, after all, various schools of thought (including most mathematicians and computer scientists) which believe that an algorithm must terminate. Thanks, Silly rabbit 08:01, 12 November 2005 (UTC)[reply]

Tragedy of the commons

Okay, I haven't gotten any responses yet. So I'm either guessing that everyone has either dismissed me, or has their own idea an algorithm differing from the standard. Anyway, I don't wish to rely on the Knuth books as a primary reference. Here is what the U.S. National Institute of Standards (NIST) offers as the definition of an algorithm: [1]. In that definition, however, they make reference to a computable [2] set of steps. To summarize, an algorithm is a set of operations that can be performed by a Turing machine in a finite number of steps. The input is allowed to be random, but contrary to a prior poster's idea of nondeterministic algorithms, these too must terminate. In fact, I will state it categorically and without any hesitation
  • THE DEFINITION OF AN ALGORITHM REQUIRES THAT IT CAN BE COMPLETED IN A FINITE NUMBER OF STEPS, REGARDLESS OF THE INPUT DATA.

It isn't usually necessary to prove this mathematically in applications, but it really, truly, honestly is the very definition of an algorithm.

Afterthought: all true wikipedists should go home and weep. This article won an award despite incredible inaccuracies. This is the very definition of a tragedy of the commons. Silly rabbit 01:13, 19 November 2005 (UTC)[reply]

Introductory Material

The introductory paragraphs contain this statement:

Correctly performing an algorithm will not solve a problem if the algorithm is flawed, or not appropriate to the problem. For example, performing the potato salad algorithm will fail if there are no potatoes present, even if all the motions of preparing the salad are performed as if the potatoes were there.

I would like to introduce some related machinery here. My draft goes on too long so it might belong in something else or a subordinate topic. At the same time, I propose that it is useful to honor this in what is written, even if it is left tacit.

Note: I notice that I have improved the separation of algorithm from procedures, but I have still left some ambiguity between a procedure as carried out, and the description of the procedure in a script, recipe, or other writing down of the steps to follow. That matters because the expression of the procedure is to be finite though the carrying out of the procedure is a different story and an algorithm is expressed only when the procedure always completes in a finite number of steps for all valid (well-defined and finitely-expressed) inputs. It is important to sharpen these distinctions because computer programs are expressions of procedures, and it is the the procedure that may or may not be algorithmic. (As I read that, I wonder if it is worth pursuing, unless one can show that careful use of such hair-splitting language clarifies things enough. It needs to be developed further to see, with honoring of the common usage of these terms as much as possible). --Orcmid 21:35, 17 Mar 2004 (UTC)

First, I would start by saying

Correctly performing the procedure will not solve a problem if the procedure is flawed or the algorithm implemented using the procedure is not appropriate to the problem.

I don't have a problem with the hypothetical potato-salad algorithm, though I can see adding

For example, if following the procedure given as "the potato salad algorithm" produces macaroni salad, we know that the procedure is defective (as an algorithm for potato salad). And if the procedure is impossible to follow, or doesn't reach a point of conclusion, we would question whether that procedure delivers an algorithm for anything, let alone potato salad.

Next, it is useful to point out, especially for the young reader, that

We often speak of a particular procedure as realizing an algorithm without concern for what it accomplishes under what conditions, although the recognition of the correctness of an algorithm involves establishing that it satisfies a well-defined purpose with specific input conditions. In too many cases, the purpose of the algorithm is omitted from its name, and some notable algorithms are simply named for the people who codified them.
Some well-known algorithms are Euclid's algorithm (for the greatest common divisor of two numbers), the Sieve of Eratosthenes (for finding prime numbers), Newton's method (for solutions of certain equations), Dijkstra's algorithm (for finding least-cost routes in a network), Warshall's algorithm (for problems to which Dijkstra's algorithm is also applicable), and many more. Other algorithms have been given names that reflect the procedure as well as the purpose: Bubble Sort, Quick Sort, Radix Sort and Merge Sort; Binary search and Hash Lookup. [more Wikipedia links here would be very cool]
Some algorithms are so well-known and familiar that we don't recognize them as such. The procedure printed on the back of bank statements for balancing a checkbook is algorithmic. When a discrepancy arises, it is because of an error following the procedure or in our or the bank's records. The importance of the procedure's use in a valid algorithm for check-book balancing is that any discrepancy is a reliable clue that the problem is in the data or our following of the procedure. If we also have doubts about the validity of the procedure and wonder whether there is a "bug," trouble-shooting a discrepancy is now much more problematic. We often rely on algorithms, including ones realized via computer software, that we do not know how to derive or confirm ourselves. Normally, we take as confirmation having experiences consistent with the procedures being correct.
The most well-known algorithms are the ones we learn as children. The procedures for arithmetic -- addition, subtraction, (long) multiplication, (long) long-division, and so on -- are all algorithmic. These highly-practical algorithms are valuable because of their simplicity. We can count on them and they provide numerous ways for our checking each others work and in finding errors of calculation.
A pocket calculator has comparable, though different, procedures for achieving the same arithmetic results. Because we cannot observe the calculator's procedure, and we use the calculator for operations that we might not know how to duplicate ourselves, it is extremely important to be able to trust that the calculator implements reliable procedures for confirmed algorithms and that it delivers the functions that it is specified to calculate.
We can see that there may be many algorithms that satisfy the requirements of a given problem (the two approaches to Towers of Hanoi would be excellent here). Determining that one has successfully arrived at an algorithm for the desired problem and also determining how good that algorithm is in terms of simplicity and efficiency has led to a body of study known as Analysis of Algorithms.

OK, that's enough for now. I don't know where I would introduce random procedures (for testing primality, for example) as heuristics that can be far more efficient than known algorithms, and how to relate to that. To go deeper, we might also need to broach the notion of what can be done by not requiring an algorithmic approach to a problem, and recognizing that interactive behavior of computers may be algorithmic in the procedures that are applied although the overall interactive process is not itself algorithmic. I think these topics can be acknowledged, but not this close to the front. --Orcmid 23:19, 13 Mar 2004 (UTC) updated by Orcmid 21:20, 17 Mar 2004 (UTC)

Formalized Algorithms

Well, I went and read through some of the early material. There is a "Thus ..." that ends with "Turing-Complete system" that is off the mark in two regards. 1. There is nothing to make "Thus" follow, here. Also, the reference to "Turing-Complete" is very tricky, appeals to knowledge that we don't want to presume, and might even be misleading for those of us who think we know what that means.

Oh. I see the problem. In this paragraph, computer programs are equated with algorithms, rather than with procedures. It is important to remember that not every procedure is an algorithm. Furthermore, not every computer program is even a procedure in the sense required before the satisfaction of the additional conditions can be considered. The particular conditions that a procedure must satisfy to be the procedure of an algorithm are stated well enough in the first paragraph of the article. I don't think we should mess with that. One might say that an algorithm consists of a certain form of procedure plus some other things. In addition, not every computer program is a procedure of the certain form required by the definition of algorithm! (I will leave that as a puzzle for the reader for now. Hint: The procedure that a universal turing machine follows is not an algorithm. Well, that's accurate but not such a great hint. Consider ordinary computers and the impact of input-output operations and using new inputs as part of manipulation of the procedure itself. That's a better hint.) --Orcmid 01:30, 15 Mar 2004 (UTC)

There is some work called for here. For starters, it is not the case that every procedure embodied in a Turing machine is an algorithm. That is not an equivalence. It is the case, for a certain class of problems, there are algorithms for them, and there are Turing-machine representations of those algorithms. But not every Turing machine illustrates, represents, or embodies an algorithm. Maybe that wasn't meant, but it is too easy to read that claim into the text as written. --Orcmid 02:56, 12 Mar 2004 (UTC)


Standardized Pseudocode

I think it would be a great idea if we came to some sort of standard on writing pseudocode. I've used a sort of hybrid procedural style for the algorithms in Linear search and Binary search but I'm wondering if there's a better standard out there. Can we borrow a style from a textbook like Intro to Algorithms? (is this copyrighted or is style a public domain thing like an idea?). Only a fairly small set of control structures is needed -- something to define functions, if-statements, loops, list access, mathematical operators and probably a few more. Comments? Mark Jeays

I do think we should try to do standard pseudocode. I have no idea what that pseudocode should be however. CLR's style is usually clear, but sometimes I find it confusing (often because I do not parse the A <- B assignment syntax properly). I don't think things like pseudocode style are copyrightable... We should make a pseudocode article that defines whatever we use (in addition to explaning what pseudocode is in general), then link to that from every pseudocode example. In addition, we could include examples in other languages (see bubble sort for example) by putting them in subpages like bubble sort/C++. --BlckKnght

Robertson L.A, has released a book which attempts to standardise Pseudocode by specifying pseudocode words (READ, GET, DOWHILE etc) in a book "Simple Program Design - A step by Step approach" Nelson Thompson Publishing 2000. The Pseudocode examples are close enough to most High Level Languages so as to easy to translate to any language.

We definitely need code samples for all the algorithms; I think there should be one language (pseudocode or otherwise) on the main page, and implementations in other languages on subpages, as BlckKnght suggested above.

I think some executable language would be far preferable to pseudocode, for the following reasons:

  • it's possible to test executable implementations of algorithms to see if they're buggy
  • executable languages tend to be more rigorous than pseudocode; people writing in pseudocode tend to gloss over relevant details (like whether the range n..m includes n, m, both, or neither --- this is a huge difficulty with, for example, binary search)

My current favorite languages for this are Scheme, C, and Python.

Python is the most readable of the three; it reads like pseudocode itself. It's also the least standardized of the three, the most rapidly changing, and the one with the most "hidden" stuff going on behind the scenes. (arr.append(item) in Python may allocate new space for the array if the space it's in isn't big enough; that really screws with algorithmic complexity analysis.)

C is the most widely-used of the three, probably the one with the most complex and irregular syntax, the most standardized of the three, the least rapidly changing, and the one with the least "hidden" stuff going on behind the scenes. It's also the most verbose and the least readable for people who aren't familiar with the language or one derived from it. (Although, since it is so widely used, almost anyone who knows how to program is familiar with the language or one derived from it.)

Scheme is intermediate between C and Python, in my opinion, except that it is the least widely used.

For now, I'm going to add implementations in Python to the algorithm pages, and when I have time, I'll add implementations in C and Scheme on subpages.

-- Kragen

Kragen, have a look at the pseudocode page, we thrashed out a "standard pseudocode" for Wikipedia a while ago (though feel free to make suggestions/improvements). The trouble with providing actual implementations of algorithms in real languages is that the trees start to get in the way of the forest. This is less of a problem in Python and Scheme, of course. --Robert Merkel

Hmm, I don't see a standard pseudocode section anywhere. I wonder where it went? I thought it would be here. -- Kragen

I advise you to not pursue any endeavor to create a standard pseudocode. I created one that I pushed vociferously and it was rejected by the community, who claimed that a standard was undesirable and unnecessary. I still think they're all wrong, but nothing I can do about it. See User:Dcoetzee/Wikicode. Deco 02:02, 30 July 2005 (UTC)[reply]

Whoa. User_talk:Dcoetzee/Wikicode/Specification has quite a bit of discussion (about 15000 words' worth) that could use some summarization; it appears to include most of the usual programming-language holy wars. I still think it's a good idea to try to make algorithm descriptions more similar to one another, and executable, when that doesn't interfere with their clarity. But maybe Wikipedia isn't the right place to describe algorithms? -- Kragen

Computer-Science Bias

An algorithm is not a rough form of a computer program. This is an example (among many) of the distinct Computer-science bias of the Wikipedia.

Not every computer program is an algorithm either, at least according to some of the definitions of algorithm.

I don't have a reference handy, but I have seen algorithm defined to mean, roughly, a finite set of rules that is supposed to produce an answer in a finite number of steps. Therefore, infinite loops (which can occur in computer programs) cannot occur in algorithms according to this definition.

This is, by the way, one of the motivations for the study of the halting problem. How do you prove that a certain method is in fact an algorithm?

Well, it is done one computational procedure at a time. I don't want to be facetious. I agree with the concern here, but we may be mixing too many things together. Maybe the lapse is one of carelessness on the part of computer scientists, though I think it is more of an overgeneralization by practitioners and in the teaching of programming, not so much hard-core computer science. Either way, I think there is a legitimate concern here.
I am concerned with the substitution of computer codes for algorithmic definitions also, though there is nothing that says a computer code can't implement an algorithm. (Be shown to terminate, clearly consist of well-defined operations and rules for their progressive carrying-out, etc.)
But one must be careful about taking a computer code as the expression of an algorithm. For example, there is more assumed in the Ruby illustration of binary search than meets the eye.
So what is to be done here, in this section, and in the expression/illustration of algorithms in Wikipedia? I wouldn't recommend an overhaul, though maybe some tidiness can be introduced. For example, the binary search section currently makes it clear that it is using Ruby as a means for illustrating an algorithm. Considering that it makes the idea of binary search pretty accessible, I would not complain.
Here in the section on algorithm, I would be more concerned. I also think that maybe procedure should be used in a lot of places, including binary search, and not over-use "algorithm" quite so much.
I have given longer expression to my concern in discussions around the notion that programs are rarely algorithms. I also have an in-progress article on the question "Do Programs Teach Algorithms" that, as it happens, uses Binary Search (and Python) as a basis for discussion. --Orcmid 02:09, 12 Mar 2004 (UTC)

There is an important reason to preserve the strict notion of algorithm (and somehow allow the important informal notion of effective procedure to survive as well). It is not just historically valuable, it preserves important content. Many of the historical breakthrough results were produced using the strict notion of algorithm. When there is a relaxed use of the term, now, and those earlier results are repeated, they are implied to cover a wider territory than has ever been demonstrated. That leads to sloppy science and, potentially, outright error. I think we should be more prudent, but still find good ways to speak and write in an informal setting. --Orcmid 02:09, 12 Mar 2004 (UTC)


I think this article could do with some cleaning up to clarify the distinction between the more formal use of the term algorithm (a finite set of well-defined instructions for accomplishing some task which, given an initial state, will terminate in a corresponding recognizable end-state) with the more commonly understood definition (a step-by-step procedure for solving a problem or accomplishing an end, especially by a computer). There seems to be some references to this in the article, but would it not be better to clarify this distinction up front? At least some form of disambiguation. Medical algorithm links to algorithm, yet does not match the formal definition. MickWest 16:58, 21 July 2005 (UTC)[reply]


Ease of Understanding

The whole idea of an encyclopedia is to explain basic concepts to people who don't know anything about them, including (or especially) 10 or 12 year-old-olds. I respectfully submit that anyone who knows what a "greatest common divisor" is (and probably anybody who knows what an "integer" is) already knows what an "algorithm" is. It's okay to put in the stuff that reads like a math textbook, but before that you need introductory material for the ordinary people that are never going to read past that expanded definition in lay terms. What you should really do is write something dynamic that would show a bubble sort of a short alpha list that wouldn't take an explanation at all but would show the items swapping and bubbling up the list. -- isis 22:46 Oct 8, 2002 (UTC)
GCD and HCF are usually covered at junior school level (10-12). Integers are probably called "whole numbers" at that stage, but the concept is graspable. I'm not sure at what age one encounters a term like "algorithm". Later, I think. -- Tarquin 22:50 Oct 8, 2002 (UTC)

There's no question those concepts are taught in math classes at that level, but that's irrelevant to the topic under discussion. My point is that the purpose of an encyclopedia article is to define/introduce its subject to someone who doesn't yet know anything about it. Unless this article explains "algorithm" in terms someone who doesn't already know what one is can understand, it is useless for its intended purpose. Ordinary people (of any age, but I'm talking about only the U.S., and it may be different elsewhere in the world) don't know what GCDs and integers are, and they don't need to know what they are to understand what an algorithm is, so the article should be written (at least at the beginning, and you can put in the technical stuff farther down) in terms they do understand -- that could have been accomplished by using the example of putting a list of numbers in numerical order, for example.

But there's a larger issue I consider pertinent here, and that's that Americans are innumerate, not so much because they're incapable of understanding numerical concepts as that they've acquired a fear of math, and that's the fault of generations of math teachers who didn't understand or couldn't explain the concepts in terms that made them accessible. Those of us who are teaching math now (including teaching it thru the 'pedia) have an obligation to do better by the ones we teach, because math is so much more important a tool than it has ever before been in society. The more non-numerical examples we use to explain the concepts, the less math resistance we have to overcome, and the better (and easier) we get our job done. Therefore "word problems" of practical, everyday matters are better than sets of equations for illustrating mathematical principles, and therefore a list to be alphabetized is better than numbers to be sorted for explaining "algorithm," because it makes the readers comfortable with the concept before they realize it's math and resist it. (I have yet to see a 'pedia article on a mathematical topic that didn't look like it came from a math textbook instead of an encyclopedia.)

I find it interesting that encyclopedias do not seem to reflect your maxim about what encyclopedia articles are for. I think what you propose is laudable, however. How do we deal with the fact that algorithm is listed under Mathematics in the Featured Articles section? --Orcmid 02:41, 12 Mar 2004 (UTC)

So I respectfully insist that the example in this article needs to be of a simple sorting algorithm (and there is none simpler than a bubble sort, so I'm still voting for that) that anyone can understand. I'd like for it to be of alphabetizing items, because that would obviate math anxiety, but if it has to be numerical, make it something like putting checks or invoices in serial-number order. I'd like for it to be dynamic, showing the items swapping in pairs, but I don't know how to program anything that moves, although I've started introducing animated gifs to encourage contributors that do know how to animate illustrations to do so. It is ridiculous not to take advantage of the capabilities of this medium, especially when equations are so deadly dull, and instead of showing the transformations in a mind-numbing list of them, you could have the elements moving in and out of the equations on the screen, and the novelty of that would suck your readers in, so they would pick up the concept before they realized it. -- isis 00:17 Oct 9, 2002 (UTC)

I completely agree with you that the first example should be as simple and intuitive as possible, and sorting names is simpler than computing gcd's. I just didn't like the bubblesort description since it wasn't explicit enough. By the way, I always thought selection sort is simpler: pick the smallest element, put it at the top. Pick the next smallest element, put it at position two, etc. That's how I usually sort things.

Interesting. Bubblesort and selection sort are very closely related. For selection sort, you have to remember where the least element was found and exchange it with the leading element, or otherwise slide things. In Bubblesort, you bring the smallest element to the front by starting at the far end and comparing and exchanging adjacent elements as necessary so the smaller of the two is always moved forward and used in the next comparison. That way, when you complete a sweep, the least member is in the leading slot. Then, on the remaining unsorted n-1 elements, do it again, excluding the already-sorted elements. Rinse, repeat. (Termination condition is that nothing moved, or you can just go until there is only one element left - it will be the largest.)
I think selection sort animates nicely. It is also easier to illustrate as a paper-and pencil method where the sorted list is produced in a separate column. Bubblesort is a little trickier as a computer animation and is not very suitable as a manual procedure (interesting point). I like the Towers of Hanoi because there are two quite different solutions, the problem is easily understood, it exhibits combinatorial explosion, and also shows that (efficient) algorithmic procedures often do not resemble the problem they are solving in a clearcut way -- Euclid's algorithm does that too, more directly. This shows the difference between clarity/directness of solution and optimum effort, and raises important questions about trusting in algorithms we do not understand. I recommend that sorting illustrations be in articles on the specific algorithms, though. The current example of finding the largest element in a list works, although it would be nice to establish the initial conditions (at least one member in the list). One can raise the influence of data representation and use of available elementary operations with even this simple one, including transformation to a similar algorithm employing the same principle (finding the least instead, etc.) --Orcmid 16:39, 15 Mar 2004 (UTC)

As to your contention that Wikipedia articles are too technical for an encyclopedia: here is the start of Encyclopedia Britannica's algorithm article, (fair use):

systematic procedure that produces—in a finite number of steps—the answer to a question or the solution of a problem. The name derives from the Latin translation, Algoritmi de numero Indorum, of the 9th-century Muslim mathematician al-Khwarizmi's arithmetic treatise “Al-Khwarizmi Concerning the Hindu Art of Reckoning.”
For questions or problems with only a finite set of cases or values an algorithm always exists (at least in principle); it consists of a table of values of the answers. In general, it is not such a trivial procedure to answer questions or problems that have an infinite number of cases or values to consider, such as “Is the natural number (1, 2, 3, . . .) a prime?” or “What is the greatest common divisor of the natural numbers a and b?” The first of these questions belongs to a class called decidable; an algorithm that produces a yes or no answer is called a decision procedure.

And it gets worse from there; not a single example. AxelBoldt 00:33 Oct 9, 2002 (UTC)

Articles should of course start with good definition/introduction paragraphs that give a person of average intelligence a good idea what the subject is and why it is important but we shouldn't dumb things down to the point where articles are not useful to people who already know the basics. BTW, we should also aim to be better than Britannica in depth, breadth and accessibility by starting off with the basics and building from there. Just my 2 cents. --mav 00:37 Oct 9, 2002 (UTC)

As to selection sorts being easier than bubble sorts, I guess my age in showing -- I learned to sort in assembly language without matrix notation, and bubble sorts were much easier (at least for me) than selection sorts, especially because when there wasn't enough room to hold two lists of the given length (which was usually the case), the job would bounce instead of running, so it wouldn't be done when the assignment was due. I think most of us do use selection sorts in sorting alphabetical or numerical items that are physically separate (like file cards or cancelled checks), so I think that would be a good example to use, but I'd still like to have a dynamic image of the cards rearranging themselves, the way my Windows shows me my files flying from one folder to another.
Did you cite the Encyclopædia Britannica article because you meant to suggest it was worse than the current Wikipedia article? To me, it's not, -- it starts with a definition that is accurate, complete, and concise, something neither the first sentence nor the first whole paragraph of the current Wikipedia article does. Or did you mean to imply that if ours is no (or not much) worse than theirs, it's acceptable? Even if I considered EB the standard for encyclopedias (and I never have -- I've always preferred the Encyclopedia Americana to EB and Funk & Wagnalls to World Book, which were the standard reference books in libraries when I was in grade-school), I'd still like for the 'pedia to be the best it could, not just as good as the supposed competition.
So mav has expressed my position very well: Each article should start out with introductory material that explains to an average reader what the article is about and why it is important' and then segue into some more advanced material that will appeal to someone who already knows something about the topic (and then there may even be some advanced topics that appeal only to specialists). A reader who wants the basics will read only the introductory part (so it needs to be complete for that purpose), but that part should not be so simplified ("dumbed down") that it will offend the specialist who skims over it getting to the meatier part. -- isis 01:51 Oct 9, 2002 (UTC)
I gave the EB text because I consider it far worse than our article: they start out with saying "in a finite number of steps" which is wrong; then they enter the topic of table lookups for finite problems which is completely besides the point, and then they go right into primes and gcd's without a single example algorithm. And I don't find the definition "systematic procedure that produces—in a finite number of steps—the answer to a question or the solution of a problem" any better than "well-defined method or procedure for solving a problem, usually a problem in mathematics or otherwise relating to the manipulation of information". AxelBoldt 02:45 Oct 9, 2002 (UTC)
Whoa! You threw me. Why do you say "in a finite number of steps" is wrong. They are not implying a fixed number of steps, only a finite number. It is common to require that an algorithm (not every computational procedure) terminate. Since the steps are discrete, that kind of means you can count them, and when the algorithm terminates given a particular input, you can read the counter. Any example of sorting, for example, given a finite input (also a requirement in most definitions of algorithm) had better be one that terminates on any valid input or we would throw it out as an algorithm for sorting. --Orcmid 02:19, 12 Mar 2004 (UTC)
Well, as the saying goes, "There's no accounting for tastes." -- isis 03:03 Oct 9, 2002 (UTC)
First, I don't think this is a matter of taste, but of understandability - and I think MickWest raised an excellent point: guaranteed termination and e.g. determinism of an algorithm is sometimes assumed, sometimes not, and it is important to be aware of this when seeing the term used. What is more, the phrasing 'terminates in a finite number of steps' is ambiguous: it can either be interpreted strictly, to mean that termination must happen in a finite number of steps on any input, or liberally, by saying that termination must be within a finite number of steps on any input for which an answer is defined at all. This is an important difference: the difference between primitive recursive and recusrive, or if you're a programmer, between for loops and while loops. I think it is important to have at least some mention of this in the text. Rp 02:06, 6 May 2006 (UTC)[reply]
Second, I like sorting as an example, mainly because I once found myself switching from selection sort to merge sort, when I had to sort a large stack of exams. There wasn't a computer in sight! I'm sure anyone can understand the task of alphabetizing or otherwise sorting a very large stack of items, and can also understand the two sorting algorithms. Rp 02:11, 6 May 2006 (UTC)[reply]

Choice of Example

We need a different algorithm for the example; this Euclidean GCD one is too unintuitive. I have a master's in math, and I can't figure it out, so I know ordinary readers can't follow it. -- isis 20:54 Oct 27, 2002 (UTC)

How about the trial division prime testing algorithm, or maybe one of the bin-packing algorithms ?

How about the sieve of Erathosthenes? okay, so it's infinite, but it's a simple prodecure to explain, it's clear how & why it works, and you can limit it to, say, numbers up to 100 so it terminates. I remember doing it at age 10 or so. -- Tarquin 21:09 Oct 27, 2002 (UTC)

I have no idea what that is, but it sounds dirty -- let's do it. -- isis 21:17 Oct 27, 2002 (UTC)
Sieve of Eratosthenes --Imran 21:35 Oct 27, 2002 (UTC)
I'll tell you what: You go out to a shopping area, a mall or downtown, and stop 10 people at random and ask them, and if as many as five of them know what a prime is, I'll agree to this one. The problem I'm having is that the purpose of an example is to clarify things for somebody who's looking the topic up because they don't know anything about it, and none of these mathematical concepts (like "divisor" and "prime") is part of most people's vocabulary, not to mention mind-set. I still say this needs to be an algorithm for something ordinary people understand, like alphabetizing or (if you can't live with its not being numbers) counting or sorting. -- isis 21:52 Oct 27, 2002 (UTC)

Sorting is a simpler concept, but is actually a more complex algorithm in terms of the actions involved and what they do. The sieve is easy: you just write the numbers down then cross them out. It's clear that it works, because when you cross out 2,4,6,8, you're removing all the multiples of 2. I think the explanation on Sieve of Eratosthenes could do with a rewrite though. Some 20% of the shopping mall people will be illiterate, but we're not dumbing down our long words here for them. We must make articles accessible to the intelligent lay-person. That means someone who can look up "prime" and "divides" and quickly grasp the concept. Like I said, it's 10-year old maths: I clearly remember doing it. If people are so innumerate that they forget this stuff ... *sighs in despair* -- Tarquin

I agree with Tarquin, I certainly recall knowing the sieve technique when I was 10-11, possibly it's just a UK education thing.
I think we should assume knowledge of what a prime is in the same we assume people know what a noun is.--Imran 22:19 Oct 27, 2002 (UTC)

Guys, if you're just not willing to put it in terms an average reader of the 'pedia can understand without having to look the words up first, then simply take the example out, because it's not going to do what an example is supposed to do. What you remember from 5th grade is totally irrelevant -- you're not the one that needs to look the word up. The problem is that having an example there that's too hard is worse than not having one at all, because it will scare off someone who was interested enough to try to find out what "algorithm" means. And BTW, people in this country (U.S.) don't know what a noun is, either. -- isis 23:03 Oct 27, 2002 (UTC)

ask the same people you ask about "prime" whether they have any interest in reading about the meaning of the word "Algorithm". Some might -- but those with no grasp of maths concepts will go away happy with the intro text: "Generally, an algorithm is a list of instructions for accomplishing some task..." Knowledge builds on knowledge, and people getting this far should know that. Anyway, why don't we try writing several examples: a sort and the sieve, and leave the GCD too. We can then try and decide which is the simplest to the thick *cough* novice reader. -- Tarquin

I recommend implementing the selection sort algorithm, whether there will be only the sort algorithm or other ones as example(s). It being simple to understand is an issue, that's the right algorithm. Its correctness is easy to get. TCascardo

Question on Etymology

I don't know how to fix this, but I'd like to point out that the article's etymology of "algorithm" doesn't look entirely coherent. The article seems to provide two different stories:

  1. It descended from successive translations of an Arabic guy's name.
  2. It descended from a word in said Arabic guy's chief work, a word which referred to "the rules of performing arithmetic using Arabic numerals"

There seems to be a missing link here somewhere. --Ryguasu 05:34 Jan 12, 2003 (UTC)

Please read it again more carefully: It says "algorithm" came from his name, and "algebra" came from the name of his book. It may not be expressed clearly enough because all of us who worked on it knew what it meant, so if it doesn't say that to someone who doesn't already know it, please fix it. -- isis 05:46 Jan 12, 2003 (UTC)

Good Examples

I had a thought on the subject of good examples -- how about the Tower of Hanoi algorithm? -- Tarquin 11:52 Jan 24, 2003 (UTC)

Has there been any more consideration of this? I think the problem of demonstrating that the procedure terminates, and it always gets the right result might be a little tricky if you want to keep this informal. The other nice aspect is that there are two different approaches, one that is easy to do with a computer, and the "practical" one that is iterative, rather than recursive, and more economical in a number of ways. The number of moves of disks is the same, of course, unless you slip up, but the housekeeping for one is negligible and the other is a function of the number of disks in the stack (though I would not call it impractical for anything a person would be willing to carry out, or that I would be willing to watch carried out on my computer display). But it is a practical, mostly math-free procedure. Of course, it is sort of fun to see why it works to say that to move the stack of n on post i to post j, you first move the top n-1 to post 6-i-j. --Orcmid 02:33, 12 Mar 2004 (UTC)

Reworked the First Paragraph

Ok, some of the problems with the opening paragraph:

  • The definition of the algorithm was prefaced with generally, when broadly-defined was meant. The definition of the algorithm as presented may be restricted, but it is never excepted unless done so erroneously, so it makes sense to be more precise with the modifier.
  • The term "list" has a more sequential denotation than does "set" as usually code is very non-linear. Take any assembly code for example and you get my point. I realize that an argument can be made the acutal storage of an implementation of code may be list, but this is not very strong as an argument on two points. First, most code today is non-linear in storage with the advent of shared libraries and subprocedures, and secondly the storage of instructions is really insignificant in comparison with the operation of the code, which is what algorithms are about. Set, therefore, is more precise. I realize for non-computing applications such as a recipe, list is generally the norm, but it seems to me that set is equally understandable to lay persons.
  • I added the hyperlink iteration and the term decision with the links to Boolean logic and inequality because comparison and logic are the cornerstone of even the simplest cooking recipes.
  • I changed 'mnemonics' as an example of an algorithm, because strictly speaking, a mnemonic device is NOT an algorithm. It is the data structure associated with an alogrithm, as the process of encoding and decoding the original information IS the algorithm. The phrase ROY G BIV is no more an algorithm than an MP3 file. Both are the byproduct of a codec.
  • I added the 'heuristics' link, because of the close relation.
  • I thought that the bit on policy and mechanism was important for understanding how different algorthims get the same thing done.
  • The rest of it I just cleaned up the syntax for conciseness and clarity.

jtvisona 01:51, 14 Aug 2003 (UTC)

Example algorithm is flawed ...

The algorithm to find the highest number in a list is flawed :

  • It uses the first element of the list twice
  • It fails when the list is empty

--Stephan Leclercq 08:34, 20 Jul 2004 (UTC)

That has now finally been corrected. — 131.230.133.185 18:42, 12 July 2005 (UTC)[reply]

Corrected ? Maybe :-) But at the expense of readability. How do you expect a non-programmer to understand this Lisp / Smalltalk / Python / Whatever pseudocode ? Why not simply "show List.Max" ?

I deal with why I used that style of pseudocode below. As for using "List.Max", that eliminates the example (an algorithm is a list of steps, not a command) entirely, which is pointless. The current example shows the steps of the algorithm in a style that is close to the simple, nonprogrammer's English used to describe it. — 131.230.133.185 23:48, 19 July 2005 (UTC)[reply]

References

Ok, I'm a lazy bastard and I always meant to do it way back in the day when I first put my hands into this article, but Knuth's 3 volume set really should be included in the references.

jtvisona 072004

An automated Wikipedia link suggester has some possible wiki link suggestions for the Algorithm article, and they have been placed on this page for your convenience.
Tip: Some people find it helpful if these suggestions are shown on this talk page, rather than on another page. To do this, just add {{User:LinkBot/suggestions/Algorithm}} to this page. — LinkBot 10:24, 17 Dec 2004 (UTC)

Etymology

I agree that the "not Greek" insertion is better gone, tho

a) not popular belief and b) it's not from the word to gore, either, but we don't have to explicitly say that

are bad reasons for removal. There is little popular knowledge that it came via Arabic, and irrational is it, normal human minds are far more confused by the anagram than by a coincidental syllable. There's a stumbling block here, and a better-crafted corrective may well be worth while.

For that reason and others, how about some more etymology: what is the usual relationship between -ithm and -ism; it was certainly de-Arabified, and was it Grecified or Latinized? Is there an early Arabic form other than just the personal and place names? Some of these answers may be worth including.
--Jerzy(t) 21:17, 2005 Feb 8 (UTC)

Hi, If everyone agrees that my edit was unecessary so be it, and remove it if you must. However, I agree with you that the resons given a) and b) where at least silly. I think it is popular belief, because of the ending -rithm which is a popular Greek word (logarithm, arithmetic etc etc) and I am also drawing from my experience as a student, where many people thought it had a Greek root. I think it makes more sense for someone who does not know, to assume its Greek rather than Arabic. As for b) and the fact that someone might associate it with "gore" rather than arithmetic, I think that tells a lot about that person. May I suggest a bit less TV and open a book. --Vasileios Zografos 21:29, 8 Feb 2005 (UTC)

The example is becoming a contest ...

It seems to me that the example algorithm (how to find the highest number in a list) is becoming a contest on how to perform it in the smallest amount of instructions and in the weirdest syntax possible. It has become totally unreadable to the average person, who does not know about strange constructs from Perl or Lisp. The conventional example, while simplistic, could be read by anyone.

Unless there is strong opposition (please vote here), I will soon revert to a more understandable version, with a conventional "for" loop, the requirement of a non-empty list (so we can talk about preconditions) etc.

Oh, by the way, the example is currently much more flawed than before. It assumes that the > operator returns true when comparing a number with the value "nothing", while most languages would return false (or abort execution) in this case :-)

From the perspective of a programmer in a C-style language, you are correct. All the flaws you mention should be removed, because they are grating. However, programmers of any style do not really need an example; they've seen dozens. Nonprogrammers (usually) do. We should use their perspective to guide the choice of an example. A nonprogrammer learning of algorithms may not want to learn a C-style language (or any other style). Regardless of how much we want to, we should refrain from forcing off-topic details of C-style languages on the reader. That, not a contest to make C-style programmers angry, was why I changed the example.
The algorithm given is (in simple, nonprogrammer English): "look at each item of the list; if it's the highest we've seen so far, remember it; after we've gone through the list, what is the highest we've seen ?" All other details are irrelevant and forced. "do something if a > b" is very close to plain English with a commonly understood math symbol. How some languages handle comparisons to nothing strangely is arcane, off-topic detail; at any rate, pseudocode is intended to get the idea, rather than the exact detail. "List.each_item" conveys the loop in a way that a nonprogrammer can understand at a glance. "for i = 1 to len(List)" requires an explanatory comment and a more convoluted way of thinking in the hope of teaching the reader the C style. How certain languages iterate through lists is irrelevant; that you are looking at each item is the point.
I believe that the syntax in the current example is closest to the simple English description while still being pseudocode and requires no understanding of any programming constructs except assignment. What a C-style programmer, through extensive practice, comes to consider common knowledge that any average person could understand is not necessarily what an average person could understand without coaching (as shown by the necessity for comments). What is nonstrange to a C-style programmer, through extensive familiarity, is not necessarily familiar to nonprogrammers. Most importantly, what is strange to a C-style programmer is not necessarily strange at all to a nonprogrammer who has no idea why the example should be considered strange. — 131.230.133.185 23:43, 19 July 2005 (UTC)[reply]


I strongly agree that we do not need to revert to a pure C syntax. However, do you really think that the non-programmer (who is likely also non-mathematician) will understand the dot notation as in list.each_item { some codeblock } ? Does not look like C, but like Smalltalk to me :-)

If we want to move away from programming, we either have to turn to a purely mathematical definition where the looping nature of the solution, and the need to examine each number only once, would be lost in recursion, or else use plain english with no more than prep school math symbols.

What's C-minded in a phrase like "examine in turn each number N in List, if N > max then max becomes N" ?

--Stephan Leclercq 18:10, 22 July 2005 (UTC)[reply]

I meant that we should reduce unfamiliar concepts, not reduce unfamiliar appearances. There are no new concepts introduced in the current pseudocode, such as counter-variable iteration, needing the initial maximum to be prefilled for whatever reason, etc. For the same reason, I do not agree with introducing intermediate concepts like recursion.
As for unfamiliar appearance, we already have the example repeated in English just above the pseudocode. Any confusion from appearances, such as dot notation, can be rapidly cleared up by referencing the English listing :
  • 1. Look at each item in the list. If it's larger than any you've seen so far, make a note of it.
  • List.each_item { highest = this_item if this_item > highest }
While I don't wish to waste reader brainpower with extra beside-the-point concepts, I believe most nonprogrammers are competent enough to pick up the example.
131.230.109.211 00:22, 23 July 2005 (UTC)[reply]

Moot point now, as someone (not me :-) re-introduced the loop (and a bug when list is empty :-) I'm afraid no one (especially computer guys) will accept an example that does not explicitly contain a loop.

--Stephan Leclercq 09:02, 30 July 2005 (UTC)[reply]

the link "comparison" at:

The concept of an algorithm is often illustrated by the example of a recipe, although many algorithms are much more complex; algorithms often have steps that repeat (iterate) or require decisions (such as logic or comparison)....

takes you to Inequality (math), is this intended?

I removed the reference to the USA as POV. The statement "Certain countries, such as the USA" seems to single out the USA for no reason apparent to the topic Algorithm. I'd be willing to compromise if a number of both pro-patent and anti-patent countries were listed, assuming a noteworthy software industry presence in the country.

I also removed the "provided a physical embodiment is possible" part, because it may give the idea that a physical embodiment isn't possible. If we're defining algorithms in terms of a Turing machine, then every algorithm has a possible physical embodiment. /* Pradeep Arya 18:20, 2 January 2006 (UTC) */[reply]

Why did [User:Mfc] remove "controversially"? It seems to me like it was neutrally reporting that controversy exists. -- Mikeblas 21:53, 2 January 2006 (UTC)[reply]
I agree. I think not mentioning the controversy is POV; slanted toward the pro-patent side in a lie by omission kind of way. However, since I made the last edit to that section, I won't change it myself. If somebody else agrees with me and Mikeblas, feel free to change it back. /* Pradeep Arya 23:18, 2 January 2006 (UTC) */[reply]

Inherently serial problem?

Regarding "Inherently serial problems", the text reads:

One such example is depth-first search of a graph, which happens to be recursive, but can not be parallelized.

Firstly "inherently serial problems" occurs only 5 distinct times in a google search [3]. Secondly there seem to be several parallel implementations of depth-first search [4](and, if you think about it, it seems reasonable to parallelized - you just need a way of knowing which node would have been readed first, which you can do by encoding the node positions in the tree). I'd suggest a better example would be Newton's method

MickWest 18:07, 8 January 2006 (UTC)[reply]

I had the same thought about depth-first search when I read the article. It does not seem like a good example. Perhaps another alternative for an example would be the Three-body problem which seem hard to parallelize beyond 3 processors. -R. S. Shaw 23:06, 10 January 2006 (UTC)[reply]
Or perhaps repeated hashing, ala: result = MD5(MD5(MD5(MD5(plaintext)))); any good hash algorithm perturbs the result significantly in the presence of more information. It makes it very difficult to begin the next iteration of hashing without having completely finished the previous iterations. —Pradeep Arya 12:27, 11 January 2006 (UTC)[reply]


Mathematician / Astrologer

User Theodore7 is modifying the article to emphasize al-Khwarizmi's work as an astrologer.

It began here: [5]

This was reverted by Lumos3: [6]

Theodore again modified it: [7]

This was reverted by Duncharris: [8]

Theodore again modified it: [9]

This was reverted by Ruud: [10] who left a message at Theodore7's talk page.

I left a follow-up message to discuss. In the meantime, Theodore again modified it: [11]

I'm going to revert the changes now, and move the discussion from Theodore7's talk page to this newly created subsection.

Listen, my edit is sourced. Ok. So please, do not revert unless you have a source that runs counter to him being an astrologer. Saying he was a "mathematician" is redundant. All astrologers - particular the classical astrologers such as Al-Khwarizmi were mathematicians. Cite a sources that say he was not. Thanks.Theo 04:48, 10 January 2006 (UTC)[reply]

Your changes to algorithm have been reverted. While they are correct, the fact that al-Khwarizmi was a astrologer is already mentioned in his biography and not very relevant in the algorithm article. Cheers, —Ruud 12:19, 9 January 2006 (UTC)[reply]

I'll agree with Ruud here. al-Khwarizmi's work as a mathematician is germane to the topic of Algorithm. His work as an astrologer is not. /* Pradeep Arya 14:42, 9 January 2006 (UTC) */[reply]

I suggest you read Boyers' History of Mathematics, and realize that al-Khwarizmi's main work was as an astrologer - which means that he was a mathematician. Suggest you study the facts of the history and not your POV.Theo 14:46, 9 January 2006 (UTC)[reply]

I'm not sure to which POV you are referring, please elucidate. That said, I seem to understand you saying "astrologer equals mathematician". Even if that were true in al-Khwarizmi's time, it certainly isn't true in our time. We need to write Wikipedia for people to understand it today with the way they use language today (not as language was used in al-Khwarizmi's time). /* Pradeep Arya 15:03, 9 January 2006 (UTC) */[reply]

I reverted your most recent changes to Algorithm, please see Talk:Algorithm. /* Pradeep Arya 15:36, 9 January 2006 (UTC) */[reply]

My two cents: al-Khwarizmi may have been a great many things in addition to mathematician and astrologer: writer, swordsman, home chef, whatever. This article should not list them all. His role as mathematician is the relevant one here. —Bunchofgrapes (talk) 04:29, 10 January 2006 (UTC)[reply]

That is silly. The facts say otherwise. What anyone's "current view" is concerning this subject is not relevant. Suggest you study the history of astrology - relating to the Hindus and Arabs, as it is primary to the history section of this article. Again, one's personal views (POV) relating to "astrology" does not allow for removal based on views "for" or "against". Check the history before pushing POV. Suggest you refrain from choosing what is "relevant" and what is not. The material is sourced, and many more can be added as well - primarily, al-Khwarizmi's own books. If you have sources that say he was not an astrologer - please cite them.Theo 04:45, 10 January 2006 (UTC)[reply]
Again, he was indisputably a writer, a human, a mammal -- use some judgement. —Bunchofgrapes (talk) 16:09, 10 January 2006 (UTC)[reply]
Again, this is not the point. The material is factual, and sources cited. I suggest using some judgement here Bunchofgrapes, and not the very generalized he was a "writer" "human" and "mammal" - which, for some reason beyond me, you would presume others are not aware of these moot facts? Or, perhaps, you are just joking. Ok.Theo 17:17, 12 January 2006 (UTC)[reply]

I think the issue is this: You're trying to get al-Khwarizmi's profession right. I'm trying to get al-Khwarizmi's role in algorithms right. So when I say "astrology is irrelevant" you're seeing it as an anti-astrology POV; as though I'm trying to rewrite a historical fact (al-Khwarizmi's profession). That said, from my side, I'm trying to get his role in algorithms correct. I'm not trying to idenitfy his profession, only characterize the work he did with respect to algorithms. Can we agree that "mathematician" is a better characterization of the work he did with respect to algorithms, than "astrologer"? /* Pradeep Arya 07:14, 10 January 2006 (UTC) */[reply]

Pradeep, I do understand what you mean here, but, my addition to the "history" section of this Algorithm Page does not take away from the encyclopedic version, but ADDS to it. I do not see it as anti-astrology at all, or even pro-astrology - but a fact. If you read the materials, you will see it clearly yourself that astrologers invented mathematics, and that Hindu astrologers invented the algebra that al-Khwarizmi translated, and used. I agree partially w/you on "role" but perhaps Pradeep, you might be better served as to actually read al-Khwarizmi, and the very clear connections between astrology, algebra, and yes, algorithms, which is directly connected to astrology itself, and the work as mathematicians. All astrologers are mathematicians, and al-Khwarizmi was an astrologer, foremost. Your view in this matter, might be missing this fact, in that you seperate Algorithm, and al-Khwarizmi from whatever idea you may, or may not have of what "astrology" is. Now, I am suggesting that you might really want to see the historical facts of the direct connections here, and not base the entire matter on your possible confusion here. Moreover, my entry is in the history section, Pradeep, and is factual, attributed, and cited. That is all. I would love to help you to see that I am not trying to change the entire article, because I have no such intention, but I can add numerous direct sources on this, and I think, with your interest, and knowledge of modern algorithms, and my historical knowledge of the orgin of algorithms, to astrological techniques, principles, and history, that a fascinating new version, or expanded version, could be achieved. Think about it positively, and how exciting this could be. There is defintely an "anti-astrology" view that permeates the "scientific community" that runs counter to HISTORICAL FACT. I am against this always, because it is POV and intends to "rewrite history" to suit conventional POV that is in error when it comes to the facts - and facts that can be sourced, and cited extensively. Even the smallest mention of the word "astrologer" seems to get the blood-pressure up of those who hold strong POV on what they think "astrology" is, but it does not change the facts. And Wikipedia is an encyclopedia with facts on subjects. Now, I would really enjoy working with you on this Pradeep. I mean it. I liked reading what you wrote so far, and it made me want to read more, then I came up on the history section (al_Khwarizmi) and that's what I edited, added, and sourced. So, don't take it personally, please, because it was not, and never will be that way with me, ok? As a fellow Wikipedian, I am on your side, Pradeep, and not against you. What do you say?Theo 18:00, 12 January 2006 (UTC)[reply]
Okay. Perhaps we're coming at "astrology" from different angles, because I have no knowledge of how algorithms connect to astrology. Please share with me your reasoning and logic that leads you to conclude that astrology and algorithms are connected. Personally, I don't see it; but I am willing to listen and learn. —Pradeep Arya 18:15, 12 January 2006 (UTC)[reply]
There is no connection. Theo, you might want to read [12]. —Ruud 18:19, 12 January 2006 (UTC)[reply]

I beg to differ R.Koot/Ruud. That is one site, from the University of Scotland. Suggest that you see additional sources other than those that fit your POV. Even the site you gave shows Al-Khawarizmi was an astrologer. Moreover, again, there was NO distinction between astrologer/astronomer/mathematician in that time, and many centuries until the 17th century. Suggest you read the references cited below on the major texts on astrology and algorithms, and not just one website that one can read from one's own POV - then stating - in error, as you do, that "there is no connection." There clearly is - and historical facts bears this out, extensively, mind you. (See references cited below.) Lastly, again, al-Khwarizmi's role as an astrologer is proven, and is mentioned in the appropriate section, and is cited. You cannot state that he was not, then, replace it with mathematician based on your POV. No, it is not the subject of the entire article, but, as you know, Wikipedia has links from each subject matter to others, so this claim is not valid. If you must persist, then I will extend my entry on the history section with more sources if need be. There was NO distinction between astrologer, astonomer and mathematician with regard to al-Khawarzimi, and others - and just because there is today in the POV of others, does not change that historical fact. I state the obvious regarding him, and it is not a big deal, but is being made one with reverts to the contrary. If you want, I will add numerous sources, but I like Pradeep's entries for the most part, and the article as a whole.

Now, Pradeep, thanks for saying what you did. Good. That's a start. But, because you do not personally see it is directly related to your admission that you have no knowledge of how algorithms connect to astrology. Mathematics & astrology are directly connected, and from this connection - came algorithms. This is historical fact. There is a direct connection Pradeep. Go to the origins of Vedic astrological mathematics and see for yourself. I am quite versed on Vedic astrology and its mathematics, and if you conduct a simple internet search - (Astrology and the origin of Algorithms) you will be able to find copious facts. This is one of the major problems with conventional views, is that contrary to history - to historical fact - that they make a distinction, not only between astrology, and astronomy, but between astrology and mathematics. There was no such distinction; yet, it is being stated today as a fact. This is not true. Read Hindu Astrology: Myths, Symbols and Realities, by Anthony Philip (New Dehli, Select Books, 1981) and also Sphudjidhvaja, The Yavanajataka of Spudjidhvaja, (2 Volumes, edited, translated, commented on by David Pingree, Cambridge MA & Harvard University Press, 1978) among them. These materials are rich with evidence, and give direct proof of my assertions. There are plenty more I can name, but will start with these two, and of course, Boyer's work. I hope this helps to get you started. If you need more references, ask, and I'll be happy to send more your way. Theo 18:58, 12 January 2006 (UTC)[reply]

Maybe you could cite a few relevant paragraphs from those works. Would save me a trip to the library... —Ruud 19:15, 12 January 2006 (UTC)[reply]

What is wrong with a trip to the library? Not to sound rude, because it is not meant that way whatsoever, but, perhaps, you might just benefit a loaner from the library, or better yet, purchase the books mentioned. I can cite others for you since you might want to have them in your own personal library for future reference yourself since it would be good for you to see the direct connection between serious, scientific, applied astrology, to algorithms, astronomy, and yes, even to mathematics itself.Theo 19:20, 12 January 2006 (UTC)[reply]

It takes time and energy better spent on other thing, and since you seem to have those books at hand... —Ruud 19:25, 12 January 2006 (UTC)[reply]
I agree that the line between astronomy and astrology was much blurrier back then. However by today's standard parts of al-K's work are clearly mathematics, so there is no reason to call every mathematician who lived before the scientific revolution (for as far as we can call mathematics a science) an astrologer. —Ruud 19:25, 12 January 2006 (UTC)[reply]

There was nothing "blurry" about it at all back then, as you say. Hey, I wrote the short paragraph in the history section of the piece, ok, and not in the "today's standard section." What you say are "today's standards" is POV, no matter how many in a community, or club, would share that view. Now, it is ok to say that they "believe" this, or that they don't "believe" that, yes, but, the "scientific revolution" as you call it, is actually the Age of Rationalization, and Astrology, despite what these POVs may claim, is practiced today as an applied science by professional astrologers - using mathematics. I use it every single day. You know, if you would look honestly, without the intrusive point of view, and move along the circle to see more points of view - and historical fact - you just might be surprised by what your open eyes will see. Astrologers were ancient mathematicians. That is all they did. And, they used mathematics not only to map the skies, and calculate the motions of planets, and stars, but they mapped the Earth as well. The used the mathmatics invented by Hindus, and some by the Chinese too - and mapped the celestial skies and the terrain on the Earth. This is fact. The seperation is only in your mind. There IS NO SEPARATION. Algebra was a shorthand used to shorten the long calculations of tracking the movement of the planets - including the Earth - due to the many variables. It was needed because the old arithmatic just took too long, and contained many errors in calculation for creating ephemeris. All this is history man. Astrologers invented the modern mathematics used today - algebra, geometry, trigonomery - and it is ok. That is the major problem here. The POV that denies these historical facts. What is wrong with having this known? More clearly - it was also practiced by people such as al-Khwarzimi, and many, many other astrologers who entered India after the Islamic invasion of that country and those in Europe later. This is historical fact. There is nothing "blurry" about it. The copious materials are there. The sources are there. They can be cited. They can be read. Taken out from the library. Bought, and read by anyone. I am not making this stuff up. Go see it for yourself.Theo 19:34, 12 January 2006 (UTC)[reply]

Alright, explain to me were you think my reasoning goes wrong:
  • Al-Khwarizmi wrote Al-Jabr wa-al-Muqabilah, as work of mathematics, therefore, Al-Khwarizmi was a mathematician.
  • Al-Khwarizmi practices astrology, therefore Al-Khwarizmi was an astrologer.
  • Al-Khwarizmi was a male.
  • Al-Khwarizmi was or did many more things.
  • We shouldn't list ALL of these things in the algorithm article because they disrupt the flow of the text and are better explained in Al-Khwarizmi's biography.
  • Therefore we should only list the property of Al-Khwarizmi that is the most relevant in the algorithm article.
  • Therefore we should refer to Al-Khwarizmi as a mathematician.
19:50, 12 January 2006 (UTC)

Listen, don't play semantics with me. Games, ok? I took the time to write to you, cited sources, and then you come back with this? Are you a serious person? I know you are only 19-years-old Ruud, but, perhaps you may want to learn a little more about the world before getting stuck on your absolute knowledge, ok? The above notations by you are picking gnat s__ out of pepper. Give it a rest, ok? I knew this stuff when I was your age. I am in my mid-40s now, ok kid? You are rationalizing to suit your own POV. Stick to the facts. The sources are there, can be cited, and leave it at that. So, please, don't insult my intelligence here with items like the above "... was male" and "... was or did many more things" - please, give it a break Ruud, and either be a serious person, or go play elsewhere. There is NO disruption to the text whatsoever, and it is entered in the appropriate section. I suggest Ruud, that you actually be honest, and read the books, sources mentioned. Is that too hard for you to do? Or will you not let the facts interfere with your POV? That way, if you take a little time to actually read the materials, rather than wasting time on your fixed point POV - you actually see that astrologers were the mathematicians. Or, does it bother you that they were Hindus, and not western Europeans? Is that it. That Hindus discovered the value of zero, and algebra, etc.? Does that bug you? The sources are there and can be cited. No need to be simple here about it and reduce things to silliness. Ok? Read the materials, be an honest Wikipedian. Let the facts and the history speak for themselves.Theo 20:00, 12 January 2006 (UTC)[reply]

You just insulted a lot of people. Please, answer my question. —Socrates

That assuming that you think that I am not one of those people. I suggest you take the time to read the Tao of Physics. Perhaps you might learn more than your age suggests.

And if you had cared to do the maths you would have known I'm 20. —Ruud 20:32, 12 January 2006 (UTC)[reply]

That is "math" Ruud, and yes, Happy Birthday Capricorn, I just had a birthday too a few days after your own, and am a January Capricorn too. I do apologize to you for my assertion that perhaps you might not like that Hindus discovered zero by the way; I don't believe you really think that. However, it does not take away from my point. Do your own homework, and do not assume that at the new age of 20 that you've learned all there is about the subject at hand, and would prefer to stick to a POV, and not read the historical fact. I took the time to actually read the link you sent. Suggest you actually take the time to find the books I referenced.Theo 21:07, 12 January 2006 (UTC)[reply]

When someone (Theodore7, in this case, but whoever) resorts to ad hominem arguments, he or she has lost the debate. So, best ignore that one. Needs a thick skin sometimes, but that's life. Personally, I'll bet that an objective 19-year old (or 16-year old) is more likely to be right than a bigoted mid-40-year old. But that's just my experience as an even older person...quota

Perhaps, but then again - ad hominem - who asked you? Express knowledge on the issue at hand, and not POV that perhaps doesn't sit well with your own sensibilities. One size does NOT fit all Quota. And, as an even older person, you should already know this. I also wouldn't say that I am bigoted, never was, and you don't know me to state this. So, I suggest you add something of value to the issue rather than on your POV on what is "objective" and what is not. You would bet - and in this case, you'd lose it too. I do not easily discount age, and knowledge for nothing. The arguements this 20-year-old makes for not reading the historical fact of the matter at hand makes this point even more valid, since he would just as much not for the benefit of POV. Just as Ruud, who just left 19 for 20, knows more than he did when he was 9-10 years old, consider he'll know more at 30 and then 40. Moreover, Quota, if you are to insert yourself into to this discussion - then do so from a position of adding something to the discussion, and spend time reading the Talk Page, than from a political-correctness that has no place in the discussion. Theo 21:07, 12 January 2006 (UTC)[reply]

Theo, let's strike a deal here:

  • Give me time to read your sources. My marriage reception is on the 14th, and I'm travelling to the United States on the 18th. After that, I'll go to the library and find your sources and read them. Then we can talk about it.
  • In return, I ask that you please not modify the article from mathematician to astrologer; to avoid pointless revert wars regardless of who is correct.

Do we have a deal? —Pradeep Arya 20:15, 12 January 2006 (UTC)[reply]

Congrats by the way on your marriage Pradeep. I have no problem; however to have a mutual agreement, I would suggest that both terms astrologer/matematician - as one line - be added in your article section. We will leave my edit, w/ the Boyer attribution w/paragraph, out of the history section until you've had the time to read the sources. Agreed?Theo 21:07, 12 January 2006 (UTC)[reply]

Well, I can't speak for the consensus, so I can't say "astrologer/mathematician" is an okay edit. The thing is, you've given a source, but nobody can verify that source yet. So like science aritcles, it needs peer review before it gets published. Changing the article without the peer review puts the cart before the horse, if you know what I'm getting at.

Please give me the time to read your sources. In the meantime, not making a change to the article avoids stirring up the ill-will of the other editors here. If you are correct, I'll have a much easier time convincing them of that fact if they didn't have to fight a revert war every day for a whole week.

Ultimately, everyone has to be convinced, and as you've already noted, the anti-astrology POV can be quite strong, especially in science-related matters. Trust me, getting your version up for five minutes every day isn't much of a victory when you lose the good faith of the other editors working with you.

Just give me a week, that's all I'm asking for. —Pradeep Arya 21:38, 12 January 2006 (UTC)[reply]

I see your point. Well said. Yes, I will agree to this. Take longer than a week. I am not a jerk. I'm not pushing POV. I just enjoy seeing the facts of the matter included - and not denied - based on ignorant, and sometimes hostile POV. At least you are willing to go and read the sources. So, take longer than a week. I'm not looking for a "victory" Pradeep, just the facts not being shoved aside to support POVs that are "anti-astrology" - while not even knowing what applied astrology is, and it sure is not "sun-sign newspaper astrology." POVer's not wanting to admit the historical facts that are sourced, and verifiable - while pretending to be the "voice of science and reason." What kind of "reasoning" throws out historical facts, and denies a science as astrology as false when they maintain little knowledge of it, do not practice it to be qualified to peer review? Science is verb to me. Not a staid noun. However, I do agree with your assessment Pradeep. I am tough-minded, but am not closed-minded, or a rude cynic, or ignorant, not to be able to listen, because I do listen to common sense and good advice. And your request is both. So, I will agree to this as long as you give your word, and will be honest enough that when the historical facts bear out as sourced, and verifiable, that you will then be able to help support the factual position, and expand on the piece without the "anti-astrology POV" based on ignorance of the subject, and denial of the true history. That's all I am asking. Ok? Other than this, I will have to agree with you, because you make perfect sense in this request. Have a wonderful reception, and trip Pradeep. All the best to you and your new wife.Theo 21:53, 12 January 2006 (UTC)[reply]

Translation

Who the hell came up with the folowing translation? "De numero Indorum" (Concerning the Art of Hindu Reckoning). If the English translation is an accurate translation of the Arabic title, then the Latin is wrong. If however the Latin is the correct title, then the English is worng. (De Numero Indorum means "Concerning the Indians'/Hindus' Number"). Jim62sch 15:05, 15 January 2006 (UTC)[reply]

Yes, also see al-Khwarizmi more of his books suffer from this problem. Most references contradict each other (EB, TAoCP, MacTutor) and you'll find even more names on the Internet. I think part of the problem may lie in the fact that the English names comes from the original Arabic name, while the Latin translations (of the text, not only the title) have different titles. Most of the names of his books seem to be abbreviated as well. I'm currently doing some research on this. —Ruud 15:19, 15 January 2006 (UTC)[reply]

Read Wikipedia Policy and Guideline

About the article, it is not so bad, I wonder why you want remove it.
About external links, I suggest you read the guidelines before to decide to suppress them. Or remove the good one instead.
The dictionary of algos is not marvellous, I spent lot of time in their pages to find what I search without to find any valid code. Splang 12:31, 11 May 2006 (UTC)[reply]