Jump to content

Talk:Big O notation

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by 68.250.141.172 (talk) at 20:06, 23 January 2014 (→‎Base of log?: new section). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

Algorithms and their Big O performance

I'd like to put in some mention of computer algorithms and their Big O performance: selection sort being N^2, merge sort N log N, travelling salesman, and so on, and implications for computing (faster computers don't compensate for big-O differences, etc). Think this should be part of this write-up, or separate but linked?

I think separate would be better, to increase the article count :-). Then you can have links from the Complexity, Computation and Computer Science pages. Maybe you can call it "Algorithm run times" or something like that. --AxelBoldt
Or something like analysis of algorithms or Algorithmic Efficiency since you may sometimes choose based on other factors as well. --loh
I'd recommend puting it under computational complexity which earlier I made into a redirect to complexity theory. It should be a page of it's own, but I didn't want to write it ;-) --BlckKnght

Removed polylogarithmic

Reinstated my VERY bad. Missed a bracket

"Tight" bounds?

The article refers to terms "tight" and "tighter", but these terms are never defined! Alas, other pages (e.g. "bin packing") refer to this page as the one giving a formal definition of this term; moreover, "Asymptotically tight bound" redirects here. Yes, the term is intuitively clear, but a math-related page should have clear definitions.

I think a short section giving a formal definition of the term "tight bound" and referring to Theta-notation is needed (e.g. as a subsection of section "8. Related asymptotic notations"), and once such a section is created the redirection from "Asymptotically tight bound" should link directly there.

Mnemonics

(Below "Family of Bachmann-Landau notation" table)

This "mnemonics" business appears very unclear and clumsy to me. I strongly suspect it is the private production of a well-intentioned wikipedia contributor. Neither Bachmann, nor Landau, nor Knuth (at least in the references cited) mentioned anything close to what is claimed below the "Bachmann-Landau" notation table. Knuth uses the word "mnemonic" in his paper, but only to refer to the fact that the symbol O is commonly used and has become a reference. I note in passing that the "mnemonic" concerning the Omega symbol refers to the Knuth version of this symbol, and not to the Hardy-Littlewood version (which is the only one Landau knew): so who in the world is supposed to have devised these mnemonic "recipes"? I still think a precise reference is very much needed to justify the conservation of this part in the article. Sapphorain (talk) 21:21, 3 September 2013 (UTC)[reply]

It looks much like OR to me. It would definitely need a proper source attribution, but anyway, I’m unconvinced this bit of trivia needs to be included in the (already quite long) article at all.—Emil J. 12:17, 4 September 2013 (UTC)[reply]
I suppressed the bit of trivia. The bibliographic references inside were suppressed too, but can be found elsewhere in the page. Sapphorain (talk) 22:08, 16 November 2013 (UTC)[reply]

f(x) versus f

Currently in this article, the notation f(x) is used to denote a function. In the articles Function (mathematics), Limit (mathematics) etc., f is used to denote a function (a rule for mapping numbers to numbers) and f(x) is unambiguously used to denote its value (a number). These are two different approaches to the notation of functions: in the first approach (used in this article), the letter f denotes a dependent variable or (physical) quantity, and when talking about the function's behavior, one must always say what variables are regarded as the independent variables or experiment parameters that it depends on; in the second approach (in Function (mathematics) etc.), f is the name of a function, where function is defined as a rule mapping objects to other objects. Perhaps the notation of functions should be unified in Wikipedia? 90.190.113.12 (talk) 12:34, 9 January 2014 (UTC)[reply]

Base of log?

Sorry if I missed it, but I couldn't find specified anywhere what base of log is being used. Does it not matter since it is only comparing orders? Or is it assumed to be base 2 since it's computer science? Thanks for any clarification.