Jump to content

Talk:Heap (data structure)

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by 77.56.7.92 (talk) at 02:27, 8 February 2013 (questionable reference). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

WikiProject iconComputer science Start‑class High‑importance
WikiProject iconThis article is within the scope of WikiProject Computer science, a collaborative effort to improve the coverage of Computer science related articles on Wikipedia. If you would like to participate, please visit the project page, where you can join the discussion and see a list of open tasks.
StartThis article has been rated as Start-class on Wikipedia's content assessment scale.
HighThis article has been rated as High-importance on the project's importance scale.
Things you can help WikiProject Computer science with:

Unifying heaps

The articles about heaps keep switching from min to max heaps and back (e.g. in this article the definition is for max heap while the complexity is for min heap.) I think we use consistently min-heap (or max-heap) and say in ONE place the other is just the opposite. What do you think? Jirka6 18:38, 25 February 2007 (UTC)[reply]

I agree. Bender2k14 (talk) 00:06, 15 October 2009 (UTC)[reply]

Definition

I was under the impression that heaps must also be complete trees...that is...the last level of the tree must be filled with a left to right ordering but maybe this is not true for all heaps...

That only holds for binary heaps. --Mellum 09:59, 3 Dec 2004 (UTC)
Imagine a scenario where the bottom level is completely full. If a new node were to be added to the heap, how could it maintain the property of being a complete binary tree, since the leftmost node would not be filled at the last level? Sheepdude 19:46, 20 April 2007 (UTC)[reply]
You are talking about a perfect tree, not a complete tree. See Types of binary trees. Bender2k14 (talk) 00:16, 15 October 2009 (UTC)[reply]

Let A and B be nodes of a heap, such that B is a child of A. The heap must then satisfy the following condition (heap property): key(A) ≥ key(B)

This is only for the special case of a max-heap, isn't it? --Abdull 17:38, 6 March 2006 (UTC)[reply]

Yes, special in the sense that a heap is either a minheap or a maxheap. Bender2k14 (talk) 00:11, 15 October 2009 (UTC)[reply]

complete table of run time complexity

Anybody able to complete the table of the run time complexity? It also would be nice to have more information than worst case or amortised cost. Cyc 14:05, 2 May 2006 (UTC)[reply]

That table is not correct. As the proof on the Binary Heap page shows, the running time for createHeap should be O(n). I will look into the createHeap times for the other heap implementations to see if they are wrong as well. Bender2k14 21:42, 12 October 2007 (UTC)[reply]

Ditto: the running time for createHeap should be O(n). If it were O(1) as claimed, then sorting is O(n) as follows: for i = 1 to n do { createHeap; remove root } — Preceding unsigned comment added by 96.60.253.126 (talk) 13:45, 28 January 2012 (UTC)[reply]

According to the cited source (pag 159 and 506 3rd ed), createHeap on a Binary Heap is O(n) but the amortized worst case is Θ(1).
Sergio91pt (talk) 12:21, 11 March 2012 (UTC)[reply]

heapifying

You know you guys use the term heapifying in this topic? The topic is a mishmash of stuff in its current revision and I might clean it up like I did with tree (data structure). Hope noone objects. Josh Froelich 16:43, 16 February 2007 (UTC)[reply]

[1]

Key needs to be explained in the definition

It should be explained what a "key" is, becuase it is in the definition with no explanation, which makes the rest of the article usless to anyone who doesn't already know what a heap is, which in turn makes the article usless for %95 of people visiting the page because if you know what a key is there's only a small chance you don't know what a heap is. There isn't even a link (although, noe that a link to key wouldn't be adequeate in this case, because it is probably possible to give a short non-technical descriptioin of a "key"). 76.175.72.51 (talk) 17:19, 13 April 2009 (UTC)[reply]

Sublinear Claim

I just removed the claim that heaps can be used to find the max, min, both the max and min, median, or k-th largest element in sublinear time because that is not true. You have to look at each element at least once, so the running time is at least linear. Bender2k14 (talk) 00:20, 15 October 2009 (UTC)[reply]

I'm pretty sure that the sublinear claim refers to finding these elements after the heap has been constructed. Otherwise that claim would be pointless, and should be removed altogether. Paradoctor (talk) 06:45, 10 November 2009 (UTC)[reply]
The current statement "Finding the min, max, both the min and max, median, or even the k-th largest element can be done in linear time using heaps" also seems spurious. Firstly, you can easily find min, max and both min and max in linear time without using any kind of special data structure. And … can you really find the median in linear time in any way in which a heap is instrumental? (The standard linear median-finding algorithm doesn't use a heap.) You can find the k-th largest element in linear time with a heap 'if' you assume that k is a constant. Otherwise, if you don't want k popping up in the running time, using the same linear selection algorithm as for the median would be the solution. No heap involved, though. The most natural claims to have in this bullet point (related to selection) would be that it can find the k-th largest or smallest item in (for example) O(n lg k) or O(n + k lg k) time. Mlhetland (talk) 19:58, 12 July 2010 (UTC)[reply]
I removed the corresponding section according to the above discussion. --mellum (talk) 13:05, 2 August 2010 (UTC)[reply]
It's easy to find min/max in O(1) using heaps, and update them in O(ln(n)). You can't do that with the linear algorithm, it takes O(n) each and every time. After thinking about it for a while, I came up with O(1) for finding the median also using min/max heaps that update in O(ln n). But I added a ref tag.- Wolfkeeper 13:53, 2 August 2010 (UTC)[reply]

Brodal queue

Fibonacci queue cites this paper. From the abstract:

I think this should be mentioned here. Paradoctor (talk) 12:01, 7 November 2009 (UTC)[reply]

Weak heaps?

Is there any particular reason why weak heaps are not mentioned? Maybe this variant is not so well known? --188.126.207.212 (talk) 17:46, 23 August 2010 (UTC)[reply]

If you can write up something about weak heaps with proper sources, be bold. Qwertyus (talk) 15:04, 19 January 2011 (UTC)[reply]

Rm pseudo queues

I've removed all reference to "Pseudo Queues". I've never heard of them, and the reference given seems to have nothing to do whatsoever with heaps/priority queues. Qwertyus (talk) 15:09, 19 January 2011 (UTC)[reply]

Paring Heap Amortized Linear Time Insert Operation

Hi folks,

why is in the pairing heap implementation the insert operation denoted as amortized linear? Shouldn't it be linear without amortized analysis?

Grettings! —Preceding unsigned comment added by 91.18.86.134 (talk) 16:10, 21 February 2011 (UTC)[reply]

Because that's the way it's given in the citation. Btw., I've removed all the citation needed tags for the pairing heap operations, since they are all described in the reference given. I'm not going to cite it again in every field of the table. Qwertyus (talk) 10:48, 14 July 2011 (UTC)[reply]

Fibonacci heaps

I found this source for the fibonacci heaps, however, it says that every operation is done in amortized time, mind you I haven't read the hole thing. I think is the original paper. Plaga701 (talk) 10:41, 27 July 2011 (UTC)[reply]

Leaf heap

I just come across this term from a course slides of MIT OCW, by J.B. Orlin, Network Optimization, spring 2003. I'd like to create a new article to describe this object. However, I can not find a formal public reference now, since my slides at hand is inherited from a senior. In short, this is a heap with all real elements placed in leaves, and all the internal nodes has the value of smaller child. Every operation costs log(n) time. I don't think this structure practical, but it's reasonable to note it here to make a complete article. Can anybody help proceed along this line? Hupili (talk) 09:16, 4 February 2012 (UTC)[reply]


Radix heap

Also from 'J.B. Orlin, Network Optimization, spring 2003.', a specialized data structure to improve Dial's algorithm(an implementation of Dijkstra's shortest path algorithm). Use exponentially growed bucket width to improve find-min procedure.Hupili (talk) 02:48, 5 February 2012 (UTC)[reply]

Origin of Phrase "the Heap"

I deleted this line:

Some early popular languages such as Lisp provided dynamic memory allocation using heap data structures, which gave the memory area its name.

The line is mysterious to me because I don't see how heaps (the data structure) are useful in the context of memory management. What would the heap property be? Early Lisp implementations generally used mark-and-sweep to reclaim memory.

It was added by an anonymous editor (IP 82.180.29.126) on 20 April 2011. It was justified by a footnote to Cormen etal, but here is what Cormen etal. actually says:

We note that the term "heap" was originally coined in the context of heapsort, but it has since come to refer to "garbage-collected storage" such as the languages Lisp and Java provide. Our heap data structure is not garbage-collected storage, and whenever we refer to heaps in this book, we shall mean the structure defined in this chapter.

In The Art of Computer Programming, Vol. 1, 2nd Ed. p. 435, Knuth says the following on the use of "heap" in the context of dynamic memory:

Several authors began about 1975 to call the pool of available memory a "heap". But in the present series of books, we will use the word only in its more traditional sense related to priority queues.

I found an article dated 1985 which uses what it calls a "heap data structure" to manage memory in a Forth implementation. However the data structure described is not the heap which is the topic of this Wikipedia page.

Googling the line shows that it has been repeated multiple times on the web, but none of the references predate the 20 April 2011 Wikipedia edit. — Preceding unsigned comment added by Grubbiv (talkcontribs) 17:42, 19 August 2012 (UTC)[reply]

Floyd's algorithm

There's currently a reference to Floyd's algorithm in section Implementation and operations, but there's no mention of an application to heaps on the algorithm's page. I think the linked one is not the correct "Floyd's algorithm" in this case. If so, the link should be removed.

  1. ^

    == confusing

    I'm a graduate level EE who has been working for 20 years - and I don't understand. Perhaps the explanation could be less circular. Please keep in mind that this data is used by people who do not already know much about the subject.

    Agreed. This entry does not read well, and provides no explainations or definitions. —Preceding unsigned comment added by Chieffancypants (talkcontribs) 03:47, 11 January 2008 (UTC)

    ==