Leftist tree

From Wikipedia, the free encyclopedia
Jump to: navigation, search

In computer science, a leftist tree or leftist heap is a priority queue implemented with a variant of a binary heap. Every node has an s-value which is the distance to the nearest leaf. In contrast to a binary heap, a leftist tree attempts to be very unbalanced. In addition to the heap property, leftist trees are maintained so the right descendant of each node has the lower s-value.

The height-biased leftist tree was invented by Clark Allan Crane.[1] The name comes from the fact that the left subtree is usually taller than the right subtree.

A leftist tree is a mergeable heap. When inserting a new node into a tree, a new one-node tree is created and merged into the existing tree. To delete an item, it is replaced by the merge of its left and right sub-trees. Both these operations take O(log n) time. For insertions, this is slower than binomial heaps which support insertion in O(1) (constant) amortized time, and O(log n) worst-case.

Leftist trees are advantageous because of their ability to merge quickly, compared to binary heaps which take Θ(n). In almost all cases, the merging of skew heaps has better performance. However merging leftist heaps has worst-case O(log n) complexity while merging skew heaps has only amortized O(log n) complexity.

Bias[edit]

The usual leftist tree is a height-biased leftist tree.[1] However, other biases can exist, such as in the weight-biased leftist tree.[2]

S-value[edit]

S-values of a leftist tree

The s-value (or rank) of a node is the distance from that node to the nearest missing leaf. Put another way, the s-value of a null child is implicitly zero. Other nodes have an s-value equal to one more the minimum of their children's s-values. Thus, in the example at right, all nodes with at least one missing child have an s-value of 1, while node 4 has an s-value of 2, since its right child (8) has an s-value of 1.

Merging height biased leftist trees[edit]

Assuming a min-heap, choose the lower-valued node as the root, and merge the higher-valued node into the new root's right subtree.

After merging, compare the s-values (heights) of the resultant children. Swap them, if necessary, so the left child's s-value is at least as large as the right child's. (If one child is missing, it should be the right one.) Then update the s-value of the root to be one more than its right child's.

Java code for merging a min height biased leftist tree[edit]

public Node merge(Node x, Node y) {
  if (x == null)
    return y;
  if (y == null) 
    return x;

  // if this were a max-heap, then the 
  // next line would be: if (x.element < y.element)
  if (x.element.compareTo(y.element) > 0) {  
    // x.element > y.element
    Node temp = x;
    x = y;
    y = temp;
  }

  x.rightChild = merge(x.rightChild, y);

  if (x.leftChild == null) {
    // left child doesn't exist, so move right child to the left side
    x.leftChild = x.rightChild;
    x.rightChild = null;
    // x.s was, and remains, 1
  } else {
    // left child does exist, so compare s-values
    if (x.leftChild.s < x.rightChild.s) {
      Node temp = x.leftChild;
      x.leftChild = x.rightChild;
      x.rightChild = temp;
    }
    // since we know the right child has the lower s-value, we can just
    // add one to its s-value
    x.s = x.rightChild.s + 1;
  }
  return x;
}

Variants[edit]

Several variations on the basic leftist tree exist, which make only minor changes top the basic algorithm:

  • The choice of the left child as the taller one is arbitrary; a "rightist tree" would work just as well.
  • It is possible to avoid swapping children, but instead record which child is the tallest (in, for example, the least significant bit of the s-value) and use that in the merge operation.
  • The s-value used to decide which side to merge with could use a metric other than height. For example, weight (number of nodes) could be used.

Initializing a height biased leftist tree[edit]

Initializing a min HBLT - Part 1

Initializing a height biased leftist tree is primarily done in one of two ways. The first is to merge each node one at a time into one HBLT. This process is inefficient and takes O(nlogn) time. The other approach is to use a queue to store each node and resulting tree. The first two items in the queue are removed, merged, and placed back into the queue. This can initialize a HBLT in O(n) time. This approach is detailed in the three diagrams supplied. A min height biased leftist tree is shown.

To initialize a min HBLT, place each element to be added to the tree into a queue. In the example (see Part 1 to the left), the set of numbers [4, 8, 10, 9, 1, 3, 5, 6, 11] are initialized. Each line of the diagram represents another cycle of the algorithm, depicting the contents of the queue. The first five steps are easy to follow. Notice that the freshly created HBLT is added to the end of the queue. In the fifth step, the first occurrence of an s-value greater than 1 occurs. The sixth step shows two trees merged with each other, with predictable results.

Initializing a min HBLT - Part 2

In part 2 a slightly more complex merge happens. The tree with the lower value (tree x) has a right child, so merge must be called again on the subtree rooted by tree x's right child and the other tree. After the merge with the subtree, the resulting tree is put back into tree x. The s-value of the right child (s=2) is now greater than the s-value of the left child (s=1), so they must be swapped. The s-value of the root node 4 is also now 2.

Initializing a min HBLT - Part 3

Part 3 is the most complex. Here, we recursively call merge twice (each time with the right child 's subtree that is not grayed out). This uses the same process described for part 2.

References[edit]

  1. ^ a b Crane, Clark A. (1972), Linear Lists and Priority Queues as Balanced Binary Trees (Ph.D. thesis), Department of Computer Science, Stanford University, ISBN 0-8240-4407-X, STAN-CS-72-259 
  2. ^ Seonghun Cho; Sartaj Sahni (1996), "Weight Biased Leftist Trees and Modified Skip Lists" (PDF), Journal of Experimetnal Algorithmics, 3, CiteSeerX 10.1.1.13.2962Freely accessible, doi:10.1145/297096.297111 

External links[edit]

Further reading[edit]