Jump to content

Weak heap

From Wikipedia, the free encyclopedia

Incomputer science,aweak heapis adata structureforpriority queues,combining features of thebinary heapandbinomial heap.It can be stored in an array as animplicitbinary tree like a binary heap, and has the efficiency guarantees of binomial heaps.

Asorting algorithmusing weak heaps, weak-heapsort, uses a number of comparisons that is close to the theoreticallower boundon thenumber of comparisons required to sort a list,so is particularly useful when comparison is expensive, such as when comparing strings using the fullUnicode collation algorithm.

Description[edit]

A weak heap is most easily understood as a heap-orderedmulti-way treestored as a binary tree using the "right-child left-sibling" convention. (This is equivalent to, but reversed from, the usualleft-child right-sibling binary tree.)

In the multi-way tree, and assuming a max-heap, each parent's key is greater than or equal to () all the child keys (and thus, by induction, all members of the subtree).

Expressed as a binary tree, this translates to the following invariants:[1]

  • The root node has no left child
  • For every node, the value associated with that node is greater than or equal to the values associated with all nodes in its right subtree.
  • The leaves of the tree have heights that are all within one of each other.

The last condition is a consequence of the fact that an implicit binary tree is acomplete binary tree.

The structure of this tree maps very neatly onto the traditional1-based (Ahnentafel) implicit binary tree arrangement, where nodekhas a next sibling (left child) numbered2kand a first child (right child) numbered2k+ 1,by adding an additional root numbered0.This root has no siblings, only a first child, which is node1(2×0 + 1).

This structure is very similar to that of a binomial heap, with a tree of heighthbeing composed of a root plus trees of heightsh− 1,h− 2,...,1.A perfect (no missing leaves) weak heap with2nelements is exactly isomorphic to a binomial heap of the same size,[2]but the two algorithms handle sizes which are not a power of2differently: a binomial heap uses multiple perfect trees, while a weak heap uses a single imperfect tree.

Weak heaps require the ability to exchange the left and right children (and associated subtrees) of a node. In an explicit (pointer-based) representation of the tree, this is straightforward. In an implicit (array) representation, this requires one "reverse bit" per internal node to indicate which child is considered the left child. A weak heap is thus not a strictly implicit data structure since it requiresO(n)additional space (1/2bit per node). However, it is often possible to find space for this extra bit within the node structure, such as bytagging a pointerwhich is already present.

In the implicit binary tree, nodekwith reverse bitrkhas parentk/2,left child2k+rk,and right child2k+ 1 −rk.

Viewed as a multi-way tree, each node in a weak heap is linked to two others: a "next sibling" and a "first child". In the implicit tree, the links are fixed, sowhichof the two links is the sibling and which the first child is indicated by the reverse bit.

Operations on weak heaps[edit]

Note thateverynode in a weak heap can be considered the root of a smaller weak heap by ignoring its next sibling. Nodes with no first child are automatically valid weak heaps.

A node of heighthhash− 1children: a first child of heighth− 1,a second child of heighth− 2,and so on to the last child of height1.These may be found by following the first child link and then successive next sibling links.

It also has next siblings of heighth− 1,h− 2,etc.

A node's parent in the multi-way tree is called its "distinguished ancestor". To find this in the binary tree, find the node's binary parent. If the node is the right child (first child), the parent is the distinguished ancestor. If the node is the left child (next sibling), its distinguished ancestor is the same as its binary parent's. In the implicit tree, finding the binary parent is easy, but its reverse bit must be consulted to determine which type of child the node is. (Early papers used the term "grandparent" for the distinguished ancestor,[3]a meaning confusingly different from the usual "parent of parent".)

Although the distinguished ancestor may belog2nlevels high in the tree, theaveragedistance is2.(It's at least1,and half of the time we recurse, soD= 1 +D/2,meaning thatD= 2.) Thus, even a simple iterative algorithm for finding the distinguished ancestor is sufficient.

Like binomial heaps, the fundamental operation on weak heaps is merging two heaps of equal heighth,to make a weak heap of heighth+1.This requires exactly one comparison, between the roots. Whichever root is greater (assuming a max-heap) is the final root. Its first child is the losing root, which retains its children (right subtree). The winning root's children are installed as siblings of the losing root.

This operation can be performed on the implicit tree structure because the heaps being merged are never arbitrary. Rather, the two heaps are formed as part of sifting a node up the multi-way tree:

  • The first is a normal weak heap (whose next sibling link exists, but is ignored).
  • The second is the imaginary heap formed by linking the first root's distinguished ancestor (multi-way parent) to the first root's following siblings.

At the beginning, the heap invariants apply everywhereexceptpossibly between the first root and its distinguished ancestor. Allothernodes are less than or equal to their distinguished ancestors.

After comparing the two roots, the merge proceeds in one of two ways:

  1. (The distinguished ancestor is greater or equal.) Nothing needs to be moved, and the result of the merge is the distinguished ancestor.
  2. (The first root is greater.) The first root's binary children (first child and next sibling) are exchanged (using the reverse bit), and then the first root and its distinguished ancestor are exchanged (by copying).

The second case works because, in the multi-way tree, each node keeps its children with it. The first root is promoted up the tree because it is greater than its distinguished ancestor. Thus, it is safely greater than all of the ancestor's previous children.

The previous ancestor, however, is not a safe parent for the first root's old children, because it is less than the first root and so it's not guaranteed to be greater than or equal to all of its children.

By swapping the binary children, the appropriate subset of the demoted ancestor's old children (which are safely less than or equal to it) are demoted with it. The demoted ancestor's new siblings are the first root's old children, promoted, which are safely less than or equal to the promoted first root.

After this operation, it is uncertain whether the invariant is maintained between the new distinguished ancestor anditsdistinguished ancestor, so the operation is repeated until the root is reached.

Weak-heap sort[edit]

Weak heaps may be used to sort an array, in essentially the same way as a conventionalheapsort.[3]First, a weak heap is built out of all of the elements of the array, and then the root is repeatedly exchanged with the last element, which is sifted down to its proper place.

A weak heap ofnelements can be formed inn− 1merges. It can be done on various orders, but a simple bottom-up implementation works from the end of the array to the beginning, merging each node with its distinguished ancestor. Note thatfindingthe distinguished ancestor is simplified because the reverse bits in all parents of the heaps being merged are unmodified from their initial state ( "not reversed" ), and so do not need to be consulted.

As with heapsort, if the array to be sorted is larger than theCPU cache,performance is improved if subtrees are merged as soon as two of the same size become available, rather than merging all subtrees on one level before proceeding to the next.[4]

Sifting down in a weak heap can be done inh= ⌈log2ncomparisons, as opposed to2 log2nfor a binary heap, or1.5 log2nfor the "bottom-up heapsort"variant. This is done by" merging up ": after swapping the root with the last element of the heap, find the last (height 1) child of the root. Merge this with the root (its distinguished ancestor), resulting in a valid height-2 heap at the global root. Then go to the previous sibling (binary parent) of the last merged node, and merge again. Repeat until the root is reached, when it will be correct for the complete tree.

Priority queue operations[edit]

In a weak max-heap, the maximum value can be found (in constant time) as the value associated with the root node; similarly, in a weak min-heap, the minimum value can be found at the root.

As with binary heaps, weak heaps can support the typical operations of apriority queuedata structure: insert, delete-min, delete, or decrease-key, in logarithmic time per operation.

Sifting up is done using the same process as in binary heaps. The new node is added at the leaf level, then compared with its distinguished ancestor and swapped if necessary (the merge operation). This is repeated until no more swaps are necessary or the root is reached.

Variants of the weak heap structure allow constantamortized timeinsertions and decrease-keys, matching the time forFibonacci heaps.[2]

History and applications[edit]

Weak heaps were introduced byDutton (1993),as part of a variantheap sortalgorithm that (unlike the standard heap sort using binary heaps) could be used to sortnitems using onlynlog2n+O(n)comparisons.[3][5]They were later investigated as a more generally applicable priority queue data structure.[6][7]

References[edit]

  1. ^Edelkamp, Stefan (26 May 2011), Pieterse, Vreda; Black, Paul E. (eds.),"weak-heap",Dictionary of Algorithms and Data Structures,retrieved2015-12-01
  2. ^abEdelkamp, Stefan; Elmasry, Amr; Katajainen, Jyrki (October 2012),"The weak-heap data structure: variants and applications"(PDF),Journal of Discrete Algorithms,16:187–205,CiteSeerX10.1.1.455.1213,doi:10.1016/j.jda.2012.04.010,MR2960353.
  3. ^abcDutton, Ronald D. (1993), "Weak-heap sort",BIT,33(3): 372–381,doi:10.1007/bf01990520,S2CID5387832.
  4. ^Bojesen, Jesper; Katajainen, Jyrki; Spork, Maz (2000)."Performance Engineering Case Study: Heap Construction"(PostScript).ACM Journal of Experimental Algorithmics.5(15).CiteSeerX10.1.1.35.3248.doi:10.1145/351827.384257.S2CID16705375.Alternate PDF source.
  5. ^Edelkamp, Stefan;Wegener, Ingo(2000), "On the performance ofWEAK-HEAPSORT",Stacs 2000(PDF),Lecture Notes in Computer Science, vol. 1770, Springer-Verlag, pp. 254–266,CiteSeerX10.1.1.21.1863,doi:10.1007/3-540-46541-3_21,ISBN978-3-540-67141-1.
  6. ^Bruun, Asger; Edelkamp, Stefan; Katajainen, Jyrki; Rasmussen, Jens (2010)."Policy-Based Benchmarking of Weak Heaps and Their Relatives"(PDF).Proceedings of the 9th International Symposium on Experimental Algorithms (SEA 2010).Lecture Notes in Computer Science. Vol. 6049. Springer-Verlag. pp. 424–435.doi:10.1007/978-3-642-13193-6_36.ISBN978-3-642-13192-9.Archived fromthe original(PDF)on 2017-08-11..
  7. ^Edelkamp, Stefan; Elmasry, Amr; Katajainen, Jyrki (2012),"The Weak-heap Family of Priority Queues in Theory and Praxis"(PDF),Proceedings of the Eighteenth Computing: The Australasian Theory Symposium (CATS 2012),vol. 128, Darlinghurst, Australia: Australian Computer Society, Inc., pp. 103–112,ISBN978-1-921770-09-8.

Further reading[edit]

  • Edelkamp, Stefan; Elmasry, Amr; Katajainen, Jyrki (November 2013)."Weak heaps engineered"(PDF).Journal of Discrete Algorithms.23:83–97.doi:10.1016/j.jda.2013.07.002.We provide a catalogue of algorithms that optimize the standard algorithms in various ways. As the optimization criteria, we consider the worst-case running time, the number of instructions, branch mispredictions, cache misses, element comparisons, and element moves.
  • Edelkamp, Stefan; Elmasry, Amr; Katajainen, Jyrki; Weiß, Armin (July 2013).Weak Heaps and Friends: Recent Developments.Combinatorial Algorithms - 24th International Workshop.Rouen,France.doi:10.1007/978-3-642-45278-9_1.