Answered step by step
Verified Expert Solution
Link Copied!

Question

1 Approved Answer

Skewed heaps: In this problem you are required to develop a data structure similar to that of the leftist heap. In leftist heaps, the Merge

Skewed heaps:

In this problem you are required to develop a data structure similar to that of the leftist heap. In leftist heaps, the Merge operation preserved the heap ordering and the balance (leftist bias) of the underlying tree. Skewed heaps use the same idea for merging heap-ordered trees. SkewHeapMerge is performed by merging the rightmost paths of two trees without keeping any explicit balance conditions. This means that theres no need to store the rank of the nodes in the tree. This is similar to self-adjusting trees, since no information is kept or updated.

Good performance of those data structures is guaranteed by a rebalancing steplike the splay in self-adjusting trees, only simpler. At each step of the merging along the rightmost paths of the two heaps, we swap all of the left and right children of nodes along this path, except for the last one. The modified procedure for merging two skewed heaps looks as follows:

function SkewedHeapMerge(h,h) : heap

if h is empty then return h

else if h is empty then return h

if the root of h <= the root of h then

exchange h and h (* h holds smallest root *)

if right(h) = nil then right(h) := h (* last node, we dont swap *)

else right(h) := SkewedHeapMerge(right(h), h)

swap left and right children of h

return h

The above recursive routine can also be done iteratively. In fact, it can be done more efficiently than the leftist heap Merge (by a constant factor), because everything can be done in one pass, while moving down the rightmost path. In the case of leftist heaps, we go down the rightmost path, and then back up to recompute ranks. In leftist heaps, that also requires either a recursive algorithm or pointers from nodes to their parents (See Note below).

Since there is no balance condition, theres no guarantee that these trees will have O(log n) worst-case performance for the merge (and hence all of the other operations). But they do have good amortized performance.

Heres the intuition for the above: In the previous merge algorithm we only had to swap children when the right one had a larger rank than the left. In this merge algorithm, we always swap children, so we might actually replace a right child of small rank with one of large rank. But the next time we come down this path, we will correct that error, because the right child will be swapped onto the left.

(a) Show that a SkewedHeapMerge of two skewed heaps uses amortized time O(log2n) by the use of the accounting method.

(b) Show that Insert and DeleteMin have the same amortized time bounds.

(Hint: Use weights instead of ranks. Define the weight of a node to be the number of nodes in the subtree rooted at that node (below that node, including the node itself). Let node x be a heavy node if its weight is more than half of the weight of its parent. Otherwise it is light one. What can we say about light and heavy nodes in any binary tree? Look at any rootleaf path in the tree. How many light nodes can you encounter in such path? Why? The best case is when the light nodes are also right children of their parents. For the accounting method, every time we swap, we must pay $1. Moreover, if it happens to swap a heavy child to a right child position, then you must deposit a penalty amount.)

Note: Just a note on implementation here. Sometimes we may want to be able to move up a tree as easily as we move down; so every node will also include a pointer to its parent. That means that a node has a pointer to its left child, a pointer to its right child, and a pointer to its parent, increasing the amount of space needed to store trees. To decrease the space requirement, you can do this. Look at three nodes, a parent p and its two children r and l. Now p will have a pointer to l, but not to r; l has a pointer to r and r has a pointer back to p. So, all the left children in a tree have pointers to their right siblings (brothers) and to their own left children. All the right children have pointers to their parents and to their right children. If you draw a picture, this should make more sense. Now every node still has two pointers, and you can visit left nodes, right nodes and parent nodes easily.

Step by Step Solution

There are 3 Steps involved in it

Step: 1

blur-text-image

Get Instant Access to Expert-Tailored Solutions

See step-by-step solutions with expert insights and AI powered tools for academic success

Step: 2

blur-text-image

Step: 3

blur-text-image

Ace Your Homework with AI

Get the answers you need in no time with our AI-driven, step-by-step assistance

Get Started

Recommended Textbook for

Intelligent Information And Database Systems Asian Conference Aciids 2012 Kaohsiung Taiwan March 2012 Proceedings Part 2 Lnai 7197

Authors: Jeng-Shyang Pan ,Shyi-Ming Chen ,Ngoc-Thanh Nguyen

2012th Edition

3642284892, 978-3642284892

More Books

Students also viewed these Databases questions