Subsections


10.1 $ \mathtt{BinaryHeap}$: An Implicit Binary Tree

Our first implementation of a (priority) $ \mathtt{Queue}$ is based on a technique that is over four hundred years old. Eytzinger's method allows us to represent a complete binary tree as an array by laying out the nodes of the tree in breadth-first order (see Section 6.1.2). In this way, the root is stored at position 0, the root's left child is stored at position 1, the root's right child at position 2, the left child of the left child of the root is stored at position 3, and so on. See Figure 10.1.

Figure 10.1: Eytzinger's method represents a complete binary tree as an array.
\includegraphics[scale=0.90909]{figs/eytzinger}

If we apply Eytzinger's method to a sufficiently large tree, some patterns emerge. The left child of the node at index $ \mathtt{i}$ is at index $ \ensuremath{\mathtt{left(i)}}=2\ensuremath{\mathtt{i}}+1$ and the right child of the node at index $ \mathtt{i}$ is at index $ \ensuremath{\mathtt{right(i)}}=2\ensuremath{\mathtt{i}}+2$. The parent of the node at index $ \mathtt{i}$ is at index $ \ensuremath{\mathtt{parent(i)}}=(\ensuremath{\mathtt{i}}-1)/2$.

  int left(int i) {
    return 2*i + 1;
  }
  int right(int i) {
    return 2*i + 2;
  }
  int parent(int i) {
    return (i-1)/2;
  }

A $ \mathtt{BinaryHeap}$ uses this technique to implicitly represent a complete binary tree in which the elements are heap-ordered: The value stored at any index $ \mathtt{i}$ is not smaller than the value stored at index $ \mathtt{parent(i)}$, with the exception of the root value, $ \ensuremath{\mathtt{i}}=0$. It follows that the smallest value in the priority $ \mathtt{Queue}$ is therefore stored at position 0 (the root).

In a $ \mathtt{BinaryHeap}$, the $ \mathtt{n}$ elements are stored in an array $ \mathtt{a}$:

  array<T> a;
  int n;

Implementing the $ \mathtt{add(x)}$ operation is fairly straightforward. As with all array-based structures, we first check to see if $ \mathtt{a}$ is full (by checking if $ \ensuremath{\mathtt{a.length}}=\ensuremath{\mathtt{n}}$) and, if so, we grow $ \mathtt{a}$. Next, we place $ \mathtt{x}$ at location $ \mathtt{a[n]}$ and increment $ \mathtt{n}$. At this point, all that remains is to ensure that we maintain the heap property. We do this by repeatedly swapping $ \mathtt{x}$ with its parent until $ \mathtt{x}$ is no longer smaller than its parent. See Figure 10.2.

  bool add(T x) {
    if (n + 1 > a.length) resize();
    a[n++] = x;
    bubbleUp(n-1);
    return true;
  }
  void bubbleUp(int i) {
    int p = parent(i);
    while (i > 0 && compare(a[i], a[p]) < 0) {
      a.swap(i,p);
      i = p;
      p = parent(i);
    }
  }

Figure 10.2: Adding the value 6 to a $ \mathtt{BinaryHeap}$.
\includegraphics[height=.25\textheight ]{figs/heap-insert-1}
\includegraphics[height=.25\textheight ]{figs/heap-insert-2}
\includegraphics[height=.25\textheight ]{figs/heap-insert-3}
\includegraphics[height=.25\textheight ]{figs/heap-insert-4}

Implementing the $ \mathtt{remove()}$ operation, which removes the smallest value from the heap, is a little trickier. We know where the smallest value is (at the root), but we need to replace it after we remove it and ensure that we maintain the heap property.

The easiest way to do this is to replace the root with the value $ \mathtt{a[n-1]}$, delete that value, and decrement $ \mathtt{n}$. Unfortunately, the new root element is now probably not the smallest element, so it needs to be moved downwards. We do this by repeatedly comparing this element to its two children. If it is the smallest of the three then we are done. Otherwise, we swap this element with the smallest of its two children and continue.

  T remove() {
    T x = a[0];
    a[0] = a[--n];
    trickleDown(0);
    if (3*n < a.length) resize();
    return x;
  }
  void trickleDown(int i) {
    do {
      int j = -1;
      int r = right(i);
      if (r < n && compare(a[r], a[i]) < 0) {
        int l = left(i);
        if (compare(a[l], a[r]) < 0) {
          j = l;
        } else {
          j = r;
        }
      } else {
        int l = left(i);
        if (l < n && compare(a[l], a[i]) < 0) {
          j = l;
        }
      }
      if (j >= 0)  a.swap(i, j);
      i = j;
    } while (i >= 0);
  }

Figure 10.3: Removing the minimum value, 4, from a $ \mathtt{BinaryHeap}$.
\includegraphics[height=.25\textheight ]{figs/heap-remove-1}
\includegraphics[height=.25\textheight ]{figs/heap-remove-2}
\includegraphics[height=.25\textheight ]{figs/heap-remove-3}
\includegraphics[height=.25\textheight ]{figs/heap-remove-4}

As with other array-based structures, we will ignore the time spent in calls to $ \mathtt{resize()}$, since these can be accounted for using the amortization argument from Lemma 2.1. The running times of both $ \mathtt{add(x)}$ and $ \mathtt{remove()}$ then depend on the height of the (implicit) binary tree. Luckily, this is a complete binary tree; every level except the last has the maximum possible number of nodes. Therefore, if the height of this tree is $ h$, then it has at least $ 2^h$ nodes. Stated another way

$\displaystyle \ensuremath{\mathtt{n}} \ge 2^h \enspace .
$

Taking logarithms on both sides of this equation gives

$\displaystyle h \le \log \ensuremath{\mathtt{n}} \enspace .
$

Therefore, both the $ \mathtt{add(x)}$ and $ \mathtt{remove()}$ operation run in $ O(\log \ensuremath{\mathtt{n}})$ time.

10.1.1 Summary

The following theorem summarizes the performance of a $ \mathtt{BinaryHeap}$:

Theorem 10..1   A $ \mathtt{BinaryHeap}$ implements the (priority) $ \mathtt{Queue}$ interface. Ignoring the cost of calls to $ \mathtt{resize()}$, a $ \mathtt{BinaryHeap}$ supports the operations $ \mathtt{add(x)}$ and $ \mathtt{remove()}$ in $ O(\log \ensuremath{\mathtt{n}})$ time per operation.

Furthermore, beginning with an empty $ \mathtt{BinaryHeap}$, any sequence of $ m$ $ \mathtt{add(x)}$ and $ \mathtt{remove()}$ operations results in a total of $ O(m)$ time spent during all calls to $ \mathtt{resize()}$.

opendatastructures.org