2.7 Discussion and Exercises

Most of the data structures described in this chapter are folklore. They can be found in implementations dating back over 30 years. For example, implementations of stacks, queues, and deques which generalize easily to the ArrayStack, ArrayQueue and ArrayDeque structures described here are discussed by Knuth [39, Section 2.2.2].

Brodnik et al. [10] seem to have been the first to describe the RootishArrayStack and prove a $ \sqrt{n}$ lower-bound like that in Section 2.6.2. They also present a different structure that uses a more sophisticated choice of block sizes in order to avoid computing square roots in the $ \mathtt{i2b(i)}$ method. With their scheme, the block containing $ \mathtt{i}$ is block $ \lfloor\log (\ensuremath{\mathtt{i}}+1)\rfloor$, which is just the index of the leading 1 bit in the binary representation of $ \ensuremath{\mathtt{i}}+1$. Some computer architectures provide an instruction for computing the index of the leading 1-bit in an integer.

A structure related to the RootishArrayStack is the $ 2$-level tiered-vector of Goodrich and Kloss [30]. This structure supports $ \mathtt{get(i,x)}$ and $ \mathtt{set(i,x)}$ in constant time and $ \mathtt{add(i,x)}$ and $ \mathtt{remove(i)}$ in $ O(\sqrt{\ensuremath{\mathtt{n}}})$ time. These running times are similar to what can be achieved with the more careful implementation of a RootishArrayStack discussed in Exercise 2.9.

Exercise 2..1   The List method $ \mathtt{addAll(i,c)}$ inserts all elements of the Collection $ \mathtt{c}$ into the list at position $ \mathtt{i}$. (The $ \mathtt{add(i,x)}$ method is a special case where $ \ensuremath{\mathtt{c}}=\{\ensuremath{\mathtt{x}}\}$.) Explain why, for the data structures in this chapter, it is not efficient to implement $ \mathtt{addAll(i,c)}$ by repeated calls to $ \mathtt{add(i,x)}$. Design and implement a more efficient implementation.

Exercise 2..2   Design and implement a RandomQueue. This is an implementation of the Queue interface in which the $ \mathtt{remove()}$ operation removes an element that is chosen uniformly at random among all the elements in the queue. The $ \mathtt{add(x)}$ and $ \mathtt{remove()}$ operations in a RandomQueue should take constant time.

Exercise 2..3   Design and implement a Treque (triple-ended queue). This is a List implementation in which $ \mathtt{get(i)}$ and $ \mathtt{set(i,x)}$ run in constant time and $ \mathtt{add(i,x)}$ and $ \mathtt{remove(i)}$ run in time

$\displaystyle O(1+\min\{\ensuremath{\mathtt{i}}, \ensuremath{\mathtt{n}}-\ensur...
...}}, \vert\ensuremath{\mathtt{n}}/2-\ensuremath{\mathtt{i}}\vert\}) \enspace .
$

With this running time, modifications are fast if they are near either end or near the middle of the list.

Exercise 2..4   Implement a method $ \mathtt{rotate(r)}$ that ``rotates'' a List so that list item $ \mathtt{i}$ becomes list item $ (\ensuremath{\mathtt{i}}+\ensuremath{\mathtt{r}})\bmod n$. When run on an ArrayDeque, or a DualArrayDeque, $ \mathtt{rotate(r)}$ should run in $ O(1+\min\{r,n-r\})$.

Exercise 2..5   Modify the ArrayDeque implementation so that the shifting done by $ \mathtt{add(i,x)}$, $ \mathtt{remove(i)}$, and $ \mathtt{resize()}$ is done using $ \mathtt{System.arraycopy(s,i,d,j,n)}$.

Exercise 2..6   Modify the ArrayDeque implementation so that it does not use the $ \mathtt{\text{\ttfamily\%}}$ operator (which is expensive on some systems). Instead, it should make use of the fact that, if $ \mathtt{a.length}$ is a power of 2, then $ \mathtt{k\text{\ttfamily\%}a.length}$$ =$ $ \mathtt{k\text{\ttfamily\&}(a.length-1)}$. (Here, $ \mathtt{\text{\ttfamily\&}}$ is the bitwise-and operator.)

Exercise 2..7   Design and implement a variant of ArrayDeque that does not do any modular arithmetic at all. Instead, all the data sits in a consecutive block, in order, inside an array. When the data overruns the beginning or the end of this array, a modified $ \mathtt{rebuild()}$ operation is performed. The amortized cost of all operations should be the same as in an ArrayDeque.

Hint: Making this work is really all about how a $ \mathtt{rebuild()}$ operation is performed. You would like $ \mathtt{rebuild()}$ to put the data structure into a state where the data cannot run off either end until at least $ \ensuremath{\mathtt{n}}/2$ operations have been performed.

Test the performance of your implementation against the ArrayDeque. Optimize your implementation (by using $ \mathtt{System.arraycopy(a,i,b,i,n)}$) and see if you can get it to outperform the ArrayDeque implementation.

Exercise 2..8   Design and implement a version of a RootishArrayStack that has only $ O(\sqrt{\ensuremath{\mathtt{n}}})$ wasted space, but that can perform $ \mathtt{add(i,x)}$ and $ \mathtt{remove(i,x)}$ operations in $ O(1+\min\{\ensuremath{\mathtt{i}},\ensuremath{\mathtt{n}}-\ensuremath{\mathtt{i}}\})$ time.

Exercise 2..9   Design and implement a version of a RootishArrayStack that has only $ O(\sqrt{\ensuremath{\mathtt{n}}})$ wasted space, but that can perform $ \mathtt{add(i,x)}$ and $ \mathtt{remove(i,x)}$ operations in $ O(1+\min\{\sqrt{\ensuremath{\mathtt{n}}},\ensuremath{\mathtt{n}}-\ensuremath{\mathtt{i}}\})$ time. (For an idea on how to do this, see Section 3.3.)

Exercise 2..10   Design and implement a version of a RootishArrayStack that has only $ O(\sqrt{\ensuremath{\mathtt{n}}})$ wasted space, but that can perform $ \mathtt{add(i,x)}$ and $ \mathtt{remove(i,x)}$ operations in $ O(1+\min\{\ensuremath{\mathtt{i}},\sqrt {\ensuremath{\mathtt{n}}},\ensuremath{\mathtt{n}}-\ensuremath{\mathtt{i}}\})$ time. (See Section 3.3 for ideas on how to achieve this.)

opendatastructures.org