2.7 Discussion and Exercises

Most of the data structures described in this chapter are folklore. They can be found in implementations dating back over 30 years. For example, implementations of stacks, queues, and deques which generalize easily to the $ \mathtt{ArrayStack}$, $ \mathtt{ArrayQueue}$ and $ \mathtt{ArrayDeque}$ structures described here are discussed by Knuth [39, Section 2.2.2].

Brodnik et al. [10] seem to have been the first to describe the $ \mathtt{RootishArrayStack}$ and prove a $ \sqrt{n}$ lower-bound like that in . They also present a different structure that uses a more sophisticated choice of block sizes in order to avoid computing square roots in the $ \mathtt{i2b(i)}$ method. With their scheme, the block containing $ \mathtt{i}$ is block $ \lfloor\log (\ensuremath{\mathtt{i}}+1)\rfloor$, which is just the index of the leading 1 bit in the binary representation of $ \ensuremath{\mathtt{i}}+1$. Some computer architectures provide an instruction for computing the index of the leading 1-bit in an integer.

A structure related to the $ \mathtt{RootishArrayStack}$ is the $ 2$-level tiered-vector of Goodrich and Kloss [30]. This structure supports $ \mathtt{get(i,x)}$ and $ \mathtt{set(i,x)}$ in constant time and $ \mathtt{add(i,x)}$ and $ \mathtt{remove(i)}$ in $ O(\sqrt{\ensuremath{\mathtt{n}}})$ time. These running times are similar to what can be achieved with the more careful implementation of a $ \mathtt{RootishArrayStack}$ discussed in Exercise 2.9.

Exercise 2..1   The $ \mathtt{List}$ method $ \mathtt{addAll(i,c)}$ inserts all elements of the $ \mathtt{Collection}$ $ \mathtt{c}$ into the list at position $ \mathtt{i}$. (The $ \mathtt{add(i,x)}$ method is a special case where $ \ensuremath{\mathtt{c}}=\{\ensuremath{\mathtt{x}}\}$.) Explain why, for the data structures in this chapter, it is not efficient to implement $ \mathtt{addAll(i,c)}$ by repeated calls to $ \mathtt{add(i,x)}$. Design and implement a more efficient implementation.

Exercise 2..2   Design and implement a $ \mathtt{RandomQueue}$. This is an implementation of the $ \mathtt{Queue}$ interface in which the $ \mathtt{remove()}$ operation removes an element that is chosen uniformly at random among all the elements in the queue. The $ \mathtt{add(x)}$ and $ \mathtt{remove()}$ operations in a $ \mathtt{RandomQueue}$ should take constant time.

Exercise 2..3   Design and implement a $ \mathtt{Treque}$ (triple-ended queue). This is a $ \mathtt{List}$ implementation in which $ \mathtt{get(i)}$ and $ \mathtt{set(i,x)}$ run in constant time and $ \mathtt{add(i,x)}$ and $ \mathtt{remove(i)}$ run in time

$\displaystyle O(1+\min\{\ensuremath{\mathtt{i}}, \ensuremath{\mathtt{n}}-\ensur...
...}}, \vert\ensuremath{\mathtt{n}}/2-\ensuremath{\mathtt{i}}\vert\}) \enspace .
$

With this running time, modifications are fast if they are near either end or near the middle of the list.

Exercise 2..4   Implement a method $ \mathtt{rotate(r)}$ that ``rotates'' a $ \mathtt{List}$ so that list item $ \mathtt{i}$ becomes list item $ (\ensuremath{\mathtt{i}}+\ensuremath{\mathtt{r}})\bmod n$. When run on an $ \mathtt{ArrayDeque}$, or a $ \mathtt{DualArrayDeque}$, $ \mathtt{rotate(r)}$ should run in $ O(1+\min\{r,n-r\})$.

Exercise 2..5   Modify the $ \mathtt{ArrayDeque}$ implementation so that the shifting done by $ \mathtt{add(i,x)}$, $ \mathtt{remove(i)}$, and $ \mathtt{resize()}$ is done using $ \mathtt{System.arraycopy(s,i,d,j,n)}$.

Exercise 2..6   Modify the $ \mathtt{ArrayDeque}$ implementation so that it does not use the $ \mathtt{\%}$ operator (which is expensive on some systems). Instead, it should make use of the fact that, if $ \mathtt{a.length}$ is a power of 2, then $ \mathtt{k\%a.length}$$ =$ $ \mathtt{k\&(a.length-1)}$. (Here, $ \mathtt{\&}$ is the bitwise-and operator.)

Exercise 2..7   Design and implement a variant of $ \mathtt{ArrayDeque}$ that does not do any modular arithmetic at all. Instead, all the data sits in a consecutive block, in order, inside an array. When the data overruns the beginning or the end of this array, a modified $ \mathtt{rebuild()}$ operation is performed. The amortized cost of all operations should be the same as in an $ \mathtt{ArrayDeque}$.

Hint: Making this work is really all about how a $ \mathtt{rebuild()}$ operation is performed. You would like $ \mathtt{rebuild()}$ to put the data structure into a state where the data cannot run off either end until at least $ \ensuremath{\mathtt{n}}/2$ operations have been performed.

Test the performance of your implementation against the $ \mathtt{ArrayDeque}$. Optimize your implementation (by using $ \mathtt{System.arraycopy(a,i,b,i,n)}$) and see if you can get it to outperform the $ \mathtt{ArrayDeque}$ implementation.

Exercise 2..8   Design and implement a version of a $ \mathtt{RootishArrayStack}$ that has only $ O(\sqrt{\ensuremath{\mathtt{n}}})$ wasted space, but that can perform $ \mathtt{add(i,x)}$ and $ \mathtt{remove(i,x)}$ operations in $ O(1+\min\{\ensuremath{\mathtt{i}},\ensuremath{\mathtt{n}}-\ensuremath{\mathtt{i}}\})$ time.

Exercise 2..9   Design and implement a version of a $ \mathtt{RootishArrayStack}$ that has only $ O(\sqrt{\ensuremath{\mathtt{n}}})$ wasted space, but that can perform $ \mathtt{add(i,x)}$ and $ \mathtt{remove(i,x)}$ operations in $ O(1+\min\{\sqrt{\ensuremath{\mathtt{n}}},\ensuremath{\mathtt{n}}-\ensuremath{\mathtt{i}}\})$ time. (For an idea on how to do this, see .)

Exercise 2..10   Design and implement a version of a $ \mathtt{RootishArrayStack}$ that has only $ O(\sqrt{\ensuremath{\mathtt{n}}})$ wasted space, but that can perform $ \mathtt{add(i,x)}$ and $ \mathtt{remove(i,x)}$ operations in $ O(1+\min\{\ensuremath{\mathtt{i}},\sqrt {\ensuremath{\mathtt{n}}},\ensuremath{\mathtt{n}}-\ensuremath{\mathtt{i}}\})$ time. (See for ideas on how to achieve this.)

opendatastructures.org