1A. Run mystery(G, 3) on the digraph G... What are the values of (M, M, ..., M) when the algorithm finishes?
M = 7, M = 6, M = 7. Other values of M are not set.
1B. Suppose G is ANY digraph without cycles. Describe, in words, what mystery(G, s) returns.
mystery(G,s) returns the largest id of any vertex reachable from s.
1C. Over all input graphs with N vertices and M edges, what is the worst-case running time of mystery(G,v)?
O(N+M), because, as for DFS, the time is proportional to the number of vertices plus the number of edges. (The total time spent in the inner loop is proportional to the sum, over all vertices w, of the degree of w. This is proportional to the number of edges.)
1D. Carefully explain your reasoning that makes you believe your answer to 1B is correct.
The value returned for a vertex w is the maximum of the following quantities: the id of w, or the values returned for the children of w. This gives a recurrence relation for the value which, by induction, one can show gives the maximum id of any vertex reachable from w.
2A. Run mystery2(G, 1, 6) on the example graph from problem 1. What are the values S, S, ..., S when the algorithm finishes?
S=3, S=2, S=1, S=0, S=1, S=1, S=0, S=0
2B. In words, if G is ANY directed acyclic graph, what does mystery2(G, s, t) return?
The number of distinct paths from s to t.
3A. Show that with this implementation of the Stack class, there is a sequence of N push() and pop() operations that takes total time proportional to N2.
Perform pushes until the stack has size at least N/4 and the array has just doubled in size (with the last push). This takes at most N/2 pushes. After this, alternate pop() and push() until N operations have been performed.
Each of these pop() and push() operations causes the array to shrink or to grow, respectively, and takes time proportional to N (because the stack has size at least N/4). Since there are at least N/2 of these operations, the total time for all N operations is at least proportional to N2.
3B. Now suppose that the set_size() method of the Array class is modified...
Claim: if the Stack class is implemented with the Array class modified in this way, then any sequence of N push() and pop() operations on such a Stack takes O(N) time.
Prove this claim, or find a counter-example (a sequence of N push() and pop() operations taking more than O(N) time).
The claim is true. Fix any sequence of N push() or pop() operations. Say a push() or pop() operation is "expensive" if it causes the array to either double in size or to shrink (by a factor of 2). All other push() or pop() operations take time O(1), that is, they are constant-time operations.
By careful consideration, one can see that after any expensive push() or pop() operation where the size of the array is X, there must have been at least X/4 constant-time push() or pop() operations before the expensive operation (and after any previous expensive operation). Since the time to support the expensive operation is proportional to X, this means that the time spent for the expensive operation is proportional to the time spent on the constant-time operations preceding it (and after the previous expensive operation).
Thus, the total time to support all the expensive operations is proportional to the time to support all the constant-time operations. Since the latter is O(N), so is the former. Thus, the total time for all operations is O(N).
4A. For any positive integers N and M with N > M, show that there is a sequence of N UNION and FIND operations on M elements that takes time at least proportional to N log M.
First, do M/2 unions, where each union joins two sets of the same size, to build a single set of size at least M/4. Then, do N-M/2 finds on the element that is deepest in the tree representing that single set.
The unions build a tree of depth Ω(log(M)) , because with each union, the depth of the new tree is one larger than the depth of the two trees being joined. Thus, each of the N-M/2 > N/2 finds takes time at least Ω(log(N)).
Thus, the total time is at least Ω(N log(N)).
4B. For any positive integers N and M, prove that any sequence of N UNION and FIND operations on M elements takes time at most O(N log M).
Each union operation takes O(1) time. Each find operation takes time proportional to the depth of the found node in the tree. By induction, one can show that no tree has depth more than log(M). Thus, the time per operation is O(log(M)), and the total time is O(N log(M) ).