Show a Proof of correctness and state, and solve the Reccurrence using the Maste
ID: 3792170 • Letter: S
Question
Show a Proof of correctness and state, and solve the Reccurrence using the Master Theorem. You are scheduling a parade full for some very self-important groups, who have all submitted their float to the parade. Each group has a cutoff time, ci , such that if their float comes on at time t ci , then all pi members of the group will come to watch the parade. If the float comes on at time t > ci , then all of them will stay home. All floats take exactly one minute to parade and can start as early as time 0. Your job is to maximize the number of people who show up. Give an efficient algorithm to schedule all floats given (ci , pi) for 1 i n, where each ci R + and each pi N. Please do so using the greedy algorithm.
Explanation / Answer
The Master Theorem:
recurrences:
We now introduce a general method, called the master method, for solving recurrences where all the subproblems are of the same size. We assume that the input to the master method is a recurrence of the form T(n) = a · T n b + O(n d ). In this recurrence, there are three constants:
• a is the number of subproblems that we create from one problem, and must be an integer greater than or equal to 1. • b is the factor by which the input size shrinks (it must hold that b > 1). • d is the exponent of n in the time it takes to generate the subproblems and combine their solutions. There is another constant “hidden” in the big-O notation. We will introduce it in the proof and see that it does not affect the result. In addition, we need to specify the “base case” of the recurrence, that is, the runtime when the input gets small enough. For a sufficiently small n (say, when n = 1), the worst-case runtime of the algorithm is constant, namely, T(n) = O(1). We now state the master theorem, which is used to solve the recurrences.
Master Theorem:
Let T(n) = a · T n b + O(n d ) be a recurrence where a 1, b > 1. Then, T(n) = O(n d log n) if a = b d O(n d ) if a < bd O(n logb a ) if a > bd Remark 1. In some cases, the recurrence may involve subproblems of size d n b e, b n b c, or n b + 1. The master theorem holds for these cases as well. However, we do not prove that here. Before we turn to the proof of the master theorem, we show how it can be used to solve the recurrences we saw earlier. • Mult1: T(n) = 4T n 2 + O(n). The parameters are a = 4, b = 2, d = 1, so a > bd , hence T(n) = O(n log2 4 ) = O(n 2 ). • Karatsuba: T(n) = 3T n 2 + O(n). The parameters are a = 3, b = 2, d = 1, so a > bd , hence T(n) = O(n log2 3 ) = O(n 1.59). • MergeSort: T(n) = 2T n 2 + O(n). The parameters are a = 2, b = 2, d = 1, so a = b d , hence T(n) = O(n log n). • Another example: T(n) = 2T n 2 + O(n 2 ). The parameters are a = 2, b = 2, d = 2, so a < bd , hence T(n) = O(n 2 ). We see that for integer multiplication, Karatsuba is the clear winner! Proof of the Master Theorem. Let T(n) = a · T n b + O(n d ) be the recurrence we solve using the master theorem. For simplicity, we assume that T(1) = 1 and that n is a power of b. From the definition of big-O, we know that there is a constant c > 0 such that for sufficiently large n, T(n) a · T n b + c · n d . The proof of the master theorem will use the recursion tree in a similar way to our analysis of the running time of MergeSort.
Level 0: n v } ( Level 1: n/b v } ! n/b . . . n/b Level 2: n/b2 n/b2 · · · n/b2 · · · v ~ ! . . . Level logb n: 1 1 · · · 1 · · · The recursion tree drawn above has logb n + 1 level. We analyze the amount of work done at each level, and then sum over all levels in order to get the total running time. Consider level j. At level j, there are a j subproblems. Each of these subproblems is of size n b j , and will take time at most c n b j d to solve (this only considers the work done at level j and does not include the time it takes to solve the subsubproblems). We conclude that the total work done at level j is at most a j · c n b j d = cnd a b d j . Writing the running time this way shows us where the terms a and b d come from: a is the branching factor and measures how the number of subproblems grows at each level, and b d is the shrinkage in the work needed (per subproblem). Summing over all levels, we get that the total running time is at most cnd Plogbn j=0 a b d j . We now consider each of the three cases. 1. a = b d . In this case, the amount of work done at each level is the same: cnd . Since there are logb n + 1 levels, the total running time is at most (logb n + 1)cnd = O(n d log n). 2. a < bd . In this case, a b d < 1, hence, Plogb n j=0 a b d j P j=0 a b d j = 1 1 a bd = b d b da . Hence, the total running time is cnd · b d b da = O(n d ). Intuitively, in this case the shrinkage in the work needed per subproblem is more significant, so the work done in the highest level “dominates” the other factors in the running time. 3. a > bd . In this case, Plogb n j=0 a b d j = ( a bd ) logb n+11 a bd 1 . Since a, b, c, d are constants, we get that the total work done is O n d · a b d logb n = O n d · a logb n b d logb n = O n d · n logb a nd = O(n logb a ). Intuitively, here the branching factor is more significant, so the total work done at each level increases, and the leaves of the tree “dominate”. We conclude with a more general version of the master theorem.