* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project
Download T(n)
Mathematical optimization wikipedia , lookup
Corecursion wikipedia , lookup
Pattern recognition wikipedia , lookup
Computational electromagnetics wikipedia , lookup
Exact cover wikipedia , lookup
Computational complexity theory wikipedia , lookup
Recursion (computer science) wikipedia , lookup
Factorization of polynomials over finite fields wikipedia , lookup
Genetic algorithm wikipedia , lookup
Simplex algorithm wikipedia , lookup
Algorithm characterizations wikipedia , lookup
Halting problem wikipedia , lookup
CSC 331: Algorithm Analysis Divide-and-Conquer Algorithms The divide-and-conquer strategy solves a problem by: Breaking it into subproblems that are themselves smaller instances of the same type of problem. Recursively solving these subproblems. Appropriately combining their answers. CSC 331: Algorithm Analysis Recurrence Relations Divide-and-conquer algorithms often follow a generic pattern: They tackle a problem of size n by recursively solving a subproblems of size n/b and then combining these answers in O(nd) time, for some a, b, d > 0. Their running time can therefore be captured by the equation T(n) = aT(⎡n/b⎤) + O(nd). If T(n) = aT(⎡n/b⎤) + O(nd) for some constants a > 0, b > 1, and d ≥ 0, then The Master Theorem tells us the running times of most of the divide-and-conquer procedures we are likely to use. CSC 331: Algorithm Analysis Binary Search Given a sorted array of numbers, return the index of the number k or -1 if it’s not there. We could go through each number and check to see if that number is k. If so, return the index. What’s the running time? Instead, we want to USE the fact that the array is sorted. What is a better way? (Think high/low game). Input: Look for the 13 in: 2 3 5 6 7 13 19 100 Look at middle number, 7. 13>7, so recurse on second half: 13 19 100 Look at middle number, 19. 13<19, so recurse on first half: 13 Look at middle number, 13. l found it, with only 3 recursions. Search in part of Array A for element k, and return its index. Assumes A is sorted. function binarySearch(A[left..right], k) // Base case: 0 elements left, I didn’t find it. if right< left return Not found mid<- (right-left/2) + left if k = A[mid] return mid if k < A[mid] return binarySearch(A[left..mid-1]) if k > A[mid] return binarySearch(A[mid+1..right]) What is the recurrence relation? What is the running time? function binarySearch(A[left..right], k) // Base case: 0 elements left, I didn’t find it. if right< left return Not found mid<- (right-left)/2 + left if k = A[mid] return mid if k < A[mid] return A[left..mid-1] if k > A[mid] return A[mid+1..right] CSC 331: Algorithm Analysis Mergesort The problem of sorting a list of numbers lends itself immediately to a divide-andconquer strategy. Split the list into two halves. Recursively sort each half. Merge the two sorted sublists. Input: 10 10 10 10 2 2 2 5 5 7 13 3 5 2 3 5 1 7 3 7 3 7 13 6 1 6 13 13 1 1 6 6 10 2 2 5 10 2 3 3 1 3 5 2 7 5 7 10 3 13 13 1 5 6 1 7 6 1 6 7 10 13 6 13 function mergesort(a[1...n]) if n > 1: return merge(mergesort(a[1...⎣n/2⎦]), mergesort(a[⎣n/2⎦+ 1...n])) else: return a function merge(x[1...k], y[1...m]) if k = 0: return y[1...m] if m = 0: return x[1...k] if x[1] ≤ y[1]: return x[1] ✪ merge(x[2...k], y[1...m]) else: return y[1] ✪ merge(x[1...k], y[2...m]) where ✪ is concatenation. function merge(x[1...k], y[1...m]) if k = 0: return y[1...m] if m = 0: return x[1...k] if x[1] ≤ y[1]: return x[1] ✪ merge(x[2...k], y[1...m]) else: return y[1] ✪ merge(x[1...k], y[2...m]) This merge procedure does a constant amount of work per recursive call (provided the required array space is allocated in advance), for a total running time of O(k + m). Thus merge’s are linear, and the overall time take by mergesort is T(n) = 2T(n/2) + O(n). a = 2, b = 2, d = 1 ➠ O(n log n) CSC 331: Algorithm Analysis Medians What is the median of [45, 1, 10, 30, 25]? 25 How do you compute the median of n numbers? Sorting takes O(n log n), can we do better? Sorting is doing far more work than we really need -- we just want the middle element and don’t care about the relative ordering of the rest of them. When looking for a recursive solution, it is paradoxically often easier to work with a more general version of the problem. Selection Input: A list of numbers S; an integer k Output: The kth smallest element of S For instance, if k = 1, the minimum of S is sought. For median, k = ⎣⎜S⎟/ 2⎦. For any number v, split list S into three categories: elements smaller than v (SL) elements equal to v (SV) elements greater than v (SR) The three sublists can be computed in linear time O(n). For instance, if the array S: 2 36 5 21 8 13 11 20 5 is split on v = 13, the three subarrays generated are: SL: 2 5 8 Sv: 13 SR: 36 21 20 11 5 4 1 4 1 Now we can instantly narrow down the search to one of these sublists. SL: 2 SV: 13 SR: 36 5 8 21 20 11 5 4 1 If we wanted, say, the ninth-smallest element of S, we know it must be the first-smallest element of SR since | SL | + | SV | = 8. selection(S,9) = selection(SR,1) if k ≤ | SL | selection(S,k) = selection(SL, k) if | SL | < k ≤ | SL | + | SV | selection(S,k) = v if k > | SL | + | SV | selection(S,k) = selection(SR, k - | SL | - | SV |) If we could always pick v so that ⎜SL⎟, ⎟ SR⎟≈ 1/2(S) T(n) = T(n/2) + O(n). a = 1, b = 2, d = 1 ➠ O(n) But this requires picking v to be the median, which is our ultimate goal! We will pick v randomly from S. Naturally, the running time of our algorithm depends on the random choice of v. The worst-case scenario would be if we kept picking v to be the largest (or smallest) element of the array. n + (n - 1) + (n - 2) + ... + n/2 = Θ(n2) The best-case scenario would be if we picked v to be the median element of the array. Θ(n) Where does the average running time lie? v is good if it lies within the 25th to 75th percentile of the array. A good v ensures that the sublists SL and SR have size at most threefourths that of S. Half of the elements of any list must fall between the 25th and 75th percentile. Lemma: On average a fair coin needs to be tossed two times before a “heads” is seen. After two split operations on average, the array will shrink to at most three-fourths of its size. Letting T(n) be the expected running time on an array of size n, we get T(n) ≤ T(3n/4) + O(n). a = 1, b = 4/3, d = 1 ➠ O(n) So, the overall ideas: Often the obvious way to solve a problem is not the most efficient. Try to find a way to divide the problem into pieces. IF you know the answer (recursively) to the piece, does that help you find the overall answer? Once you write a recursive function, see if its solution is quicker than your iterative way. Practice: (Book, problem 2.23) An array A[1..n] has a majority element if one element occurs in more than half of the entries. Design an O(nlgn) algorithm to decide if the array has a majority element and if so, what the element is. You cannot sort the array, or ask if A[i]>A[j], since comparisons are not supported on the array type. (For example, if the array were of GIF files.) You can, however, ask if A[i] = A[j].