Download Introduction to Algorithms

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Artificial intelligence wikipedia , lookup

Computational electromagnetics wikipedia , lookup

Sieve of Eratosthenes wikipedia , lookup

Machine learning wikipedia , lookup

Mathematical optimization wikipedia , lookup

Lateral computing wikipedia , lookup

Post-quantum cryptography wikipedia , lookup

Recursion (computer science) wikipedia , lookup

Multiplication algorithm wikipedia , lookup

Cluster analysis wikipedia , lookup

Corecursion wikipedia , lookup

Travelling salesman problem wikipedia , lookup

Fisher–Yates shuffle wikipedia , lookup

Dijkstra's algorithm wikipedia , lookup

Radix sort wikipedia , lookup

Computational complexity theory wikipedia , lookup

K-nearest neighbors algorithm wikipedia , lookup

Expectation–maximization algorithm wikipedia , lookup

Pattern recognition wikipedia , lookup

Algorithm characterizations wikipedia , lookup

Quicksort wikipedia , lookup

Fast Fourier transform wikipedia , lookup

Theoretical computer science wikipedia , lookup

Planted motif search wikipedia , lookup

Genetic algorithm wikipedia , lookup

Factorization of polynomials over finite fields wikipedia , lookup

Algorithm wikipedia , lookup

Time complexity wikipedia , lookup

Transcript
Design and Analysis of
Algorithms
Introduction
Class Policy
• Grading
– Homeworks and quizzes (20%)
• Programming assignments
– First and Second exams (20% each)
• Closed books, closed notes
Introduction to Algorithms,
– Final exam (40%)
• Closed books, closed notes
Thomas H. Cormen, Charles
E. Leiserson, Ronald L.
Rivest and Clifford Stein
• Late homework
– 12% penalty for each day of delay, up to 2 days
2
Homework Submission
• Submit all homeworks using the
e-learning system
http://elearning.just.edu.jo/
3
Why Study Algorithms?
• Necessary in any computer programming problem
– Improve algorithm efficiency: run faster, process more data, do
something that would otherwise be impossible
– Solve problems of significantly large size
– Technology only improves things by a constant factor
• Compare algorithms
• Algorithms as a field of study
– Learn about a standard set of algorithms
– New discoveries arise
– Numerous application areas
• Learn techniques of algorithm design and analysis
Introduction
4
Applications
• Multimedia
– CD player, DVD, MP3, JPG, DivX, HDTV
• Internet
– Packet routing, data retrieval (Google)
• Communication
– Cell-phones, e-commerce
• Computers
– Circuit layout, file systems
• Science
– Human genome
• Transportation
– Airline crew scheduling, ARAMEX and DHL deliveries
Introduction
5
Roadmap
• Different problems
• Different design paradigms
– Searching
– Divide-and-conquer
– Sorting
– Greedy algorithms
– Graph problems
– Dynamic programming
Introduction
6
Analyzing Algorithms
•
Predict the amount of resources required:
•
memory: how much space is needed?
•
computational time: how fast the algorithm runs?
•
FACT: running time grows with the size of the input
•
Input size (number of elements in the input)
–
Size of an array, polynomial degree, # of elements in a matrix, # of bits in
the binary representation of the input, vertices and edges in a graph
Def: Running time = the number of primitive operations (steps) executed
before termination
–
Arithmetic operations (+, -, *), data movement, control, decision making
(if, while), comparison
Introduction
7
Algorithm Efficiency vs. Speed
E.g.: Sorting n numbers
Sort 106 numbers!
Friend’s computer = 109 instructions/second
Friend’s algorithm = 2n2 instructions
Your computer = 107 instructions/second
Your algorithm = 50nlgn instructions
2  10  instructio ns
6 2
Your friend =
You =
9
 2000seconds
10 instructio ns / second
50  106  lg10 6 instructio ns
 100seconds
7
10 instructio ns / second
2000/100 = 20 times better!!
Introduction
8
Algorithm Analysis: Example
• Alg.: MIN (a[1], …, a[n])
m ← a[1];
for i ← 2 to n
if a[i] < m
then m ← a[i];
• Running time:
– the number of primitive operations (steps) executed
before termination
T(n) =1 [first step] + (n) [for loop] + (n-1) [if condition] +
(n-1) [the assignment in then] = 3n - 1
• Order (rate) of growth:
– The leading term of the formula
– Expresses the asymptotic behavior of the algorithm
Introduction
9
Typical Running Time Functions
• 1 (constant running time):
– Instructions are executed once or a few times
• logN (logarithmic)
– A big problem is solved by cutting the original problem in smaller
sizes, by a constant fraction at each step
• N (linear)
– A small amount of processing is done on each input element
• N logN
– A problem is solved by dividing it into smaller problems, solving
them independently and combining the solution
Introduction
10
Typical Running Time Functions
• N2 (quadratic)
– Typical for algorithms that process all pairs of data items (double
nested loops)
• N3(cubic)
– Processing of triples of data (triple nested loops)
• NK (polynomial)
• 2N (exponential)
– Few exponential algorithms are appropriate for practical use
Introduction
11
Practical Complexity
250
f(n) = n
f(n) = log(n)
f(n) = n log(n)
f(n) = n^2
f(n) = n^3
f(n) = 2^n
0
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20
Practical Complexity
500
f(n) = n
f(n) = log(n)
f(n) = n log(n)
f(n) = n^2
f(n) = n^3
f(n) = 2^n
0
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20
Practical Complexity
1000
f(n) = n
f(n) = log(n)
f(n) = n log(n)
f(n) = n^2
f(n) = n^3
f(n) = 2^n
0
1
3
5
7
9
11
13
15
17
19
Why Faster Algorithms?
5000
4000
f(n) = n
f(n) = log(n)
3000
f(n) = n log(n)
f(n) = n^2
2000
f(n) = n^3
f(n) = 2^n
1000
0
1
3
5
7
9
11
13
15
17
19
Asymptotic Notations
• A way to describe behavior of functions in the limit
– Abstracts away low-order terms and constant factors
– How we indicate running times of algorithms
– Describe the running time of an algorithm as n grows to 
• O notation: asymptotic “less than and equal”:
f(n) “≤” g(n)
•  notation: asymptotic “greater than and equal”:f(n) “≥” g(n)
•  notation: asymptotic “equality”:
Introduction
f(n) “=” g(n)
16
Asymptotic Notations - Examples
•  notation
– n2/2 – n/2
= (n2)
– (6n3 + 1)lgn/(n + 1) = (n2lgn)
– n vs. n2
n ≠ (n2)
•  notation
• O notation
– n3 vs. n2
n3 = (n2)
– 2n2 vs. n3
2n2 = O(n3)
– n vs. logn
n = (logn)
– n2 vs. n2
n2 = O(n2)
– n vs. n2
n (n2)
– n3 vs. nlogn n3 O(nlgn)
Introduction
17
Recursive Algorithms
•
Binary search: for an ordered array A, finds if x is in the array
A[lo…hi]
Alg.:BINARY-SEARCH (A, lo, hi, x)
1
2
3
4
2
3
5
7
if (lo > hi)
return FALSE
lo
mid  (lo+hi)/2
if x = A[mid]
return TRUE
if ( x < A[mid] )
BINARY-SEARCH (A, lo, mid-1, x)
if ( x > A[mid] )
BINARY-SEARCH (A, mid+1, hi, x)
Introduction
5
6
7
8
9 10 11 12
mid
hi
18
Recurrences
Def.: Recurrence = an equation or inequality that
describes a function in terms of its value on smaller
inputs, and one or more base cases
• E.g.: T(n) = T(n-1) + n
• Useful for analyzing recurrent algorithms
• Methods for solving recurrences
–
–
–
–
Iteration method
Substitution method
Recursion tree method
Master method
Introduction
19
Sorting
Iterative methods:
• Insertion sort
• Bubble sort
• Selection sort
2, 3, 4, 5, 6, 7, 8, 9, 10, J, Q, K, A
Divide and conquer
• Merge sort
• Quicksort
Non-comparison methods
• Counting sort
• Radix sort
• Bucket sort
Introduction
20
Types of Analysis
• Worst case (e.g. cards reversely ordered)
– Provides an upper bound on running time
– An absolute guarantee that the algorithm would not
run longer, no matter what the inputs are
• Best case
(e.g., cards already ordered)
– Input is the one for which the algorithm runs the
fastest
• Average case (general case)
– Provides a prediction about the running time
– Assumes that the input is random
Introduction
21