Download Lecture_12___Heaps_A.. - School of Computer Science

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Array data structure wikipedia , lookup

Lattice model (finance) wikipedia , lookup

B-tree wikipedia , lookup

Quadtree wikipedia , lookup

Red–black tree wikipedia , lookup

Interval tree wikipedia , lookup

Binary tree wikipedia , lookup

Binary search tree wikipedia , lookup

Transcript
15-211
Fundamental Structures
of Computer Science
- Heaps
Feb 17, 2005
Ananda Guna
Priority Queues
Priority Queue Data Structure
A data structure that can maintain order
statistics
 What is the k-th smallest entry in a list?
A priority queue is a container that
supports the operations
insert(item, priority)
removeMin().
FindMin()
decreaseKey(PQpointer,newPriority)
What are the applications of Priority
queues?
Implementing a PQ
How to implement a PQ?
Unordered list?
What is the insertion complexity? Removal?
Sorted List?
What about a splay tree?
Complete Binary Trees
Complete binary trees
13
21
24
65
26
16
31
32
19
68
Complete binary trees
1
2
3
4
8
5
9
10
6
7
Representing complete binary trees
Linked structures?
1
2
3
4
8
5
9
10
6
7
Representing complete binary trees
Linked structures? No!
Arrays!
1
2
3
4
8
5
9
10
6
7
Representing complete binary trees
Arrays
Parent at position i
Children at 2i and 2i+1.
1
2
3
4
8
5
9
10
6
7
Representing complete binary trees
Arrays (1-based)
Parent at position i
Children at 2i and 2i+1.
1 2 3
4 5 6 7
8 9 10
1
2
3
4
8
5
9
10
6
7
Representing complete binary trees
Arrays (1-based)
Parent at position i
Children at 2i and 2i+1.
1 2 3
4 5 6 7
8 9 10
1
2
3
4
8
5
9
10
6
7
Representing complete binary trees
Arrays (1-based)
Parent at position i
Children at 2i (and 2i+1).
1 2 3
4 5 6 7
8 9 10
1
2
3
4
8
5
9
10
6
7
Heaps
Representation invariant
1. Structure property
Complete binary tree
2. Heap order property
Parent keys less than children keys
Heaps
Representation invariant
1. Structure property
Complete binary tree
Hence: efficient compact representation
2. Heap order property
Parent keys less than children keys
Hence: rapid insert, findMin, and deleteMin
• O(log(N)) for insert and deleteMin
• O(1) for findMin
An Example of a Heap
The heap order property
Each parent is less than each of its
children.
Hence: Root is less than every other
node.
Proof by induction
13
21
24
65
26
16
31
32
19
68
The heap order property
Each parent is less than each of its
children.
Hence: Root is less than every other
node.
Proof by induction
13
21
24
13
16
31 19 68
65 26 32
16
24
13
19
31 21 68
65 26 32
24
32
16
26 19 21
65 68 31
Heaps to perform PQ operations
How to code a PQ operations using a
Heap?
findMin() –
 The code
public boolean isEmpty() {
return size == 0;
}
public Comparable findMin() {
if(isEmpty()) return null;
return heap[1];
}
Does not change the tree
Trivially preserves the invariant
Insert Operation
Insert(x) –
put the new element into next leaf
position (maintain complete tree
property) and then swap it up until it's
<= its parent
 More formally…
insert (Comparable x)
Process
1. Create a “hole” at the next tree cell for x.
heap[size+1]
This preserves the completeness of the tree.
2. Percolate the hole up the tree until the
heap order property is satisfied.
This assures the heap order property is satisfied.
insert (Comparable x)
Process
1. Create a “hole” at the next tree cell for x.
heap[size+1]
This preserves the completeness of the tree
assuming it was complete to begin with.
2. Percolate the hole up the tree until the
heap order property is satisfied.
This assures the heap order property is
satisfied assuming it held at the outset.
Percolation up
public void insert(Comparable x) throws
Overflow
{
if(isFull()) throw new Overflow();
for(int i = ++size;
i>1 && x.compareTo(heap[i/2])<0;
i/=2)
heap[i] = heap[i/2];
heap[i] = x;
}
Percolation up
Bubble the hole up the tree until the
heap order property is satisfied.
i = 11
HOP false
13
21
24
65
26
16
31
32
14
19
68
Not really there...
Percolation up
Bubble the hole up the tree until the heap
order property is satisfied.
i = 11
HOPE false
i=5
HOP false
13
21
13
16
24
31
65 26
32 14
19 68
21
16
24
14
65 26
32 31
19 68
Percolation up
Bubble the hole up the tree until the heap
order property is satisfied.
i=5
HOP false
i=2
HOP true
13
21
13
16
24
14
65 26
32 31
done
19 68
14
16
24
21
65 26
32 31
19 68
Insert(x) operation summary
Insert the new element into the next
leaf position and then swap it up
until it's <= its parent
Percolate up
Question: When did we add any new
ancestor-descendant relationship?
What is the total runtime in
insert(x)?
RemoveMin operation
Just return the element on the top
Complexity?
But what about the heap order
property?
How do we replace the top element?
Put rightmost leaf into root position
and Percolate down
i.e swap with smaller child and recurse
until current node is _______________
Percolation down
Bubble the transplanted leaf value down
the tree until the heap order property is
satisfied.
1
2
-14
16
24
21
65 26
32 31
19 68
31
14
24
65 26
16
21
32
19 68
Percolation down
Bubble the transplanted leaf value down
the tree until the heap order property is
satisfied.
2
3
31
14
24
65 26
16
21
32
19 68
14
31
24
65 26
16
21
32
19 68
Percolation down
Bubble the transplanted leaf value down
the tree until the heap order property is
satisfied.
3
4
14
31
24
65 26
16
21
32
19 68
14
21
24
65 26
done
16
31
32
19 68
Build Heap Operation
Suppose we have an unordered array of
elements. How do we build a heap?
for(i= size/2; i > 0; --i)
percolateDown(i);
Example:
BuildHeap analysis
What is the complexity of this
operation?
How can this be O(n)?
Work area
Sub-quadratic Sorting
Algorithms
Sorting Comparison
 We can categorize sorting algorithms into two major
classes
Fast Sorts
versus
Slow Sorts
O (N log2 N)
O (N2)
slow sorts are easy to code and sufficient when the
amount of data is small
N
10
100
1,000
10,000
100,000
n*n
100
10,000
1,000,000
1,000,000,000
10,000,000,000
N * log(N)
33
664
9,966
132,877
1,660,964
Heap Sort
 Suppose we have an unsorted array of
comparable objects that need to be
sorted. Assume that entries are in 1..n
 First we will build a max heap by
percolating down - this time we need to
use the larger of the two children
 Then we deleteMax. We will place the
max at the end of the array, reduce the
size of the heap by 1 and percolate
down to maintain maxHeap property
Heap Sort Demo
Start with 54 32 62 12 72 70
 Here is the max heap
72
54
70
12
32
62
Heap Sort Demo ctd..
 Now move max 72 to the back of the array and maintain the heap
order property.
72
70
54
54
70
12
32
Original Heap
62
62
12
32
72
Heap after removing the max
Heap Sort Demo ctd..
 Now move 70 to the back of the array and maintain the heap
order property
70
62
54
54
62
12
32
72
Heap before removing the max 70
32
12
70
Heap after removing the max 70
72
Heap Sort Demo ctd..
 Now move 62 to the back of the array (right most leaf node)
and maintain the heap order property
62
54
12
54
32
12
70
Heap before removing max 62
72
32
62
70
Heap after removing max 62
72
Heap Sort Demo ctd..
 Now move 54 to the back of the array (right most leaf node)
and maintain the heap order property
54
32
12
12
32
62
70
Heap before removing max 54
72
54
62
70
Heap after removing max 54
72
Heap Sort Demo ctd..
 Now move 32 to the back of the array (right most leaf node)
and maintain the heap order property
32
12
32
12
54
62
70
Heap before removing max 32
72
54
62
70
Heap after removing max 32
Now the array is sorted
72
Heapsort Analysis
Recall facts about heaps:
buildHeap has O(N) worst-case running
time.
removeMax has O(log N) worst-case
running time.
Heapsort:
Build heap.
DeleteMax until empty.
Total worst case:
O(N)
O(Nlog N)
O(Nlog N)
Sorting in O(Nlog N)
Heapsort establishes the fact that
sorting can be accomplished in
O(Nlog N) worst-case running time.
Recall another O(NlogN) sort is the
mergesort
In fact, we know that it is possible to
prove any sorting algorithm that
compare keys, will have require at
least O(Nlog N) in the worst case.
Heapsort in practice
The average-case analysis for
heapsort is somewhat complex.
In practice, heapsort consistently
tends to use nearly Nlog N
comparisons.
So, while the worst case is better
than N2, other algorithms sometimes
work better.
Practice vs big-O and big-
We know that some algorithms have
inferior big-O and/or big-, but do
very well in practice.
And conversely, algorithms like
heapsort have good big-O but tend to
be “conservative”.
The choice of algorithm thus
sometimes depends on the particular
application.
Summary
 HW3 will be released soon
 About data compression
 Due 3/17
 Group of 2 allowed
 Start Early and discuss with your
partner as often as you can
Next Week Compression
Tuesday – Prefix codes and Huffman
Thursday - LZW