Download Priority Queue / Heap - Algorithms and Complexity

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Quadtree wikipedia , lookup

Lattice model (finance) wikipedia , lookup

Red–black tree wikipedia , lookup

B-tree wikipedia , lookup

Binary search tree wikipedia , lookup

Transcript
Priority Queue / Heap
• Stores (key,data) pairs (like dictionary)
• But, different set of operations:
•
•
•
•
•
•
•
Initialize‐Heap: creates new empty heap
Is‐Empty: returns true if heap is empty
Insert(key,data): inserts (key,data)‐pair, returns pointer to entry Get‐Min: returns (key,data)‐pair with minimum key
Delete‐Min: deletes minimum (key,data)‐pair
Decrease‐Key(entry,newkey): decreases key of entry to newkey
Merge: merges two heaps into one
Algorithm Theory, WS 2012/13
Fabian Kuhn
1
Implementation of Dijkstra’s Algorithm
Store nodes in a priority queue, use 1. Initialize ,
0 and 2. All nodes are unmarked
,
,
∞ for all 3. Get unmarked node which minimizes 4.
mark node 5.
For all ,
∈ , ,
as keys:
min
,
:
,
,
,
6. Until all nodes are marked
Algorithm Theory, WS 2012/13
Fabian Kuhn
2
Analysis
Number of priority queue operations for Dijkstra:
• Initialize‐Heap:
• Is‐Empty:
| |
• Insert:
| |
• Get‐Min:
| |
• Delete‐Min:
| |
• Decrease‐Key: | |
• Merge:
Algorithm Theory, WS 2012/13
Fabian Kuhn
3
Priority Queue Implementation
Implementation as min‐heap:
 complete binary tree,
e.g., stored in an array
• Initialize‐Heap:
• Is‐Empty:
• Insert:
• Get‐Min:
• Delete‐Min:
• Decrease‐Key:
• Merge (heaps of size Algorithm Theory, WS 2012/13
and , Fabian Kuhn
):
4
Better Implementation
• Can we do better?
• Cost of Dijkstra with complete binary min‐heap implementation:
log
• Can be improved if we can make decrease‐key cheaper…
• Cost of merging two heaps is expensive
• We will get there in two steps:
Binomial heap  Fibonacci heap
Algorithm Theory, WS 2012/13
Fabian Kuhn
5
Definition: Binomial Tree
Binomial tree of order 0 :
Algorithm Theory, WS 2012/13
Fabian Kuhn
6
Binomial Trees
Algorithm Theory, WS 2012/13
Fabian Kuhn
7
Properties
1. Tree has 2 nodes
2. Height of tree 3. Root degree of 4. In is is , there are exactly Algorithm Theory, WS 2012/13
nodes at depth Fabian Kuhn
8
Binomial Coefficients
• Binomial coefficient:
: #of
element
subsetsofasetofsize
• Property: Pascal triangle:
Algorithm Theory, WS 2012/13
Fabian Kuhn
9
Number of Nodes at Depth in Claim: In , there are exactly Algorithm Theory, WS 2012/13
nodes at depth Fabian Kuhn
10
Binomial Heap
• Keys are stored in nodes of binomial trees of different order
nodes: there is a binomial tree or order iff
bit of base‐2 representation of is 1.
• Min‐Heap Property:
Key of node Algorithm Theory, WS 2012/13
keys of all nodes in sub‐tree of Fabian Kuhn
11
Example
• 10 keys: 2, 5, 8, 9, 12, 14, 17, 18, 20, 22, 25
• Binary representation of : 11
 trees , , and present
12
9
14
18
25
1011
5
2
20
8
17
22
Algorithm Theory, WS 2012/13
Fabian Kuhn
12
Child‐Sibling Representation
Structure of a node:
Algorithm Theory, WS 2012/13
parent
key
degree
child
sibling
Fabian Kuhn
13
Link Operation
• Unite two binomial trees of the same order to one tree:
⨁
⇒
12
• Time: 


15
20
18


⨁
25
40


22


30

Algorithm Theory, WS 2012/13

Fabian Kuhn
14
Merge Operation
Merging two binomial heaps:
• For , ,…,
:
If there are 2 or 3 binomial trees : apply link operation to merge 2 trees into one binomial tree ∪
Algorithm Theory, WS 2012/13
Fabian Kuhn
Time:
15
Example
9
12
22
18
13
14
5
2
20
8
17
25
Algorithm Theory, WS 2012/13
Fabian Kuhn
16
Operations
Initialize: create empty list of trees
Get minimum of queue: time 1 (if we maintain a pointer)
Decrease‐key at node :
• Set key of node to new key
• Swap with parent until min‐heap property is restored
• Time: log
Insert key into queue :
1. Create queue of size 1 containing only 2. Merge and • Time for insert: Algorithm Theory, WS 2012/13
log
Fabian Kuhn
17
Operations
Delete‐Min Operation:
1. Find tree 2. Remove with minimum root from queue  queue ′
3. Children of form new queue ′′
4. Merge queues ′ and ′′
• Overall time: Algorithm Theory, WS 2012/13
Fabian Kuhn
18
Delete‐Min Example
12
9
14
18
25
2
5
20
8
17
22
Algorithm Theory, WS 2012/13
Fabian Kuhn
19
Complexities Binomial Heap
• Initialize‐Heap:
• Is‐Empty:
• Insert:
• Get‐Min:
• Delete‐Min:
• Decrease‐Key:
• Merge (heaps of size Algorithm Theory, WS 2012/13
and , Fabian Kuhn
):
20
Can We Do Better?
• Binomial heap:
insert, delete‐min, and decrease‐key cost log
• One of the operations insert or delete‐min must cost Ω log
:
– Heap‐Sort:
Insert elements into heap, then take out the minimum times
– (Comparison‐based) sorting costs at least Ω log .
• But maybe we can improve decrease‐key and one of the other two operations?
• Structure of binomial heap is not flexible:
– Simplifies analysis, allows to get strong worst‐case bounds
– But, operations almost inherently need at least logarithmic time
Algorithm Theory, WS 2012/13
Fabian Kuhn
21
Fibonacci Heaps
Lacy‐merge variant of binomial heaps:
• Do not merge trees as long as possible…
Structure:
A Fibonacci heap consists of a collection of trees satisfying the min‐heap property.
Variables:
•
.
: root of the tree containing the (a) minimum key
•
.
: circular, doubly linked, unordered list containing
the roots of all trees
•
.
: number of nodes currently in Algorithm Theory, WS 2012/13
Fabian Kuhn
22
Trees in Fibonacci Heaps
Structure of a single node :
parent
.
•
•
.
.
degree
child
mark
right
left
•
key
: points to circular, doubly linked and unordered list of
the children of , .
: pointers to siblings (in doubly linked list)
: will be used later…
Advantages of circular, doubly linked lists:
• Deleting an element takes constant time
• Concatenating two lists takes constant time
Algorithm Theory, WS 2012/13
Fabian Kuhn
23
Example
Figure: Cormen et al., Introduction to Algorithms
Algorithm Theory, WS 2012/13
Fabian Kuhn
24
Simple (Lazy) Operations
Initialize‐Heap : •
.
≔ .
≔
Merge heaps and ′:
• concatenate root lists
• update .
Insert element into :
• create new one‐node tree containing  H′
• merge heaps and ′
Get minimum element of :
• return .
Algorithm Theory, WS 2012/13
Fabian Kuhn
25
Operation Delete‐Min
Delete the node with minimum key from and return its element:
1.
≔ .
;
2. if .
0 then
3.
remove .
from .
4.
add .
.
(list) to .
5.
.
;
;
// Repeatedly merge nodes with equal degree in the root list
// until degrees of nodes in the root list are distinct.
// Determine the element with minimum key
6. return
Algorithm Theory, WS 2012/13
Fabian Kuhn
26
Rank and Maximum Degree
Ranks of nodes, trees, heap:
Node :
•
: degree of Tree :
•
: rank (degree) of root node of Heap :
•
: maximum degree of any node in Assumption ( : number of nodes in ):
– for a known function Algorithm Theory, WS 2012/13
Fabian Kuhn
27
Merging Two Trees
Given: Heap‐ordered trees , ′ with • Assume: min‐key of min‐key of ′
Operation ,
:
• Removes tree ′ from root list
and adds to child list of •
•
.
≔
≔
′
1
′
Algorithm Theory, WS 2012/13
Fabian Kuhn
28
Consolidation of Root List
Array pointing to find roots with the same rank:
0 1 2
Consolidate:
Time:
1. for ≔ 0 to
do
≔ null;
| .
2. while .
null do
3.
≔ “delete and return first element of .
4.
while null do
≔
;
5.
6.
≔
;
7.
≔
,
8.
≔
9. Create new .
and .
Algorithm Theory, WS 2012/13
Fabian Kuhn
”
29
Consolidate Example
25
5
31
9
12
14
12
18
Algorithm Theory, WS 2012/13
13
8
25
1
2
31
13
8
15
17
3
19
7
22
5
14
2
20
18
9
1
20
15
17
3
19
7
22
Fabian Kuhn
30
Consolidate Example
5
9
12
14
18
20
25
1
2
31
13
8
1
2
13
8
12
17
3
19
7
22
5
9
15
14
18
Algorithm Theory, WS 2012/13
20
25
22
31
Fabian Kuhn
15
17
3
19
7
31
Consolidate Example
5
9
12
14
18
20
25
22
31
5
9
12
14
18
Algorithm Theory, WS 2012/13
1
2
13
8
1
20
25
22
31
15
Fabian Kuhn
15
7
17
8
3
19
3
19
2
13
17
7
32
Consolidate Example
5
9
12
1
14
18
20
25
22
31
2
13
9
12
14
18
Algorithm Theory, WS 2012/13
25
20
31
2
13
15
17
7
1
5
17
8
3
19
15
8
3
19
7
22
Fabian Kuhn
33
Consolidate Example
1
5
9
12
14
18
25
20
2
13
7
22
1
5
9
12
17
8
3
19
31
15
14
18
Algorithm Theory, WS 2012/13
25
20
31
13
3
19
2
15
8
17
7
22
Fabian Kuhn
34
Consolidate Example
1
5
9
12
14
18
25
20
13
3
19
31
2
15
8
17
7
22
1
5
9
12
14
18
Algorithm Theory, WS 2012/13
25
20
31
2
13
8
3
19
7
15
17
22
Fabian Kuhn
35
Operation Decrease‐Key
Decrease‐Key
1.
2.
3.
4.
5.
6.
7.
8.
9.
,
: (decrease key of node to new value )
.
then return;
.
≔ ; update .
;
if ∈ .
∨
.
repeat
≔ .
;
.
;
≔
;
until .
∨ ∈ .
if ∉ .
then .
if
Algorithm Theory, WS 2012/13
Fabian Kuhn
.
then return
;
≔
;
36
Operation Cut( )
Operation .
:
• Cuts ’s sub‐tree from its parent and adds to rootlist
1. if ∉ .
then
2.
// cut the link between and its parent
3.
.
≔
.
1;
4.
remove from .
.
(list)
5.
.
≔ null;
6.
add to .
1
25
31
2
13
1
8
3
19
15
25
7
Algorithm Theory, WS 2012/13
13
3
19
2
7
15
8
31
Fabian Kuhn
37
Decrease‐Key Example
• Green nodes are marked
5
9
12
14
1
20
22
13
25
Decrease‐Key
5
14
20
17
7
,
18
9
22
12
1
25
31
Algorithm Theory, WS 2012/13
15
8
3
19
31
18
2
Fabian Kuhn
2
13
17
8
3
19
15
7
38
Fibonacci Heap Marks
History of a node :
is being linked to a node .
≔
a child of is cut .
≔
a second child of is cut .
• Hence, the boolean value .
indicates whether node has lost a child since the last time was made the child of another node.
Algorithm Theory, WS 2012/13
Fabian Kuhn
39
Cost of Delete‐Min & Decrease‐Key
Delete‐Min:
1. Delete min. root and add .
to .
time: 1
2. Consolidate .
time: lengthof .
• Step 2 can potentially be linear in (size of )
Decrease‐Key (at node ):
1. If new key parent key, cut sub‐tree of node time: 1
2. Cascading cuts up the tree as long as nodes are marked
time: numberofconsecutivemarkednodes
• Step 2 can potentially be linear in Exercises: Both operations can take Algorithm Theory, WS 2012/13
Fabian Kuhn
time in the worst case!
40
Cost of Delete‐Min & Decrease‐Key
• Cost of delete‐min and decrease‐key can be Θ
– Seems a large price to pay to get insert and merge in …
1 time
• Maybe, the operations are efficient most of the time?
– It seems to require a lot of operations to get a long rootlist and thus, an expensive consolidate operation
– In each decrease‐key operation, at most one node gets marked:
We need a lot of decrease‐key operations to get an expensive decrease‐key operation
• Can we show that the average cost per operation is small?
• We can  requires amortized analysis
Algorithm Theory, WS 2012/13
Fabian Kuhn
41
Amortization
• Consider sequence , , … , of operations
(typically performed on some data structure )
•
•
: execution time of operation ≔
⋯
: total execution time
• The execution time of a single operation might
vary within a large range (e.g., ∈ 1,
)
• The worst case overall execution time might still be small
 average execution time per operation might be small in
the worst case, even if single operations can be expensive
Algorithm Theory, WS 2012/13
Fabian Kuhn
42
Analysis of Algorithms
• Best case
• Worst case
• Average case
• Amortized worst case
What it the average cost of an operation
in a worst case sequence of operations?
Algorithm Theory, WS 2012/13
Fabian Kuhn
43
Example: Binary Counter
Incrementing a binary counter: determine the bit flip cost:
Operation
Counter Value
Cost
00000
1
00001
1
2
00010
2
3
00011
1
4
00100
3
5
00101
1
6
00110
2
7
00111
1
8
01000
4
9
01001
1
10
01010
2
11
01011
1
12
01100
3
13
01101
1
Algorithm Theory, WS 2012/13
Fabian Kuhn
44
Accounting Method
Observation:
• Each increment flips exactly one 0 into a 1
00100 1111 ⟹ 00100 0000
Idea:
• Have a bank account (with initial amount 0)
• Paying to the bank account costs • Take “money” from account to pay for expensive operations
Applied to binary counter:
• Flip from 0 to 1: pay 1 to bank account (cost: 2)
• Flip from 1 to 0: take 1 from bank account (cost: 0)
• Amount on bank account = number of ones
 We always have enough “money” to pay!
Algorithm Theory, WS 2012/13
Fabian Kuhn
45
Accounting Method
Op.
Counter
Cost
To Bank
From Bank
Net Cost
Credit
0 0 0 0 0
1
0 0 0 0 1
1
2
0 0 0 1 0
2
3
0 0 0 1 1
1
4
0 0 1 0 0
3
5
0 0 1 0 1
1
6
0 0 1 1 0
2
7
0 0 1 1 1
1
8
0 1 0 0 0
4
9
0 1 0 0 1
1
10
0 1 0 1 0
2
Algorithm Theory, WS 2012/13
Fabian Kuhn
46
Potential Function Method
• Most generic and elegant way to do amortized analysis!
– But, also more abstract than the others…
• State of data structure / system: ∈
Potential function :
(state space)
→
• Operation :
–
–
–
–
: actual cost of operation : state after execution of operation ( : initial state)
≔Φ
: potential after exec. of operation : amortized cost of operation :
≔
Algorithm Theory, WS 2012/13
Fabian Kuhn
47
Potential Function Method
Operation :
actual cost: Φ
amortized cost: Φ
Overall cost:
Φ
≔
Algorithm Theory, WS 2012/13
Fabian Kuhn
Φ
48
Binary Counter: Potential Method
• Potential function:
:
• Clearly, Φ
0 and Φ
0 for all 0
• Actual cost :  1 flip from 0 to 1

1 flips from 1 to 0
• Potential difference: Φ
Φ
1
• Amortized cost: Φ
Algorithm Theory, WS 2012/13
Fabian Kuhn
Φ
1
2
2
49
Back to Fibonacci Heaps
• Worst‐case cost of a single delete‐min or decrease‐key operation is Ω
• Can we prove a small worst‐case amortized cost for
delete‐min and decrease‐key operations?
Remark:
• Data structure that allows operations ,…,
• We say that operation has amortized cost execution the total time is
⋅
where if for every ,
is the number of operations of type Algorithm Theory, WS 2012/13
Fabian Kuhn
50
Amortized Cost of Fibonacci Heaps
• Initialize‐heap, is‐empty, get‐min, insert, and merge
have worst‐case cost • Delete‐min has amortized cost • Decrease‐key has amortized cost • Starting with an empty heap, any sequence of operations with at most delete‐min operations has total cost (time)
.
• Cost for Dijkstra: Algorithm Theory, WS 2012/13
| |
| | log | |
Fabian Kuhn
51