Download Document

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Exact cover wikipedia , lookup

Pattern recognition wikipedia , lookup

Knapsack problem wikipedia , lookup

Algorithm characterizations wikipedia , lookup

Genetic algorithm wikipedia , lookup

Drift plus penalty wikipedia , lookup

K-nearest neighbors algorithm wikipedia , lookup

Computational complexity theory wikipedia , lookup

Simulated annealing wikipedia , lookup

Factorization of polynomials over finite fields wikipedia , lookup

Travelling salesman problem wikipedia , lookup

Algorithm wikipedia , lookup

Expectation–maximization algorithm wikipedia , lookup

Time complexity wikipedia , lookup

Transcript
Contents
Systematic Search Guided by Local
Search with Conflict-based Heuristic in
N-queen problem
Florida Institute of Technology
Department of Computer Science
Hyoung rae Kim
Debasis Mitra Ph.D
1
Contents
Contents
1.
2.
3.
4.
5.
6.
7.
8.
1
2
Introduction
Proposed method
Implementation design
Experiments and analysis
Related work
Conclusion
Future works
References
>
>
>
>
>
Contents
1. Introduction
• Constraint Satisfaction Problem(CSP) does very important
role in Artificial Intelligence (AI). CSP appears in many
areas, for instance vision, resource allocation in scheduling
and temporal reasoning [2].
• What is a constraint satisfaction problem
– A CSP is a problem composed of a finite set of variables, each of
which is associated with a finite domain, and a set of constraints.
– The task is to assign a value to each variable satisfying all the
constraints.
1
3
Contents
Resource allocation in scheduling
[2]
1
4
Contents
N-queens problem
• Place eight queens on an 8 × 8 chessboard satisfying the
constraint that no two queens should be on the same row,
column or diagonal.
1
5
[4 × 4 queens problem]
Contents
N-queens problem
• Problem formalization
•
•
•
•
The set of variables: Z = {Q1, Q2, …, Q8}
Domain: DQ1 = DQ2 = … = DQ8 = {1,2,3,4,5,6,7,8}
Constraint (1): i,j: QiQj
Constraint (2): i,j, if Qi=a and Qj=b,
then i-j  a-b, and i-j  b-a.
• The variable is considered row number.
• The domain of each variable is set of column numbers.
1
6
Contents
Problem reduction and search
• There are two approaches to solve CSP
• Problem reduction
• Pruning off search spaces that contain no solution
• Reducing the size of domains of the variables
* Tightening constraints potentially reduce the search space at a later
stage of the search
• Pruning off branches in the search space
• It can be performed at any stage of the search
• Search
• Find solution in the search space, all or one of the solutions.
1
7
• One often has to find a balance between the efforts made and
the potential gains in problem reduction.
Contents
An example of a search space
1
8
Contents
Search strategies
• Systematic algorithm
• Starts from an empty variable assignment that is extended until obtaining a
complete assignment that satisfies all the constraints in the problem.
• Look-back enhancements (backward checking, back jumping, etc.)
• Look-ahead enhancements (forward checking, etc.)
• Local search algorithm
• Perform an incomplete exploration of the search space by repairing
infeasible complete assignments (min-conflict, GSAT, tabu search).
• Hybrid approach
1
9
• Performing a local search before or after a systematic search.
• Performing a systematic search improved with a local search at some point
of the search.
• Performing an overall local search, and using systematic search either to
select a candidate neighbor or to prune the search space [1].
Contents
The contributions of this work
• Explain the relationship between Local search algorithm
(MC) and Systematic algorithm (FC).
• Trying to find faster searching algorithm by combining them.
1
10
Contents
2. Proposed method
• We improve the speed by hybrid of Forward Checking and
Min-Conflict: Forward checking after Min-conflict.
• We examine the complexity and accuracy as gradually
varying the coverage of Min-conflict.
1
11
Contents
Forward checking algorithm
(12)
(12)
(4)
1
12
(4)
(5)
(1)
(2)
(0)
Total comparison: 40
Contents
Forward checking algorithm
FC1 (UNLABELLED, COMPOUND_LABEL, D,C){
if (UNLABELLED={}) {return UNLABELLED;};
Pick one variable x from UNLABELLED;
{
pick one value v from Dx; Delete v from Dx;
D’=Update1(UNLABELLED-{X}, D, C, <x,v>);
Result = FC1 (UNLABELLED-{X}, COMPOUND_LABEL+{<x,v>},
D’, C);
if (Result != Nil) {return Result;};
} until (Dx={});
return (NIL);
}
S
13
Contents
Update1(W,D,C,Lable)
{
D’=D;
for each variable y in W {
for each value v in D’y {
if (<y,v> is incompatible with Label with respect to the
constraints in C)
D’y=D’y-{v};
}
}
return D’;
}
S
14
Contents
Min-conflict algorithm
3
2
3
Initial status
Checking (1)
Ordering (16)
Checking (1)
Checking (1)
Ordering (10)
Ordering (7)
Checking (3)
C(3)
Checking (5)
Ordering (8)
Checking (2)
2
15
Ordering (7)
Total comparison: 71
1
Contents
Min-conflict algorithm
Informed_Backtrack(Z,D,C)
{
LEFT = {};
for each variable x in Z {
pick a random value from Dx;
add <x,v> to LEFT;
}
InfBack(LEFT, {}, D, C);
}
S
16
Contents
S
17
InfBack(LEFT, DONE, D,C)
{
if (LEFT+DONE is compatible with constraints)
{return LEFT+DONE;};
x = any variable such that label <x,v> is in LEFT;
Queue = Order_values(x, Dx, Labels_left, Labels_done, C);
while (Queue != {}){
w = first element in Queue; Delete w from Queue;
DONE = DONE + {<x,w>};
Result = InfBack(LEFT-{<x,v>}, DONE, D, C):
if (Result != Nil) {return Result;};
}
return Nil;
}
Contents
S
18
Order_values(x, Dx, LEFT, DONE, C)
{
List = {};
for each v in Dx {
if (<x,v> is compatible with all the labels in DONE)
{
Count [v] = 0;
for each <y,w> in LEFT {
if NOT satisfies ((<x,v><y,w>), Cx,y)
Count[v]=count[v]+1;
}
List = List + {v};
}
}
Queue = the values in List ordered in ascending order of Count[v];
return Queue;
}
Contents
Comparison between MC and FC
• Forward checking (FC)
• Advantage: Completeness – it always find a solution if one exists.
One of the best Systematic algorithm.
• Disadvantage: FC is typically cursed with early mistakes in the
search, a wrong variable value can cause a whole sub-tree to be
explored with no success.
• Min-conflict (MC)
• Advantage: Do not suffer from the early-mistake problem. It may be
far more efficient than systematic ones to find a first solution.
• Disadvantage: Not complete. It can be undone, without having
anything to prove.
1
19
Contents
Explanation of hybrid method
• Forward checking after Min-conflict.
• K=0 means pure FC, K=n means pure MC.
Solve this portion
by MC
K
* Vary this K value
Solve this portion
by FC
1
[8-queens problem]
20
Contents
3. Implementation design
• Input variable: N-queens problem
• Output variable:
• Counted number of visited label.
• Counted number of executed constraints.
• MC-FC algorithm runs MC and then FC with the results
from MC.
• We use standard MC algorithm [2].
• We use standard FC algorithm [2].
1
21
Contents
Hybrid algorithm
K=2
MC
FC
2
22
Contents
Hybrid algorithm
SEARCH (n) {
for each k=0 to n
1. MC_FC (k, Success, Count_Label, Count_Constraint);
2. print (k, Success, Count_Label, Count_Constraint);
}
S
23
MC_FC (k, Success, Count_Label, Count_Constraint) {
Repeat until it gets a result or reach to the max iteration
1. Initialize cZ, cD, CC
2. COMPOUND_LABEL = MC(k, cZ, cD, cC, Count_Label, Count_Constraint);
3. If COMPOUND_LABEL is valid
Result=FC(k,COMPOUND_LABEL,
cZ,cD,cC,Count_Label,Count_Constraint)
4. If Result is valid
Success = True
return;
}
Contents
4. Experiment and analysis
1
24
• We use 24-queens problem.
• We ran the algorithm 300 times on a Sun Ultra 60.
• The max iteration number was 1000 (if FC part does not have solutions,
it randomly re-execute MC part).
• We recorded every k value from 0 through n with an interval of 2.
• An output parameter ‘Label count’ is the number of label that the
algorithm visited.
• The other parameter ‘Total count’ is the number of how many times the
constraint is checked. ‘Total count’ subsumes the ‘Label count’.
• We analyze the ‘Label count’ and ‘Total count’.
• We use this formula to compare the quality of data points, which is often
referred to as standard error of the mean:
S.D. of Total count / Sqrt(n) [3].
Contents
Compare the label count
k
1
25
0
2
4
6
8
10
12
14
16
18
20
22
24
Average
Label
411,608
436,652
280,998
173,603
104,788
21,581
4,753
1,300
839
1,025
1,844
2,357
2,936
Std of Label
1,020,620
738,118
336,408
150,904
20,679
4,394
1,109
836
1,025
1,800
2,749
9,096
Std. of Label /
Sqrt(n)
58,926
42,615
19,423
8,712
1,194
254
64
48
59
104
159
525
600000
Complexity
Pure Forward checking
Plot label
count in a
graph
500000
Pure Min Conflict
400000
300000
200000
100000
1
0
1
2
3
4
5
6
7
8
9
10
11
12
13
k
k
Contents
Compare the total count
k
1
27
0
2
4
6
8
10
12
14
16
18
20
22
24
Average
Std. of Constraints /
Std. of Constraints
Constraint
Sqrt(n)
9,032,529
8,912,273
21,793,943
1,258,274
5,810,238
15,920,543
919,173
3,608,147
7,195,248
415,418
2,124,796
3,110,599
179,590
413,155
404,830
23,373
154,604
144,286
8,330
155,602
140,061
8,086
230,620
230,762
13,323
379,014
379,427
21,906
619,071
600,727
34,683
581,590
649,051
37,473
643,543
1,916,064
110,624
12000000
Complexity
Plot total
count in a
graph
Pure Forward checking
10000000
Pure Min Conflict
8000000
6000000
4000000
2000000
1
0
1
2
3
4
5
6
7
8
9
10
11
12
13
k
Contents
Explanation of the results
• The reason of gradual shrinking of the width.
– 4-queens problem has two solutions with following conditions.
– When K=1,
•
•
•
•
There are two solution marks (called A), it takes 4 steps to know the results.
There are two un-solution marks (called B), it takes 6 steps to know the results.
Starts with solution mark A (50%): 4
Starts with un-solution mark B (50%): 10
– When K=2,
2
•
•
•
•
•
•
•
There are two solution marks (called A), it takes 2 steps to know the results.
There are four un-solution marks, 2 has 1step (called B), 2 has 2 steps (called C).
Starts with solution mark (34%): 2
Starts with un-solution mark, B -> A (16.5%) : 3
Starts with un-solution mark, B -> C -> A (16.5%): 5
Starts with un-solution mark, C -> A (16.5%): 4
Starts with un-solution makr, C-> B -> A (16.5%): 5
– This tells when K=2 the S.D is much smaller.
29
• Case 1=12={4,10,4,10,…}, S.D. = 3.1; Case 2=12={2,2,3,5,4,5,…}, S.D. = 1.3
Contents
• As for the reduction of the complexity We are trying to find
the explanation.
1
30
Contents
5. Related work
• A research tried to show that the look-back and look-ahead
enhancements of backtracking-based algorithms can be
exploited for local search algorithms, and can greatly
improve their behavior too. They propose a generic search
technique over CSP which is called decision-repair, which
show great performance [1].
S
31
Contents
6. Conclusion
• We performed a hybrid search: Performing a local search
(MC) before a systematic search (FC).
• The purpose of our research is to understand the relationship
between MC and FC and to improve the speed of searching
algorithm.
• The algorithm shows the best performance when K value is
in the middle.
• We need theoretical explanation for this results.
• Even without the theoretical explanation, the Hybrid
algorithm is better than pure MC and FC.
1
32
Contents
7. Future works
• Vary N to bigger number.
• For other problems other than N-queens.
• Theoretical studies for the result.
1
33
Contents
8. References
• [1] N. Jussien, O. Lhomme, Local Search with Constraint
Propagation and Conflict-absed Heuristics, Artificial
Intelligence 139 (2002) 21-45.
• [2] E. Tsang, “Foundations of Constraint Satisfaction”,
University of Essex Colchester Essex, UK., (1995).
• [3] John Mandel, The statistical analysis of experimental
data, Dover, (1964) 63.
1
34