Download full local search

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Natural computing wikipedia , lookup

Lateral computing wikipedia , lookup

Travelling salesman problem wikipedia , lookup

Population genetics wikipedia , lookup

Simulated annealing wikipedia , lookup

Multi-objective optimization wikipedia , lookup

Mathematical optimization wikipedia , lookup

Binary search algorithm wikipedia , lookup

Multiple-criteria decision analysis wikipedia , lookup

Genetic algorithm wikipedia , lookup

Transcript
Paper Review for ENGG6140
Memetic Algorithms
By: Jin Zeng
Shaun Wang
School of Engineering
University of Guelph
Mar. 18, 2002
1
Contents

Introduction
 MA and GA
 Basic MA
 Examples
 Conclusions
2
Introduction
History of MA
‘Meme’: word introduced by Richard Dawkins
when he describe cultural evolution in his bestseller book “The Selfish Gene’’ (‘76).
 “Memetic Algorithms’’ Analogous role of gene but
in the field of cultural evolution.‘Memetic
Algorithms’ , firstly proposed by P. Moscarto.
(‘89)
 MA has been widely applied in optimization and
solving many NP hard problems successfully.

3
Introduction
What is ‘Meme’?





Meme is the basic unit of cultural transmission, in
analagy to gene in genetic transmission.
Meme is replicated by imitation.
It can be changed by the owner for adaption.
Examples: ideas, clothing fashion and NBA.
High-extent variation occurs in cultural
transmission.
4
Introduction
Cultural Evolution

When a meme passed between individuals, the
individual will adapt the meme as it sees best.
 Shared characteristics are not inherited due to
simple processes of recombination of previous
solutions
 Using historical information and an external logic
to speed-up the process.
5
Introduction
What is MA?

MA mimics the process of cultural evolution
 Characterization of evolutionary algorithms that
can hardly fit the GAs methaphor - no, or small,
relation with biology
 ‘Hybrid GAs’
 MAs
 ‘Scatter Search’ (Glover, ‘77)  MAs
6
Introduction
Why MA?

In general, there are two ways to searching the
solution space:
 Exploration: Investigate the new and unknown
areas in the search space;
 Exploitation: Make use of knowledge found
before to help find better solutions
 Both are necessary but contradictory in solving an
optimization problem.
7
Introduction
Why MA? (cont.)

The limitation of former algorithms:
 GA: using parallel searching technique.
 Good at avoiding local optima
 Not well suited for finely tuned search.
 LS: improvement heuristics.
 Find local optima quickly.
 Highly depending on the start point.
 Hard to find a global optimum.
8
Introduction
Why MA? (cont.)
Combination of GA + Local Search
MA
 GA: For exploration;
 LS: For exploitation;
 Result: higher efficiency and better effect.
9
Introduction
Combination Methods

Two kinds of Combinations:
Baldwin Effect Based
LS is used to modify the
structure of the problem. The
improvement is not inherited
by the children.
Lamarkian Evolution Based
The improvement of LS will
be inherited in the children.
Wrong in biological
evolution. But effective in
optimization.
10
MA and GA
Similarities

Both MA and GA model an evolutionary process.
 Both MA and GA have the process of
generalization, recombination (crossover) and
mutation. Some changes occur in the process.
 Both MA and GA use fitness function to evaluate
the changes in the process thus both of them are
applied in optimization successfully.
11
MA and GA
Difference
MA
Models
Cultural Evolution
Basic Unit
Meme
Flow Process
Information
Evolution Speed
Fast
Copying Fidelity
Low
Mutation Rate
High
GA
Bio Evolution
Gene
Bio Characteristics
Slow
High
Low
12
Basic MA
Flow Chart Process
13
Basic MA
Pseudo Code of MA
procedure Memetic Algorithm();
begin
Generalization();
repeat
Crossover();
Mutation();
P := select (P);
if P converged then P : MutationAndLS ( P );
until
terminate=true;
end;
14
Basic MA
Generalization
Generalization ( )
begin
for j : 1 to popsize do
i : GenerateSolution();
i : LocalSearch(i);
Add individual i to P
endfor
end
15
Basic MA
Crossover
Crossover ( )
begin
for i : 1
to # crossover do
Select two parents ia , ib  P randomly;
ic : Crossover(ia , ib );
ic : LocalSearch(ic );
Add individual ic to P ;
endfor
end
16
Basic MA
Mutation
Mutation ( )
begin
for i : 1 to # mutations do
Select and individual i  P randomly
im : Mutation(i);
im : LocalSearc h(im );
Add individual im to P ;
endfor
end
17
Basic MA
Local Search

Full Local Search and Partial Local Search
 Demo of FLS
Y
Original Solution
Solution after Recombination or Mutation
Solution after Local Search
X
18
Basic MA
Demonstration of MA

•
•
•
•
•
Example Problems: Y= f(x);
Parameters of MA:
Population: 5;
Xover rate:0.4; (# of Xover: 5x0.4=2)
Mutation rate: 0.4; (# of Mutation: 5x0.4=2)
Local Search: Full
19
Basic MA
Demonstration of MA (Continued)
Y
Random Generalized Solution
Solutions After Local Search
X
A. Generalization and LS
Y
Solutions After Local Search
Solutions After Crossover
Solutions After Mutation
X
B. Crossover and Mutation
20
Basic MA
Demonstration of MA (Continued)
Y
Solutions After Local Search
Solutions After Crossover
Solutions After Mutation
Solutions After Local Search
X
C. Local Search after Crossover and Mutation
Y
Solutions After Crossover
X
D. Population Selection
21
Basic MA
Effect of Crossover and Mutation

Both can be used for exploring the search space by
“jumping” to new regions to start new local
search;
 Crossover
Searching the region between two or more
specified points;
 Mutation
Searching the undirected region randomly;
22
Basic MA
Advantage of MA

Combining the advantages of GA and LS while
avoid the disadvantages of both;
 GA ensures wide exploration in the solution space
 Through local search, the space of possible
solutions can be reduced to the subspace of local
optima.
 When the scale of problem increases, the
advantages becomes remarkable.
23
Basic MA
Disadvantage of MA

The proportion of computations used in
exploration and exploitation depends on the real
optimization problem.
 It is hard to determine the best depth of local
search,.
24
MA Examples
Some Implementation Examples of MA






Quadratic Assignment Problem (QAP)
Traveling Salesman Problem (TSP)
Vehicle Routing
Graph Partitioning
Scheduling
The Knapsack Problem
25
MA Examples
Apply Local Search to MA in QAP

For any permutation solution being explored, the
procedure for the local search be executed once or
several times –– partial local search (PLS)
 The procedure for the local search be repeated
many times until no further improvement is
possible –– full local search (FLS)
26
MA Examples
Derived Two different MAs for QAP
PGA –– starts with an initial population of
randomly generated individuals. For each
individual, after xover and mutation, a PLS is
performed.
 FGA –– relies on FLS, full local search are carried
out on all individuals at the beginning and at the
end of a SGA run.

27
MA Examples
Briefly Steps involved for the PGA

The steps for PGA is same as the Basic MA.
 The procedures for the local search only executed
once or several times after each xover and
mutation.
28
MA Examples
Briefly Steps involved for the FGA

1. Randomly generate an initial population.
Perform FLS on each individual.
 2: While terminating criterion is not reached,
continue with procedures as spelled out for the
SGA.
 3: Perform FLS on the best solution and output
the final solution.
29
MA Examples
Comparison of FGA and PGA

The effectiveness of FLS depends on the starting
solution and the exchange routine.
 PLS can be carried out more frequently, the
algorithm is therefore able to spread out the search
by exploring many small-localized regions, thus
reducing the likelihood of the algorithms being
trapped in a local optimum.
30
MA Examples
Comparison of FGA and PGA (cont.)

As the size of the problem scales up, it is difficult
to carry out FLS freely due to its great
computational intensity.
 PLS is carried out for almost all the individuals in
addition to the SGA evolutionary mechanisms, the
capability of the SGA in evolving towards fitter
individuals is greatly enhanced.
31
MA Examples
Comparison of FGA and PGA (cont.)

FLS limits the exploratory capability of the SGA,
it will reduce the chance of the FGA reaching the
global optimum.
 PGA has a greater chance of obtaining the global
optimum as compared to FGA.
32
MA Examples
Comparison of a typical run on problem Els19
for SGA, PGA and FGA
Average Cost
-SGA
-FGA
-PGA
-Optimum
3.0x107
2.8x107
2.6x107
2.4x107
2.2x107
2.0x107
1.8x107
1.6x107
10
110
210
310
410 Generation
33
Conclusion

MA provides a more efficient and more robust
way to the optimization problem.
 MA combines global and local search by using EA
to perform exploration while another local search
method performs exploitation.
 MA can solve some typical optimization problem
where other meta-heuristics have failed.
34
Thank you!
35