Download GA - Academic Server| Cleveland State University

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
Derivative-Free Optimization:
Genetic Algorithms
Dan Simon
Cleveland State University
1
Outline
1.
2.
3.
4.
5.
6.
7.
Biological Genetics
Genetic Algorithm: A Short History
Genetic Algorithm Example: Robot Design
Genetic Algorithm Options
Genetic Algorithm Example: Ackley Function
Continuous Genetic Algorithm
GA Code Walk-Through
2
Charles Darwin
• Born in England, 1809
• “You care for nothing but
shooting, dogs, and rat-catching;
and you will be a disgrace to
yourself and all your family.”
• Medicine?
• Theology?
• Biology?
3
Charles Darwin
•
•
•
•
H. M. S. Beagle: 1831–1836
The Origin of Species: 1836–…
Paper from Alfred Wallace: 1858
Presentation of both papers by
Darwin: 1858
• The Origin of Species: 1859
“Only” 500 pages
The first printing (1,250 copies)
sold out the first day
4
Darwin’s Theory of Natural Selection
• Survival of the fittest
• Controversial
– Anti-Christian?
– How are traits passed to children?
• Misconceptions
– Traits of parents could be blended in children
– Acquired traits could be passed to children
5
Gregor Mendel
• Born in 1822 in Czech.
• Poor farming family
• Joined Augustinian monastery
at age 21
• Studied botany (peas)
• Discovered the idea of genes, heredity, and
dominance
• His publication (1865) was ignored at the time
6
Genetic Algorithms
• Princeton, John von Neumann
• Nils Barricelli (mathematician),
1953, artificial life
• 1954: “Esempi numerici di
processi di evoluzione”
(Numerical models of evolutionary processes)
7
Genetic Algorithms
• Alexander Fraser (biologist) – England, Hong
Kong, New Zealand, Scotland, Australia –
1957: “Simulation of genetic systems by
automatic digital computers”
• Hans-Joachim Bremermann – University of
Washington, UC Berkeley – 1958: “The
evolution of intelligence”
8
Genetic Algorithms
• George Box (statistician) – Imperial Chemical
Industries (England) – 1957: “Evolutionary
operation: A method for increasing industrial
productivity”
“Essentially, all models are wrong, but some are
useful” (1987)
• George Friedman, UCLA – 1956: “Selective
Feedback Computers for Engineering Synthesis
and Nervous System Analogy” (Master’s thesis)
9
GA for Robot Design
000 = 5-volt stepper
001 = 9-volt stepper
010 = 12-volt stepper
011 = 24-volt stepper
100 = 5-volt servo
101 = 9-volt serv
110 = 12-volt serv
111 = 24-volt servo
000 = 12-volt NiCd battery
001 = 24-volt NiCd battery
010 = 12-volt Li-ion battery
011 = 24-volt Li-ion battery
100 = 12-volt solar panel
101 = 24-volt solar panel
110 = 12-volt fusion reactor
111 = 24-volt fusion reactor
encoding for motor spec
encoding for power spec
10
GA for Robot Design
Fitness = Range (hrs) + Power (W) – Weight (kg)
• Experiment or simulation
We are combining incompatible units
Randomly create initial population:
Individual 1  12-volt step motor , 24-volt solar panel
010
101
Individual 2  9-volt servo motor , 24-volt NiCad battery
101
101
Each individual is represented with a
11
chromosome which has two genes
GA for Robot Design
Individual 1 chromosome = 010 101
Individual 1’s motor genotype is 010, and its
motor phenotype is “12-V stepper”
Two Parents
Two Children
0 1 0 1 0 1
0 1 1 0 0 1
1 0 1 0 0 1
1 0 0 1 0 1
crossover point
12
GA for Robot Design
How do we decide which individuals to mate?
Fitness proportional selection, AKA roulettewheel selection
Example: four individuals with fitness values 10,
20, 30, and 40
Individual 2
Individual 4
20
40
10
30
Individual 1
Individual 3
13
A Simple Genetic Algorithm
Parents  {randomly generated population}
While not (termination criterion)
Calculate the fitness of each parent in the population
Children = 
While |Children| < |Parents|
Use fitnesses to select a pair of parents for mating
Mate parents to create children c1 and c2
Children  Children  { c1, c2}
Loop
Randomly mutate some of the children
Parents  Children
Next generation
14
GA Termination Criteria
1. Generation count
2. Fitness threshold
3. Fitness improvement threshold
15
Critical GA Design Parameters
1.
2.
3.
4.
5.
6.
7.
8.
Elitism
Encoding scheme
Fitness function and scaling
Population size
Selection method (tournament, rank, …)
Mutation rate
Crossover type
Speciation / incest
16
GA Schematic
10010110
10010110
01100010
Elitism
01100010
10100100
10100100
10011001
10111100
01111101
11001011
---
---
---
Selection Crossover Mutation
---
---
---
---
---
Current
generation
Next
generation
17
Encoding
Binary: Neighboring
phenotypes have
dissimilar
genotypes, and vice
versa
000
001
010
011
100
101
110
000
001
011
010
110
111
101 100
x = -5 : 0.1 : 2
plot(x, x.^4 + 5*x.^3 + 4*x.^2 – 4*x + 1);
111
Gray: Neighboring
phenotypes have
similar genotypes
18
Gray Codes
Bell Labs researcher Frank Gray introduced the
term reflected binary code in his 1947 patent
application.
19
Ackley Function

x2  y 2
f ( x, y )  20  e  20 exp  0.2

2

x genotype  xg  [0, 63]
x phenotype  5 
10 xg
y phenotype  5 
10 y g
63
y genotype  y g  [0, 63]
63

 cos(2 x)  cos(2 y ) 
  exp 


2



 [5,5]
 [5,5]
Minimization problem; global minimum = 0 (at x = y = 0)
Can be generalized to any number of dimensions
20
Ackley Function
•
•
•
•
•
•
•
100 Monte Carlo simulations
Population size = 50
Mutation rate = 2%
Crossover probability = 100%
Single point crossover
Encoding: binary or gray
Elitism: 0 or 2
21
Ackley Function
Ackley function
-5
-10
-15
-20
5
5
0
y
0
-5
-5
x
22
Ackley Function
4.5
Binary Coding without Elitism
Binary Coding with Elitism
Gray Coding without Elitisim
Gray Coding with Elitism
Minimum Cost
4
3.5
3
2.5
2
1.5
1
0
10
20
30
Generation
40
50
Average of 100 Monte Carlo simulations
23
Ackley Function
5
5
0
0
-5
-5
0
0th generation
5
-5
-5
5
5
0
0
-5
-5
0
10th generation
5
-5
-5
0
5th generation
5
0
15th generation
5
24
Continuous Genetic Algorithms
Parents crossover:
[1.23, 4.76, 2.19, 7.63]
[9.73, 1.09, 4.87, 8.28]
Children:
[1.23, 1.09, 4.87, 8.28]
[9.73, 4.76, 2.19, 7.63]
crossover point
Usually, GAs for continuous problems are
implemented as continuous GAs
25
Continuous Genetic Algorithms
Blended crossover:
Select a random number r  [0, 1]
Genotype operation: c = p1 + r(p2—p1)
Parent 2
Child
Parent 1
26
Continuous Genetic Algorithms
Mutation: Suppose x = [9.73, 1.09, 4.87, 8.28]
Problem dimension = 4
r  random number  [0, 1]
Aggressive
Mutation
If r < pm then
i  random integer  [1, 4]
r  random number  [0, 1]
x(i)  xmin + r(xmax – xmin)
end if
27
Continuous Genetic Algorithms
Mutation: Suppose x = [9.73, 1.09, 4.87, 8.28]
Problem dimension = 4
r  random number  [0, 1]
Gentle
Mutation
If r < pm then
i  random integer  [1, 4]
r  Gaussian random number  N(0, )
x(i)  x(i) + r
end if
28
Rastrigin Benchmark Function
p

f ( x)  10 p   xi2  10cos 2 xi
i 1

p dimensions
Global minimum f(x) = 0 at xi = 0 for all i
Lots of
local
minima
29
Rastrigin Benchmark Function
GA.m
300
Average Cost
Minimum Cost
250
200
Cost
Population size = 50
Mutation rate = 1%
Crossover prob. = 100%
Single point crossover
Elitism = 2
15 dimensions
150
100
50
0
0
10
20
30
Generation
40
50
30