Download Evolutionary Algorithms - (BVM) engineering college

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Public health genomics wikipedia , lookup

Group selection wikipedia , lookup

Polymorphism (biology) wikipedia , lookup

History of genetic engineering wikipedia , lookup

Human genetic variation wikipedia , lookup

Genetic testing wikipedia , lookup

Dual inheritance theory wikipedia , lookup

Genome (book) wikipedia , lookup

Genetic drift wikipedia , lookup

Genetic engineering wikipedia , lookup

Gene expression programming wikipedia , lookup

Microevolution wikipedia , lookup

Population genetics wikipedia , lookup

Koinophilia wikipedia , lookup

Transcript
National Conference on Recent Trends in Engineering & Technology
Evolutionary Algorithms:
Symbiosis of its Paradigms
1
Kanan D. Joshi 2Prof. D. C. Vakaskar
Process Executive, Tata Consultancy Services, Baroda,
2
Professor, Department of Applied Mathematics, M. S. University, Baroda
1
Abstract- Main objective of this paper is to present an overall
idea of evolutionary algorithms and its paradigms inspired
from natural selection and survival of the fittest in the
biological world. It also highlights the disparity and
resemblance amongst the different facet of evolutionary
algorithms.
I.
INTRODUCTION
Evolutionary algorithms have been around since the early
sixties. They apply the rules of nature: evolution through
selection of the fittest individuals, the individuals representing
solutions to a mathematical problem. Candidate solutions to
the optimization problem play the role of individuals in a
population, and the cost function determines the environment
within which the solutions "live". Evolution of the population
then takes place after the repeated application of the various
operators. Evolutionary algorithms often perform well
approximating solutions to all types of problems because they
ideally do not make any assumption about the underlying
fitness landscape. This generality is shown by successes in
fields as diverse as engineering, art, biology, economics,
marketing, genetics, operation research, robotics, social
sciences, physics, politics and chemistry.
Evolutionary algorithm performance is representation
independent, in contrast with other numerical techniques,
which might be applicable for only continuous values or other
constrained sets. Evolutionary algorithms offer a framework
such that it is comparably easy to incorporate prior knowledge
about the problem. Incorporating such information focuses the
evolutionary search, yielding a more efficient exploration of
the state space of possible solutions. Evolutionary algorithms
can also be combined with more traditional optimization
techniques. This may be as simple as the use of a gradient
minimization used after primary search with an evolutionary
algorithm. For example fine tuning of weights of an
evolutionary neural network. Or it may even involve
simultaneous application of other algorithms. For example,
hybridizing with simulated annealing or tabu search to
improve the efficiency of basic evolutionary search.
Evolutionary algorithm also has a benefit of evaluation of each
solution in parallel and only selection (which requires at least
pair wise competition) requires some serial processing.
Implicit parallelism is not possible in many global
optimization algorithms like simulated annealing and tabu
search. Traditional methods of optimization are not robust to
13-14 May 2011
dynamic changes in the environment and often require a
complete restart in order to provide a solution (e.g., dynamic
programming). In contrast, evolutionary algorithms can be
used to adapt solutions to changing circumstance. Perhaps the
greatest advantage of evolutionary algorithms comes from the
ability to address problems for which there are no human
experts. Evolutionary algorithms form a subset of evolutionary
computation in that they usually involve only techniques
implementing mechanisms inspired by biological evolution
such as reproduction, mutation, recombination, natural
selection and survival of the fittest. Evolutionary Algorithms
can be divided into four main areas of research: Genetic
Algorithms (GA), Genetic Programming (GP) and Learning
Classifier Systems: Evolution Strategies (ES) and
Evolutionary Programming (EP).
II. INTRODUCTION TO PARADIGMS OF EA
Genetic Algorithms (GA) are search algorithms that mimic
the process of natural evolution where each individual is a
candidate solution: individuals are generally raw data in
whatever encoding format has been defined. As such they
represent an intelligent exploitation of a random search within
a defined search space to solve a problem. GAs were first
pioneered by John Holland in the 1960s. Then they have been
extensively studied, experimented, and applied in many fields.
It is important to note that GAs not only provides an
alternative method to solve problems, but, in several cases, it
consistently outperforms other traditional methods. The
genetic algorithm exploits the higher-payoff or ‘target’ regions
of the solution space, because successive generations of
reproduction and crossover produce increasing numbers of
strings in those regions (for example the following string
represents a can of diameter 8 cm and height 10 cm:)
01000 01010
d
h
The algorithm favors the fittest strings as parents and so
above-average strings which fall in target regions will have
more offspring in the next generation.
GAs are parallel search procedures. They are applicable to
both Continuous and Discrete Optimization Problems. GAs
are stochastic and less likely to get trapped in the local optima.
B.V.M. Engineering College, V.V.Nagar,Gujarat,India
National Conference on Recent Trends in Engineering & Technology
Its flexibility facilitates both structure and parameter
identification in complex models like neural networks and
fuzzy inference systems. GAs work with a coding of the
parameter set, not the parameter themselves. It searches from a
population of points, not a single point. GAs use payoff
information, not derivative or auxiliary knowledge.
The components of GA are as follows:
•
•
•
•
•
Encoding scheme.
Fitness evaluation.
Selection / Reproduction.
Crossover.
Mutation.
Genetic Programming (GP) is considered a special case of
GA, where each individual is a computer program. It is a
machine learning technique used to optimize a population of
computer programs according to a fitness landscape
determined by a program's ability to perform a given
computational task. Genetic Programming (GP) technique
provides a framework for automatically creating a working
computer program from a high-level statement of the problem.
Genetic programming achieves this goal of automatic
programming by genetically breeding a population of
computer programs. It uses the principles of Darwinian natural
selection and biologically inspired operations.
The main difference between genetic programming and
genetic algorithms is the representation of the solution. Earlier
work concerned genetic algorithms (Holland 1975), evolution
strategies (Schwefel 1981), and evolutionary programming
(Fogel 1966). These methods have been applied successfully
to a wide spectrum of problem domains, especially in
optimization. However, it was unclear for a long time whether
the principles of evolution could be applied to computer code,
with all its dependencies and structural brittleness. Negative
results from early experiments seemed to indicate that
evolution of computer code was not possible. Successes were
all in the area of constraint optimization (Michalewicz 1996),
where methods were made available for how to deal with
structural brittleness. These methods found their way into
programming and gave rise to the new field of GP (Koza
1992). Genetic programming creates computer programs in
the LISP or scheme computer languages as the solution.
Unlike most languages, LISP is usually used as an interpreted
language. This means that, unlike compiled languages, an
interpreter can process and respond directly to programs
written in LISP. The main reason for choosing LISP to
implement GP is due to the advantage of having the programs
and data with the same structure, which could provide easy
means for manipulation and evaluation. Instead of decision
variables, a program represents a procedure to solve a task.
For example, let us consider solving the following differential
equation using GP.
13-14 May 2011
Solve
dy
 3x,
dx
with y(o) = 2
The objective in this problem is to find a solution y as a
function of x which will satisfy the given problem definition.
Here, a solution in a GP method is nothing but an arbitrary
function of x. The task of GP is to find a function which is the
correct solution to the above problem through genetic
operations. Genetic programming uses four steps to solve
problems:
1) Generate an initial population of random compositions of
the functions and terminals of the problem (computer
programs).
2) Execute each program in the population and assign it a
fitness value according to how well it solves the problem.
3) Create a new population of computer programs.
a) Copy the best existing programs
b) Create new computer programs by mutation.
c) Create new computer programs by crossover.
4) The best computer program that appeared in any
generation, the best-so-far solution, is designated as the result
of genetic programming.
Evolution Strategy (ES) was developed by Rechenberg at
Technical University, Berlin. ES tend to be used for empirical
experiments that are difficult to model mathematically. The
system to be optimized is actually constructed and ES is used
to find the optimal parameter settings. Evolution strategies
merely concentrate on translating the fundamental
mechanisms of biological evolution for technical optimization
problems. The parameters to be optimized are often
represented by a vector of real numbers (object parameters –
op). Another vector of real numbers defines the strategy
parameters which controls the mutation of the objective
parameters. Both object and strategic parameters form the
data-structure for a single individual. The basic
implementation of evolution strategies was two membered
(11) ES, i.e. one parent generates one offspring and the
best of two is selected and the other eliminated.
Evolutionary programming (EP) is a wide evolutionary
computing dialect with no fixed structure or (representation),
in contrast with some of the other dialects. It is becoming
harder to distinguish from evolutionary strategies. Some of its
original variants are quite similar to genetic programming,
except that the program structure is fixed and its numerical
parameters are allowed to evolve. Its main variation operator
is mutation. Members of the population are viewed as part of
a specific species rather than members of the same species
therefore each parent generates an offspring, using a (μ + μ)
survivor selection. Combinatoric and a real-valued function
optimization in which the optimization surface or fitness
landscape is "rugged", possessing many locally optimal
solutions, are well suited for evolutionary programming. The
basic evolutionary programming method involves the
following steps:
B.V.M. Engineering College, V.V.Nagar,Gujarat,India
National Conference on Recent Trends in Engineering & Technology
1. Choose an initial population (possible solutions at random).
The number of solutions in a population is highly relevant to
the speed of optimization, but no definite answers are
available as to how many solutions are appropriate and how
many solutions are just wasteful.
2. New offspring’s are created by mutation. Each offspring
solution is assessed by computing its fitness. Typically, a
stochastic tournament is held to determine N solutions to be
retained for the population of solutions. It should be noted that
evolutionary programming method typically does not use any
crossover as a genetic operator.
III APPLICATIONS:
 Genetic algorithms can be used to optimize business
processes by considering the demands and conditions of
the surrounding environment. Essentially, allowing
process designers to design a process that will be most
likely to succeed in the current market environment.
 GAs have been used for everything from multiple-fault
diagnosis to medical-image registration.
 They have shown themselves to be a superior tool for
developing rule-based systems, capable of gleaning
knowledge from data inaccessible to statistical methods.
 Using Genetic Algorithms to both design composite
materials and aerodynamic shapes for race cars and
regular means of transportation can return combinations
of best materials and best engineering to provide faster,
lighter, more fuel efficient and safer vehicles.
 Getting the most out of a range of materials to optimize
the structural and operational design of buildings,
factories, machines, etc. is a rapidly expanding
application of GAs.
 GAs can be programmed to search for a range of optimal
designs and components for each specific use, or to return
results for entirely new types of robots that can perform
multiple tasks and have more general application.
 GAs are being developed that will allow for dynamic and
anticipatory routing of circuits for telecommunications
networks.
 New applications of a GA known as the "Traveling
Salesman Problem" or TSP can be used to plan the most
efficient routes and scheduling for travel planners, traffic
routers and even shipping companies. The shortest routes
for traveling.
 On the security front, GAs can be used both to create
encryption for sensitive data as well as to break those
codes.
 GAs are used to aid in the understanding of protein
folding, analyzing the effects of substitutions on those
protein functions, and to predict the binding affinities of
various designed proteins developed by the
pharmaceutical industry for treatment of particular
diseases.
 GAs have been and are being developed to make analysis
of gene expression profiles much quicker and easier.
13-14 May 2011
 GAs are indeed being put to work to help merchandisers
to produce products and marketing consultants design
advertising and direct solicitation campaigns to sell stuff.
 Genetic Programming is applicable in areas like
computer science, science, engineering, art and
entertainment.
 In computer science, the development of algorithms has
been a focus of attention. By being able to manipulate
symbolic structures, genetic programming is one of the
few heuristic search methods for algorithms. Sorting
algorithms, caching algorithms, random number
generators, and algorithms for automatic parallelization of
code (Ryan 2000), to name a few, have been studied.
 Typical applications in science are to modeling and
pattern recognition. Modeling certain processes in physics
and chemistry with the unconventional help of
evolutionary
creativity
supports
research
and
understanding of the systems under study. Pattern
recognition is a key ability in molecular biology and other
branches of biology, as well as in science in general.
Here, GP has delivered first results that are competitive if
not better than human-generated results.
 In engineering, GP is used in competition or cooperation
with other heuristic methods such as neural networks or
fuzzy systems. The general goal is again to model
processes such as production plants, or to classify results
of production.
 Control of man-made apparatus is another area where GP
has been used successful; with process control and robot
control the primary applications.
 In art and entertainment, GP is used to evolve realistic
animation scenes and appealing visual graphics. It also
has been used to extract structural information from
musical composition in order to model the process so that
automatic composition of music pieces becomes possible.
 Many of these problems require a huge amount of
computational power on the part of the GP systems.
Parallel evolution has hence been a key engineering
aspect of developments in GP. As a paradigm, genetic
programming is very well suited for a natural way of
parallelization.
 Evolution Strategies - as a stochastic search can be
applied to system parameter estimation.
 ES can also be applied to image analysis applications. It
can be applied for task scheduling in multiprocessor
systems.
 ES is applicable for omni-directional mobile manipulator
path planning.
 ES is also applied for airbag release optimization; that is
presented in a tuning method for airbag release.
 Evolution strategies are applied to the problem of
parameter optimization in fuzzy rule based systems.
 The application of evolution strategies is also in the
optimization of actuator parameters in active jet flow
B.V.M. Engineering College, V.V.Nagar,Gujarat,India
National Conference on Recent Trends in Engineering & Technology
control and in the optimization of bifurcating and
blooming jets.
 Current research concerning ESs deals with applications
like the travelling salesman problem, grider-bridge
optimization, neural networks, vector optimization and
parameter optimization in general.
 Evolutionary programming has been applied to diverse
engineering problems including traffic routing and
planning, pharmaceutical design , epidemiology, cancer
detection, military planning, control systems , system
identification ,signal processing, power engineering,
learning in games, function optimization.
 Evolutionary Programming is applicable to any of the
areas for which evolutionary algorithms are applicable.
 Cutting-edge research indicates that evolutionary
programming is set to emerge as the dominant
optimization technique in the fast-changing power
industry.
 Practical application of evolutionary programming to
reactive power planning and dispatch for speedy, costeffective increases in transmission capacity plus generator
parameter estimation.
IV RICHNESS, TAKEN BACKS AND COMPARISION
OF ALL THE PARADIGMS
Advantages of GAs are they provide efficient, effective
techniques for optimization and machine learning applications
Concept is easy to understand. Concept of GA is easy to
understand. GA supports multi-objective optimization. It is
good for “noisy” environments. GA always gives an answer
and answer gets better with time. GAs are inherently parallel
and easily distributed. There are many ways to speed up and
improve a GA-based application as knowledge about problem
domain is gained. Can quickly scan a vast solution set. Bad
proposals do not affect the end solution negatively as they are
simply discarded. The inductive nature of the GA means that it
doesn't have to know any rules of the problem - it works by its
own internal rules. The main advantage of GA is that models
which cannot be developed using other solution methods
without some form of approximation can be considered in an
unapproximated form.
computational resources at runtime. This approach does scale
with the problem size.
Some other approaches to the cart centering problem use a GA
that encodes NxN matrices of parameters. These solutions
work bad as the problem grows in size (i.e. as N
increases).With GP we do not impose restrictions on how the
structure of solutions should be. Also we do not bound the
complexity or the number of rules of the computed solution.
Taken backs for GP is a computationally intense process
requiring a large amount of machine time. The estimated
machine time increases with increasing complexity of the
problem, and increase in the dimensions and number of
samples.
As GP is a stochastic process that depends highly on the initial
control parameter settings, it does not guarantee an optimal
solution in all runs. It should therefore be run several times
with different settings to ensure that the system has not fallen
into local optima.
While GP combine features of global and local search
algorithms, the cost is that it often performs neither of these
functions as well as more specialized algorithms. The constant
introduction of new genetic material through mechanisms of
mutation and crossover (mating) will divert the algorithm
from finding the best combination of a few highly effective
components.
GP may also output several rules that are quite different but
perform equally well, thus suggesting the involvement of
multiple and often unrelated genes. The selection of a single
rule can be difficult, particularly when searching for a general
solution to a problem.
Advantages of ES are contemporary derivatives of evolution
strategy often use a population of μ parents and also
recombination as an additional operator, called (μ/ρ+, λ)-ES.
This makes them less prone to get stuck in local optima.
Demerits of ES are the (environmental) selection in evolution
strategies is deterministic and only based on the fitness
rankings, not on the actual fitness values. The resulting
algorithm is therefore invariant with respect to monotonous
transformations of the objective function.
Symbiosis of paradigms of EA:
Demerits of GAs are alternate solutions are too slow or overly
complicated. Need an exploratory tool to examine new
approaches.
Advantages of GP are no analytical knowledge is needed and
we can still get accurate results If we encode fuzzy sets in the
genotype we can generate new -more suited- fuzzy sets to
describe precise and individual membership functions. We can
do it by means of the intersection and/or union of the existing
fuzzy sets. Every component of the resulting GP rule-base is
relevant in some way for thes solution of the problem. Thus
we do not encode null operations that will expend
13-14 May 2011
Evolution Strategies (ESs) and Genetic Algorithms (GAs) are
compared in a formal as well as in an experimental way. It is
shown, that both are identical with respect to their major
working scheme. But nevertheless they exhibit significant
differences with respect to the details of the selection scheme,
the amount of the genetic representation and, especially, the
self-adaptation of strategy parameters.
The main difference between ES and GA are in the
presentation of population and the types of evolution
operators. In ES, instead of binary strings we use real values to
present parameters of optimization. Also contrary to GA that
incorporates both crossover and mutation, ES just use
B.V.M. Engineering College, V.V.Nagar,Gujarat,India
National Conference on Recent Trends in Engineering & Technology
mutation. Regarding these differences, it seems that ES are
easier to implement and might be faster than GA.
With respect to their performance on multiple problem
instances with varying parameters, it is shown that single
problem instances are not sufficient to prove the effectiveness
of a given evolution strategy. And that the Genetic
Programming approach is less prone to varying instances than
the Evolution Strategy.
The main difference between genetic programming and
genetic algorithms is the representation of the solution.
Genetic programming creates computer programs in the lisp or
scheme computer languages as the solution. Genetic
algorithms create a string of numbers that represent the
solution. Genetic programming is much more powerful than
genetic algorithms. The output of the genetic algorithm is a
quantity, while the output of the genetic programming is
another computer program. In essence, this is the beginning of
computer programs that program themselves. Photomosaics
are a new form of art in which smaller digital images (known
as tiles) are used to construct larger images. Although both
approaches sometimes use the same computational effort, GP
is capable of generating finer photomosaics in fewer
generations. In conclusion, we found that the GP
representation is richer than the GA representation and offers
additional flexibility for future photomosaics generation.
GP applies the approach of the genetic algorithm to the space
of possible computer programs. Genetic Programming (GP) is
considered a special case of GA, where each individual is a
computer program (not just raw data). GP explore the
algorithmic search space and evolve computer programs to
perform a defined task. Although linear GAs are adept at
developing rule-based systems, they cannot develop equations.
A recent addition to the evolutionary domain is genetic
programming, which uses an evolutionary approach to
generate symbolic expressions and perform symbolic
regressions.
Evolutionary programming is a stochastic optimization
strategy similar to genetic algorithm. But instead places
emphasis on the behavioral linkage between parents and their
offspring, rather than seeking to emulate specific genetic
operators as observed in nature. Evolutionary programming is
similar to evolution strategies, although the two approaches
developed independently. EP typically does not use any
crossover as a genetic operator. First, there is no constraint on
the representation. The typical GA approach involves
encoding the problem solutions as a string of representative
tokens, the genome. In EP, the representation follows from the
problem. A neural network can be represented in the same
manner as it is implemented, for example, because the
mutation operation does not demand a linear encoding.
13-14 May 2011
GA uses both crossover and mutation, with crossover as the
primary search operator, while EP uses only mutation without
crossover.
Some of the original variants of EP are quite similar to genetic
programming, except that the program structure is fixed and
its numerical parameters are allowed to evolve. Its main
variation operator is mutation; members of the population are
viewed as part of a specific species rather than members of the
same species therefore each parent generates an offspring,
using a (μ + μ) survivor selection.
Despite their independent development over 30 years, they
share many similarities. When implemented to solve realvalued function optimization problems, EP & ES typically
operate on the real values themselves (rather than any coding
of the real values as is often done in GAs). Multivariate zero
mean Gaussian mutations are applied to each parent in a
population and a selection mechanism is applied to determine
which solutions to remove from the population. The
similarities extend to the use of self-adaptive methods for
determining the appropriate mutations to use. Methods in
which each parent carries not only a potential solution to the
problem at hand, but also information on how it will distribute
new trials (offspring). Most of the theoretical results on
convergence (both asymptotic and velocity) developed for ES
or EP also apply directly to the other.
Selection: EP typically uses stochastic selection via a
tournament. Each trial solution in the population faces
competition against a preselected number of opponents and
receives a "win" if it is at least as good as its opponent in each
encounter. Selection then eliminates those solutions with the
least wins. In contrast, ES typically uses deterministic
selection in which the worst solutions are purged from the
population based directly on their function evaluation.
Recombination: EP is an abstraction of evolution at the level
of reproductive populations and thus no recombination
mechanisms are typically used because recombination does
not occur between species (by definition: see Mayr's
biological species concept). In contrast, ES is an abstraction of
evolution at the level of individual behavior. When selfadaptive information is incorporated this is purely genetic
information (as opposed to phenotypic). Thus some forms of
recombination are reasonable and many forms of
recombination have been implemented within ES.
V CONCLUSION
No free launch theorem states that there is no globally best
optimization algorithm, and each algorithm will be efficient
for specific application domains. Genetic programming is part
of the growing set of evolutionary algorithms which apply
search principles analogous to those of natural evolution in a
variety of different problem domains, notably parameter
B.V.M. Engineering College, V.V.Nagar,Gujarat,India
National Conference on Recent Trends in Engineering & Technology
optimization. Evolutionary programming, evolutionary
strategies, and genetic algorithms are three other branches of
the area of evolutionary algorithms which mostly find
applications as optimization techniques. The most important
difference is that a GA works on a population of possible
solutions, while other heuristic methods use a single solution
in their iterations. Another difference is that GAs are
probabilistic (stochastic), not deterministic.
8.
Thomas Back, Frank Hoffmeister, Hans-Paul schwefel,
University of Dortmund, Germany, “A survey of
evolution strategies”.
VI REFRENCES
1. Raymond Chiong, Member, IEEE, and Ooi Koon Beng,
Member, IEAust, “A Comparison between Genetic
Algorithms and Evolutionary Programming based on
Cutting Stock Problem”.
2. Mehrdad Dianati, Insop Song, and Mark Treiber Univ.
Ave. West, University of Waterloo, Ontario, N2L 3G1,
Canada “An Introduction to Genetic Algorithms and
Evolution Strategies”.
3. Ajith Abraham1, Nadia Nedjah2 and Luiza de Macedo
Mourelle3, 1 School of Computer Science and Engineering
Chung-Ang University 410, 2nd Engineering Building 221,
Heukseok-dong, Dongjakgu Seoul 156-756, 2 Department
of Electronics Engineering and Telecommunications,
Engineering Faculty, State University of Rio de Janeiro,
Rua S˜ao Francisco Xavier, 524, Sala 5022-D, Maracan˜a,
Rio de Janeiro, Brazil, 3 Department of System
Engineering and Computation, Engineering Faculty, State
University of Rio de Janeiro, Rua S˜ao Francisco Xavier,
524, Sala 5022-D, Maracan˜a, Rio de Janeiro, Brazil
“Evolutionary Computation: from Genetic Algorithms to
Genetic Programming”.
4. Fogel DB (1995), IEEE Press, Piscataway, NJ.Proceeding
of the first [EP92], second [EP93] and third [EP94] Annual
Conference on Evolutionary Programming. "Evolutionary
Computation: Toward a New Philosophy of Machine
Intelligence".
5. Hassan Farsijani, Ph.D Faculty Of Management Shahid
Beheshti University, Tehran, Iran, “Evolutionary Methods
for Design of Dynamic Global World-Class Business for
the World Market Society”
6. Al Globus, MRJ Technology Solutions, Inc. at NASA
Ames Research Center John Lawton, University of
California at Santa Cruz Todd Wipke, “Automatic
molecular design using evolutionary techniques”.
7.
J. Koza, “Genetic Programming: On the programming of
computers by means of natural selection (complex
adaptive systems)”, MIT Press (1992).
13-14 May 2011
B.V.M. Engineering College, V.V.Nagar,Gujarat,India