What is an evolutionary algorithm?

A process where, given a population of individuals, the environmental pressure causes natural selection (survival of the fittest) and hereby the fitness of the population grows.  It is easy to see such a process as optimization. Given an objective function to be maximized we can randomly create a set of candidate solutions and use the objective function as an abstract fitness measure (the higher the better).  Based on this fitness, some of the better candidates are chosen to seed the next generation by applying recombination and/or mutation.  Recombination is applied to two selected candidates, the so-called parents, and results one or two new candidates, the children. Mutation is applied to one candidate and results in one new candidate.  Applying recombination and mutation leads to a set of new candidates, the offspring. Based on their fitness these offspring compete with the old candidates for a place in the next generation.  This process can be iterated until a solution is found or a previously set time limit is reached.  Let us note that many components of such an evolutionary process are stochastic.  According to Darwin, the emergence of a new species, adapted to their environment, is a consequence of the interaction between the survival of the fittest mechanism and undirected variations. Variation operators must be stochastic, the choice on which pieces of information will be exchanged during recombination, as well as the changes in a candidate solution during mutation, are random.  On the other hand, selection operators can be either deterministic, or stochastic.  In the latter case fitter individuals have a higher chance to be selected than less fit ones, but typically even the weak individuals have a chance to become a parent or to survive.  The general scheme of an evolutionary algorithm can be seen in the following diagram:

A genetic algorithm repeatedly cycles through a loop which defines successive generations of objects:

  1. Compare each object with the final goal to evaluate fitness.
  2. Discard the objects that least fit the goal.
  3. Use crossover to generate the next generation of objects.
  4. Simply copy some objects--as is--to the next generation.
  5. Add some objects with random mutations into the next generation.

You'll probably notice there is a common structure to most all explanations of genetic algorithms, no matter where you look.  For example:

Initialize population with random individuals (candidate solutions)

Evaluate (compute fitness of) all individuals

WHILE not stop DO

Select genitors from parent population

Create offspring using variation operators on genitors

Evaluate newborn offspring

Replace some parents by some offspring

OD

-- Evolutionary Computing, A.E. Eiben and M. Schoenauer

 

At runtime beginning of a genetic algorithm, a large population of random chromosomes is created.

  1. Test each chromosome to see how good it is at solving the problem at hand and assign a fitness score accordingly. The fitness score is a measure of how good that chromosome is at solving the problem to hand.
  2. Select two members from the current population. The chance of being selected is proportional to the chromosomes fitness. Roulette wheel selection is a commonly used method.
  3. Dependent on the crossover rate crossover the bits from each chosen chromosome at a randomly chosen point.
  4. Step through the chosen chromosomes bits and flip dependent on the mutation rate.

Repeat step 2, 3, 4 until a new population of N members has been created.

-- Genetic Algorithms in Plain English,
http://www.ai-junkie.com/ga/intro/gat1.html

 

A GA operates through a simple cycle of stages:

  1. Creation of a "population" of strings,
  2. Evaluation of each string,
  3. Selection of best strings and
  4. Genetic manipulation to create new population of strings.

-- Artificial Intelligence and Soft Computing:
Behavioral and Cognitive Modeling of the Human Brain
,
Amit Konar

 

-- Practical Genetic Algorithms, Second Edition,
by Randy L. Haupt and Sue Ellen Haupt

 

The initial population is usually created by some random sampling of the search space, generally performed as uniformly as possible.

In general, there are two driving forces behind a GA: selection and variation. The first one represents a push toward quality and is reducing the genetic diversity of the population. The second one, implemented by recombination and mutation operators, represents a push toward novelty and is increasing genetic diversity. To have a GA work properly, an appropriate balance between these two forces has to be maintained.

 

public void footer() {
About | Contact | Privacy Policy | Terms of Service | Site Map
Copyright© 2009-2012 John McCullock. All Rights Reserved.
}