Selection and Fitness

Fitness-based selection is the force that represents the drive toward quality improvements in a GA.  Designing the fitness function (or evaluation function) is therefore crucial.  The first important feature about fitness computation is that it represents 99% of the total computational cost of evolution in most real-world problems. Second, the fitness function is often the only information about the problem in the algorithm, so any available and usable knowledge about the problem should be used.

 

Selecting Only The Best: Elitism

Elitism is the simplest and most intuitive method to try. Certainly the best way to make sure the fittest survive and reproduce. Selection of the elite can be implemented by taking the best 5% or so of the population and passing them to the next generation, and/or mating them for offspring.

One of the biggest problems with this is over-specialization; the algorithm often finishes too quickly, with a solution that is less then optimal. Basically, the algorithm tends to get caught in local minima.

 

Steady State Selection

Steady state selection works a little like elitism, except that instead of copying a few of the most fit individuals for the new generation, steady state selection copies all but a few of the worst fit from the current population. The remainder are then further selected for mutation and crossover in the usual way. Some sources say steady state selection is rarely successful, some had better luck, but all agree that it's only successful in certain situations.

 

Random Selection

"Random" is the best word to describe this type. Basically, a fixed number of individuals are selected at random.

The worst that can happen is theat the best individuals might not be chosen at all.

One of the best ways to use random selection is the allow the best individuals to have a higher chance of being randomly chosen. For instance, if fitness values range between 0 and 100, then choosing a random number between 0 and 100 for each individual will yield selection results proportional to each fitness score. In this case, if an individual has a fitness value of 75, and a random number is chosen between 0 and 100, there is a 75% chance that the number will be between 0 and 75 (a 75% chance of being selected). Likewise, an individual with a fitness of 30, only has a 30% chance of being selected. A random number can be chosen for each individual until enough have been selected.

 

Roulette Wheel Selection

The individuals are mapped to contiguous segments of a line, such that each individual's segment is proportional to its fitness. A random number is generated and the individual whose segment spans the random number is selected. The process is repeated until the desired number of individuals is obtained. This technique is analogous to a roulette wheel with each slice proportional in size to the fitness of the individuals.

 

Stochastic Universal Sampling

The individuals are mapped to contiguous segments of a line, such that each individual's segment is proportional to its fitness; exactly as in roulette-wheel selection. Here equally spaced pointers are placed over the line as many as there are individuals to be selected. The position of the first pointer is given by a randomly generated number, the rest of the pointers are equally spaced up to the end of the line.

 

Truncation Selection

Similar to the general random method, but only the best individuals are selected for parents.  The parameter for truncation selection is the truncation threshold "Trunc". Trunc indicates the proportion of the population to be selected as parents and takes values ranging from 50%-10% (higher percentages can be used, but they tend to stagnate the population).  Individuals below the truncation threshold do not produce offspring.

 

Tournament Selection

This one actually allows the algorithm to run a little faster than Steady State and Roulette Wheel. Basically, 5 or 10 individuals are randomly chosen, and from this group, the best fit is selected. The rest are returned to the population. This process is repeated until enough individuals have been selected for the next generation. This adds an element of fairness to the game by allowing non-selected individuals multiple chances for selection.

 

Copying

The copying of objects from one generation to the next ensures that no matter what the results of crossover, at least some objects in the next generation will be as fit as their predecessors.  Otherwise there is a chance of total extinction to all objects before ever reaching the goal.  At least a small percentage of the population from each generation should be chosen for copying (Koza, et. al. suggest about 9%).

 

public void footer() {
About | Contact | Privacy Policy | Terms of Service | Site Map
Copyright© 2009-2012 John McCullock. All Rights Reserved.
}