Jump to content

Selection (genetic algorithm)

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by Blablubbs (talk | contribs) at 18:46, 11 January 2021 (Reverted edits by 2409:4065:D92:196E:C04:6349:E997:668D (talk): editing tests (HG) (3.4.10)). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

Selection is the stage of a genetic algorithm in which individual genomes are chosen from a population for later breeding (using the crossover operator).

A generic selection procedure may be implemented as follows:

  1. The fitness function is evaluated for each individual, providing fitness values, which are then normalized. Normalization means dividing the fitness value of each individual by the sum of all fitness values, so that the sum of all resulting fitness values equals 1.
  2. Accumulated normalized fitness values are computed: the accumulated fitness value of an individual is the sum of its own fitness value plus the fitness values of all the previous individuals; the accumulated fitness of the last individual should be 1, otherwise something went wrong in the normalization step.
  3. A random number R between 0 and 1 is chosen.
  4. The selected individual is the first one whose accumulated normalized value is greater than or equal to R.

For many problems the above algorithm might be computationally demanding. A simpler and faster alternative uses the so-called stochastic acceptance.

If this procedure is repeated until there are enough selected individuals, this selection method is called fitness proportionate selection or roulette-wheel selection. If instead of a single pointer spun multiple times, there are multiple, equally spaced pointers on a wheel that is spun once, it is called stochastic universal sampling. Repeatedly selecting the best individual of a randomly chosen subset is tournament selection. Taking the best half, third or another proportion of the individuals is truncation selection.

There are other selection algorithms that do not consider all individuals for selection, but only those with a fitness value that is higher than a given (arbitrary) constant. Other algorithms select from a restricted pool where only a certain percentage of the individuals are allowed, based on fitness value.

Retaining the best individuals in a generation unchanged in the next generation, is called elitism or elitist selection. It is a successful (slight) variant of the general process of constructing a new population.

Methods of Selection (Genetic Algorithm)

Roulette Wheel Selection

In the roulette wheel selection, the probability of choosing an individual for breeding of the next generation is proportional to its fitness, the better the fitness is, the higher chance for that individual to be chosen. Choosing individuals can be depicted as spinning a roulette that has as many pockets as there are individuals in the current generation, with sizes depending on their probability. Probability of choosing individual is equal to , where is the fitness of and is the size of current generation (note that in this method one individual can be drawn multiple times) If we're working on minimization problem, it is however needed to transform it into maximization problem (which can be easily done by taking the inversion of our fitness).

Rank Selection

Rank Selection also works with negative fitness values and is mostly used when the individuals in the population have very close fitness values (this happens usually at the end of the run). This leads to each individual having an almost equal share of the pie (like in case of fitness proportionate selection) and hence each individual no matter how fit relative to each other has an approximately same probability of getting selected as a parent. This in turn leads to a loss in the selection pressure towards fitter individuals, making the GA to make poor parent selections in such situations.

Steady State Selection

This is not a particular method of selecting parents. Main idea of this selection is that big part of chromosomes should survive to next generation.

In every generation are selected a few (good - with high fitness) chromosomes for creating a new offspring. Then some (bad - with low fitness) chromosomes are removed and the new offspring is placed in their place. The rest of population survives to new generation.

Tournament Selection

Tournament Selection is a method of choosing the individual from the set of individuals. The winner of each tournament is selected to perform crossover.

Elitism Selection

Often to get better parameters, strategies with partial reproduction are used. One of them is elitism, in which a small portion of the best individuals from the last generation is carried over (without any changes) to the next one.

Boltzmann Selection

In Boltzmann selection, a continuously varying temperature controls the rate of selection according to a preset schedule. The temperature starts out high, which means that the selection pressure is low. The temperature is gradually lowered, which gradually increases the selection pressure, thereby allowing the GA to narrow in more closely to the best part of the search space while maintaining the appropriate degree of diversity.[1]

See also

References

  1. ^ Sivanandam, S. N. (2013). Principles of soft computing. Deepa, S. N. New Delhi: Wiley. ISBN 978-1-118-54680-2. OCLC 891566849.