Jump to content

Genetic operator

From Wikipedia, the free encyclopedia
(Redirected from Genetic Operators)

A genetic operator is an operator used in genetic algorithms to guide the algorithm towards a solution to a given problem. There are three main types of operators (mutation, crossover and selection), which must work in conjunction with one another in order for the algorithm to be successful. Genetic operators are used to create and maintain genetic diversity (mutation operator), combine existing solutions (also known as chromosomes) into new solutions (crossover) and select between solutions (selection).[1] In his book discussing the use of genetic programming for the optimization of complex problems, computer scientist John Koza has also identified an 'inversion' or 'permutation' operator; however, the effectiveness of this operator has never been conclusively demonstrated and this operator is rarely discussed.[2][3]

Mutation (or mutation-like) operators are said to be unary operators, as they only operate on one chromosome at a time. In contrast, crossover operators are said to be binary operators, as they operate on two chromosomes at a time, combining two existing chromosomes into one new chromosome.[4]

Operators

[edit]

Genetic variation is a necessity for the process of evolution. Genetic operators used in genetic algorithms are analogous to those in the natural world: survival of the fittest, or selection; reproduction (crossover, also called recombination); and mutation.

Selection

[edit]

Selection operators give preference to better solutions (chromosomes), allowing them to pass on their 'genes' to the next generation of the algorithm. The best solutions are determined using some form of objective function (also known as a 'fitness function' in genetic algorithms), before being passed to the crossover operator. Different methods for choosing the best solutions exist, for example, fitness proportionate selection and tournament selection; different methods may choose different solutions as being 'best'. The selection operator may also simply pass the best solutions from the current generation directly to the next generation without being mutated; this is known as elitism or elitist selection.[1][5]

Crossover

[edit]

Crossover is the process of taking more than one parent solutions (chromosomes) and producing a child solution from them. By recombining portions of good solutions, the genetic algorithm is more likely to create a better solution.[1] As with selection, there are a number of different methods for combining the parent solutions, including the edge recombination operator (ERO) and the 'cut and splice crossover' and 'uniform crossover' methods. The crossover method is often chosen to closely match the chromosome's representation of the solution; this may become particularly important when variables are grouped together as building blocks, which might be disrupted by a non-respectful crossover operator. Similarly, crossover methods may be particularly suited to certain problems; the ERO is generally considered a good option for solving the travelling salesman problem.[6]

Mutation

[edit]

The mutation operator encourages genetic diversity amongst solutions and attempts to prevent the genetic algorithm converging to a local minimum by stopping the solutions becoming too close to one another. In mutating the current pool of solutions, a given solution may change entirely from the previous solution. By mutating the solutions, a genetic algorithm can reach an improved solution solely through the mutation operator.[1] Again, different methods of mutation may be used; these range from a simple bit mutation (flipping random bits in a binary string chromosome with some low probability) to more complex mutation methods, which may replace genes in the solution with random values chosen from the uniform distribution or the Gaussian distribution. As with the crossover operator, the mutation method is usually chosen to match the representation of the solution within the chromosome.

Combining operators

[edit]

While each operator acts to improve the solutions produced by the genetic algorithm working individually, the operators must work in conjunction with each other for the algorithm to be successful in finding a good solution. Using the selection operator on its own will tend to fill the solution population with copies of the best solution from the population. If the selection and crossover operators are used without the mutation operator, the algorithm will tend to converge to a local minimum, that is, a good but sub-optimal solution to the problem. Using the mutation operator on its own leads to a random walk through the search space. Only by using all three operators together can the genetic algorithm become a noise-tolerant hill-climbing algorithm, yielding good solutions to the problem.[1]

References

[edit]
  1. ^ a b c d e "Introduction to Genetic Algorithms". Archived from the original on 11 August 2015. Retrieved 20 August 2015.
  2. ^ Koza, John R. (1996). Genetic programming : on the programming of computers by means of natural selection (6. print ed.). Cambridge, Mass.: MIT Press. ISBN 0-262-11170-5.
  3. ^ "Genetic programming operators". Retrieved 20 August 2015.
  4. ^ "Genetic operators". Archived from the original on 30 December 2017. Retrieved 20 August 2015.
  5. ^ "Introduction to Genetic Algorithm". Retrieved 20 August 2015.
  6. ^ Schaffer, George Mason University, June, 4 - 7, 1989. Ed.: J. David (1991). Proceedings of the Third International Conference on Genetic Algorithms (2. [Dr.] ed.). San Mateo, Calif.: Kaufmann. ISBN 1558600663.{{cite book}}: CS1 maint: multiple names: authors list (link) CS1 maint: numeric names: authors list (link)