Genetic algorithms are one of the best ways to solve a problem for which little is known. They are a very general algorithm and so will work well in any search space. All you need to know is what you need the solution to be able to do well, and a genetic algorithm will be able to create a high quality solution. Genetic algorithms use the principles of selection and evolution to produce several solutions to a given problem.
When are Genetic Algorithms Useful?
There are at least three situations where genetic algorithms are useful:
The objective function is not smooth (i.e., not differentiable).
There are multiple local optima.
There is a large number of parameters (the meaning of "large" keeps changing).
A typical genetic algorithm requires:
1) A genetic representation (e.g. Array of Bits) of the solution domain,
2) A fitness function to evaluate the solution domain. A fitness function is a particular type of objective function that is used to summarize, as a single figure of merit, how close a given design solution is to achieving the set aims.
Once the genetic representation and the fitness function are defined, a GA proceeds to initialize a population of solutions and then to improve it through repetitive application of the mutation, crossover, inversion and selection operators.
Simple Genetic Algorithm Pseudo-code
function SimpleGeneticAlgorithm ()
{
Initialize population;
Calculate fitness function;
While (fitness value != termination criteria)
{
Selection;
Crossover;
Mutation;
Calculate fitness function;
}
}
Genetic Algorithms steps:
1. Initialization
The population size depends on the nature of the problem, but typically contains several hundreds or thousands of possible solutions. Often, the initial population is generated randomly, allowing the entire range of possible solutions (the search space). Occasionally, the solutions may be "seeded" in areas where optimal solutions are likely to be found.
2. Selection
During each successive generation, a portion of the existing population is selected to breed a new generation. Individual solutions are selected through a fitness-based process, where fitter solutions (as measured by a fitness function) are typically more likely to be selected. The fitness function is defined over the genetic representation and measures the quality of the represented solution. The fitness function is always problem dependent.
3. Genetic operators
The next step is to generate a second generation population of solutions from those selected through a combination of genetic operators: crossover (also called recombination), and mutation.
a) Crossover (also called as Recombination)
Crossover is a genetic operator used to vary the programming of a chromosome or chromosomes from one generation to the next. It is analogous to reproduction and biological crossover, upon which genetic algorithms are based. Cross over is a process of taking more than one parent solutions and producing a child solution from them.
b) Mutation
Mutation alters one or more gene values in a chromosome from its initial state. In mutation, the solution may change entirely from the previous solution. Hence GA can come to a better solution by using mutation. Mutation should allow the algorithm to avoid local minima by preventing the population of chromosomes from becoming too similar to each other, thus slowing or even stopping evolution.
-
- The mutation of bit strings ensue through bit flips at random positions.
-
- Example:
-
1 |
0 |
1 |
0 |
0 |
1 |
0 |
|
|
|
|
↓ |
|
|
1 |
0 |
1 |
0 |
1 |
1 |
0 |
4. Termination
This generational process is repeated until a termination condition has been reached. Common terminating conditions are:
- A solution is found that satisfies minimum criteria
- Fixed number of generations reached
- Allocated budget (computation time/money) reached
- The highest ranking solution's fitness is reaching or has reached a plateau such that successive iterations no longer produce better results
- Manual inspection
- Combinations of the above
Case Study : Roulette Wheel Selection
Parents are selected according to their fitness. The better the chromosomes are, the more chances to be selected they have. Imagine a roulette wheel where all the chromosomes in the population are placed. The size of the section in the roulette wheel is proportional to the value of the fitness function of every chromosome - the bigger the value is, the larger the section is. See the following picture for an example.
A marble is thrown in the roulette wheel and the chromosome where it stops is selected. Clearly, the chromosomes with bigger fitness value will be selected more times.
This process can be described by the following algorithm.
[Sum] Calculate the sum of all chromosome fitnesses in population - sum S.
[Select] Generate random number from the interval (0,S) - r.
[Loop] Go through the population and sum the fitnesses from 0 - sum s. When the sum s is greater then r, stop and return the chromosome where you are. Of course, the step 1 is performed only once for each population.
Python code for Roulette wheel selection