top of page
  • Writer's pictureAdisorn O.

Understanding Genetic Algorithm (GA)

Updated: May 10, 2023

Adisorn Owatsiriwong

Computer programs that "evolve" in ways that resemble natural selection can solve complex problems even their creators do not fully understand

by John H. Holland

Genetic algorithm (GA) is one of the most widely used meta-heuristics. It is considered a sub-class of population-based search algorithms. The GA was formulated by J. H. Holland and his colleagues in 1970s by mimicking the natural evolutionary concept. GA is so general that it can apply to various optimization problems that other methods can't easily solve, like discrete and combinatorial problems. It's also suitable for knapsack-like problems by selecting items under certain constraints. Similar to its meta-heuristic counterparts, GA doesn't require calculating function derivatives. This is also useful when the objective function is black box. Generally, GA comprises four operations:

  1. Selection

  2. Crossover

  3. Mutation

  4. Elitism

In the modern GA approach, one considers adding the fourth operation, so-called 'Elitism', by transferring some of the best parents to the next generation. Elitism assures that the current best solution might be transferred to the next generation. This reduces the fluctuation of the solution after iterations.

The terminology of GA is shown in the figure below.

Population-based vs. neighborhood-based search algorithms

Population-based search algorithm employs randomly generated multi points population inside the search domain. Then the existing population is used as a starting point and then evolves into the next generation that, usually provides a better solution. Some well-known population-based search algorithms are Genetic Algorithms and most Particles Swarm Methods.

A neighborhood search algorithm starts from a single solution that looks up the neighboring region for a better solution. After several iterations, the solution becomes closer to the optimum. Some neighborhood search algorithms are Simulated annealing and Tabu searches.


1. Population or generation is a set of candidates (solutions) for each iteration.

2. Chromosome is a candidate for the best solution.

3. Gene is a trial solution contained inside the chromosome.

4. Fitness value is computed for each chromosome. The larger fitness solution has chance to be selected or retained to the next generation.

First gen of population is created using the randi (5) function in MATLAB. The fitness value of each chromosome is computed and then the probability of selection is also computed.

Using the Roulette Wheel technique, the rand() function in MATLAB is used to sample the parents upon cumulative probability. From this example, the light blue and yellow chromosomes are selected as parents to generate another two children. The sampling is continued for the following parents; the code must beware to avoid the selection of two identical parents.

Selected parents are recombined (crossover). From this example, a single-point (3rd point) crossover is applied. Then the children for the next generation are generated. It should be noted that the number of population is unchanged. Without elitism, the first gen of population is replaced by their children.

With defined probability, i.e., pm = 0,5, that means 50% of the children are mutated.

After the n-th iteration (generation), the best solution (chromosome) is found with fitness = 20.

The following MATLAB code is written to solve the problem.

Video 1: The solution with Elitism ratio = 0.1

Video 2: The solution without Elitism

Download PDF • 16KB



bottom of page