Genetic algorithms Basic concepts and applications for model identification and process optimization

In this book, the idea of the biological principle of natural evolution (survival of the fittest) to artificial systems is applied. This idea was introduced more than three decades ago. It has seen impressive growth in application to biochemical processes in the past few years. As a generic example of the biological principle of natural evolution, GAs [59,60,61,62,63,64,65,66,67] are considered in this research. GAs are optimization methods, which operate on a number of candidate solutions called a "population". Each candidate solution of a problem is represented by a data structure known as an "individual". An individual has two parts: a chromosome and a fitness. The chromosome of an individual represents a possible solution of the optimization problem ("chromosome" and "individual" are sometimes exchangeable in the literature) and is made up of genes. The fitness indicates how well an individual of the population solves the problem.

Though there are several variants of GAs, the basic elements are common: a chromosomal representation of solutions, an evaluation function mimicking the role of the environment, rating solutions in terms of their current fitness, genetic operators that alter the composition of offspring during reproduction and values of the algorithmic parameters (population size, probabilities of applying genetic operators, etc). A template of a general formulation of a GA is given in Figure 1.5. The algorithm begins with random initialization of the population. The transition of one population to the next takes place via the application of the genetic operators: crossover, mutation and selection. Crossover exchanges the genetic material (genes) of two individuals, creating two offspring. Mutation arbitrarily changes the genetic material of an individual. The fittest individuals are chosen to go to the next population through the process of selection. In the example shown in Figure 1.5, The GA assumes user-specified conditions under which crossover and mutation are performed, a new population is created, and whereby the whole process is terminated.

GAs are stochastic global search methods that simultaneously evaluate many points in the parameter space. The selection pressure drives the population towards a better solution. On the other hand, mutation can prevent GAs from being stuck in local optima. Hence, it is more likely to converge towards a global solution. GAs mimic evolution, and they often behave like evolution in nature. They are results of the search for robustness; natural systems are robust - efficient and efficacious - as they adapt to a wide variety of environments. Generally speaking, GAs are applied to problems in which severe nonlinearities and discontinuities exist, or the spaces are too large to be exhaustively searched. As a summary, the general features that GAs have are listed below [69]:

• GAs operate with a population of possible solutions (individuals) instead of a single individual. Thus, the search is carried out in a parallel form.

Was this article helpful?

0 0

Post a comment