Evolutionary Computation Applied To Computational Biology

Evolutionary computation is, like neural networks, an example par excellence for an information-processing paradigm that was originally developed and exhibited by nature and later discovered by man, who subsequently transformed the general principle into computational algorithms to be put to work on computers. Nature makes in an impressive way the use of the principle of genetic heritage and evolution. Application of the simple concept of performance-based reproduction of individuals ("survival of the fittest") led to the rise of well-adapted organisms that can endure in a potentially adverse environment. Mutually beneficial interdependencies, cooperation, and even apparently altruistic behavior can emerge solely by evolution.

Evolutionary computation comprises the four main areas: genetic algorithms (GAs) [44], evolution strategies [45], genetic programming [46], and simulated annealing [47]. GAs and evolution strategies emerged at about the same time in the United States of America and Germany. Both techniques model the natural evolution process to optimize either a fitness function (evolution strategies) or the effort of generating subsequent, well-adapted individuals in successive generations (GAs). Evolution strategies in their original form were basically stochastic hill-climbing algorithms and used for optimization of complex, multi-parameter objective functions that in practice cannot be treated analytically. GAs in their original form were not primarily designed for function optimization but to demonstrate the efficiency of genetic crossover in assembling successful candidates over complicated search spaces. Genetic programming takes the idea of solving an optimization problem by evolution of potential candidates one step further in that not only the parameters of a problem but also the structure of a solution is subject to evolutionary change. Simulated annealing is mathematically similar to evolution strategies. It was originally derived from a physical model of crystallization. Only two individuals compete for the highest rank according to a fitness function and the decision about accepting suboptimal candidates is controlled stochastically.

All methods presented in this chapter are heuristic, that is, they contain a random component. As a consequence (in contrast to deterministic methods) it can never be guaranteed that the algorithm will find an optimal solution or even any solution at all. Evolutionary algorithms are therefore used preferably for applications were deterministic or analytic methods fail, for example, because the underlying mathematical model is not well defined or the search space is too large for systematic, complete search (n.-p. completeness). Another application area for evolutionary algorithms that is rapidly growing is the simulation of living systems starting with single cells and proceeding to organisms, societies, or even whole economic systems [48,49]. The goal of artificial life is not primarily to model biological life as accurately as possible but to investigate how our life or other, presumably different forms of life could have emerged from nonliving components.

Work with evolutionary algorithms bears the potential for a philosophically and epistemologically interesting recursion. At the beginning, evolution emerged spontaneously in nature. Next, man has discovered the principle of evolution and acquired knowledge on its mathematical properties. He defines GAs for computers. To complete the recursive cycle, computational GAs are applied to the very objects (DNA and proteins) from which they had been derived in the beginning.

Was this article helpful?

0 0

Post a comment