Parameter estimation is an optimization problem, and radically new approaches have been introduced recently. These methods are based on analogies with the evolution of biological systems, since one naive view of the evolutionary process is that it will produce organisms that are optimized to their environment by having maximum biological fitness, Many biologists would disagree with this caricature of evolution, but the analogy has been extremely productive in computer science. The new methods are members of a loose family of algorithms called evolutionary computation. The basic idea applied to parameter estimation is that the parameter space is searched by a large set of "organisms" that are defined by their position in the space. Their fitness is the value of the error function at that point in parameter space. Organisms with low fitness (large error) are discarded. Surviving organisms mate and produce slightly different offspring by combining the positions of the two parents to form a new location in parameter space. This process is repeated until organisms do not show further improvement. These methods are proving to be very effective on error surfaces that are complex with many hills and valleys. We discuss these methods more fully in Chapter 20, but for now recall Fig. 7.5a. One evolutionary computational modification of this method would be to iterate the brute force method by defining a smaller rectangle encompassing 20% of the best points (filled circles), populating this smaller rectangle at
MBS-CD contains files SimCalibrate that do this using Nelder-Mead simplex.
finer resolution with the same number of original points. We repeat this process until we have obtained a sufficiently small and highly resolved rectangle. If the original rectangle is sufficiently large, this method may find not just a local, but also the global minimum.
Was this article helpful?