An Overview on Stochastic Global Optimization and its Multi-Domain Applications

An Overview on Stochastic Global Optimization and its Multi-Domain Applications

Venkateswarlu Chimmiri
DOI: 10.4018/IJCCE.2019070101
OnDemand:
(Individual Articles)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

Optimization is of great interest and it has widespread applications in engineering and science. It has become a major technology contributor to the growth of industry. It is extensively used in solving a wide variety of problems in design, operation, and analysis of engineering and technological processes. Optimization of large-scale problems pose difficulties concerning to dimensionality, differentiability, multimodality and nonlinearity in objective functions and constraints. In order to overcome such difficulties, there has been a rapidly growing interest in advanced optimization algorithms. Stochastic and evolutionary optimization algorithms are increasingly used to solve challenging optimization problems. These algorithms include genetic algorithm, simulated annealing, differential evolution, ant colony optimization, tabu search, particle swarm optimization, artificial bee colony algorithm, and cuckoo search algorithm. These algorithms are typically inspired by some phenomena from nature and they are robust. These algorithms do not require any gradient information and are even suitable to solve discrete optimization problems. These methods are extensively used to solve the optimization problems concerning to systems that are highly nonlinear, high dimensional, and noisy or for solving problems that are not easily solved by classical deterministic methods of optimization.
Article Preview
Top

Brief Description Of Stochastic Evolutionary Optimization Algorithms

Various stochastic and global optimization methods such as genetic algorithm, simulated annealing, differential evolution, ant colony optimization, tabu search, particle swarm optimization, artificial bee colony algorithm, and cuckoo search algorithm are briefly presented.

Genetic algorithms (GAs) are search-based optimization techniques that adapt the principles of genetics and natural selection. GAs became more familiar with the work of John Holland, which adapts natural and artificial systems (Holland, 1975). The biological and genetic concepts of GA's have been studied earlier by various researchers. In problem solving, the GA begins with a set of random solutions represented by chromosomes or strings called population. Solutions from one population are used to form a new population through evolutionary principles of selection, crossover, and mutation. The new population is selected from new solutions (offspring) according to their fitness. This process is repeated until the best solution is obtained.

Simulated annealing (SA) is a stochastic search algorithm, the concept of which as a connection to statistical mechanics and combinatorial optimization was introduced by Kirkpatrick et al. (1983). The principle of SA mimics the annealing process of slow cooling of molten metal to achieve the minimum function value. Similar to the analogy of metallic cooling process, the SA algorithm at each virtual annealing temperature generates a new potential solution to the problem around the neighbours of the current state and alters the state according to a predefined criterion. The acceptance of the new state is then based on the satisfaction of the Metropolis criterion, and this procedure is iterated until convergence of the solution. In the SA algorithm, the objective function is treated as the energy function of a molten metal, and certain artificial temperatures are introduced and gradually cooled in analogous manner to the annealing technique so as to enable the algorithm to achieve the global minimum.

Differential evolution (DE) algorithm introduced by Storn and Price (1997) is a simple population-based search method used for global optimization of real-valued functions. This heuristic search technique is useful to optimize the nonlinear and non-differentiable continuous space problems. DE maintains a population of candidate solutions by mutation, crossover, and selection. In this method, the initial population is generated by adding normally distributed random deviations to the preliminary solution. New solution vectors are generated by adding the weighted difference vector between two population members to a third member. If the resulting vector yields a lower objective function value than a predetermined population member, the previous population is replaced by the new population. DE uses distance and direction information from the current population to guide the search process. This method exhibits faster convergence and is easier to implement, and it has few control parameters.

Ant colony optimization (ACO) algorithm mimics the foraging behaviour of ants in which the real ants find the shortest route between a food source and their nest (Dorigo et al., 1991). The ants communicate with one another by means of pheromone trails and exchange information about which path should be followed. This autocatalytic and collective behaviour of ants results in the establishment of the shortest route from the nest to the food source and back. In ACO, a number of artificial ants build solutions to an optimization problem and exchange information by following a communication scheme adopted by real ants. The ACO construction involves three main functions. The first function constructs ant solutions by managing a colony of ants that visit adjacent states by moving through neighbour nodes of the problem's construction graph. They move by applying a stochastic local decision policy that makes use of pheromone trails and heuristic information. In the second function, pheromone trail updates will be carried out and proceeds with pheromone trail reinforcement and pheromone trail evaporation. In the third function, the algorithm considers bonus updates from a global search. This additional pheromone reinforcement step generates the best solution.

Complete Article List

Search this Journal:
Reset
Open Access Articles
Volume 8: 2 Issues (2019)
Volume 7: 2 Issues (2018)
Volume 6: 2 Issues (2017)
Volume 5: 2 Issues (2016)
Volume 4: 2 Issues (2015)
Volume 3: 2 Issues (2013)
Volume 2: 2 Issues (2012)
Volume 1: 2 Issues (2011)
View Complete Journal Contents Listing