Genetic Algorithm Applications to Optimization Modeling

Genetic Algorithm Applications to Optimization Modeling

Pi-Sheng Deng
Copyright: © 2009 |Pages: 7
DOI: 10.4018/978-1-59904-849-9.ch111
OnDemand:
(Individual Chapters)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

Genetic algorithms (GAs) are stochastic search techniques based on the concepts of natural population genetics for exploring a huge solution space in identifying optimal or near optimal solutions (Davis, 1991)(Holland, 1992)(Reeves & Rowe, 2003), and are more likely able to avoid the local optima problem than traditional gradient based hill-climbing optimization techniques when solving complex problems. In essence, GAs are a type of reinforcement learning technique (Grefenstette, 1993), which are able to improve solutions gradually on the basis of the previous solutions. GAs are characterized by their abilities to combine candidate solutions to exploit efficiently a promising area in the solution space while stochastically exploring new search regions with expected improved performance. Many successful applications of this technique are frequently reported across various kinds of industries and businesses, including function optimization (Ballester & Carter, 2004)(Richter & Paxton, 2005), financial risk and portfolio management (Shin & Han, 1999), market trading (Kean, 1995), machine vision and pattern recognition (Vafaie & De Jong, 1998), document retrieval (Gordon, 1988), network topological design (Pierre & Legault, 1998)(Arabas & Kozdrowski, 2001), job shop scheduling (Özdamar, 1999), and optimization for operating system’s dynamic memory configuration (Del Rosso, 2006), among others. In this research we introduce the concept and components of GAs, and then apply the GA technique to the modeling of the batch selection problem of flexible manufacturing systems (FMSs). The model developed in this paper serves as the basis for the experiment in Deng (2007).
Chapter Preview
Top

Genetic Algorithms

GAs were simulation techniques proposed by John Holland in the 1960s (Holland, 1992). Basically, GAs solve problems by maintaining and modifying a population of candidate solutions through the application of genetic operators. During this process, beneficial changes to parent solutions are combined into their offspring in developing optimal or near-optimal solutions for the given task.

Intrinsically, GAs explore multiple potentially promising regions in the solution space at the same time, and switch stochastically from one region to another for performance improvement. According to Holland (1992), regions in the solution space can be defined by syntactic patterns of solutions, and each pattern is called a schema. A schema represents the pattern of common attributes or features of the solutions in the same region. Let Σ be an alphabet of symbols. A string over an alphabet is a finite sequence of symbols from the alphabet. An n-ary schema is defined as a string in (Σ ∪ {#})n, where # ∉ Σ is used as a wildcard denotation for any symbol in Σ.

Conceptually, n-ary schemata can be regarded as defining hypersurfaces of an n-dimensional hypercube that represents the space of all n-attribute solutions. Individual solutions in the same region can be regarded as instances of the representing schema, and an individual solution can belong to multiple schemata at the same time. Actually, an n-attribute solution is a member of 2n different schemata. Therefore, evaluating a solution has the similar effect of sampling 2n regions (i.e., schemata) at the same time, and this is the famous implicit parallelism of genetic search. A population of M solutions will contain at least 2n and at most M • 2n schemata. Even for modest values of n and M, there will be a large number of schemata available for processing in the population. GAs perform an implicit parallel search through the space of possible schemata in the form of performing an explicit parallel search through the space of individual solutions.

Key Terms in this Chapter

Batch Selection: Selecting the optimal set of products to produce, with each product requiring a set of resources, under the system capacity constraints

Landscape: A function plot showing the state as the “location” and the objective function value as the “elevation”

Schemata: A general pattern of bit strings that is made up of 1, 0, and #, used as a building block for solutions of the GA

Reinforcement Learning: A learning method which interprets feedback from an environment to learn optimal sets of condition/response relationships for problem solving within that environment

Genetic Algorithms: A stochastic search method which applies genetic operators to a population of solutions for progressively generating optimal or near-optimal solutions

Flexible Manufacturing Systems: A manufacturing system which maintains the flexibility of order of operations and machine assignment in reacting to planned or unplanned changes in the production process

Implicit Parallelism: A property of the GA which allows a schema to be matched by multiple candidate solutions simultaneously without even trying

Fitness Functions: The objective function of the GA for evaluating a population of solutions

Genetic Operators: Selection, crossover, and mutation, for combining and refining solutions in a population

Complete Chapter List

Search this Book:
Reset