Chapter 6: SELECTION. 6.1 Introduction

Size: px
Start display at page:

Download "Chapter 6: SELECTION. 6.1 Introduction"

Transcription

1 Chapter 6: SELECTION 6.1 Introduction Selection is the first genetic operation in the reproductive phase of genetic algorithm. The objective of selection is to choose the fitter individuals in the population that will create offsprings for the next generation, commonly known as mating pool. The mating pool thus selected takes part in further genetic operations, advancing the population to the next generation and hopefully close to the optimal solution. In other words, Selection is the process of choosing breeding stock or parents from a population. As the generations pass, the members of population should get fitter and fitter. Individuals from the mating pool are used to generate new offsprings, with the resulting offspring forming the basis of next generation. So it is desirable that the mating pool should have good individuals. Selection operator works at the level of chromosomes. The key idea of selection operator is to give preference to better individuals by allowing them to pass on their genes to the next generation and prohibit the entrance of worst fit individuals into the next generation. The goodness of each individual depends on its fitness. Fitness value is determined by an objective function. Selection of individuals in the population is fitness dependent and is done using different algorithms [Goldberg et al. 1991]. Some are roulette wheel selection, rank selection, steady state selection and many more. Selection acts as driving force in a genetic algorithm by directing the genetic search towards promising regions in the search space. Selection operator emulates phenomena and processes in nature. Selection chooses more fit individuals in analogy to Darwin s theory of evolution survival of fittest [Fogel 1995]. All the individuals have a chance of being selected into the mating pool, but there are chances that an individual in the population can be selected more than once depending upon its fitness. Selection schemes are characterised by selection pressure, selection variance and loss of diversity. They primarily determine the convergence characteristics of genetic algorithms. Selection has to be balanced. Too strong selection means suboptimal highly fit individuals will take over the population reducing the diversity and too weak selection will result in too slow evolution [Mitchell 1996]. 94

2 Broadly, Selection schemes can be classified into two broad categories namely, proportionate selection and ordinal-based selection. Proportionate selection picks out individuals based upon their fitness values relative to the fitness of the other individuals in the population. It is generally more sensitive to selection pressure. Hence, a scaling function is employed for redistributing the fitness range of the population in order to adapt to the selection pressure. Roulette wheel selection, stochastic remainder selection, and stochastic universal selection are some selection schemes grouped under this category. Ordinal-based selection scheme selects individuals not upon their raw fitness, but upon their rank within the population. In ordinal selection, the selection pressure should be independent of the fitness distribution of the population and solely based upon the relative ordering i.e. ranking of the population. Tournament selection, selection, truncation selection, and linear ranking selection schemes are included in ordinal selection. Goldberg and Deb grouped selection methods in to four categories: Proportionate, Ranking, Tournament, Steady state selection. Other methods were grouped under these categories. Detailed classification of selection schemes is shown in the following figure 6.1: Selection Proportionate Ordinal Proportionate Roulette wheel Stochastic remainder Ranking Stochastic universal With replacement With replacement Without replacement Without replacement Tournament Linear Rank selection Binary tournament Truncation selection Larger tournament Steady state selection Genitor Boltzmann tournament Figure 6.1 Classification of Selection Schemes 95

3 6.2 Exploration vs. Exploitation Generally all the optimisation techniques are influenced by two important issues - exploration and exploitation. Exploitation refers to the tendency of the algorithm to steer its search direction by previously obtained information [Thierens et al. 1994]. Exploitation makes use of the generated knowledge and propagation of the adaptations. It means that during the solution searching, algorithm uses information obtained in the past in order to determine smaller regions that are promising for further search. Exploitation operations often incorporate small changes into already tested individuals leading to new, very similar solution candidates or try to merge building blocks of different promising individuals. Exploration is a metaphor for the procedure which allows search operations to find novel and maybe better solution structures. Exploration investigates new and unknown areas in the search space and generates new knowledge. Exploration is a procedure that obtains new information by visiting new regions in the search space in order to find promising points or sub regions. In contrast to exploitation, exploration includes jumps into the unknown. Both techniques have their own merits and demerits. Both the terms are contradictory to each other and need to be balanced. In common view, exploration of search space is done by search operators in evolutionary algorithms and exploitation is done by selection. The tradeoff between exploitation and exploration is mainly determined by the selective pressure created by the selection operator. Pure random search is good at exploration, but does no exploitation whereas hill climbing is good at exploitation and does no exploration [Beasley 1993a]. It has been observed in previous researches that any one technique is not enough to obtain best optimal solution, especially with large TSPs [Merz et al. 1977, Ray et al. 2007]. So, many researches are being carried out to combine two or more algorithms in order to improve performance and obtain better results. Optimisation algorithms that favour exploitation over exploration have higher convergence speed but run the risk of not finding the optimal solution and may get stuck at a local optimum. Generally, optimisation algorithms should employ at least one search operation of explorative character and at least on which is able to exploit good solutions further. Holland showed that a genetic algorithm combines both exploration and exploitation at the same time in an optimal way [Holland 1975]. 96

4 6.3 Parameters for Selection Selection pressure: It is the degree to which the individuals in the mate pool needed to be better. It can be defined as the probability of best individual being selected compared to the average probability of selection of all individuals. Low selection pressure leads to slow convergence rate and long time will be needed to find the optimal solution. The selection pressure is the degree to which the better individuals are favoured. The higher the selection pressure, the more the better individuals are favoured. This selection pressure drives the genetic algorithm to improve the population fitness over the successive generations. The convergence rate of genetic algorithm is largely determined by the magnitude of the selection pressure, with higher selection pressures resulting in higher convergence rates. Selection Variance: It is the expected variance of the fitness distribution of the population after applying the selection method to the normalised Gaussian distribution. Selection Intensity: It is the expected average fitness value of the population after applying a selection method on population whose fitness is distributed according to unit normal distribution [Zhang et al. 1995]. It measures the magnitude of selection pressure provided by selection scheme. Loss of Diversity: It is the proportion of individuals of a population that is not selected during the selection phase. Bias: It is the absolute difference between an individual s normalised fitness and its expected probability of reproduction. Spread: It is the range of possible values for the number of offsprings of an individual. 6.4 Description of Selection Schemes Proportionate Selection: It selects the individual from the population according to the fitness of an individual. Proportionate selection is carried out in many ways - Roulette Wheel Selection, Deterministic Selection, Stochastic Remainder Selection and Stochastic Universal Selection. 97

5 Roulette Wheel Selection Roulette wheel is the simplest and traditional stochastic selection approach proposed by Holland. It is categorised under proportionate selection as it selects the individuals based on a probability proportional to the fitness. The principle of roulette selection is a linear search through a roulette wheel with the slots in the wheel weighted in proportion to the individual s fitness values. All the chromosomes (individuals) in the population are placed on the roulette wheel according to their fitness value [De Jong 1975, Goldberg 1989, Goldberg et al. 1991]. Each individual is assigned a segment of roulette wheel. The size of each segment in the roulette wheel is proportional to the value of the fitness of the individual - the bigger the value is, the larger the segment is. Then, the virtual roulette wheel is spinned. The individual corresponding to the segment on which roulette wheel stops, is then selected. The process is repeated until the desired number of individuals is selected. Individuals with higher fitness have more probability of selection. This may lead to biased selection towards high fitness individuals. It can also possibly miss the best individuals of a population. There is no guarantee that good individuals will find their way into next generation. Roulette wheel selection uses exploitation technique in its approach. Figure 6.2 Fitness Distribution on Roulette Wheel 98

6 Roulette wheel selection is easier to implement but is noisy. The rate of evolution depends on the variance of fitness s in the population. The average fitness of the population for ith generation in roulette wheel selection is calculated as FRW = where i varies from 1 to ngen and j varies from 1 to N. (Eq.ix) Therefore, the probability for selecting the j th string is PRW = (Eq.x) where N is the population size and FRW j is the fitness of individual j. Selection Intensity is calculated as: I = (Eq.xi) where σ is mean variance of fitness values of population before selection. In roulette wheel selection, if the search for the location of the chosen slot is performed using linear search from beginning of the list, each selection requires O(n) steps, because on an average half the list will be searched. Overall time complexity of roulette wheel selection is O(n 2 ) steps because it requires n spins to fill the population in a generation. Roulette wheel selection can also be carried out using binary search, but it requires additional memory locations. In this case, overall time complexity reduces to O( n log n) because binary search requires O(log n) steps per spin and n spins. Algorithm of Roulette Wheel Selection Set l=1, j=1, i=nogen While l <= mpool Begin While j<=n Begin Compute FRW i,j End Set j=1, S=0 While j<=n Begin 99

7 Compute S=S+FRW i,j End Generate random number r from interval (0,S) Set j=1, S=0 While j<=n Begin Calculate c j =c j-1 +FRW i,j If r<=cj, Select the individual j End l=l+1 End Stochastic Remainder Selection Stochastic Remainder Selection operates on the concept that proportionate fitness of each chromosome should be reflected in that chromosome s incidence in mating pool [Grefenstette et al. 1989]. Each chromosome in the population has a probability of selection based on its relative fitness. In Stochastic Remainder Selection Mechanism, relative fitness of a chromosome is calculated as the fitness associated with an individual divided by average fitness, say m i = f i /f avg. Integer part of m i is used to select one parent deterministically and then uses roulette selection on the remaining fractional part to fill up the free places in the mating pool. For example, if the value of m i of an individual is 2.3, that individual is listed twice as a parent because the integer part is 2. After parents have been assigned according to the integer parts of the scaled values, the rest of the parents are chosen stochastically. The probability that a parent is chosen in this step is proportional to the fractional part of its scaled value. This technique provides greatest probability of selection to most fit members of the population. Pselect i = f i /f avg (Eq.xii) Expected occurrence in mating pool is calculated as: e i = pselect i x n (Eq.xiii) This method has two variations: with replacement and without replacement i) Stochastic Remainder Selection with Replacement If stochastic remainder selection is done with replacement, then the remainders are used to size the slots of a roulette wheel selection process. The fractional parts of the 100

8 expected occurrence value are used to calculate weights in roulette wheel selection and then used to fill the remaining population slots in the mating pool. The algorithm takes on the complexity of the roulette wheel i.e. O(n 2 ) in this case because O (n) of the individuals is likely to have fractional parts to their m values. ii) Stochastic Remainder Selection without Replacement If stochastic remainder selection is done without replacement, the fractional parts of the expected occurrence value are treated as probabilities. Remaining slots are filled on the flip of a coin that determines whether the remainder part receives another copy or not. Time complexity for the algorithm without replacement is O(n) because the deterministic assignment requires only a single pass. It is an improved method of roulette wheel selection Stochastic Universal Selection Stochastic universal selection was developed by Baker [Baker 1987]. It is a single phase selection algorithm with minimum spread and zero bias. It is performed by sizing the slots of a weighted roulette wheel, placing equally spaced markers along the outside of the wheel, and spinning the wheel once; the number of copies an individual receives is then calculated by counting the number of markers that fall in its slot. The algorithm is O (n), because only a single pass is needed through the list after the sum of the function values is calculated. It exhibits less variance than repeated calls to roulette wheel selection. The individuals are mapped to contiguous segments of a line, such that each individual s segment is equal in size to its fitness exactly as in roulette-wheel selection. Here equally spaced pointers are placed over the line, as many as there are individuals to be selected. Stochastic universal selection ensures a selection of offspring, which is closer to what is deserved than roulette wheel selection. Stochastic Universal Selection can be used to make any number of selections. It is preferred in situations where more than one sample is to be drawn from the distribution. Algorithm of Stochastic Universal Selection Set l=1, j=1 µ=f/n While l <= mpool 101

9 begin generate random number r between interval (0, µ) While r<= fitness[i] begin select[l]=parent[i] r=r+1/µ end l=l+1 end Rank Selection Rank Selection sorts the population first according to fitness value and ranks them. Rank N is assigned to the best individual and rank 1 to the worst individual. Then every chromosome is allocated selection probability with respect to its rank [Baker 1985]. Individuals are selected as per their selection probability. Rank selection is an explorative technique of selection. Rank selection prevents too quick convergence and differs from roulette wheel selection in terms of selection pressure. Rank selection overcomes the scaling problems like stagnation or premature convergence. Ranking controls selective pressure by uniform method of scaling across the population. Rank selection behaves in a more robust manner than other methods [Whitley 1989, Back et al. 1991]. Algorithm of Rank Selection Set l=1, j=1, i=nogen While l <= mpool Begin While j<=n Begin Compute rsum,i End Set j=1 While j<=n Begin Compute PRANK j End Generate random number r from interval (0,rsum) Set j=1, S=0 While j<=n Begin Calculate c j =c j-1 +PRANK j If r<=cj, Select the individual j End l=l+1 End 102

10 Linear Rank Selection Linear Rank Selection was proposed by Baker. He stated that each individual in the population is ranked in increasing order of fitness, from 1 to n. the user chooses the expected value of max of an individual with rank n with max >=0. Linear ranking assigns a selection probability to each individual that is proportional to the individual s rank. Linear ranking can be implemented by specifying a single parameter Prank which is the expected number of offsprings to be allocated to the best individual during each generation. In Rank Selection, sum of ranks is computed and then selection probability of each individual is computed as under: rsum = N i 1 r i,j (Eq.xiv) where i varies from 1 to ngen and j varies from 1 to N. PRANK i = r i,j / rsum i (Eq.xv) Expected value of each individual x i in the population at time t is given by ExpVal(x, t) = Min (Max Min) (, ) (Eq.xvi) where Min is the expected value of the individuals with rank 1. Given constraints Max 0 and ExpVal(x, t) = n. It is required that 1 Max 2 and Min = 2- Max [mit98]. Probability of an individual to be selected= p = (η + (η η ) ) (Eq.xvii) where η/n is probability of worst individual to be selected. Selection intensity is calculated as: I (η ) = (1 η ) (Eq.xviii) 103

11 Rank Selection is a two step process. First the list is sorted and then individuals are ranked. So time complexity calculations consider both as separate steps. Assuming the best sorting technique is adopted, rank selection has time complexity O(n log n) Non Linear Ranking It uses non linear distribution [Pohlheim 1995]. Non-linear ranking permits higher selective pressures than the linear ranking method [Weblink17]. Non linear ranking assigns selection probabilities that are based on each individual s rank but are not proportional to the rank. Eg: selection probability may be proportional to square of the rank Truncation Selection In truncation selection the candidate solutions are ordered by fitness, and some proportion, p, (e.g. p=1/2, 1/3, etc.), of the fittest individuals are selected and reproduced 1/p times. Truncation selection is less sophisticated than many other selection methods, and is not often used in practice. Only the individuals above the threshold T are selected as parents. T indicates the proportion of population to be selected as parents and takes values ranging from 50% - 10%. Individuals below threshold do not produce offsprings [Weblink17]. Its overall time complexity is O(n log n) as sorting is involved in its implementation. It is used in Muhlenbein s breeder genetic algorithm[muhlenbein et al. 1993]. Selection Intensity = I (T) =. e Selection Variance= V (T) = 1 I (T). (I ) f ) (Eq.xix) (Eq.xx) Algorithm of Truncation Selection Input : Population P, Truncation Threshold T 0,1 Truncation(P, T, N) Sort(P) according to fitness with worst individual at first position For i=1 to N do Begin R= random((1-t).n,n) Select=P( r) End Return 104

12 6.4.3 Tournament Selection Tournament selection was stated by Goldberg, Korb and Deb [Goldberg et al. 1989a]. In tournament selection, number of chromosomes, say t, is randomly selected from the population. Here, t is tournament size that refers to number of randomly selected individuals. Tournament selection selects the best individual from this group (tournament) as parent. Tournament selection strategy provides selective pressure by holding a tournament competition among t individuals. The best individual from the tournament is the one with the highest fitness, among s individuals. Winner of tournament competition is then inserted into the mating pool. The tournament competition is repeated until the mating pool for generating new offspring is filled [Blickle et al. 1995]. The mating pool comprising of the tournament winner has higher average population fitness. The fitness difference provides the selection pressure, which drives Genetic algorithm to improve the fitness of the succeeding generations. Tournament selection is an efficient method and is more amenable to parallel implementation. Increased selection pressure can be provided by simply increasing the tournament size,t, as the winner from a larger tournament will on an average have a higher fitness than the winner of a smaller tournament. This method is not very useful when we use large population because we will need a lot of N time to search every time on new element from selected group randomly. Time complexity of tournament selection is O(n) because each competition requires selection of constant number of individuals from the population and there are n such competitions required to fill a generation. In this method, no sorting of population is required. Selection intensity= I (t) 2(ln t) ln ( 4.14 ln t) (Eq.xxi) Selection variance V (t).. / (Eq.xxii) Algorithm of Tournament Selection: P is population, t is tournament size For cur=1 to mpoolno Begin Pick t individuals randomly from P Select best of t individuals depending on their fitness value and store in sl 105

13 End mpool(cur)=sl cur=cur Binary Tournament Selection Binary tournament selection is variant of tournament selection. In this method, two individuals are chosen at random and the better of the two individuals is selected with fixed probability p, 0.5 < p ~ Large Tournament Selection In this case, size of each tournament is very large. This would increase selection pressure and the individual selected in each tournament competition would be fitter than other individuals Boltzmann Tournament Selection Boltzmann Tournament Selection was motivated by the process of simulated annealing [Kirkpatrick et al. 1983]. Simulation annealing is an optimisation method that simulates the process of slow cooling of molten metal to achieve the minimum function value in a minimization problem. Concept of Boltzmann probability distribution was introduced to simulate the cooling phenomenon by controlling the temperature- continuously varying parameter. Boltzmann tournament selection was developed to carry over Boltzmann distributions and cooling schedules using genetic algorithms. Boltzmann Tournament Selection maintains diversity in a population by sampling in such a way that the samples, over time, become Boltzmann distributed. The factor of genetic drift works against this selection by limiting the amount of diversity it can maintain. Population of individuals is created using Boltzmann Tournament selection in following manner: A population contains n individuals. In each generation, a three-way competition or "tournament" is held for each population slot. 1. First individual is chosen uniformly at random 2. Second individual is chosen with an objective function value different from the first by a threshold amount x. 106

14 3. There are two choices to choose the third individual. a) Strict choice states that the third individual is chosen with an objective function value different from the first and second individuals by the threshold amount. b) Relaxed choice states that the third individual is chosen with an objective function value different by the threshold amount from the first individual alone. This algorithm is successful creates a steady proportion of individuals according to the Boltzmann distribution. If a uniform random sample is drawn from a Boltzmann -distributed population, it will be biased toward better individuals Steady State Selection - Genitor Genitor works individual by individual, choosing an offspring for birth according to linear ranking and choosing the currently worst individual for replacement [Whitley 1989]. In steady state selection, only a few individuals are replaced in each generation: usually a small number of the least fit individuals are replaced by offspring resulting from crossover and mutation of the fittest individuals. In this scheme, the worst λ individuals of the population are replaced Elitism Elitism copies the first best chromosome or the few best chromosomes to the new population. The rest of the population is created by normal method [De Jong 1975]. Elitism accumulates best individuals and prevents them from being lost or destroyed. This significantly improves the genetic algorithm s performance. Elitism is carried out in conjunction with age-based and stochastic fitness based replacement schemes. 6.5 Other Selection Methods Sigma Scaling: Sigma scaling is also called as "sigma truncation" [Forrest 1985]. Sigma Scaling keeps the selection pressure relatively constant over the course of the run rather than depending on the fitness variances in the population. 107

15 Under sigma scaling, an individual's expected value is a function of its fitness, the population mean, and the population standard deviation. An example of sigma scaling is: ExpVal(X, t) = 1 + ( ) ( ) if σ(t) 0 ( ) 1.0 if σ(t) = 0 (Eq.xxiii) where ExpVal(i,t) is the expected value of individual Xi at time t, f(xi) is the fitness of Xi, f (t) is the mean fitness of the population at time t, and σ(t) is the standard deviation of the population fitnesses at time t. If ExpVal(i,t) was less than 0, Tanese arbitrarily reset it to 0.1, so that individuals with very low fitness had some small chance of reproducing [mit98]. Fine-grained Tournament Selection (FGTS): FGTS is an improvement of the tournament selection. FGTS is controlled by real value parameter F tour (the desired average tournament size) instead of the integer parameter N tour (the tournament size). Similarly to the tournament selection, an individual is chosen if it is the best individual on the tournament. However, unlike tournament selection, size of the tournament is not unique in the whole population, i.e., tournaments with different number of competitors can be held within one step of the selection. The parameter F tour governs the selection procedure; therefore, average tournament size in population should be as close to F tour as possible. Tournament selection randomly chooses a set of individuals and picks out the best for reproduction [Filipovic 1998]. The number of individuals in the set is mostly equal to two but larger tournament sizes can be used in order to increase the selection pressure. Stochastic Tournament Selection: Stochastic tournament selection was stated by Wetzel. In this selection, selection probabilities are calculated normally and successive pairs of chromosomes are drawn using roulette wheel selection [Wetzel 1983]. 6.6 Annealed Selection Proposed Research Work In maximum optimisation problems, it has been observed and stated in the literature that there is no single cure for all the ills. Sometimes exploration techniques outsmart exploitation techniques and vice-versa. Influenced by these observations a number of 108

16 selection operators have been proposed but they were either inclined towards exploitation or exploration. Generally requirements necessitate that in the beginning of evolution cycle exploration is better and in the last exploitation. The focus of the research is to blend the selection operators and generate a new selection operator to obtain perfect fusion of exploration and exploitation. The research considered Roulette Wheel selection as exploitative Selection and Rank Selection as Exploratory selection. The proposed blended selection operator is being coined as Annealed Selection. The name of proposed Annealed selection operator is based on the phenomenon of Simulated Annealing. Annealing is a heat treatment of material with the goal of altering its properties such as hardness. When annealing metal, the initial temperature must not be too low and the cooling must be done sufficiently slow so as to avoid the system getting stuck in a meta-stable, non-crystalline state representing a local minimum of energy [Weise 2007]. Simulated annealing uses a simple cooling schedule that lowers the temperature slowly so that there is no abrupt change in temperature. Similarly, the approach of Proposed Annealed selection is to gradually switch over the selection criteria from exploration to exploitation so as to obtain the perfect blend of the two techniques. The proposed blending of the two contrasting techniques will lead a new path towards optimal solution effectively and hence better in terms of quality than the existing selection operators. Proposed annealed selection is also fitness dependent as roulette wheel selection. In this method, fitness value of each individual is computed according to both rank and roulette wheel selection. Depending upon the current generation number of genetic algorithm, selection pressure is changed and new fitness contribution, X of each individual is computed. Selection probability of each individual is computed on the basis of X. Fitness X is computed by combining the fitness of rank and roulette wheel selection in such a way that fitness of an individual gradually moves from rank selection to roulette wheel selection, thus, implementing the gradual move from exploration to exploitation. This gradual movement is achieved by multiplying fitness of both rank and roulette wheel selection by their respective factors depending on current generation number. As the generation of population changes, fitness contribution changes and selection probability of each individual also changes. The behaviour of proposed annealed selection operator can be easily modified as per requirement by changing the pressure. 109

17 The proposed annealed selection operator computes fitness of individual depending on the current number of generation as under: FRW = F i N i 1 F i (Eq.xxiv) Frank = rank i ranksum (Eq.xxv) Annealed Fitness is denoted by FX and is computed as under: FX i = Frank ra + FRW rb (Eq.xxvi) where ra is factor that has initial value 1 and decreases by 1/ngen in each generation rb is factor that has initial value 0 and increases by 1/ngen in each generation. ngen is total number of generations value of i moves from 1 to number of chromosomes in the population F i is fitness of and i th individual The probability for selecting the i th string is PX i = FXi N i 1 FXi (Eq.xxvii) Algorithm of Annealed selection 1. Set l=1, j=1, i=nogen 2. While l <= mpool Begin a) While j<=n Begin Compute FX i,j End b) Set j=1, S=0 c) While j<=n Begin Compute S=S+FX i,j End d) Generate random number r from interval (0,S) e) Set j=1, S=0 f) While j<=n Begin Calculate c j =c j-1 +FX i,j If r<=cj, Select the individual j End g) l=l+1 End 110

18 6.7 Analysis of Fitness based on Annealed Selection Method Generation Chromosome A Chromosome B Chromosome C Chromosome D Pie Chart showing contribution of each chromosome A B C D

19 The fitness of the chromosome (individual) changes in each generation and similarly its probability of being selected also changes. In initial generations, Selection pressure is low, so the search space is explored. As the generations pass by, the selection pressure increases, thus moving from exploration to exploitation. It is evident from the pie charts that the selection pressure moves from exploration to exploitation gradually. On further experimentations, it has been found that proposed annealed selection has both features of exploration and exploitation i.e. it can be said to be blend of the two techniques. 6.8 Implementation of Annealed Selection The performance of Proposed Annealed selection was initially tested on Travelling Salesman Problem for four different population sizes 10 cities, 20 cities, 50 cities and 100 cities using the MATLAB code [Kumar et al. 2011]. The code assessed the performance of genetic algorithm by using roulette wheel selection, rank selection and annealed selection using the same initial population. Except selection criteria, all other factors affecting the performance 112

20 of genetic algorithm were kept constant. The code was run for 100 generations in each case. Average and minimum fitness in each generation was computed over 100 generations and plotted to compare the performance of three approaches. Figure 6.3 depicts the comparison of average tour length and Figure 6.4 depicts the comparison of minimum tour length in three different selection methods. Table 6.1 lists the detailed data for four different problem sizes and various parameters to analyse performance of the three methods. Comparison of average and minimum tour length is presented in Figure 6.5 and Figure 6.6 respectively. Figure 6.3 Comparison of Average Tour Length for Annealed Selection Figure 6.4 Comparison of Minimum Tour Length for Annealed Selection 113

21 Problem Size (PS) in cities F best (Min) Table 6.1 Comparison of Three Selection Approaches for TSP Rank Selection (RS) Convergence Number Roulette Wheel Selection (RWS) F best Convergence (Min) Number (Avg) Proposed Annealed Selection (PBS) F best Convergence (Min) Number (Avg) (Avg) Comparison of Average Fitness 500 Average Fitness Value RS RWS PBS Number of Cities Figure 6.5 Comparison of Average Fitness on different Problem Sizes for Annealed Selection Comparison of Best Fitness Best Fitness Value RS RWS PBS Number of Cities Figure 6.6 Comparison of Minimum fitness on different problem sizes for Annealed selection 114

22 Later the code was also tested on set of five benchmark functions stated in chapter 2. Average and minimum value of fn(x) in each generation is computed over 50 and 100 generations and plotted to compare the performance of three approaches. The comparison of selection techniques is based on their respective performance estimated as the value of the function. Average and Minimum value of evaluation of each function was recorded and examined for further analysis [Kumat et al. 2012d]. The following parameters were used in this implementation: Population size (N): 3,5,10 and 20 Number of generations (ngen) : 50 and 100 Population selection method: Roulette Wheel Selection (RWS), Rank Selection (RS) and Annealed Selection (AS) Crossover Operator: Simple arithmetic crossover Mutation: Inversion with mutation probability 5% Algorithm ending criteria: Execution stops on reaching ngen generations Fitness Function: Objective value of function Generations =50 Generations =100 Table 6.2 : Average and Minimum Value of fn(x) in F1 N Roulette Wheel Min 4.93E E E E-31 Selection (RWS) Avg Rank Selection Min 5.73E E E E-30 (RS) Avg Proposed Min 8.89E E E E-32 Annealed Avg Selection (AS) Roulette Wheel Min 1.93E E E E-65 Selection (RWS) Avg Rank Selection Min 1.18E E E E-60 (RS) Avg Proposed Annealed Selection (AS) Min 7.21E E E E-66 Avg

23 Figure 6.7: Comparison of Average Value of fn(x) in F1 Figure 6.8: Comparison of Minimum Value of fn(x) in F1 116

24 Generations =50 Generations =100 Table 6.3 : Average and Minimum Value of fn(x) in F2 N Roulette Wheel Min Selection (RWS) Avg Rank Selection (RS) Min Avg Proposed Annealed Min Selection (AS) Avg Roulette Wheel Min Selection (RWS) Avg Rank Selection (RS) Min Avg Proposed Annealed Selection (AS) Min Avg Figure 6.9: Comparison of Average Value of fn(x) in F2 117

25 Figure 6.10: Comparison of Minimum Value of fn(x) in F2 Table 6.4 : Average and Minimum Value of fn(x) in F3 Generations =50 Generations =100 N Roulette Wheel Min 1.09E E E E-15 Selection (RWS) Avg Rank Selection Min 6.57E E E E-13 (RS) Avg Proposed Min 1.52E E E E-15 Annealed Avg Selection (AS) Roulette Wheel Min 5.93E E E E-32 Selection (RWS) Avg Rank Selection Min 1.14E E E E-31 (RS) Avg Proposed Annealed Selection (AS) Min 6.48E E E E-32 Avg

26 Figure 6.11: Comparison of Average Value of fn(x) in F3 Figure 6.12: Comparison of Minimum Value of fn(x) in F3 119

27 Generations =50 Generations =100 Table 6.5 : Average and Minimum Value of fn(x) in F4 N Roulette Wheel Min Selection (RWS) Avg Rank Selection Min (RS) Avg Proposed Min Annealed Avg Selection (AS) Roulette Wheel Min Selection (RWS) Avg Rank Selection Min (RS) Avg Proposed Annealed Selection (AS) Min Avg Figure 6.13: Comparison of Average Value of fn(x) in F4 120

28 Figure 6.14: Comparison of Minimum Value of fn(x) in F4 Table 6.6 : Average and Minimum Value of fn(x) in F5 Generations =50 Generations =100 N Roulette Wheel Min E Selection (RWS) Avg Rank Selection Min E (RS) Avg Proposed Min E Annealed Avg Selection (AS) Roulette Wheel Min E Selection (RWS) Avg Rank Selection Min E (RS) Avg Proposed Annealed Selection (AS) Min E Avg

29 Figure 6.15: Comparison of Average Value of fn(x) in F5 Figure 6.16: Comparison of Minimum Value of fn(x) in F5 122

30 6.9 Observations and Findings In order to study the effect of existing selection operators and proposed annealed selection operator on the performance of genetic algorithm, the implementation has been carried out by keeping the factors initial population, crossover type and its probability, mutation type and mutation rate constant in all the cases. Test runs were carried out for 50 generations as well as 100 generations and four different population sizes. The results for the five benchmark functions in terms of average and minimum value are summarized in Table 6.2, 6.3, 6.4, 6.5 and 6.6. Figures 6.7, 6.8, 6.9, 6.10, 6.11, 6.12, 6.13, 6.14, 6.15 and 6.16 show the performance curves of the three selection operators in terms of average and minimum value of fitness function for population size 20 and 100 generations. It has been observed that in maximum cases proposed annealed selection has outperformed the roulette wheel selection and rank selection. It can be clearly seen that in early runs of generation, the proposed annealed selection explores the search space and as the number of generations increases, there is increased selection pressure and the annealed selection uses exploitation mechanism in selecting the individual. In early generations, the behaviour of annealed selection is just like the rank selection and gradually it transforms into roulette wheel selection and elitism, this justifies that annealed selection is perfect blend of exploration and exploitation. Test runs have been conducted on Travelling Salesman Problem as well as five different benchmark functions and optimistic results in favour of annealed selection are produced Summary The primary objective of the selection operator is to emphasize better solutions in a population. The identification of good or bad solutions in a population is usually accomplished according to a solution s fitness. Selection techniques are either based on fitness of an individual or on the ranking of the individuals in a population. Numerous selection techniques have been identified and used in the past. The chapter gave a systematic overview of existing approaches of selection and then discussed the selection operator proposed in the research work. The proposed annealed selection operator is a step ahead of existing operators and considers both exploration as well as exploitation approach. 123

31 The proposed annealed selection operator was test on TSP problem as well as benchmark functions resulting in optimistic results. The tests affirmed that the in early generations, the proposed annealed selection operator explored the search space as the selection pressure was low. As the number of generations passes by, selection pressure increases thus transforming the exploratory nature of selection to the exploiting nature. The proposed annealed selection neither gets lost in the search space indefinitely nor results in premature convergence. The research on proposed annealed selection operator is quite satisfactory and the operator is used in conjunction with other genetic operators in forthcoming chapters to substantiate its relevance. 124

Comparative Study of Different Selection Techniques in Genetic Algorithm

Comparative Study of Different Selection Techniques in Genetic Algorithm Journal Homepage: Comparative Study of Different Selection Techniques in Genetic Algorithm Saneh Lata Yadav 1 Asha Sohal 2 Keywords: Genetic Algorithms Selection Techniques Roulette Wheel Selection Tournament

More information

COMPARATIVE STUDY OF SELECTION METHODS IN GENETIC ALGORITHM

COMPARATIVE STUDY OF SELECTION METHODS IN GENETIC ALGORITHM COMPARATIVE STUDY OF SELECTION METHODS IN GENETIC ALGORITHM 1 MANSI GANGWAR, 2 MAIYA DIN, 3 V. K. JHA 1 Information Security, 3 Associate Professor, 1,3 Dept of CSE, Birla Institute of Technology, Mesra

More information

Machine Learning. Genetic Algorithms

Machine Learning. Genetic Algorithms Machine Learning Genetic Algorithms Genetic Algorithms Developed: USA in the 1970 s Early names: J. Holland, K. DeJong, D. Goldberg Typically applied to: discrete parameter optimization Attributed features:

More information

Machine Learning. Genetic Algorithms

Machine Learning. Genetic Algorithms Machine Learning Genetic Algorithms Genetic Algorithms Developed: USA in the 1970 s Early names: J. Holland, K. DeJong, D. Goldberg Typically applied to: discrete parameter optimization Attributed features:

More information

Plan for today GENETIC ALGORITHMS. Randomised search. Terminology: The GA cycle. Decoding genotypes

Plan for today GENETIC ALGORITHMS. Randomised search. Terminology: The GA cycle. Decoding genotypes GENETIC ALGORITHMS Jacek Malec email: jacek.malec@cs.lth.se Plan for today What is a genetic algorithm? Degrees of freedom. Some examples. Co-evolution, SAGA, Genetic Programming, Evolutionary Strategies,...

More information

Introduction To Genetic Algorithms

Introduction To Genetic Algorithms 1 Introduction To Genetic Algorithms Dr. Rajib Kumar Bhattacharjya Department of Civil Engineering IIT Guwahati Email: rkbc@iitg.ernet.in References 2 D. E. Goldberg, Genetic Algorithm In Search, Optimization

More information

Evolutionary Computation. Lecture 3. Evolutionary Computation. X 2 example: crossover. x 2 example: selection

Evolutionary Computation. Lecture 3. Evolutionary Computation. X 2 example: crossover. x 2 example: selection Evolutionary Computation Lecture 3 Evolutionary Computation CIS 412 Artificial Intelligence Umass, Dartmouth Stochastic search (or problem solving) techniques that mimic the metaphor of natural biological

More information

Genetic Algorithms. Part 3: The Component of Genetic Algorithms. Spring 2009 Instructor: Dr. Masoud Yaghini

Genetic Algorithms. Part 3: The Component of Genetic Algorithms. Spring 2009 Instructor: Dr. Masoud Yaghini Genetic Algorithms Part 3: The Component of Genetic Algorithms Spring 2009 Instructor: Dr. Masoud Yaghini Outline Genetic Algorithms: Part 3 Representation of Individuals Mutation Recombination Population

More information

Optimisation and Operations Research

Optimisation and Operations Research Optimisation and Operations Research Lecture 17: Genetic Algorithms and Evolutionary Computing Matthew Roughan http://www.maths.adelaide.edu.au/matthew.roughan/ Lecture_notes/OORII/

More information

GENETIC ALGORITHMS. Narra Priyanka. K.Naga Sowjanya. Vasavi College of Engineering. Ibrahimbahg,Hyderabad.

GENETIC ALGORITHMS. Narra Priyanka. K.Naga Sowjanya. Vasavi College of Engineering. Ibrahimbahg,Hyderabad. GENETIC ALGORITHMS Narra Priyanka K.Naga Sowjanya Vasavi College of Engineering. Ibrahimbahg,Hyderabad mynameissowji@yahoo.com priyankanarra@yahoo.com Abstract Genetic algorithms are a part of evolutionary

More information

2. Genetic Algorithms - An Overview

2. Genetic Algorithms - An Overview 2. Genetic Algorithms - An Overview 2.1 GA Terminology Genetic Algorithms (GAs), which are adaptive methods used to solve search and optimization problems, are based on the genetic processes of biological

More information

Evolutionary Algorithms

Evolutionary Algorithms Evolutionary Algorithms Evolutionary Algorithms What is Evolutionary Algorithms (EAs)? Evolutionary algorithms are iterative and stochastic search methods that mimic the natural biological evolution and/or

More information

Introduction To Genetic Algorithms

Introduction To Genetic Algorithms Introduction To Genetic Algorithms Cse634 DATA MINING Professor Anita Wasilewska Computer Science Department Stony Brook University 1 Overview Introduction To Genetic Algorithms (GA) GA Operators and Parameters

More information

Machine Learning: Algorithms and Applications

Machine Learning: Algorithms and Applications Machine Learning: Algorithms and Applications Floriano Zini Free University of Bozen-Bolzano Faculty of Computer Science Academic Year 2011-2012 Lecture 4: 19 th March 2012 Evolutionary computing These

More information

Keywords Genetic Algorithm (GA), Evolutionary, Representation, Binary, Floating Point, Operator

Keywords Genetic Algorithm (GA), Evolutionary, Representation, Binary, Floating Point, Operator Volume 5, Issue 4, 2015 ISSN: 2277 128X International Journal of Advanced Research in Computer Science and Software Engineering Research Paper Available online at: www.ijarcsse.com A Review on Genetic

More information

Introduction to Artificial Intelligence. Prof. Inkyu Moon Dept. of Robotics Engineering, DGIST

Introduction to Artificial Intelligence. Prof. Inkyu Moon Dept. of Robotics Engineering, DGIST Introduction to Artificial Intelligence Prof. Inkyu Moon Dept. of Robotics Engineering, DGIST Chapter 9 Evolutionary Computation Introduction Intelligence can be defined as the capability of a system to

More information

The Metaphor. Individuals living in that environment Individual s degree of adaptation to its surrounding environment

The Metaphor. Individuals living in that environment Individual s degree of adaptation to its surrounding environment Genetic Algorithms Sesi 14 Optimization Techniques Mathematical Programming Network Analysis Branch & Bound Simulated Annealing Tabu Search Classes of Search Techniques Calculus Base Techniqes Fibonacci

More information

Intelligent Techniques Lesson 4 (Examples about Genetic Algorithm)

Intelligent Techniques Lesson 4 (Examples about Genetic Algorithm) Intelligent Techniques Lesson 4 (Examples about Genetic Algorithm) Numerical Example A simple example will help us to understand how a GA works. Let us find the maximum value of the function (15x - x 2

More information

Genetic Algorithms and Genetic Programming Lecture 13

Genetic Algorithms and Genetic Programming Lecture 13 Genetic Algorithms and Genetic Programming Lecture 13 Gillian Hayes 10th November 2008 Pragmatics of GA Design 1 Selection methods Crossover Mutation Population model and elitism Spatial separation Maintaining

More information

CHAPTER 4 MAINTENANCE OPTIMIZATION USING GENETIC ALGORITHM

CHAPTER 4 MAINTENANCE OPTIMIZATION USING GENETIC ALGORITHM 44 CHAPTER 4 MAINTENANCE OPTIMIZATION USING GENETIC ALGORITHM 4.1 INTRODUCTION Engineering systems, nowadays, are becoming more and more complex due to the application of automation, miniaturization, embedded

More information

Deterministic Crowding, Recombination And Self-Similarity

Deterministic Crowding, Recombination And Self-Similarity Deterministic Crowding, Recombination And Self-Similarity Bo Yuan School of Information Technology and Electrical Engineering The University of Queensland Brisbane, Queensland 4072 Australia E-mail: s4002283@student.uq.edu.au

More information

Part 1: Motivation, Basic Concepts, Algorithms

Part 1: Motivation, Basic Concepts, Algorithms Part 1: Motivation, Basic Concepts, Algorithms 1 Review of Biological Evolution Evolution is a long time scale process that changes a population of an organism by generating better offspring through reproduction.

More information

Evolutionary Algorithms - Population management and popular algorithms Kai Olav Ellefsen

Evolutionary Algorithms - Population management and popular algorithms Kai Olav Ellefsen INF3490 - Biologically inspired computing Lecture 3: Eiben and Smith, chapter 5-6 Evolutionary Algorithms - Population management and popular algorithms Kai Olav Ellefsen Repetition: General scheme of

More information

Evolutionary Algorithms:

Evolutionary Algorithms: GEATbx Introduction Evolutionary Algorithms: Overview, Methods and Operators Hartmut Pohlheim Documentation for: Genetic and Evolutionary Algorithm Toolbox for use with Matlab version: toolbox 3.3 documentation

More information

CSE /CSE6602E - Soft Computing Winter Lecture 9. Genetic Algorithms & Evolution Strategies. Guest lecturer: Xiangdong An

CSE /CSE6602E - Soft Computing Winter Lecture 9. Genetic Algorithms & Evolution Strategies. Guest lecturer: Xiangdong An CSE3 3./CSE66E - Soft Computing Winter Lecture 9 Genetic Algorithms & Evolution Strategies Guest lecturer: Xiangdong An xan@cs.yorku.ca Genetic algorithms J. Holland, Adaptation in Natural and Artificial

More information

Journal of Global Research in Computer Science PREMATURE CONVERGENCE AND GENETIC ALGORITHM UNDER OPERATING SYSTEM PROCESS SCHEDULING PROBLEM

Journal of Global Research in Computer Science PREMATURE CONVERGENCE AND GENETIC ALGORITHM UNDER OPERATING SYSTEM PROCESS SCHEDULING PROBLEM Volume, No. 5, December 00 Journal of Global Research in Computer Science RESEARCH PAPER Available Online at www.jgrcs.info PREMATURE CONVERGENCE AND GENETIC ALGORITHM UNDER OPERATING SYSTEM PROCESS SCHEDULING

More information

Genetic Algorithms for Optimizations

Genetic Algorithms for Optimizations Genetic Algorithms for Optimizations 1. Introduction Genetic Algorithms (GAs) are developed to mimic some of the processes observed in natural evolution. GAs use the concept of Darwin's theory of evolution

More information

Genetic'Algorithms'::' ::'Algoritmi'Genetici'1

Genetic'Algorithms'::' ::'Algoritmi'Genetici'1 Genetic'Algorithms'::' ::'Algoritmi'Genetici'1 Prof. Mario Pavone Department of Mathematics and Computer Sciecne University of Catania v.le A. Doria 6 95125 Catania, Italy mpavone@dmi.unict.it http://www.dmi.unict.it/mpavone/

More information

VISHVESHWARAIAH TECHNOLOGICAL UNIVERSITY S.D.M COLLEGE OF ENGINEERING AND TECHNOLOGY. A seminar report on GENETIC ALGORITHMS.

VISHVESHWARAIAH TECHNOLOGICAL UNIVERSITY S.D.M COLLEGE OF ENGINEERING AND TECHNOLOGY. A seminar report on GENETIC ALGORITHMS. VISHVESHWARAIAH TECHNOLOGICAL UNIVERSITY S.D.M COLLEGE OF ENGINEERING AND TECHNOLOGY A seminar report on GENETIC ALGORITHMS Submitted by Pranesh S S 2SD06CS061 8 th semester DEPARTMENT OF COMPUTER SCIENCE

More information

Generational and steady state genetic algorithms for generator maintenance scheduling problems

Generational and steady state genetic algorithms for generator maintenance scheduling problems Generational and steady state genetic algorithms for generator maintenance scheduling problems Item Type Conference paper Authors Dahal, Keshav P.; McDonald, J.R. Citation Dahal, K. P. and McDonald, J.

More information

Evolutionary Computation

Evolutionary Computation Evolutionary Computation Evolution and Intelligent Besides learning ability, intelligence can also be defined as the capability of a system to adapt its behaviour to ever changing environment. Evolutionary

More information

College of information technology Department of software

College of information technology Department of software University of Babylon Undergraduate: third class College of information technology Department of software Subj.: Application of AI lecture notes/2011-2012 ***************************************************************************

More information

Genetic Algorithm: An Optimization Technique Concept

Genetic Algorithm: An Optimization Technique Concept Genetic Algorithm: An Optimization Technique Concept 1 Uma Anand, 2 Chain Singh 1 Student M.Tech (3 rd sem) Department of Computer Science Engineering Dronacharya College of Engineering, Gurgaon-123506,

More information

initial set of random solutions called population satisfying boundary and/or system

initial set of random solutions called population satisfying boundary and/or system CHAPTER 4 Genetic Algorithm GAs are stochastic search algorithms based on the mechanism of natural selection and natural genetics. GA, differing from conventional search techniques, start with an initial

More information

CapSel GA Genetic Algorithms.

CapSel GA Genetic Algorithms. CapSel GA - 01 Genetic Algorithms keppens@rijnh.nl Typical usage: optimization problems both minimization and maximization of complicated functions completely standard problem with non-standard solution

More information

Introduction to Genetic Algorithm (GA) Presented By: Rabiya Khalid Department of Computer Science

Introduction to Genetic Algorithm (GA) Presented By: Rabiya Khalid Department of Computer Science Introduction to Genetic Algorithm (GA) Presented By: Rabiya Khalid Department of Computer Science 1 GA (1/31) Introduction Based on Darwin s theory of evolution Rapidly growing area of artificial intelligence

More information

Intro. ANN & Fuzzy Systems. Lecture 36 GENETIC ALGORITHM (1)

Intro. ANN & Fuzzy Systems. Lecture 36 GENETIC ALGORITHM (1) Lecture 36 GENETIC ALGORITHM (1) Outline What is a Genetic Algorithm? An Example Components of a Genetic Algorithm Representation of gene Selection Criteria Reproduction Rules Cross-over Mutation Potential

More information

What is Evolutionary Computation? Genetic Algorithms. Components of Evolutionary Computing. The Argument. When changes occur...

What is Evolutionary Computation? Genetic Algorithms. Components of Evolutionary Computing. The Argument. When changes occur... What is Evolutionary Computation? Genetic Algorithms Russell & Norvig, Cha. 4.3 An abstraction from the theory of biological evolution that is used to create optimization procedures or methodologies, usually

More information

An introduction to genetic algorithms for neural networks

An introduction to genetic algorithms for neural networks An introduction to genetic algorithms for neural networks Richard Kemp 1 Introduction Once a neural network model has been created, it is frequently desirable to use the model backwards and identify sets

More information

Evolutionary Computation. Lecture 1 January, 2007 Ivan Garibay

Evolutionary Computation. Lecture 1 January, 2007 Ivan Garibay Evolutionary Computation Lecture 1 January, 2007 Ivan Garibay igaribay@cs.ucf.edu Lecture 1 What is Evolutionary Computation? Evolution, Genetics, DNA Historical Perspective Genetic Algorithm Components

More information

10. Lecture Stochastic Optimization

10. Lecture Stochastic Optimization Soft Control (AT 3, RMA) 10. Lecture Stochastic Optimization Genetic Algorithms 10. Structure of the lecture 1. Soft control: the definition and limitations, basics of epert" systems 2. Knowledge representation

More information

Energy management using genetic algorithms

Energy management using genetic algorithms Energy management using genetic algorithms F. Garzia, F. Fiamingo & G. M. Veca Department of Electrical Engineering, University of Rome "La Sapienza", Italy Abstract An energy management technique based

More information

Design and Implementation of Genetic Algorithm as a Stimulus Generator for Memory Verification

Design and Implementation of Genetic Algorithm as a Stimulus Generator for Memory Verification International Journal of Emerging Engineering Research and Technology Volume 3, Issue 9, September, 2015, PP 18-24 ISSN 2349-4395 (Print) & ISSN 2349-4409 (Online) Design and Implementation of Genetic

More information

GENETIC ALGORITHM CHAPTER 2

GENETIC ALGORITHM CHAPTER 2 CHAPTER 2 GENETIC ALGORITHM Genetic algorithm is basically a method for solving constrained and unconstrained optimization problems. GA is based on the Darwin s theory of natural evolution specified in

More information

CHAPTER 3 RESEARCH METHODOLOGY

CHAPTER 3 RESEARCH METHODOLOGY 72 CHAPTER 3 RESEARCH METHODOLOGY Inventory management is considered to be an important field in Supply chain management. Once the efficient and effective management of inventory is carried out throughout

More information

EVOLUTIONARY ALGORITHMS AT CHOICE: FROM GA TO GP EVOLŪCIJAS ALGORITMI PĒC IZVĒLES: NO GA UZ GP

EVOLUTIONARY ALGORITHMS AT CHOICE: FROM GA TO GP EVOLŪCIJAS ALGORITMI PĒC IZVĒLES: NO GA UZ GP ISSN 1691-5402 ISBN 978-9984-44-028-6 Environment. Technology. Resources Proceedings of the 7 th International Scientific and Practical Conference. Volume I1 Rēzeknes Augstskola, Rēzekne, RA Izdevniecība,

More information

Selecting Genetic Algorithm Operators for CEM Problems

Selecting Genetic Algorithm Operators for CEM Problems Selecting Genetic Algorithm Operators for CEM Problems Randy L. Haupt Communications Science & Technology The Pennsylvania State University Applied Research Laboratory P. O. Box 30 State College, PA 16804-0030

More information

CHAPTER 4 LINEAR ANTENNA ARRAY SYNTHESIS USING GENETIC ALGORITHM

CHAPTER 4 LINEAR ANTENNA ARRAY SYNTHESIS USING GENETIC ALGORITHM 76 CHAPTER 4 LINEAR ANTENNA ARRAY SYNTHESIS USING GENETIC ALGORITHM 4.1 INTRODUCTION Improving antenna characteristics such as directivity, input impedance, bandwidth, sidelobe level, null depth level

More information

Evolutionary Algorithms - Introduction and representation Jim Tørresen

Evolutionary Algorithms - Introduction and representation Jim Tørresen INF3490 - Biologically inspired computing Lecture 2: Eiben and Smith, chapter 1-4 Evolutionary Algorithms - Introduction and representation Jim Tørresen Evolution Biological evolution: Lifeforms adapt

More information

ESQUIVEL S.C., LEIVA H. A., GALLARD, R.H.

ESQUIVEL S.C., LEIVA H. A., GALLARD, R.H. SELF-ADAPTATION OF PARAMETERS FOR MCPC IN GENETIC ALGORITHMS ESQUIVEL S.C., LEIVA H. A., GALLARD, R.H. Proyecto UNSL-338403 1 Departamento de Informática Universidad Nacional de San Luis (UNSL) Ejército

More information

Minimizing Makespan for Machine Scheduling and Worker Assignment Problem in Identical Parallel Machine Models Using GA

Minimizing Makespan for Machine Scheduling and Worker Assignment Problem in Identical Parallel Machine Models Using GA , June 30 - July 2, 2010, London, U.K. Minimizing Makespan for Machine Scheduling and Worker Assignment Problem in Identical Parallel Machine Models Using GA Imran Ali Chaudhry, Sultan Mahmood and Riaz

More information

Genetic Algorithm: A Search of Complex Spaces

Genetic Algorithm: A Search of Complex Spaces Genetic Algorithm: A Search of Complex Spaces Namita Khurana, Anju Rathi, Akshatha.P.S Lecturer in Department of (CSE/IT) KIIT College of Engg., Maruti Kunj, Sohna Road, Gurgaon, India ABSTRACT Living

More information

PDGA: the Primal-Dual Genetic Algorithm

PDGA: the Primal-Dual Genetic Algorithm P: the Primal-Dual Genetic Algorithm Shengxiang Yang Department of Computer Science University of Leicester University Road, Leicester LE1 7RH, UK Email: syang@mcsleacuk Abstract Genetic algorithms (GAs)

More information

Processor Scheduling Algorithms in Environment of Genetics

Processor Scheduling Algorithms in Environment of Genetics Processor Scheduling Algorithms in Environment of Genetics Randeep Department of Computer Science and Engineering R.N. College of Engg. & Technology Haryana, India randeepravish@gmail.com Abstract The

More information

Genetic Algorithm for Predicting Protein Folding in the 2D HP Model

Genetic Algorithm for Predicting Protein Folding in the 2D HP Model Genetic Algorithm for Predicting Protein Folding in the 2D HP Model A Parameter Tuning Case Study Eyal Halm Leiden Institute of Advanced Computer Science, University of Leiden Niels Bohrweg 1 2333 CA Leiden,

More information

On Some Basic Concepts of Genetic Algorithms as a Meta-Heuristic Method for Solving of Optimization Problems

On Some Basic Concepts of Genetic Algorithms as a Meta-Heuristic Method for Solving of Optimization Problems Journal of Software Engineering and Applications, 2011, 4, 482-486 doi:10.4236/jsea.2011.48055 Published Online August 2011 (http://www.scirp.org/journal/jsea) On Some Basic Concepts of Genetic Algorithms

More information

Logistics. Final exam date. Project Presentation. Plan for this week. Evolutionary Algorithms. Crossover and Mutation

Logistics. Final exam date. Project Presentation. Plan for this week. Evolutionary Algorithms. Crossover and Mutation Logistics Crossover and Mutation Assignments Checkpoint -- Problem Graded -- comments on mycourses Checkpoint --Framework Mostly all graded -- comments on mycourses Checkpoint -- Genotype / Phenotype Due

More information

EFFECT OF CROSS OVER OPERATOR IN GENETIC ALGORITHMS ON ANTICIPATORY SCHEDULING

EFFECT OF CROSS OVER OPERATOR IN GENETIC ALGORITHMS ON ANTICIPATORY SCHEDULING 24th International Symposium on on Automation & Robotics in in Construction (ISARC 2007) Construction Automation Group, I.I.T. Madras EFFECT OF CROSS OVER OPERATOR IN GENETIC ALGORITHMS ON ANTICIPATORY

More information

Genetic algorithms and code optimization. A quiet revolution

Genetic algorithms and code optimization. A quiet revolution Genetic algorithms and code optimization Devika Subramanian Rice University Work supported by DARPA and the USAF Research Labs A quiet revolution (May 1997) Deep Blue vs Kasparaov first match won against

More information

DEVELOPMENT OF MULTI-OBJECTIVE SIMULATION-BASED GENETIC ALGORITHM FOR SUPPLY CHAIN CYCLIC PLANNING AND OPTIMISATION

DEVELOPMENT OF MULTI-OBJECTIVE SIMULATION-BASED GENETIC ALGORITHM FOR SUPPLY CHAIN CYCLIC PLANNING AND OPTIMISATION From the SelectedWorks of Liana Napalkova May, 2008 DEVELOPMENT OF MULTI-OBJECTIVE SIMULATION-BASED GENETIC ALGORITHM FOR SUPPLY CHAIN CYCLIC PLANNING AND OPTIMISATION Galina Merkuryeva Liana Napalkova

More information

Genetic Algorithms and Genetic Programming Lecture 14

Genetic Algorithms and Genetic Programming Lecture 14 Genetic Algorithms and Genetic Programming Lecture 14 Gillian Hayes 13th November 2007 GAs: Representation, Fitness, Searching the Space Representation binary, real, plans, trees Operators for non-binary

More information

Computational Intelligence Lecture 20:Intorcution to Genetic Algorithm

Computational Intelligence Lecture 20:Intorcution to Genetic Algorithm Computational Intelligence Lecture 20:Intorcution to Genetic Algorithm Farzaneh Abdollahi Department of Electrical Engineering Amirkabir University of Technology Fall 2012 Farzaneh Abdollahi Computational

More information

What is an Evolutionary Algorithm? Presented by: Faramarz Safi (Ph.D.) Faculty of Computer Engineering Islamic Azad University, Najafabad Branch

What is an Evolutionary Algorithm? Presented by: Faramarz Safi (Ph.D.) Faculty of Computer Engineering Islamic Azad University, Najafabad Branch Presented by: Faramarz Safi (Ph.D.) Faculty of Computer Engineering Islamic Azad University, Najafabad Branch Chapter 2 Contents Recap of Evolutionary Metaphor Basic scheme of an EA Basic Components: Representation

More information

Derivative-based Optimization (chapter 6)

Derivative-based Optimization (chapter 6) Soft Computing: Derivative-base Optimization Derivative-based Optimization (chapter 6) Bill Cheetham cheetham@cs.rpi.edu Kai Goebel goebel@cs.rpi.edu used for neural network learning used for multidimensional

More information

Genetic Algorithms in Matrix Representation and Its Application in Synthetic Data

Genetic Algorithms in Matrix Representation and Its Application in Synthetic Data Genetic Algorithms in Matrix Representation and Its Application in Synthetic Data Yingrui Chen *, Mark Elliot ** and Joe Sakshaug *** * ** University of Manchester, yingrui.chen@manchester.ac.uk University

More information

IMPLEMENTATION OF AN OPTIMIZATION TECHNIQUE: GENETIC ALGORITHM

IMPLEMENTATION OF AN OPTIMIZATION TECHNIQUE: GENETIC ALGORITHM IMPLEMENTATION OF AN OPTIMIZATION TECHNIQUE: GENETIC ALGORITHM TWINKLE GUPTA* Department of Computer Science, Hindu Kanya MahaVidyalya, Jind, India Abstract We are encountered with various optimization

More information

TRAINING FEED FORWARD NEURAL NETWORK USING GENETIC ALGORITHM TO PREDICT MEAN TEMPERATURE

TRAINING FEED FORWARD NEURAL NETWORK USING GENETIC ALGORITHM TO PREDICT MEAN TEMPERATURE IJRRAS 29 (1) October 216 www.arpapress.com/volumes/vol29issue1/ijrras_29_1_3.pdf TRAINING FEED FORWARD NEURAL NETWORK USING GENETIC ALGORITHM TO PREDICT MEAN TEMPERATURE Manal A. Ashour 1,*, Somia A.

More information

A Simulator to Prevent Premature Convergence in YM using GA

A Simulator to Prevent Premature Convergence in YM using GA International Association of Scientific Innovation and Research (IASIR) (An Association Unifying the Sciences, Engineering, and Applied Research) International Journal of Emerging Technologies in Computational

More information

Adaptive Mutation with Fitness and Allele Distribution Correlation for Genetic Algorithms

Adaptive Mutation with Fitness and Allele Distribution Correlation for Genetic Algorithms Adaptive Mutation with Fitness and Allele Distribution Correlation for Genetic Algorithms Shengxiang Yang Department of Computer Science University of Leicester University Road, Leicester LE 7RH, UK s.yang@mcs.le.ac.uk

More information

Genetic approach to solve non-fractional knapsack problem S. M Farooq 1, G. Madhavi 2 and S. Kiran 3

Genetic approach to solve non-fractional knapsack problem S. M Farooq 1, G. Madhavi 2 and S. Kiran 3 Genetic approach to solve non-fractional knapsack problem S. M Farooq 1, G. Madhavi 2 and S. Kiran 3 1,2,3 Y. S. R Engineering College, Yogi Vemana University Korrapad Road, Proddatur 516360, India 1 shaikfaroq@gmail.com,

More information

Simulation-Based Analysis and Optimisation of Planning Policies over the Product Life Cycle within the Entire Supply Chain

Simulation-Based Analysis and Optimisation of Planning Policies over the Product Life Cycle within the Entire Supply Chain From the SelectedWorks of Liana Napalkova June, 2009 Simulation-Based Analysis and Optimisation of Planning Policies over the Product Life Cycle within the Entire Supply Chain Galina Merkuryeva Liana Napalkova

More information

Feature Selection for Predictive Modelling - a Needle in a Haystack Problem

Feature Selection for Predictive Modelling - a Needle in a Haystack Problem Paper AB07 Feature Selection for Predictive Modelling - a Needle in a Haystack Problem Munshi Imran Hossain, Cytel Statistical Software & Services Pvt. Ltd., Pune, India Sudipta Basu, Cytel Statistical

More information

Implementation of CSP Cross Over in Solving Travelling Salesman Problem Using Genetic Algorithms

Implementation of CSP Cross Over in Solving Travelling Salesman Problem Using Genetic Algorithms Implementation of CSP Cross Over in Solving Travelling Salesman Problem Using Genetic Algorithms Karishma Mendiratta #1, Ankush Goyal *2 #1 M.Tech. Scholar, *2 Assistant Professor, Department of Computer

More information

Comparison of some fine-grained and coarse-grained parallel genetic algorithms with population reinitialization

Comparison of some fine-grained and coarse-grained parallel genetic algorithms with population reinitialization 1 Portál pre odborné publikovanie ISSN 1338-0087 Comparison of some fine-grained and coarse-grained parallel genetic algorithms with population reinitialization Oravec Michal Elektrotechnika, Študentské

More information

Genetic algorithms. History

Genetic algorithms. History Genetic algorithms History Idea of evolutionary computing was introduced in the 1960s by I. Rechenberg in his work "Evolution strategies" (Evolutionsstrategie in original). His idea was then developed

More information

From Genetics to Genetic Algorithms

From Genetics to Genetic Algorithms From Genetics to Genetic Algorithms Solution to Optimisation Problems Using Natural Systems Jitendra R Raol and Abhijit Jalisatgi Genetic algorithms are search procedures inspired by natural selection

More information

The two-stage recombination operator and its application to the multiobjective 0/1 knapsack problem: a comparative study

The two-stage recombination operator and its application to the multiobjective 0/1 knapsack problem: a comparative study The two-stage recombination operator and its application to the multiobjective 0/1 knapsack problem: a comparative study Brahim AGHEZZAF and Mohamed NAIMI Laboratoire d Informatique et d Aide à la Décision

More information

Using Multi-chromosomes to Solve. Hans J. Pierrot and Robert Hinterding. Victoria University of Technology

Using Multi-chromosomes to Solve. Hans J. Pierrot and Robert Hinterding. Victoria University of Technology Using Multi-chromosomes to Solve a Simple Mixed Integer Problem Hans J. Pierrot and Robert Hinterding Department of Computer and Mathematical Sciences Victoria University of Technology PO Box 14428 MCMC

More information

Artificial Evolution. FIT3094 AI, A-Life and Virtual Environments Alan Dorin

Artificial Evolution. FIT3094 AI, A-Life and Virtual Environments Alan Dorin Artificial Evolution FIT3094 AI, A-Life and Virtual Environments Alan Dorin Copyrighted imagery used in the preparation of these lecture notes remains the property of the credited owners and is included

More information

Introduction Evolutionary Algorithm Implementation

Introduction Evolutionary Algorithm Implementation Introduction Traditional optimization methods fail when there are complex, nonlinear relationships between the parameters and the value to be optimized, the goal function has many local extrema, and resources

More information

Genetic Algorithm. Presented by Shi Yong Feb. 1, 2007 Music McGill University

Genetic Algorithm. Presented by Shi Yong Feb. 1, 2007 Music McGill University Genetic Algorithm Presented by Shi Yong Feb. 1, 2007 Music Tech @ McGill University Outline Background: Biological Genetics & GA Two Examples Some Applications Online Demos* (if the time allows) Introduction

More information

An introduction to evolutionary computation

An introduction to evolutionary computation An introduction to evolutionary computation Andrea Roli andrea.roli@unibo.it Dept. of Computer Science and Engineering (DISI) Campus of Cesena Alma Mater Studiorum Università di Bologna Outline 1 Basic

More information

Recessive Trait Cross Over Approach of GAs Population Inheritance for Evolutionary Optimisation

Recessive Trait Cross Over Approach of GAs Population Inheritance for Evolutionary Optimisation Recessive Trait Cross Over Approach of GAs Population Inheritance for Evolutionary Optimisation Amr Madkour, Alamgir Hossain, and Keshav Dahal Modeling Optimization Scheduling And Intelligent Control (MOSAIC)

More information

Validity Constraints and the TSP GeneRepair of Genetic Algorithms

Validity Constraints and the TSP GeneRepair of Genetic Algorithms Validity Constraints and the TSP GeneRepair of Genetic Algorithms George G. Mitchell Department of Computer Science National University of Ireland, Maynooth Ireland georgem@cs.nuim.ie Abstract In this

More information

Genetic Algorithms and Shape Grammars

Genetic Algorithms and Shape Grammars Genetic Algorithms and Shape Grammars Technical report Author Manuela Ruiz Montiel Date October 18, 2011 Version 1.1 1 Contents 1. Introduction... 3 2. Genetic algorithm... 4 3. Genotype... 7 4. Experiments...

More information

Economic Design of Control Chart

Economic Design of Control Chart A Project Report on Economic Design of Control Chart In partial fulfillment of the requirements of Bachelor of Technology (Mechanical Engineering) Submitted By Debabrata Patel (Roll No.10503031) Session:

More information

Genetic Algorithm for Supply Planning Optimization under Uncertain Demand

Genetic Algorithm for Supply Planning Optimization under Uncertain Demand Genetic Algorithm for Supply Planning Optimization under Uncertain Demand Tezuka Masaru and Hiji Masahiro Hitachi Tohoku Software, Ltd. 2-16-10, Honcho, Aoba ward, Sendai City, 980-0014, Japan {tezuka,hiji}@hitachi-to.co.jp

More information

APPLICATION OF COMPUTER FOR ANALYZING WORLD CO2 EMISSION

APPLICATION OF COMPUTER FOR ANALYZING WORLD CO2 EMISSION APPLICATION OF COMPUTER FOR ANALYZING WORLD CO2 EMISSION M. Kavoosi 1, B.shafiee 2 1 Department of Computer Engineering, Izeh Branch, Islamic Azad University, Izeh, Iran 1 E-mail address: Hakavoosi@yahoo.com

More information

A Genetic Algorithm on Inventory Routing Problem

A Genetic Algorithm on Inventory Routing Problem A Genetic Algorithm on Inventory Routing Problem Artvin Çoruh University e-mail: nevin.aydin@gmail.com Volume 3 No 3 (2014) ISSN 2158-8708 (online) DOI 10.5195/emaj.2014.31 http://emaj.pitt.edu Abstract

More information

An Evolutionary Solution to a Multi-objective Scheduling Problem

An Evolutionary Solution to a Multi-objective Scheduling Problem , June 30 - July 2,, London, U.K. An Evolutionary Solution to a Multi-objective Scheduling Problem Sumeyye Samur, Serol Bulkan Abstract Multi-objective problems have been attractive for most researchers

More information

TIMETABLING EXPERIMENTS USING GENETIC ALGORITHMS. Liviu Lalescu, Costin Badica

TIMETABLING EXPERIMENTS USING GENETIC ALGORITHMS. Liviu Lalescu, Costin Badica TIMETABLING EXPERIMENTS USING GENETIC ALGORITHMS Liviu Lalescu, Costin Badica University of Craiova, Faculty of Control, Computers and Electronics Software Engineering Department, str.tehnicii, 5, Craiova,

More information

A Genetic Algorithm for Order Picking in Automated Storage and Retrieval Systems with Multiple Stock Locations

A Genetic Algorithm for Order Picking in Automated Storage and Retrieval Systems with Multiple Stock Locations IEMS Vol. 4, No. 2, pp. 36-44, December 25. A Genetic Algorithm for Order Picing in Automated Storage and Retrieval Systems with Multiple Stoc Locations Yaghoub Khojasteh Ghamari Graduate School of Systems

More information

A Gene Based Adaptive Mutation Strategy for Genetic Algorithms

A Gene Based Adaptive Mutation Strategy for Genetic Algorithms A Gene Based Adaptive Mutation Strategy for Genetic Algorithms Sima Uyar, Sanem Sariel, and Gulsen Eryigit Istanbul Technical University, Electrical and Electronics Faculty Department of Computer Engineering,

More information

Improving Differential Evolution Algorithm with Activation Strategy

Improving Differential Evolution Algorithm with Activation Strategy 2009 International Conference on Machine Learning and Computing IPCSIT vol.3 (2011) (2011) IACSIT Press, Singapore Improving Differential Evolution Algorithm with Activation Strategy Zhan-Rong Hsu 1, Wei-Ping

More information

Evolutionary Algorithms and Simulated Annealing in the Topological Configuration of the Spanning Tree

Evolutionary Algorithms and Simulated Annealing in the Topological Configuration of the Spanning Tree Evolutionary Algorithms and Simulated Annealing in the Topological Configuration of the Spanning Tree A. SADEGHEIH Department of Industrial Engineering University of Yazd, P.O.Box: 89195-741 IRAN, YAZD

More information

[Sharma* et al., 5(6): June, 2016] ISSN: IC Value: 3.00 Impact Factor: 4.116

[Sharma* et al., 5(6): June, 2016] ISSN: IC Value: 3.00 Impact Factor: 4.116 IJESRT INTERNATIONAL JOURNAL OF ENGINEERING SCIENCES & RESEARCH TECHNOLOGY AN APPROACH TO GENERATE TEST CASES AUTOMATICALLY USING GENETIC ALGORITHM Deepika Sharma*, Dr. Sanjay Tyagi * Research Scholar,

More information

Ant Colony Optimization

Ant Colony Optimization Ant Colony Optimization Part 4: Algorithms Fall 2009 Instructor: Dr. Masoud Yaghini Ant Colony Optimization: Part 4 Outline The Traveling Salesman Problem ACO Algorithms for TSP Ant System (AS) Elitist

More information

Comp215: Genetic Algorithms - Part 1

Comp215: Genetic Algorithms - Part 1 Comp215: Genetic Algorithms - Part 1 Mack Joyner, Dan S. Wallach (Rice University) Copyright 2016, Mack Joyner, Dan S. Wallach. All rights reserved. Darwin s Theory of Evolution Individual organisms differ

More information

Memetic Algorithm with Hybrid Mutation Operator

Memetic Algorithm with Hybrid Mutation Operator Available Online at www.ijcsmc.com International Journal of Computer Science and Mobile Computing A Monthly Journal of Computer Science and Information Technology IJCSMC, Vol. 3, Issue. 1, January 2014,

More information

The Effects of Supermajority on Multi-Parent Crossover

The Effects of Supermajority on Multi-Parent Crossover The Effects of Supermajority on Multi-Parent Crossover Chuan-Kang Ting, Member, IEEE, and Chun-Cheng Chen Abstract Multi-parent crossover allows more than two parents participating in crossover. The increase

More information