Single Parent Genetic Programming

Size: px
Start display at page:

Download "Single Parent Genetic Programming"

Transcription

1 Single Parent Genetic Programming Wendy Ashlock Roseheart Biomaths 41 Lambert Crescent Guelph, Ontario N1G 2R4, Canada Dan Ashlock Mathematics and Statistics University of Guelph Guelph, Ontario N1G 2W1, Canada, ABSTRACT The most controversial part of genetic programming is its highly disruptive and potentially innovative subtree crossover operator. The clearest problem with the crossover operator is its potential to induce defensive metaselection for large parse trees, a process usually termed bloat. Single parent genetic programming is a form of genetic programming in which bloat is reduced by doing subtree crossover with a fixed population of ancestor trees. Analysis of mean tree size growth demonstrates that this fixed and limited set of crossover partners provides implicit, automatic control on tree size in the evolving population, reducing the need for additionally disruptive trimming of large trees. The choice of ancestor trees can also incorporate expert knowledge into the genetic programming system. The system is tested on four problems: plus-one-recall-store (PORS), odd parity, plus-times-half (PTH) and a bioinformatic model fitting problem (NIPs). The effectiveness of the technique varies with the problem and choice of ancestor set. At the extremes, improvements in time to solution in excess of 4700-fold were observed for the PORS problem, and no significant improvements for the PTH problem were observed. I. INTRODUCTION This study presents a new technique for use with genetic programming. It was inspired by a science fiction novel [7] about an all-male society in which children had only a father, and the female contribution came from artificial womb technology and egg cultures donated when the society was founded. In single parent genetic programming, children have only one parent, and crossover is done with a fixed set of unchanging ancestors. This technique simultaneously limits parse tree growth in the evolving population (since the ancestors don t grow) and provides a means of embedding either expert knowledge or the results of previous evolutionary runs into an evolving population. Selection still causes the parse trees to grow, but the single parent technique slows that growth significantly. Since the ancestor set doesn t change, no information in it is ever lost, which can help keep the algorithm from getting stuck and can speed time to solution. Four test problems are treated in this study: plus-one-recall-store (PORS), odd parity, plustimes-half (PTH), and a bioinformatic model fitting problem (NIPs). A. Subtree crossover and bloat Genetic programming [14], [11], [12] is a type of evolutionary computation that uses a variable-sized representation, typically in the form of a parse tree, which represents a mathematical formula. The binary variation operator most commonly used in genetic programming is subtree crossover. During selection and reproduction, a pair of parent trees are picked by the selection algorithm and copied. A subtree from each parent is selected at random from the set of all its subtrees, and their root nodes are exchanged in the copies. The number of possible outcomes is equal to the product of the nodes in the two trees. The sizes of the new trees vary from one node to the sum of the number of nodes in the starting trees minus one. The upper bound on the size of trees in a population grows exponentially, and in many experiments trees approach that upper bound [18]. In practice, this means that the average size of trees in a population grows sufficiently that total size or depth of trees in a population must be controlled to keep the computer s memory from being overrun and to ensure that the program terminates in a reasonable amount of time [8]. Any size control measure is, itself, a new source of disruption in reproduction. Bloat [6] is characterized by the growth of parse trees in a population to near whatever bound in placed on their size. Bloated trees typically have a large amount of material that does not contribute directly to their fitness. It is a result of the shape of the underlying search space (more long solutions than short ones) and of the need to shield against destructive crossover [15]. A tree might, for example, have the functionally equivalent form: good solution + 0 * ineffective material. Subtree crossover in the ineffective material does not change the solution s fitness. Bloat has at least two bad side effects. First it ensures that computer memory and time spent managing it is wasted to an extent close to the maximum possible within the bounds set by the programmer. Second, it is a means whereby the evolving population manages to reduce the power of subtree crossover as a search operator [16] [6]. The ineffective material causing

2 bloat may retain useful chunks of material that can be restored to function via crossover, and so bloat need not be all bad [17], but on average it hampers. In addition, if the goal of a given exercise in genetic programming is to understand solutions to a problem, then bloat substantially complicates analysis. B. Single parents and gentle size control The presence of bloat indicates selection pressure for larger trees. In order to generate large trees, the subtrees participating in crossover must be large at least some of the time. In single parent genetic programming, every tree placed back in the population is the result of crossover with a tree of unchanging size. Thus, the size increase of the trees is placed under far tighter control than in a standard genetic programming environment. In Section II, we show that the upper bound on tree size changes from exponential to linear in this case. In single parent genetic programming a standard genetic programming environment is modified as follows. A collection of trees called the ancestor set is chosen. This ancestor set may be selected from past evolutionary runs or may be designed by the user. The genetic programming environment proceeds as usual, except that any subtree crossover is between a member of the population and an ancestor. Selection of the population member is as usual, and the ancestor is chosen at random from the ancestor set. Crossover produces only one tree, the population member with one of its subtrees replaced by a subtree of the ancestor. This results in a variation operator which is not quite crossover and not quite mutation; it uses information from two individuals, but only one is a member of the population. Notice that any material available in the ancestors is available indefinitely. This means that the system is constructively unable to lose any operation, terminal, or building block present in the ancestor set. Single parent genetic programming is reminiscent of Angeline and Pollack s module acquisition[1] but is substantially simpler. II. THE THEORY OF SINGLE PARENT SIZE CONTROL Suppose we are using a steady-state evolutionary algorithm to do genetic programming. Pairs of parents are selected, offspring generated, and then those offspring are placed back in the population before the next set of parents are selected. Each such selection and replacement is called a mating event. Theorem 1: In standard genetic programming without any form of tree trimming, the maximum possible size of a tree is an exponential function of the number of mating events. In single parent genetic programming, this bound is linear. Proof: Suppose we start with a population of trees in which the largest tree has n nodes. This tree can produce another tree the same size or larger in one mating event. As a result, we may assume two trees of size n are possible after one mating event. In the next generation, these two trees can produce a tree of size 2n 1 via subtree crossover where the root of one tree is exchanged with the leaf of the other. This means that a sequence f(t) of tree sizes can be generated that obeys the recursion f(t + 2) = 2f(t) 1. Solving this recursion we obtain the formula f(t) = A ( 2) t + 1. Thus, the maximum possible size of a tree after t mating events must be at least as large as this exponential function: A ( 2) t + 1 maximum tree size. The greatest possible size increase in the size of a tree comes from taking the largest two trees of size n and m n and crossing the root of one with the leaf of the other. Thus the maximum increase in size must be less than double. This yields an upper bound of g(t) = 2 t on maximum tree size: maximum tree size 2 t. Therefore, maximum possible tree size in standard genetic programming is an exponential function of the number of mating events t which is between the functions f(t) and g(t). A ( 2) t + 1 maximum tree size 2 t. In single parent genetic programming, trees are crossed over with ancestors. As a result the maximum size increase comes from crossing the root of an ancestor into a leaf of the single parent. This size increase is one less than the size z of the largest ancestor. Tree size is thus bounded above by h(t) = n + t(z 1) where n is the size of the largest tree present in the initial population. maximum tree size n + t(z 1). Therefore, the growth of trees in single parent genetic programming is at most linear. III. EXPERIMENTS Single parent genetic programming was compared to standard genetic programming on four problems: the plus-onerecall-store (PORS) problem [3], odd parity [14], plus-timeshalf (PTH), and a bioinformatic model-fitting problem [9]. To keep things simple and to concentrate on the impact of the single parent technique, standard parse trees were used without any other techniques known to be useful, like automatically defined functions (ADFs) [14]. Populations which failed to find solutions after a fixed number of mating events were said to timeout, and their results were not included in the averages. This means that results that include many timeouts are underestimates of the actual mean time to solution.

3 Base 14 Base 15 Base 16 mean mean single baseline n baseline parent ratio timeouts 14 65, , , ,420, ,758, , TABLE I RATIOS OF MEAN TIME TO SOLUTION FOR THE BASELINE PORS RUNS AND THE SINGLE PARENT PORS RUNS FOR n = 14, 15,..., 19. FOR n 17 THESE ARE UNDERESTIMATES OF THE RATIO BECAUSE SOME BASELINE RUNS TIMEOUT AT 10,000,000 MATING EVENTS. 0 A4 14 A4 15 A4 16 A4 17 A4 18 A4 19 A4 20 A4 21 A4 22 A4 23 A4 24 A4 25 Fig. 1. The above shows 95% confidence intervals for time to solution for the baseline PORS problem for n = 14, 15, and 16 as well as the results of using single parent genetic programming with the ancestor set A={(+ (Sto (+ (Sto (+ (Sto (+ 1 1)) Rcl)) Rcl)) Rcl)} for n = 14, 15, Baseline runs are labels Base N while the single parent runs are denoted A4 N. 10,000,000 mating events. 400 populations were run for each problem case A4 24 A4 25 A. Plus-one-recall-store To test the ability of single parent genetic programming to embed expert information in the population, we chose a simple test problem for which solutions are known and for which the relationships between solutions for one value of n to the solutions of other values of n are known. The plus-one-recall-store (PORS) problem is described in detail in [3]. It is a maximum problem with a small operation set and a calculator-style memory. The goal of the test problem, called the PORS efficient node use problem, is to find parse trees with a fixed maximum number of nodes that generate the largest integer result possible. The language has two operations: integer addition + and a store operation Sto that places its argument in an external memory location and returns the value it stores. It has two terminals: the integer 1 and recall Rcl from the external memory. Standard genetic programming experiments were run for n = 14 to 19 nodes. (The hardest baseline case is n = 18; the easiest is n = 16.) Fitness was the value of the parse tree. The initial population was composed of randomly generated trees with exactly n nodes. A tree that evaluated to the largest possible number was considered successful (these numbers are computed in [3]). Crossover was performed by the usual subtree exchange [13]. If this produced a tree with more than n nodes, then a subtree of the root node of the tree iteratively replaced the tree until it had less than n nodes. This size control operation is called chopping; it was chosen to avoid the problem other size control methods have of limiting the effects of crossover to nodes far from the root [10]. Both the baseline and single parent versions used chopping. Mutation was performed by replacing a subtree picked uniformly at random with a new random subtree of the same size for each new tree produced. All experiments used double tournament selection with tournament size 7. Populations timed out at A4 14 A4 15 A4 16 A4 17 A4 18 A4 19 Fig. 2. The above shows 95% confidence intervals for time to solution for the PORS problem using single parent genetic programming with the ancestor set A={(+ (Sto (+ (Sto (+ (Sto (+ 1 1)) Rcl)) Rcl)) Rcl)} for n = 14, 15, In choosing the ancestor set for the single parent version, knowledge about the character of the problem from [3] was used. The solution to the PORS efficient node use problem varies according to the congruence class (mod 3) of the number of nodes permitted. When n is 0 (mod 3), there is a unique solution made up of building blocks that look like (+ (Sto (+ 1 1) Rcl)). For n=1(mod 3) or n=2(mod 3) there are multiple solutions which use the (+ (Sto (+ 1 1) Rcl) building blocks and also building blocks that look like (+ Rcl Rcl). This means that the solution to n=12 contains building blocks needed for all n, and solutions to n=11 contain all the building blocks needed for all solutions for all n. At first it was thought that solutions to n=11 would make the best ancestors, since they had all the building blocks. However, it turned out that the solution to n=12 was an excellent ancestor, because its building blocks are the most common in all the solutions, and crossover and mutation could easily construct the other building blocks. The solution to n = 12 is as follows: A4 20 A4 21 A4 22 A4 23 (+ (Sto (+ (Sto (+ (Sto (+ 1 1)) Rcl)) Rcl)) Rcl).

4 #4 #3 #2 Fig. 4. (AND (OR v0 v1) (NAND v1 v0)) (AND (OR v2 v1) (NAND v1 v2)) (AND (OR v0 v2) (NAND v2 v0)) (NOR (AND v0 v1) (NOR v0 v1)) (NOR (AND v2 v1) (NOR v2 v1)) (NOR (AND v0 v2) (NOR v0 v2)) Ancestor set for 3-parity consisting of modified solutions for 2-parity. #1 baseline Fig. 3. Confidence intervals for mean time to solution for the single-parent n=14 PORS experiment using four different ancestor sets. Ancestor set #1 contains 4 nonoptimal PORS parse trees with 11 nodes; ancestor set #2 contains 3 optimal PORS parse trees with 11 nodes; ancestor set #3 contains 3 optimal PORS parse trees with 11 nodes and 6 optimal PORS parse trees with 12 nodes; ancestor set #4 contains 1 optimal PORS parse tree with 12 nodes. mean mean single baseline single parent n baseline parent ratio timeouts timeouts % 0.5% % 17.5% % 31.0% TABLE II COMPARISON OF TIME TO SOLUTION FOR ODD PARITY FOR n = 3, 4, 5 WITH AND WITHOUT USING THE SINGLE PARENT TECHNIQUE. NOTE THAT THE CHANCE OF FINDING A SOLUTION BEFORE TIMING OUT IS MUCH HIGHER USING THE SINGLE PARENT TECHNIQUE. Figure 1 shows 95% confidence intervals on the time to solution for the baseline and single parent runs with the above ancestor. Table I compares mean times to solution and shows the number of timeouts. The single parent runs are shown by themselves in Figure 2 so that the compression of vertical scale caused by the baseline runs does not prevent their comparison. The baseline results for n > 16 are not pictured due to problems of scale and the substantial number of timeouts for those cases; none of the single parent experiments had timeouts. Figure 2 shows that the single parent technique using this ancestor changes which problem cases are harder: n=16 becomes harder than n=18. Clearly, the choice of ancestors is very important. Figure 3 shows the results for four different ancestor sets for n=14. It is possible to choose ancestors which degrade the performance of the algorithm. For example, for n=14 an ancestor set with four nonoptimal parse trees with 11 nodes has an average time to solution of 70,015 mating events as compared to 64,967 mating events for the baseline and 666 mating events for the single parent algorithm using the solution to n=12 as the ancestor set. B. Odd parity Because PORs trees must use all their nodes to achieve maximum fitness, they have no problem with bloat. The odd parity problem was chosen to test the size control feature of the single parent technique. The odd parity problem is a standard logic function induction problem. It maps a collection of n boolean variables {x 0, x 1,..., x n 1 } onto the truth value of the proposition that an odd number of the variables are true. It is a standard test problem for genetic programming [13]. Experiments were performed for standard and single parent genetic programming on the odd parity problem. The operations used were and, or, nand, and nor, and the terminals were the n boolean input variables. The experiments used double tournament selection with tournament size 7. Mutation was performed by replacing a subtree picked uniformly at random with a new random subtree of the same size for each new tree produced. The timeout limit was 1,000, populations were run for the single parent experiments and for the baseline experiments for n=3 and n=4; for the baseline for n=5, 200 populations were run. The ancestor set for the single parent version for n was created by making n copies of several short solutions generated for n 1 and modifying each copy so that it used a different subset of n 1 variables out of the n variables. The ancestor set for 3-parity appears in Figure 4. In the first experiment, the times to solution were compared for the baseline and single parent versions for n = 3, 4, and 5. The chop operator for the baseline was set so that the solutions would be of similar length to those produced by the single parent experiments (50 for n = 3; 400 for n = 4; 2000 for n = 5). For the single parent version, when crossover resulted 5-parity baseline 4-parity single parent 3-parity single parent 3-parity baseline 5-parity single parent 4-parity baseline Fig. 5. The above shows 95% confidence intervals for time to solution for single parent and baseline versions of the n-parity problem.

5 n baseline baseline single parent single parent avg.size 95% CI avg. size 95% CI 3 41 (40.4,41.6) 42 (39.7,44.3) (312, 328) 238 (221,255) (1503,1889) 854 (798,910) TABLE III COMPARISON OF THE AVERAGE SIZE OF SOLUTIONS FOUND BY BASELINE AND SINGLE PARENT VERSIONS FOR n-parity. CHOP SET TO 50 FOR N=3; 400 FOR N=4; 2000 FOR N=5. 4-parity sp 3-parity sp 3-parity baseline 4-parity baseline Fig. 6. The above shows 95% confidence intervals of parse tree sizes for single parent and baseline versions of 3-parity and 4-parity. * * * h h h h h h h h h h h h h h h h Fig. 7. An optimal PTH tree of depth 4; it evaluates to 16. Plus is represented by +; times is represented by *; one-half is represented by h. n mean mean ratio baseline single parent TABLE IV COMPARISON OF THE TIME TO SOLUTION FOUND BY BASELINE AND SINGLE PARENT VERSIONS OF PTH FOR TREES OF DEPTH n USING THE SOLUTION AS THE ANCESTOR. in trees larger than 1000 nodes for n=3 and n=4 and 2000 nodes for n=5, the parents were returned to the population unchanged. The times to solution were better for the single parent version as shown in Table II. However, the difference was not as much as for the PORS problem, probably because the ancestor set did not contain as much information. The chance of finding a solution before timing out was greatly improved, especially for n = 5. For n = 5 the baseline ran very slowly and timed out most of the time resulting in just 8 solutions out of 200 runs. Table III shows the sizes of the solutions found. For the baseline version, changing the chop operator to allow the parse trees to be larger causes the solutions to be larger, and changing the chop operator to keep the parse trees smaller causes the solutions to be smaller. For example, for 3-parity, if the chop operator is set to 400 instead of 50, then the best solutions average size 175 (instead of 41); if it is set to 31, then the best solutions average size 28. There is no similar way to tweak the size for the single parent version. In the baseline version, trees quickly grow and the space of parse trees close to the chop limit is well explored. In the single parent version, the trees grow more slowly, so there is a greater chance of finding solutions of a variety of sizes. For example, for 5-parity, the single-parent version found solutions ranging from 201 nodes to 1995 nodes, while the baseline (chopped at 2000) found solutions ranging from 1101 nodes to 1971 nodes. C. Plus-times-half (PTH) A problem on which the single parent technique did not seem to help much was the plus-times-half (PTH) problem. This problem uses trees with limited depth, and like PORs requires all possible nodes be used to achieve an optimal solution. The object is to find a parse tree of depth n which evaluates to the maximum number possible using the binary arithmetic operations plus and times, and the constant onehalf as a terminal. Our experiments used double tournament selection with tournament size 7. Mutation was performed by replacing a subtree picked uniformly at random with a new random subtree of the same size for each new tree produced. The timeout limit was 1,000, populations were run for each problem case. Trees were not allowed to grow beyond depth n. If crossover resulted in a parse tree of depth greater than n, then any subtree at depth n was replaced with the terminal one-half. We tried various ancestor sets none of which improved performance significantly. The ancestor set with which one would most expect to improve performance consists of the solution itself. Remember that the ancestor is not part of the population, so this does not lead to an instant result. For PORs, however, the optimum was always found quickly when given this type of ancestor set. Not so for PTH. As can be seen in Table IV, even feeding the PTH problem its own solution as an ancestor did not significantly improve performance. The best ancestor sets included parse trees which

6 had depth greater than n. For example, for n=4 using an optimal parse tree of depth 7 as an ancestor resulted in a mean time to solution of 551 (compare to baseline time of 814); using the same ancestor for n=6 resulted in a mean time to solution of 5541 (compare to baseline time of 7334). A reason that the single parent technique doesn t work well for this problem could be that the solution depends on the structure of the entire tree (pluses near the bottom, times on top) instead of having various subtrees contributing parts of the solution. (See Figure 7 for an example solution.) D. Modeling nearly identical paralogs Paralogs are pairs or sets of genes thought to be copies of a single ancestral gene. Duplication of genes and subsequent variation of the copies is thought to be an important source of new genetic function. In [9], a collection of paralogs differing in a very small number of base positions were documented and biologically validated. Biological validation was required to ensure that the apparent paralogs were real rather than the result of sequencing errors. (These nearly identical paralogs or NIPs were discovered because of a coincident pattern of apparent sequencing errors in a large-scale genomic assembly.) The existence of NIPs implies the existence of paralogs with no sequence divergence, totally identical paralogs or TIPS. In an effort to estimate the number of TIPS, the available NIPs data were modeled in [9] using a simple parameter estimation evolutionary algorithm. The NIPs data is given as the count of NIPs which vary in a given number of positions. The model is thus the number of NIPs as a function of the number of positions in which they vary, and the zero of the model is its estimate of the number of TIPS. The model used for the NIPs data was of the form N(p) = e rp f(p), where N(p) is the number of NIPs exhibiting p polymorphisms and f(p) is a polynomial. The results of the evolutionary algorithm estimating parameters for this model were used in the ancestor set for the GP evolutionary algorithm. The parse trees used the unary operations Neg (negate), Sin (sine), Cos (cosine), Atn (arctangent), Sqr (square), and Nex (e x ), and the binary operations Add (addition), Sub (subtraction), Mul (multiplication), and Div (division). There was one variable and ephemeral constants ranging from -100 to 100. Experiments used double tournament selection with tournament size 7. Mutation was performed by replacing a subtree picked uniformly at random with a new random subtree of the same size for each new tree produced. 100 populations were run for each problem case. The fitness function was RMS error (to be minimized). The timeout limit was 1,000,000. The ancestor set, shown in Figure 8, for the single parent version consisted of 10 ancestors generated from the best result from an evolutionary algorithm attempting to fit the model N(p) = Ae rp using a gene with two parameters, A and r, and the best result from an evolutionary algorithm attempting to fit the model N(p) = (Ap + B)e rp using a gene with three parameters, A, B, and r. The results were rounded to (Mul 85.3 (Nex (Mul 0.1 X1))) (Mul 85.3 (Nex (Mul 0.2 X1))) (Mul 85.3 (Nex (Mul 0.3 X1))) (Mul 85.3 (Nex (Mul 0.4 X1))) (Mul 85.3 (Nex (Mul 0.5 X1))) (Mul (Add 85.3 (Mul 55 X1)) (Nex (Mul 0.2 X1))) (Mul (Add 85.3 (Mul 55 X1)) (Nex (Mul 0.3 X1))) (Mul (Add 85.3 (Mul 55 X1)) (Nex (Mul 0.4 X1))) (Mul (Add 85.3 (Mul 55 X1)) (Nex (Mul 0.5 X1))) (Mul (Add 85.3 (Mul 55 X1)) (Nex (Mul 0.6 X1))) Fig. 8. Ancestor set for the NIPs experiment. 95% Confidence Interval Mean RMS error Single Parent Baseline Fig. 9. Comparison of fitnesses for single parent and baseline experiments for the NIPs problem with the number of nodes restricted to 40. the nearest tenth, and five versions of each solution were used with r varying by one-tenth centered on the solution. The algorithm was run for 2000 generations. Parse tree size was limited to 20 nodes in one experiment and 40 nodes in another. In the baseline version, parse trees were chopped when they grew too large; in the single parent version, when crossover resulted in a parse tree which was too large, the parent tree was returned to the population unchanged. The single parent technique proved to be effective both in improving fitness and in limiting size. The mean RMS error with the single parent version was as compared to with the baseline version. Confidence intervals are shown in Figure 9. The size of the trees averaged 38.3 for the baseline, but only 31.2 for the single parent version. Confidence intervals are shown in Figure 10. The character of the solutions found were different for the baseline and single parent experiments. Table V shows the proportion of each operation found in the solutions as well as the proportions in the ancestors. The single parent solutions favor the operations found in the ancestors (especially Mul and

7 95% Confidence Interval Mean tree size Single Parent Baseline Fig. 10. Comparison of tree sizes for single parent and baseline experiments for the NIPs problem with the number of nodes restricted to 40. (Sub (Div (Add (Mul (Div X1 (Sub X1 (Div ))) X1)) ) (Div X1)) (Sub (Div X1) (Div (Neg (Cos (Div X ))) (Sqr (Sqr (Div (Sub (Div X1 X1) X1) X1))))) (Mul (Div (Cos (Nex (Mul (Mul (Mul X1 X1) (Mul (Sin (Sqr X1)) (Nex X1))) (Mul X1 X1)))) X1)) Fig. 11. Examples of baseline results for NIPs data fit using a chop operator with a node limit of 20. (Mul (Add (Div X1)) (Nex (Mul (Mul 0.4 X1)))) (Sub (Mul (Nex (Mul (Mul X1)))) (Div X1)) (Mul (Nex (Mul (Mul X1) (Mul )))) Fig. 12. Examples of single parent results for NIPs data fit with parse trees restricted to no more than 20 nodes. Op Baseline Single Parent Ancestors Add 16% 16% 12.5% Sub 18% 7% 0% Mul 12% 40% 62.5% Div 23% 8% 0% Sqr 6% 2% 0% Nex 3% 14% 25% Sin 4% 5% 0% Cos 6% 4% 0% Atn 6% 2% 0% Neg 6% 2% 0% TABLE V OPERATION USE IN ANCESTORS AND SOLUTIONS TO THE NIPS PROBLEM USING STANDARD AND SINGLE PARENT GENETIC PROGRAMMING. Nex); the baseline solutions use more Div and Sub operations. Examination of the results from the single parent runs shows that the building blocks in the ancestors are preserved. Every solution found by the single parent algorithm used a Nex operation, in contrast to the baseline in which only 26% of the solutions used a Nex operation. In the single parent runs, both (Mul (Nex (Mul X1))) and (Mul (Nex (Mul X1))) appeared as solutions. These have exactly the same form as the ancestors, just with different constants. Solutions similar to each other as well as similar to the ancestors were found by different single parent runs. For example, (Add (Mul (Nex (Mul X1))) (Div X1)) and (Add (Mul (Nex (Mul X1))) (Div X1)). Such duplicates did not seem to be present in the baseline results. The solution with the best fitness (1.079) found by the single parent algorithm limited to 20 nodes (written in more standard notation, with constants rounded and simplified) was ( X cos ( 75 sin X ) ) e 0.3 X In the single parent algorithm limited to 40 nodes, the solution with best fitness (0.807) was 3.33 (26+cos( 100+sin(86 e X2 sin X)) 2 2X) e 0.3 X. In contrast, the solution with best fitness (1.700) for the baseline version with the chop operator set to 20 was (arctan(11 X) 1) X X2 X, and the solution with best fitness (0.958) for the baseline with the chop operator set to 40 was sin( 80 + arctan( 0.3X)) 0.4 X ( 13 cos X 49) sin2 (X 2 X) 80. X Not all the single parent solutions are so similar to the ancestors. For example, this solution with fitness of looks somewhat different: (81 + X X 10 + X) e 0.3(cos(arctan(X) X) X). The single parent algorithm is doing a more directed search than the baseline is doing. Instead of exploring the entire space of possible solutions encoded by its parse trees, it is looking only for solutions that are of the general form f(x) e g(x). It is important to note that some of the models found by this algorithm with good fitnesses may not actually be good models of the data. They may be overfitted to the particular data set they evolved to model. To find the best models an additional step needs to be taken: cross validation with other NIPs data.

8 E. Discussion The single parent technique seems to be useful both as a way of limiting bloat with low computational cost and as a way of incorporating expert information into the evolutionary computation. Also, it is useful as a way of preserving building blocks and of directing the algorithm s search. The information in the ancestors is never lost to the algorithm. This can speed time to solution and reduce the number of timeouts. It can also affect the degree of exploration vs. exploitation; the single parent technique sways the balance toward exploration near ancestors. As the results for the PTH problem show, the single parent technique is not appropriate for every problem. The choice of ancestor set is crucial. Some ancestor sets degrade performance, others enhance it. Focusing the search near the ancestors is a two-edged sword that can buy rapid results at the cost of breadth of search. The greatest advantage to the technique seems to be in controlling the growth of parse trees. In the baseline experiments, the parse trees tended to quickly grow as large as possible, meaning that most of the search was of large parse trees. The single parent technique caused the parse trees to grow more slowly, hence allowing a better search of diverse tree sizes. The choice of baseline for comparison (standard genetic programming with chopping used for size control) was somewhat arbitrary. Future experiments could compare single parent genetic programming with other size control measures like incorporating size into the fitness evaluation [8] [18] or only using the smaller of the two children produced by crossover [10]. Single parent genetic programming has the advantage of computational simplicity and of also incorporating expert knowledge into the system. It would also be interesting to compare the single parent technique to other techniques which improve performance in genetic programming, such as ADFs [14] and population seeding. Experiments with ADFs would be particularly interesting because the ADFs for one problem case might make good ancestors for another problem case. It would also be interesting to test the single parent technique with other kinds of genetic programming like ISAc lists [2] or finite state automata or graph-based evolutionary algorithms [4]. It would be good to be able to categorize which sorts of problems work well with the single parent technique and which don t. Single parent genetic programming could be used with competitive problems (such as Prisoner s Dilemma [5]) instead of optimization problems. Would a population of single parent creatures beat a population of standard creatures, or vice versa? Would the solutions be more or less robust? Another thing which would be interesting to try would be allowing the ancestor set to change. One idea is to promote highly fit creatures to the ancestor set as evolution progresses. Another idea is to let the ancestor set itself evolve. Fitness of an ancestor would be the average change in fitness of creatures in the main population when crossed over with it. However, this might reduce the size control effect of the technique unless the criterion for promotion included small size. REFERENCES [1] Peter J. Angeline and Jordan B. Pollack. Coevolving high-level representations. In Christopher Langton, editor, Artificial Life III, volume 17 of Santa Fe Institute Studies in the Sciences of Complexity, pages 55 71, Reading, Addison-Wesley. [2] D. Ashlock and M. Joenks. ISAc lists: A different program induction method. In Genetic Programming 1998, pages 18 26, [3] Daniel Ashlock and James I Lathrop. A full characterized test suite for genetic programming. In Proceedings of the Seventh Annual Conference on Evolutionary Programming, pages , [4] Daniel A. Ashlock. Evolutionary Computation for Modeling and Optimization. Springer-Verlag New York, Inc., Secaucus, NJ, [5] Robert Axelrod. The Evolution of Cooperation. Basic Books, New York, [6] Wolfgang Banzhaf, Peter Nordin, Robert E. Keller, and Frank D. Francone. Genetic Programming : An Introduction. Morgan Kaufmann, San Francisco, [7] Lois McMaster Bujold. Ethan of Athos. Baen Books, Riverdale, NY, [8] Edwin D. DeJong and Jordan B. Pollack. Multi-objective methods for tree size control. Genetic Programming and Evolvable Machines, 4: , [9] Yan Fu, Tsui-Jung Wen, Ling Guo, Debbie Chen, Karthik Viswanathan, Mu Zhang, Yefim Ronin, David Mester, Abraham Korol, Daniel A. Ashlock, and Patrick S. Schnable. Genetic structure analysis of maize recombinant inbred lines. in preparation, [10] Chris Gathercole and Peter Ross. An adverse interaction between crossover and restricted tree depth in genetic programming. In Genetic Programming 1996, pages , [11] Kenneth Kinnear. Advances in Genetic Programming. The MIT Press, Cambridge, MA, [12] Kenneth Kinnear and Peter Angeline. Advances in Genetic Programming, Volume 2. The MIT Press, Cambridge, MA, [13] John R. Koza. Genetic Programming. The MIT Press, Cambridge, MA, [14] John R. Koza. Genetic Programming II. The MIT Press, Cambridge, MA, [15] W.B. Langdon, T. Soule, R. Poli, and J.A. Foster. The evolution of size and shape. In Lee Spector, William B. Langdon, Una-May O Reilly, and Peter J. Angeline, editors, Advances in Genetic Programming, volume 3, pages The MIT Press, Cambridge, MA, [16] W.B. Langton. Quadratic bloat in genetic programming. In Proceedings of the Genetic and Evolutionary Computation Conference 2000, pages , [17] Peter Nordin, Frank Francone, and Wolfgang Banzhof. Explicitly defined introns and destructive crossover in genetic programming. In Peter Angeline and Kenneth E. Kinnear Jr., editors, Advances in Genetic Programming, volume 2, pages The MIT Press, Cambridge, MA, [18] John Dickinson Terence Soule, James A. Foster. Code growth in genetic programming. In Genetic Programming 1996, pages , 1996.

Detecting and Pruning Introns for Faster Decision Tree Evolution

Detecting and Pruning Introns for Faster Decision Tree Evolution Detecting and Pruning Introns for Faster Decision Tree Evolution Jeroen Eggermont and Joost N. Kok and Walter A. Kosters Leiden Institute of Advanced Computer Science Universiteit Leiden P.O. Box 9512,

More information

Introduction to Artificial Intelligence. Prof. Inkyu Moon Dept. of Robotics Engineering, DGIST

Introduction to Artificial Intelligence. Prof. Inkyu Moon Dept. of Robotics Engineering, DGIST Introduction to Artificial Intelligence Prof. Inkyu Moon Dept. of Robotics Engineering, DGIST Chapter 9 Evolutionary Computation Introduction Intelligence can be defined as the capability of a system to

More information

EVOLUTIONARY ALGORITHMS AT CHOICE: FROM GA TO GP EVOLŪCIJAS ALGORITMI PĒC IZVĒLES: NO GA UZ GP

EVOLUTIONARY ALGORITHMS AT CHOICE: FROM GA TO GP EVOLŪCIJAS ALGORITMI PĒC IZVĒLES: NO GA UZ GP ISSN 1691-5402 ISBN 978-9984-44-028-6 Environment. Technology. Resources Proceedings of the 7 th International Scientific and Practical Conference. Volume I1 Rēzeknes Augstskola, Rēzekne, RA Izdevniecība,

More information

OPERATOR CHOICE AND THE EVOLUTION OF ROBUST SOLUTIONS

OPERATOR CHOICE AND THE EVOLUTION OF ROBUST SOLUTIONS Chapter 16 OPERATOR CHOICE AND THE EVOLUTION OF ROBUST SOLUTIONS Terence Soule Department of Computer Science University of Idaho, Moscow, ID, 83844-1010 tsoule@cs.uidaho.edu Abstract This research demonstrates

More information

GENETIC ALGORITHMS. Narra Priyanka. K.Naga Sowjanya. Vasavi College of Engineering. Ibrahimbahg,Hyderabad.

GENETIC ALGORITHMS. Narra Priyanka. K.Naga Sowjanya. Vasavi College of Engineering. Ibrahimbahg,Hyderabad. GENETIC ALGORITHMS Narra Priyanka K.Naga Sowjanya Vasavi College of Engineering. Ibrahimbahg,Hyderabad mynameissowji@yahoo.com priyankanarra@yahoo.com Abstract Genetic algorithms are a part of evolutionary

More information

An Analytical Upper Bound on the Minimum Number of. Recombinations in the History of SNP Sequences in Populations

An Analytical Upper Bound on the Minimum Number of. Recombinations in the History of SNP Sequences in Populations An Analytical Upper Bound on the Minimum Number of Recombinations in the History of SNP Sequences in Populations Yufeng Wu Department of Computer Science and Engineering University of Connecticut Storrs,

More information

Genetic algorithms. History

Genetic algorithms. History Genetic algorithms History Idea of evolutionary computing was introduced in the 1960s by I. Rechenberg in his work "Evolution strategies" (Evolutionsstrategie in original). His idea was then developed

More information

On Naïve Crossover Biases with Reproduction for Simple Solutions to Classification Problems

On Naïve Crossover Biases with Reproduction for Simple Solutions to Classification Problems On Naïve Crossover Biases with Reproduction for Simple Solutions to Classification Problems M. David Terrio and Malcolm I. Heywood Dalhousie University, Faculty of Computer Science 6040 University Avenue,

More information

Deterministic Crowding, Recombination And Self-Similarity

Deterministic Crowding, Recombination And Self-Similarity Deterministic Crowding, Recombination And Self-Similarity Bo Yuan School of Information Technology and Electrical Engineering The University of Queensland Brisbane, Queensland 4072 Australia E-mail: s4002283@student.uq.edu.au

More information

Genetic Programming for Symbolic Regression

Genetic Programming for Symbolic Regression Genetic Programming for Symbolic Regression Chi Zhang Department of Electrical Engineering and Computer Science, University of Tennessee, Knoxville, TN 37996, USA Email: czhang24@utk.edu Abstract Genetic

More information

Repeated Sequences in Genetic Programming

Repeated Sequences in Genetic Programming Repeated Sequences in Genetic Programming W. B. Langdon Computer Science 29.6.2012 1 Introduction Langdon + Banzhaf in Memorial University, Canada Emergence: Repeated Sequences Repeated Sequences in Biology

More information

Intelligent Techniques Lesson 4 (Examples about Genetic Algorithm)

Intelligent Techniques Lesson 4 (Examples about Genetic Algorithm) Intelligent Techniques Lesson 4 (Examples about Genetic Algorithm) Numerical Example A simple example will help us to understand how a GA works. Let us find the maximum value of the function (15x - x 2

More information

arxiv: v1 [cs.ne] 23 Sep 2013

arxiv: v1 [cs.ne] 23 Sep 2013 On the Success Rate of Crossover Operators for Genetic Programming with Offspring Selection Gabriel Kronberger, Stephan Winkler, Michael Affenzeller, Andreas Beham, and Stefan Wagner arxiv:139.5896v1 [cs.ne]

More information

Optimisation and Operations Research

Optimisation and Operations Research Optimisation and Operations Research Lecture 17: Genetic Algorithms and Evolutionary Computing Matthew Roughan http://www.maths.adelaide.edu.au/matthew.roughan/ Lecture_notes/OORII/

More information

Computational Intelligence Lecture 20:Intorcution to Genetic Algorithm

Computational Intelligence Lecture 20:Intorcution to Genetic Algorithm Computational Intelligence Lecture 20:Intorcution to Genetic Algorithm Farzaneh Abdollahi Department of Electrical Engineering Amirkabir University of Technology Fall 2012 Farzaneh Abdollahi Computational

More information

2. Genetic Algorithms - An Overview

2. Genetic Algorithms - An Overview 2. Genetic Algorithms - An Overview 2.1 GA Terminology Genetic Algorithms (GAs), which are adaptive methods used to solve search and optimization problems, are based on the genetic processes of biological

More information

Contents. Preface...VII

Contents. Preface...VII Contents Preface...VII 1 An Overview of Evolutionary Computation... 1 1.1 Examples of Evolutionary Computation... 3 1.1.1 Predators Running Backward... 3 1.1.2 Wood-Burning Stoves... 5 1.1.3 Hyperspectral

More information

Part 1: Motivation, Basic Concepts, Algorithms

Part 1: Motivation, Basic Concepts, Algorithms Part 1: Motivation, Basic Concepts, Algorithms 1 Review of Biological Evolution Evolution is a long time scale process that changes a population of an organism by generating better offspring through reproduction.

More information

Comp215: Genetic Algorithms - Part 1

Comp215: Genetic Algorithms - Part 1 Comp215: Genetic Algorithms - Part 1 Mack Joyner, Dan S. Wallach (Rice University) Copyright 2016, Mack Joyner, Dan S. Wallach. All rights reserved. Darwin s Theory of Evolution Individual organisms differ

More information

Introduction To Genetic Algorithms

Introduction To Genetic Algorithms 1 Introduction To Genetic Algorithms Dr. Rajib Kumar Bhattacharjya Department of Civil Engineering IIT Guwahati Email: rkbc@iitg.ernet.in References 2 D. E. Goldberg, Genetic Algorithm In Search, Optimization

More information

Evolutionary Computation

Evolutionary Computation Evolutionary Computation Evolution and Intelligent Besides learning ability, intelligence can also be defined as the capability of a system to adapt its behaviour to ever changing environment. Evolutionary

More information

Training Finite State Machines to Improve PCR Primer Design

Training Finite State Machines to Improve PCR Primer Design Training Finite State Machines to Improve PCR Primer Design Dan Ashlock Mathematics Department Bioinformatics Program Iowa State University, Ames, Iowa 50011 danwell@iastate.edu Andrew Wittrock Computer

More information

VISHVESHWARAIAH TECHNOLOGICAL UNIVERSITY S.D.M COLLEGE OF ENGINEERING AND TECHNOLOGY. A seminar report on GENETIC ALGORITHMS.

VISHVESHWARAIAH TECHNOLOGICAL UNIVERSITY S.D.M COLLEGE OF ENGINEERING AND TECHNOLOGY. A seminar report on GENETIC ALGORITHMS. VISHVESHWARAIAH TECHNOLOGICAL UNIVERSITY S.D.M COLLEGE OF ENGINEERING AND TECHNOLOGY A seminar report on GENETIC ALGORITHMS Submitted by Pranesh S S 2SD06CS061 8 th semester DEPARTMENT OF COMPUTER SCIENCE

More information

A NEW MUTATION OPERATOR IN GENETIC PROGRAMMING

A NEW MUTATION OPERATOR IN GENETIC PROGRAMMING ISSN: 2229-6956(ONLINE) DOI: 10.21917/ijsc.2013.0070 ICTACT JOURNAL ON SOFT COMPUTING, JANUARY 2013, VOLUME: 03, ISSUE: 02 A NEW MUTATION OPERATOR IN GENETIC PROGRAMMING Anuradha Purohit 1, Narendra S.

More information

Genetic Algorithms. Moreno Marzolla Dip. di Informatica Scienza e Ingegneria (DISI) Università di Bologna.

Genetic Algorithms. Moreno Marzolla Dip. di Informatica Scienza e Ingegneria (DISI) Università di Bologna. Genetic Algorithms Moreno Marzolla Dip. di Informatica Scienza e Ingegneria (DISI) Università di Bologna http://www.moreno.marzolla.name/ Slides credit: Ozalp Babaoglu History Pioneered by John Henry Holland

More information

Timetabling with Genetic Algorithms

Timetabling with Genetic Algorithms Timetabling with Genetic Algorithms NADIA NEDJAH AND LUIZA DE MACEDO MOURELLE Department of de Systems Engineering and Computation, State University of Rio de Janeiro São Francisco Xavier, 524, 5 O. Andar,

More information

College of information technology Department of software

College of information technology Department of software University of Babylon Undergraduate: third class College of information technology Department of software Subj.: Application of AI lecture notes/2011-2012 ***************************************************************************

More information

Modularity, Product Innovation, and Consumer Satisfaction: An Agent-Based Approach

Modularity, Product Innovation, and Consumer Satisfaction: An Agent-Based Approach Modularity, Product Innovation, and Consumer Satisfaction: An Agent-Based Approach Shu-Heng Chen and Bin-Tzong Chie AI-ECON Research Center, Department of Economics National Chengchi University, Taipei,

More information

Intro. ANN & Fuzzy Systems. Lecture 36 GENETIC ALGORITHM (1)

Intro. ANN & Fuzzy Systems. Lecture 36 GENETIC ALGORITHM (1) Lecture 36 GENETIC ALGORITHM (1) Outline What is a Genetic Algorithm? An Example Components of a Genetic Algorithm Representation of gene Selection Criteria Reproduction Rules Cross-over Mutation Potential

More information

An introduction to evolutionary computation

An introduction to evolutionary computation An introduction to evolutionary computation Andrea Roli andrea.roli@unibo.it Dept. of Computer Science and Engineering (DISI) Campus of Cesena Alma Mater Studiorum Università di Bologna Outline 1 Basic

More information

Gene Fragment Programming Master thesis

Gene Fragment Programming Master thesis Gene Fragment Programming Master thesis Piotr Holubowicz October 17, 2008 Promotor: dr hab. inż. Krzysztof Krawiec Poznań University of Technology, Faculty of Management and Computing, Institute of Computer

More information

On the Evolution of Evolutionary Algorithms

On the Evolution of Evolutionary Algorithms On the Evolution of Evolutionary Algorithms Jorge Tavares, Penousal Machado,2, Amílcar Cardoso, Francisco B. Pereira,2, and Ernesto Costa Centre for Informatics and Systems of the University of Coimbra,

More information

Optimal, Efficient Reconstruction of Phylogenetic Networks with Constrained Recombination

Optimal, Efficient Reconstruction of Phylogenetic Networks with Constrained Recombination UC Davis Computer Science Technical Report CSE-2003-29 1 Optimal, Efficient Reconstruction of Phylogenetic Networks with Constrained Recombination Dan Gusfield, Satish Eddhu, Charles Langley November 24,

More information

Processor Scheduling Algorithms in Environment of Genetics

Processor Scheduling Algorithms in Environment of Genetics Processor Scheduling Algorithms in Environment of Genetics Randeep Department of Computer Science and Engineering R.N. College of Engg. & Technology Haryana, India randeepravish@gmail.com Abstract The

More information

Genetic Algorithm for Predicting Protein Folding in the 2D HP Model

Genetic Algorithm for Predicting Protein Folding in the 2D HP Model Genetic Algorithm for Predicting Protein Folding in the 2D HP Model A Parameter Tuning Case Study Eyal Halm Leiden Institute of Advanced Computer Science, University of Leiden Niels Bohrweg 1 2333 CA Leiden,

More information

Introduction To Genetic Algorithms

Introduction To Genetic Algorithms Introduction To Genetic Algorithms Cse634 DATA MINING Professor Anita Wasilewska Computer Science Department Stony Brook University 1 Overview Introduction To Genetic Algorithms (GA) GA Operators and Parameters

More information

Genetic'Algorithms'::' ::'Algoritmi'Genetici'1

Genetic'Algorithms'::' ::'Algoritmi'Genetici'1 Genetic'Algorithms'::' ::'Algoritmi'Genetici'1 Prof. Mario Pavone Department of Mathematics and Computer Sciecne University of Catania v.le A. Doria 6 95125 Catania, Italy mpavone@dmi.unict.it http://www.dmi.unict.it/mpavone/

More information

Genetic Algorithm: A Search of Complex Spaces

Genetic Algorithm: A Search of Complex Spaces Genetic Algorithm: A Search of Complex Spaces Namita Khurana, Anju Rathi, Akshatha.P.S Lecturer in Department of (CSE/IT) KIIT College of Engg., Maruti Kunj, Sohna Road, Gurgaon, India ABSTRACT Living

More information

What is Evolutionary Computation? Genetic Algorithms. Components of Evolutionary Computing. The Argument. When changes occur...

What is Evolutionary Computation? Genetic Algorithms. Components of Evolutionary Computing. The Argument. When changes occur... What is Evolutionary Computation? Genetic Algorithms Russell & Norvig, Cha. 4.3 An abstraction from the theory of biological evolution that is used to create optimization procedures or methodologies, usually

More information

Time Series Modeling with Genetic Programming Relative to ARIMA Models

Time Series Modeling with Genetic Programming Relative to ARIMA Models Time Series Modeling with Genetic Programming Relative to ARIMA Models Miroslav Kľúčik 1, Jana Juriová 2, Marian Kľúčik 3 1 INFOSTAT, Slovakia, klucik@infostat.sk 2 INFOSTAT, Slovakia, juriova@infostat.sk

More information

Machine Learning. Genetic Algorithms

Machine Learning. Genetic Algorithms Machine Learning Genetic Algorithms Genetic Algorithms Developed: USA in the 1970 s Early names: J. Holland, K. DeJong, D. Goldberg Typically applied to: discrete parameter optimization Attributed features:

More information

Machine Learning. Genetic Algorithms

Machine Learning. Genetic Algorithms Machine Learning Genetic Algorithms Genetic Algorithms Developed: USA in the 1970 s Early names: J. Holland, K. DeJong, D. Goldberg Typically applied to: discrete parameter optimization Attributed features:

More information

Grammatical Evolution. Santa Fe (Degeneracy) Santa Fe (No Degeneracy) Symbolic Regression (Degeneracy) Symbolic Regression (No Degeneracy)

Grammatical Evolution. Santa Fe (Degeneracy) Santa Fe (No Degeneracy) Symbolic Regression (Degeneracy) Symbolic Regression (No Degeneracy) Genetic Code Degeneracy: Implications for Grammatical Evolution and Beyond Michael O'Neill & Conor Ryan Dept. of Computer Science and Information Systems University of Limerick Ireland fmichael.oneilljconor.ryang@ul.ie

More information

Dynamics and Performance of a Linear Genetic Programming System

Dynamics and Performance of a Linear Genetic Programming System THESIS FOR THE DEGREE OF LICENTIATE IN COMPLEX SYSTEMS Dynamics and Performance of a Linear Genetic Programming System FRANK D. FRANCONE Department of Energy and Environment Division of Physical Resource

More information

Multiagent Systems: Spring 2006

Multiagent Systems: Spring 2006 Multiagent Systems: Spring 2006 Ulle Endriss Institute for Logic, Language and Computation University of Amsterdam Ulle Endriss (ulle@illc.uva.nl) 1 Combinatorial Auctions In a combinatorial auction, the

More information

Energy management using genetic algorithms

Energy management using genetic algorithms Energy management using genetic algorithms F. Garzia, F. Fiamingo & G. M. Veca Department of Electrical Engineering, University of Rome "La Sapienza", Italy Abstract An energy management technique based

More information

Minimizing Makespan for Machine Scheduling and Worker Assignment Problem in Identical Parallel Machine Models Using GA

Minimizing Makespan for Machine Scheduling and Worker Assignment Problem in Identical Parallel Machine Models Using GA , June 30 - July 2, 2010, London, U.K. Minimizing Makespan for Machine Scheduling and Worker Assignment Problem in Identical Parallel Machine Models Using GA Imran Ali Chaudhry, Sultan Mahmood and Riaz

More information

AGENT-BASED SIMULATION OF PRODUCT INNOVATION: MODULARITY, COMPLEXITY AND DIVERSITY

AGENT-BASED SIMULATION OF PRODUCT INNOVATION: MODULARITY, COMPLEXITY AND DIVERSITY 1 AGENT-BASED SIMULATION OF PRODUCT INNOVATION: MODULARITY, COMPLEXITY AND DIVERSITY S.H. CHEN, National Chengchi University, Taiwan B.T. CHIE, National Chengchi University, Taiwan ABSTRACT The importance

More information

Journal of Global Research in Computer Science PREMATURE CONVERGENCE AND GENETIC ALGORITHM UNDER OPERATING SYSTEM PROCESS SCHEDULING PROBLEM

Journal of Global Research in Computer Science PREMATURE CONVERGENCE AND GENETIC ALGORITHM UNDER OPERATING SYSTEM PROCESS SCHEDULING PROBLEM Volume, No. 5, December 00 Journal of Global Research in Computer Science RESEARCH PAPER Available Online at www.jgrcs.info PREMATURE CONVERGENCE AND GENETIC ALGORITHM UNDER OPERATING SYSTEM PROCESS SCHEDULING

More information

Genetic Algorithm: An Optimization Technique Concept

Genetic Algorithm: An Optimization Technique Concept Genetic Algorithm: An Optimization Technique Concept 1 Uma Anand, 2 Chain Singh 1 Student M.Tech (3 rd sem) Department of Computer Science Engineering Dronacharya College of Engineering, Gurgaon-123506,

More information

Evolutionary Algorithms - Population management and popular algorithms Kai Olav Ellefsen

Evolutionary Algorithms - Population management and popular algorithms Kai Olav Ellefsen INF3490 - Biologically inspired computing Lecture 3: Eiben and Smith, chapter 5-6 Evolutionary Algorithms - Population management and popular algorithms Kai Olav Ellefsen Repetition: General scheme of

More information

Logistics. Final exam date. Project Presentation. Plan for this week. Evolutionary Algorithms. Crossover and Mutation

Logistics. Final exam date. Project Presentation. Plan for this week. Evolutionary Algorithms. Crossover and Mutation Logistics Crossover and Mutation Assignments Checkpoint -- Problem Graded -- comments on mycourses Checkpoint --Framework Mostly all graded -- comments on mycourses Checkpoint -- Genotype / Phenotype Due

More information

Evolutionary Developmental System for Structural Design

Evolutionary Developmental System for Structural Design Evolutionary Developmental System for Structural Design Rafal Kicinger George Mason University 4400 University Drive MS 4A6 Fairfax, VA 22030 rkicinge@gmu.edu Abstract This paper discusses the results

More information

On the Evolutionary Behavior of Genetic Programming with Constants Optimization

On the Evolutionary Behavior of Genetic Programming with Constants Optimization On the Evolutionary Behavior of Genetic Programming with Constants Optimization Bogdan Burlacu, Michael Affenzeller, and Michael Kommenda University of Applied Sciences Upper Austria Heuristic and Evolutionary

More information

Generational and steady state genetic algorithms for generator maintenance scheduling problems

Generational and steady state genetic algorithms for generator maintenance scheduling problems Generational and steady state genetic algorithms for generator maintenance scheduling problems Item Type Conference paper Authors Dahal, Keshav P.; McDonald, J.R. Citation Dahal, K. P. and McDonald, J.

More information

Max Mean Best Covariance. Max Mean Food eaten. Program Size Number of Programs Created

Max Mean Best Covariance. Max Mean Food eaten. Program Size Number of Programs Created Submitted to Late-Breaking Papers at the Genetic Programming 1997 Conference, J. R. Koza (ed) Fitness Causes Bloat: Mutation W. B. Langdon and R. Poli School of Computer Science, The University of Birmingham,

More information

Fixed vs. Self-Adaptive Crossover-First Differential Evolution

Fixed vs. Self-Adaptive Crossover-First Differential Evolution Applied Mathematical Sciences, Vol. 10, 2016, no. 32, 1603-1610 HIKARI Ltd, www.m-hikari.com http://dx.doi.org/10.12988/ams.2016.6377 Fixed vs. Self-Adaptive Crossover-First Differential Evolution Jason

More information

Repeated Patterns in Tree Genetic Programming

Repeated Patterns in Tree Genetic Programming Repeated Patterns in Tree Genetic Programming W.B. Langdon 1 and W. Banzhaf 2 1 Computer Science, University of Essex, UK 2 Computer Science, Memorial University of Newfoundland, Canada Abstract. We extend

More information

GENETIC ALGORITHM BASED APPROACH FOR THE SELECTION OF PROJECTS IN PUBLIC R&D INSTITUTIONS

GENETIC ALGORITHM BASED APPROACH FOR THE SELECTION OF PROJECTS IN PUBLIC R&D INSTITUTIONS GENETIC ALGORITHM BASED APPROACH FOR THE SELECTION OF PROJECTS IN PUBLIC R&D INSTITUTIONS SANJAY S, PRADEEP S, MANIKANTA V, KUMARA S.S, HARSHA P Department of Human Resource Development CSIR-Central Food

More information

Evolutionary Algorithms

Evolutionary Algorithms Evolutionary Algorithms Fall 2008 1 Introduction Evolutionary algorithms (or EAs) are tools for solving complex problems. They were originally developed for engineering and chemistry problems. Much of

More information

TIMETABLING EXPERIMENTS USING GENETIC ALGORITHMS. Liviu Lalescu, Costin Badica

TIMETABLING EXPERIMENTS USING GENETIC ALGORITHMS. Liviu Lalescu, Costin Badica TIMETABLING EXPERIMENTS USING GENETIC ALGORITHMS Liviu Lalescu, Costin Badica University of Craiova, Faculty of Control, Computers and Electronics Software Engineering Department, str.tehnicii, 5, Craiova,

More information

ABSTRACT COMPUTER EVOLUTION OF GENE CIRCUITS FOR CELL- EMBEDDED COMPUTATION, BIOTECHNOLOGY AND AS A MODEL FOR EVOLUTIONARY COMPUTATION

ABSTRACT COMPUTER EVOLUTION OF GENE CIRCUITS FOR CELL- EMBEDDED COMPUTATION, BIOTECHNOLOGY AND AS A MODEL FOR EVOLUTIONARY COMPUTATION ABSTRACT COMPUTER EVOLUTION OF GENE CIRCUITS FOR CELL- EMBEDDED COMPUTATION, BIOTECHNOLOGY AND AS A MODEL FOR EVOLUTIONARY COMPUTATION by Tommaso F. Bersano-Begey Chair: John H. Holland This dissertation

More information

CHAPTER 3 RESEARCH METHODOLOGY

CHAPTER 3 RESEARCH METHODOLOGY 72 CHAPTER 3 RESEARCH METHODOLOGY Inventory management is considered to be an important field in Supply chain management. Once the efficient and effective management of inventory is carried out throughout

More information

What is Genetic Programming(GP)?

What is Genetic Programming(GP)? Agenda What is Genetic Programming? Background/History. Why Genetic Programming? How Genetic Principles are Applied. Examples of Genetic Programs. Future of Genetic Programming. What is Genetic Programming(GP)?

More information

BUSINESS PROCESS MANAGEMENT IN GENETIC APPROACH CONTEXT

BUSINESS PROCESS MANAGEMENT IN GENETIC APPROACH CONTEXT BUSINESS PROCESS MANAGEMENT IN GENETIC APPROACH CONTEXT Martin Lamr Institute of System Engineering and Informatics, Faculty of Economics and Administration, University of Pardubice Abstract: Need of business

More information

Optimizing Online Auction Bidding Strategies Using Genetic Programming

Optimizing Online Auction Bidding Strategies Using Genetic Programming Optimizing Online Auction Bidding Strategies Using Genetic Programming Ekaterina Smorodkina December 8, 2003 Abstract The research presented in this paper is concerned with creating optimal bidding strategies

More information

An Evolutionary Solution to a Multi-objective Scheduling Problem

An Evolutionary Solution to a Multi-objective Scheduling Problem , June 30 - July 2,, London, U.K. An Evolutionary Solution to a Multi-objective Scheduling Problem Sumeyye Samur, Serol Bulkan Abstract Multi-objective problems have been attractive for most researchers

More information

Evolutionary Computation. Lecture 1 January, 2007 Ivan Garibay

Evolutionary Computation. Lecture 1 January, 2007 Ivan Garibay Evolutionary Computation Lecture 1 January, 2007 Ivan Garibay igaribay@cs.ucf.edu Lecture 1 What is Evolutionary Computation? Evolution, Genetics, DNA Historical Perspective Genetic Algorithm Components

More information

A Comparison between Genetic Algorithms and Evolutionary Programming based on Cutting Stock Problem

A Comparison between Genetic Algorithms and Evolutionary Programming based on Cutting Stock Problem Engineering Letters, 14:1, EL_14_1_14 (Advance online publication: 12 February 2007) A Comparison between Genetic Algorithms and Evolutionary Programming based on Cutting Stock Problem Raymond Chiong,

More information

Testing the Dinosaur Hypothesis under Empirical Datasets

Testing the Dinosaur Hypothesis under Empirical Datasets Testing the Dinosaur Hypothesis under Empirical Datasets Michael Kampouridis 1, Shu-Heng Chen 2, and Edward Tsang 1 1 School of Computer Science and Electronic Engineering, University of Essex, Wivenhoe

More information

PARALLEL LINE AND MACHINE JOB SCHEDULING USING GENETIC ALGORITHM

PARALLEL LINE AND MACHINE JOB SCHEDULING USING GENETIC ALGORITHM PARALLEL LINE AND MACHINE JOB SCHEDULING USING GENETIC ALGORITHM Dr.V.Selvi Assistant Professor, Department of Computer Science Mother Teresa women s University Kodaikanal. Tamilnadu,India. Abstract -

More information

Using Multi-chromosomes to Solve. Hans J. Pierrot and Robert Hinterding. Victoria University of Technology

Using Multi-chromosomes to Solve. Hans J. Pierrot and Robert Hinterding. Victoria University of Technology Using Multi-chromosomes to Solve a Simple Mixed Integer Problem Hans J. Pierrot and Robert Hinterding Department of Computer and Mathematical Sciences Victoria University of Technology PO Box 14428 MCMC

More information

the better the performance. Therefore it is of principle is demonstrated on an easy articial problem unless mentioned otherwise. 2.

the better the performance. Therefore it is of principle is demonstrated on an easy articial problem unless mentioned otherwise. 2. The Evolution of Genetic Code in Genetic Programming Robert E. Keller Systems Analysis Computer Science Department University of Dortmund D-44221 Dortmund, Germany keller@icd.de Abstract In most Genetic

More information

Self-Improvement for the ADATE Automatic Programming System

Self-Improvement for the ADATE Automatic Programming System Self-Improvement for the ADATE Automatic Programming System Roland Olsson Computer Science Dept. Østfold College 1750 HALDEN Norway http://www-ia.hiof.no/ rolando Abstract Automatic Design of Algorithms

More information

Metaheuristics. Approximate. Metaheuristics used for. Math programming LP, IP, NLP, DP. Heuristics

Metaheuristics. Approximate. Metaheuristics used for. Math programming LP, IP, NLP, DP. Heuristics Metaheuristics Meta Greek word for upper level methods Heuristics Greek word heuriskein art of discovering new strategies to solve problems. Exact and Approximate methods Exact Math programming LP, IP,

More information

Genetic Algorithms for Optimizations

Genetic Algorithms for Optimizations Genetic Algorithms for Optimizations 1. Introduction Genetic Algorithms (GAs) are developed to mimic some of the processes observed in natural evolution. GAs use the concept of Darwin's theory of evolution

More information

Implementation of Genetic Algorithm for Agriculture System

Implementation of Genetic Algorithm for Agriculture System Implementation of Genetic Algorithm for Agriculture System Shweta Srivastava Department of Computer science Engineering Babu Banarasi Das University,Lucknow, Uttar Pradesh, India Diwakar Yagyasen Department

More information

The Metaphor. Individuals living in that environment Individual s degree of adaptation to its surrounding environment

The Metaphor. Individuals living in that environment Individual s degree of adaptation to its surrounding environment Genetic Algorithms Sesi 14 Optimization Techniques Mathematical Programming Network Analysis Branch & Bound Simulated Annealing Tabu Search Classes of Search Techniques Calculus Base Techniqes Fibonacci

More information

Evolutionary Developmental System for Structural Design 1

Evolutionary Developmental System for Structural Design 1 Evolutionary Developmental System for Structural Design 1 Rafal Kicinger George Mason University 4400 University Drive MS 4A6 Fairfax, VA 22030 rkicinge@gmu.edu Abstract This paper discusses the results

More information

ReCombinatorics. The Algorithmics and Combinatorics of Phylogenetic Networks with Recombination. Dan Gusfield

ReCombinatorics. The Algorithmics and Combinatorics of Phylogenetic Networks with Recombination. Dan Gusfield ReCombinatorics The Algorithmics and Combinatorics of Phylogenetic Networks with Recombination! Dan Gusfield NCBS CS and BIO Meeting December 19, 2016 !2 SNP Data A SNP is a Single Nucleotide Polymorphism

More information

Evolutionary Computation. Lecture 3. Evolutionary Computation. X 2 example: crossover. x 2 example: selection

Evolutionary Computation. Lecture 3. Evolutionary Computation. X 2 example: crossover. x 2 example: selection Evolutionary Computation Lecture 3 Evolutionary Computation CIS 412 Artificial Intelligence Umass, Dartmouth Stochastic search (or problem solving) techniques that mimic the metaphor of natural biological

More information

GENETIC ALGORITHM CHAPTER 2

GENETIC ALGORITHM CHAPTER 2 CHAPTER 2 GENETIC ALGORITHM Genetic algorithm is basically a method for solving constrained and unconstrained optimization problems. GA is based on the Darwin s theory of natural evolution specified in

More information

PDGA: the Primal-Dual Genetic Algorithm

PDGA: the Primal-Dual Genetic Algorithm P: the Primal-Dual Genetic Algorithm Shengxiang Yang Department of Computer Science University of Leicester University Road, Leicester LE1 7RH, UK Email: syang@mcsleacuk Abstract Genetic algorithms (GAs)

More information

Available online at International Journal of Current Research Vol. 9, Issue, 07, pp , July, 2017

Available online at   International Journal of Current Research Vol. 9, Issue, 07, pp , July, 2017 z Available online at http://www.journalcra.com International Journal of Current Research Vol. 9, Issue, 07, pp.53529-53533, July, 2017 INTERNATIONAL JOURNAL OF CURRENT RESEARCH ISSN: 0975-833X RESEARCH

More information

Selection Of Genetically Diverse Recombinant Inbreds With An Ordered Gene Evolutionary Algorithm.

Selection Of Genetically Diverse Recombinant Inbreds With An Ordered Gene Evolutionary Algorithm. Selection Of Genetically Diverse Recombinant Inbreds With An Ordered Gene Evolutionary Algorithm. Dan Ashlock Mathematics and Statistics University of Guelph Guelph, Ontario Canda N1G 2W1 dashlock@uoguelph.ca

More information

Optimal Design of Laminated Composite Plates by Using Advanced Genetic Algorithm

Optimal Design of Laminated Composite Plates by Using Advanced Genetic Algorithm International Refereed Journal of Engineering and Science (IRJES) ISSN (Online) 2319-183X, (Print) 2319-1821 Volume 3, Issue 5(May 2014), PP.77-86 Optimal Design of Laminated Composite Plates by Using

More information

Genetic approach to solve non-fractional knapsack problem S. M Farooq 1, G. Madhavi 2 and S. Kiran 3

Genetic approach to solve non-fractional knapsack problem S. M Farooq 1, G. Madhavi 2 and S. Kiran 3 Genetic approach to solve non-fractional knapsack problem S. M Farooq 1, G. Madhavi 2 and S. Kiran 3 1,2,3 Y. S. R Engineering College, Yogi Vemana University Korrapad Road, Proddatur 516360, India 1 shaikfaroq@gmail.com,

More information

A Genetic Algorithm for Order Picking in Automated Storage and Retrieval Systems with Multiple Stock Locations

A Genetic Algorithm for Order Picking in Automated Storage and Retrieval Systems with Multiple Stock Locations IEMS Vol. 4, No. 2, pp. 36-44, December 25. A Genetic Algorithm for Order Picing in Automated Storage and Retrieval Systems with Multiple Stoc Locations Yaghoub Khojasteh Ghamari Graduate School of Systems

More information

ESQUIVEL S.C., LEIVA H. A., GALLARD, R.H.

ESQUIVEL S.C., LEIVA H. A., GALLARD, R.H. SELF-ADAPTATION OF PARAMETERS FOR MCPC IN GENETIC ALGORITHMS ESQUIVEL S.C., LEIVA H. A., GALLARD, R.H. Proyecto UNSL-338403 1 Departamento de Informática Universidad Nacional de San Luis (UNSL) Ejército

More information

APPLICATION OF COMPUTER FOR ANALYZING WORLD CO2 EMISSION

APPLICATION OF COMPUTER FOR ANALYZING WORLD CO2 EMISSION APPLICATION OF COMPUTER FOR ANALYZING WORLD CO2 EMISSION M. Kavoosi 1, B.shafiee 2 1 Department of Computer Engineering, Izeh Branch, Islamic Azad University, Izeh, Iran 1 E-mail address: Hakavoosi@yahoo.com

More information

Genetic Algorithms. Part 3: The Component of Genetic Algorithms. Spring 2009 Instructor: Dr. Masoud Yaghini

Genetic Algorithms. Part 3: The Component of Genetic Algorithms. Spring 2009 Instructor: Dr. Masoud Yaghini Genetic Algorithms Part 3: The Component of Genetic Algorithms Spring 2009 Instructor: Dr. Masoud Yaghini Outline Genetic Algorithms: Part 3 Representation of Individuals Mutation Recombination Population

More information

Tumor Detection Using Genetic Algorithm

Tumor Detection Using Genetic Algorithm Tumor Detection Using Genetic Algorithm 1 Amanpreet Kaur, 2 Gagan Jindal 1,2 Dept. of CSE, Chandigarh Engineering College, Landran, Mohali, Punjab, India Abstract In the medical field, Image Segmentation

More information

GENETIC DRIFT INTRODUCTION. Objectives

GENETIC DRIFT INTRODUCTION. Objectives 2 GENETIC DRIFT Objectives Set up a spreadsheet model of genetic drift. Determine the likelihood of allele fixation in a population of 0 individuals. Evaluate how initial allele frequencies in a population

More information

Forecasting Euro United States Dollar Exchange Rate with Gene Expression Programming

Forecasting Euro United States Dollar Exchange Rate with Gene Expression Programming Forecasting Euro United States Dollar Exchange Rate with Gene Expression Programming Maria Α. Antoniou 1, Efstratios F. Georgopoulos 1,2, Konstantinos A. Theofilatos 1, and Spiridon D. Likothanassis 1

More information

Evolutionary Algorithms

Evolutionary Algorithms Evolutionary Algorithms Evolutionary Algorithms What is Evolutionary Algorithms (EAs)? Evolutionary algorithms are iterative and stochastic search methods that mimic the natural biological evolution and/or

More information

Dominant and Recessive Genes in Evolutionary Systems Applied to Spatial Reasoning

Dominant and Recessive Genes in Evolutionary Systems Applied to Spatial Reasoning Dominant and Recessive Genes in Evolutionary Systems Applied to Spatial Reasoning Thorsten Schnier and John Gero Key Centre of Design Computing Department of Architectural and Design Science University

More information

Machine Learning: Algorithms and Applications

Machine Learning: Algorithms and Applications Machine Learning: Algorithms and Applications Floriano Zini Free University of Bozen-Bolzano Faculty of Computer Science Academic Year 2011-2012 Lecture 4: 19 th March 2012 Evolutionary computing These

More information

Artificial Life Lecture 14 EASy. Genetic Programming. EASy. GP EASy. GP solution to to problem 1. EASy. Picturing a Lisp program EASy

Artificial Life Lecture 14 EASy. Genetic Programming. EASy. GP EASy. GP solution to to problem 1. EASy. Picturing a Lisp program EASy Artificial Life Lecture 14 14 Genetic Programming This will look at 3 aspects of Evolutionary Algorithms: 1) Genetic Programming GP 2) Classifier Systems 3) Species Adaptation Genetic Algorithms -- SAGA

More information

Algebra II Common Core

Algebra II Common Core Core Algebra II Common Core Algebra II introduces students to advanced functions, with a focus on developing a strong conceptual grasp of the expressions that define them. Students learn through discovery

More information

Using Decision Tree to predict repeat customers

Using Decision Tree to predict repeat customers Using Decision Tree to predict repeat customers Jia En Nicholette Li Jing Rong Lim Abstract We focus on using feature engineering and decision trees to perform classification and feature selection on the

More information