Analysis of NEAT and application in swarm intelligence

Size: px
Start display at page:

Download "Analysis of NEAT and application in swarm intelligence"

Transcription

1 Bachelor Informatica Informatica Universiteit van Amsterdam Analysis of NEAT and application in swarm intelligence Frank van Beem June 9, 2017 Supervisor(s): Rein van den Boomgaard

2 2

3 Abstract In this paper we examine if NeuroEvolution of Augmenting Topologies (NEAT) can be applied to evolving swarm intelligence. The NEAT algorithm is a genetic algorithm that evolves both the topology and the connection weights of a neural network. Swarm intelligence is the collective intelligent behaviour of decentralized agents. First NEAT is tested on a number of increasingly complex problems to test how well it performs. Then a neural network is evolved that is used as a controller for simulated agents in a swarm. This is done both with and without the possibility of communication between the agents to test if communication takes place. 3

4 4

5 Contents 1 Introduction 7 2 Background Genetic algorithms Application Neural networks NEAT algorithm Representation Speciation Growth Swarm intelligence Method NEAT-python implementation Parameters Swarm Experiments NEAT benchmarks Fitness function XOR gate XOR gate with noise Iris dataset Handwritten digit recognition Scaling complexity Evolving swarm intelligence Fitness function Results Conclusions 27 A Parameters for NEAT algorithm 31 5

6 6

7 CHAPTER 1 Introduction Genetic algorithms have been around for quite a while. They are inspired by natural evolution. The method was originally developed by John Holland of the University of Michigan, as a way to study the process of evolution [2]. His method has slowly been adapted by other evolutionary scientists into a general machine learning technique [6], rooted in reinforcement learning. Evolution is, after all, a way to search among a collection of candidate solutions for a fitting solution. In machine learning the collection of possible solutions is known as the search space. Because genetic algorithms do not make problem specific assumptions about the search space, they can be deployed on a wide range of problems. In 2002 Kenneth O. Stanley and Risto Miikkulainen wrote a paper on Evolving neural networks through augmenting topologies, at the University of Texas in Austin [11]. This paper describes a specific type of genetic algorithm for neuro-evolution. The NEAT or NeuroEvolution of Augmenting Topologies algorithm can simultaneously evolve the topology of an artificial neural network as well as the weights of the connections. The algorithm has been used in tasks such as classifiers and game AI. It has also spawned multiple variations, such as hyperneat specifically for large scale networks with fixed topologies [10], and rtneat for real-time learning [9]. This paper examines how well the NEAT algorithm can be applied to learning swarm intelligence. Swarm intelligence being the collective intelligent behaviour of decentralized units. Often relatively simple rules of the units can result in complex emerging patters. Swarm intelligence is widely studied in insects. For swarm intelligence to occur the units need to have a way to communicate with each other to reach a common goal. Evolving swarm intelligence would for instance be useful in controllers for robots working in swarms. To start off we test NEAT on simple tasks to show how well it performs, and to test how well it can handle data with noise. Then we test the algorithm on increasingly more complex tasks to test the limits of the algorithm. Lastly we test the algorithm on a simulation of a swarm, both with and without the ability to communicate to determine how well it performs and if communication takes place. If so this indicates that the agents are working together. 7

8 8

9 CHAPTER 2 Background 2.1 Genetic algorithms A genetic algorithm is a machine learning technique inspired by natural evolution. It is primarily used to optimize function parameters, but it can be applied to a much wider range of search and optimization problems [12]. It is a form of reinforcement learning. A genetic algorithm works by keeping a population of candidate solutions, also called individuals or phenotypes. Each solution is represented by certain properties, called genes. In traditional genetic algorithms these genes are often encoded by a fixed length string of bits. More complex genes represented by data structures can also be used. Finding a good encoding for the properties of the problem is a key part in using a genetic algorithm. The algorithm iterates through a couple of steps to progressively improve the solutions in the population. Figure 2.1: Steps of a genetic algorithm. The first step is the initialization of the population. This can be done either with solutions with randomly generated genes, or by using encoded preconstructed partial solutions. Using partial solutions can speed up the process, but this is only possible and desired for some problems. The next step after initialization is evaluating the fitness of each solution in the population; that is how well the solution performs or how well it matches a target solution. The search space in the genetic algorithm consists of all possible combinations of genes. The fitness function that defines the fitness, works as a heuristic for this search space. As such, it is important to use a fitness function that selects the best performing solutions correctly, and has enough granularity to detect minor differences in fitness. Once the fitness of each solution in the population is known, a new population is produced, replacing the old population. The reproduction is inspired by natural reproduction. The higher the fitness of the solution, the higher the chance of reproduction. Reproduction of a high fitness solution means its well performing genes slowly spread through the population, raising the overall fitness of the population. Note that although solutions with a lower fitness have a lower chance 9

10 to reproduce they can often still pass on their genes to a small portion of the new generation. This happens so the less fit genes are not dropped from the population too quickly, reducing the diversity of the population too much. Optionally one or more of the best performing solutions are carried over unchanged from the original population to guarantee the (genes of the) fittest solutions are preserved. Genetic algorithms also use crossover in this step, which is inspired by sexual reproduction in which offspring has a mixture of genes from multiple parents. Crossover does not introduce new genes to the population, but rather finds new combinations of genes. Crossover is not required for a genetic algorithm to work, but does generally speed up the algorithm significantly. The next step, mutation, mimics biological mutation: meaning that offspring can at random get some of their genes altered to different versions, possibly to new versions not yet found within the population. Having too many mutations means proven fit genes are not well preserved within the population while having too few means growth is limited by a lack of new genes. By iterating through the selection, reproduction and mutation steps the average fitness of the population will keep increasing, gradually working toward a better solution to the problem. The algorithm can be stopped when a satisfactory solution is found or when the improvements diminish with each iteration, or no further improvement occurs at all Application Genetic algorithms are a very general technique that can be applied to a wide range of search and optimization problems. The only real requirements are finding a suitable encoding that allows for mutation and crossover, and being able to define a fitness evaluation with enough granularity. Genetic algorithms do not always improve individual solutions at each step, but occasionally produce less performing solutions. At any time they have a population of diverse solutions spread over the search space. These qualities make them less prone to converting to local optima than some other learning algorithms. This does, however, not mean they are immune. In general, a more diverse population is less likely to converge to local optima [12]. This generality does come at a cost: genetic algorithms tend to scale poorly with more complex problems where a large number of genes are required for the encoding of a working solution. This often results in an exponential growth relative to the number of genes. However, it can differ between problems, depending primarily on the fitness function [8]. 2.2 Neural networks Artificial neural networks are a widespread computational model often used in machine learning. The technique is inspired by biological neural networks. A single artificial neuron has a couple of inputs and produces a single output value. The input values are first multiplied with a weight between 1 and 1, then all these values are summed. An activation function is applied to the summed value which remaps the value to the range 1 to 0. Often the Sigmoid function is used for this purpose: S(x) = 1 1+e. This value then serves as output of the neuron. Usually, a bias x value is also added, this is a fixed value input that allows for an offset in the summation function [7]. Figure 2.2: A single neuron, where w are the input weights. With such an artificial neuron a network can be made where the output of one neuron connects to one of the inputs of another. Such networks are usually arranged in layers of neurons, each 10

11 getting all values from the previous layer as input. The first layer is known as the input layer and simply outputs the initial value. The last layer is the output layer which outputs the final values. All layers in between are called hidden layers. Generally, backpropagation is used to train these neural networks to giving the desired output [7]. Figure 2.3: A layered artificial neural network with one fully connected hidden layer. Connection weights are not shown. Irregular or atypical neural networks are networks that can not be arranged in layers. They are not as frequently used as their layered counterparts. These networks are useful in neuroevolution since they can be changed in small increments. Recurring neural networks are networks in which connections and neurons form a cyclic path. This can serve as a form of memory in the system. This is useful in situations in which the previous inputs can play a significant role. Figure 2.4: An irregular artificial neural network with three hidden nodes. Connection weights are not shown. 2.3 NEAT algorithm The NeuroEvolution of Augmenting Topologies (NEAT) algorithm is a genetic algorithm for evolving artificial neural networks (ANN). It simultaneously evolves the weights of the synapses of the ANN as well as its topology. This allows it to dynamically find the right complexity for a problem Representation The major challenge in evolving the topology of an ANN is finding a suitable genetic encoding, with the ability to meaningfully apply mutation and crossover. 11

12 NEAT makes a distinction between node genes and synapse genes. Node genes specify the existence of the node in the ANN, and may hold some auxiliary data like whether it is an input, output or hidden node. Synapse or connection genes specify the existence of a synapse between nodes in the ANN. One such gene holds the direction and weight of the connection, as well as the nodes it connects. It also holds whether the connection is enabled. The connection specified by a disabled gene does not show up in the final ANN, but can be (re)activated later on in the evolution. There can only be one connection between two nodes in either direction, which means recurring connections are allowed. Figure 2.5: Example mapping from encoding to network. 1 and 2 are input nodes, and 5 is the only output node. The connection between nodes 4 and 5 is disabled and therefore not expressed in the network. Mutation Mutation in the NEAT algorithm can occur in a couple of different ways: A connection gene of the offspring can at random be replaced by a new variation of a gene with a new weight. These weights are randomly chosen from a Gaussian distribution. A completely new connection gene with random weight can be added between previously unconnected nodes. A new node can be added with connections between two directly connected nodes, the original direct connection is then disabled. This method ensures new nodes are immediately integrated in the network. The input connection of the new nodes get a weight of one and the output connection gets the same weight as the disabled direct connection. This is done to not immediately disrupt the working and therefore fitness of the network. Note that through mutation the size of the network can only grow. Crossover Crossover is one of the main difficulties for the evolution of topologies, since wildly different topologies within the population should be able to crossover. For crossover all the nodes present in either of the parents are also present in the offspring. The connection genes from both parents are lined up according to the nodes they connect. For the connection genes present in both parents one of the genes is chosen at random for the offspring. Connection genes present in only one of the parents are always passed on to the offspring. 12

13 Figure 2.6: Example of crossover on genomes and the resulting networks. Only connection genes are shown. Different gray shades indicate different versions of a gene. The connection between nodes 4 and 5 is disabled Speciation Using mutation and crossover in this way can result in new topologies quite quickly. However, radically new topologies tend to have a lower fitness when they have just been created since the weights are still optimized for a different topology. This lower fitness often means they are dropped from the population before they get a chance to get optimized. To solve this problem NEAT uses speciation, and explicit fitness sharing. When a new topology is deemed different enough from all other topologies, it is labeled as a new species. The difference between the genomes is expressed in a single scalar according to the function: δ = c 1 D N + c 2W 1 D is the number of disjoint genes, that is genes present in only one of the two genomes. N is the size of the genome (number of genes), used to normalize the number of disjoint genes. W is the average distance in weights in non-disjoint connection genes (including disabled ones). c 1 and c 2 are coefficients and are used to weight the two parts against each other. Individual genomes only compete with other individuals within the same species. Fitness values of all individuals are inversely scaled with the number of individuals within their species, resulting in an adjusted fitness value. A larger species results in lower adjusted fitness values for its members. This way a low fitness topology gets to stay in the population and compete with individuals within the same species, without this species taking up a significant portion of the population. If the new species does match or overtake the other species in absolute fitness, it will get more offspring and therefore more chance to develop. This method not only protects new topologies, but also protects diversity, which as previously discussed is essential for a genetic algorithm [5]. 1 Adapted from Kenneth et al., since disjoint and excess genes are functionally identical. 13

14 2.3.3 Growth The NEAT algorithm is usually initialized with only the input and output nodes, and allowed to grow over time. This means a network can, through the evolution process, grow to the right complexity for the problem. At this point, further growth will almost stop, since this will not result in a higher fitness. 2.4 Swarm intelligence Swarm intelligence is the collective behaviour of self-organizing entities, through a decentralized system. In nature this is found in many insect colonies, where individuals only interact with each other and the local environment. Yet through following simple rules the colony as a whole can perform complex tasks such as an ant colony finding and retrieving large quantities of food. Having individual agents without central control can be useful in certain applications since it eliminates single points of failure in the system, making the system more reliable. It is also useful in computation where it is often advantageous to have multiple simpler machines rather than one powerful one [1]. For swarm intelligence to occur, the agents need to be able to locally or globally communicate. This can be as simple as knowing each other s position, or through a complex system of communication such as language. Ant and bee colonies communicate through the environment by leaving trails of pheromones, that other members of the colony can pick up. [3] 14

15 CHAPTER 3 Method 3.1 NEAT-python implementation All benchmarks are implemented with the NEAT-python library [4]. There are a couple of differences between this implementation and the method described by Stanley and Miikkulainen. This implementation allows for the removal of nodes and connection through mutation. In contrast to the original method this enables the genome to decrease in complexity. Though this eliminates the need to start with a minimal complexity genome, it is still preferable to do so. This so the algorithm is not started with unnecessarily large networks. NEAT-python uses an additional way of mutation for connection weights. A value is picked at random from a Gaussian distribution with a mean of zero, and added to the connection weight. This allows for the connection weights to have a more gradual evolution as opposed to waiting for a good random value. There is a relatively high chance to change a weight by this mutation, and a significantly lower chance to mutate through the replacement method. When species within the population have stagnated, that is, not shown any improvement within a certain number of generations, the NEAT-python implementation labels these species as dead ends and removes all individuals of the species from the population. This assigns more room and therefore computational power to the individuals from species that have not yet stagnated. The number of generations before a species is stagnated needs to be carefully chosen, since a value chosen too low will remove the diversity the algorithm relies on. All hidden and output nodes in the network in the NEAT-python implementation use a bias connection. 3.2 Parameters A series of parameters influence the workings of the algorithm: The population size is the number of individuals in the population. A larger population size allows for more genetic diversity, but increases computation time per generation. The benchmark in this paper uses a population size of 50 to 100 individuals. There is a series of parameters that determine the likelihood of each mutation. These values control the introduction speed of new genes and combinations of genes. To keep the loss of innovation minimal, these values have been chosen to not mutate excessively. Because these values primary effect is on learning speed and not the possible quality of the solution, they are kept constant across all benchmarks. The maximum stagnation number sets the number of generations a species is allowed to exist without improvement. The benchmarks use a maximum stagnation between 20 and 50 generations. 15

16 Elitism determines how many of the best performing individuals are carried over to the next generation unchanged. The benchmarks use two. Genome compatibility is dependent on the two constants in the function that weight topological difference and connection weight difference against each other, as well as the threshold that is used for division in separate species. In the benchmarks topological difference is weighted twice as much as the connection weight difference is. The threshold is chosen per task to have roughly 2-4 species at any time. Though this is hard to control precisely. Since the benchmarks used to test NEAT within this paper are all straightforward mappings from static data to outputs, they do not require memory. For this reason, the NEAT algorithm is configured to not make any recurring connections: connections that would create a directed cycle of nodes and connection in the resulting network. In the benchmarks the network are initialized as fully connected networks without hidden nodes. The full configuration for each benchmark can be found in appendix A. 3.3 Swarm To test the algorithm on actual swarm intelligence a simulation with discrete steps is used. The simulation consists of an area filled with a certain number of individual agents. The agents have to find and retrieve food in the environment. Food is represented as stationary dots, normally distributed around a randomly selected center. The randomness help ensure that the solution does not overfit to one specific scenario. Each agent can only carry one entity of food at a time, which needs to be delivered to the nest. This nest is stationary, so it can theoretically be found when the agent knows its own position. For communication the agents can leave behind pheromone trails, as inspired by ants. These trails fade over time. Each agent has certain input and output capabilities. The input values are: its own position; the intensity of pheromones at its location; whether it is carrying food at the moment and the distance to the nearest food from two different points on the agent. The two distances to the nearest food provide the agent with the information to find the food through triangulation. The outputs of the agent are its forward movement speed, its steering direction and the intensity of the pheromones it leaves behind. The inputs are connected to the output by the neural network created by the NEAT algorithm. The nodes in this network use the Sigmoid function as activation function, and have a bias connection. All agents within a swarm have the same neural network as controller. 16

17 CHAPTER 4 Experiments 4.1 NEAT benchmarks Fitness function The fitness in these classification benchmarks inversely grows with the error. The fitness starts at one and any error is squared and subtracted from the fitness. A fitness value of one would mean there is no error. The algorithm terminates when an individual reaches a threshold fitness of 0.95, or when a given maximum generation is reached. This ensures the algorithm will terminate within a reasonable time frame XOR gate In this first benchmark NEAT is applied to solve the XOR gate: two binary inputs that produce one output. This is an interesting test, because this problem cannot be solved without at least one hidden node, that (indirectly) connects to both inputs. Figure 4.1 shows the distribution of the number of generations before the algorithm is terminated. The distribution peaks around 100 to 120 generations and shows a decline after that. At generation 1000 there is a peak of runs that never produced a solution with a fitness above the threshold. They were terminated after the limit of a 1000 generations. Figure 4.2 shows the average fitness per generation. The fitness initially grows fast, after which growth slows down drastically. It stagnates around a fitness of There is a high variance between the fitness individuals reflected in the high standard deviation. Figure 4.3 shows three example solutions found by the algorithm. The first is the least complex solution with just one hidden node. This and variations on this topology are often found solutions for this problem. The second genome is of the layered variety. It is also a relatively low complexity solution, but less common. The last genome is significantly more complex. The far right of this last network shows a node that is not an output node, and no other node connects to this node, which means it has no effect on the output of the network. Such dead nodes can still be connected later on in the evolution. If they appear in the final neural network, such dead nodes or sections could be removed. This benchmark shows that NEAT has no problem solving a simple, low complexity problem, though a small percentage of runs does not result in a solution after a 1000 generations, showing the random nondeterministic nature of the algorithm XOR gate with noise The XOR gate is a relatively simple problem without any noise. To get an indication of how well NEAT can handle noise, a version of the XOR gate with added noise in the inputs is examined. In Figure 4.4 the dataset is depicted: four normal distributions centered around the permutations of zero and one. Zero representing logical false and one representing logical true. The data points 17

18 Figure 4.1: Histogram of the number of generations before a genome was found with a fitness over the threshold. (XOR problem, 1000 runs) Figure 4.2: Mean fitness and standard deviation per generation. (XOR problem, 1000 runs) 18

19 (a) simple network (b) layered network (c) complex network Figure 4.3: Example genomes from XOR problem. The blue rectangles are the input nodes and the blue circles the output nodes. Figure 4.4: XOR with noise, points from normal distributions, labeled true or false. are labeled true or false according to the normal distribution they originate from. At the edges there is a slight overlap between the distributions. Figure 4.5 shows the distribution of the number of generations before the algorithm is terminated. The distribution is reminiscent of the distribution of the XOR gate without noise, though with a higher mean number of generations. This is reflected by figure 4.6 that shows an initially slower growth in average fitness. The average fitness again stagnates around 0.62 There are a significantly higher number of runs that did not find a solution with a fitness above the threshold before the generation limit. This probably cannot be entirely attributed to the longer learning time, but can also be a reflection of the random nature of the algorithm. Due to the noise, the classifier can no longer be expected to have perfect accuracy. In a validation test on a separate test set the solutions have an average accuracy of 94%. This suggests NEAT does function with noisy data given some extra learning time Iris dataset The iris flower dataset is a dataset consisting of 4 features and one class per example. In total there are three classes. This gives it a more complex initial network than the XOR gate problem. 19

20 Figure 4.5: Histogram of the number of generations before a genome was found with a fitness over the threshold. (XOR problem with noise, 200 runs) Figure 4.6: Mean fitness and standard deviation per generation. (XOR problem with noise, 200 runs) 20

21 Figure 4.7: Histogram of the number of generations before a genome was found with a fitness over the threshold. (IRIS problem, 205 runs) The neural network takes the four features as input values and has three output neurons, one for each class. The fitness function uses 100 of the 150 examples for evaluation and 50 for testing the resulting network. Figure 4.7 shows that the peak of the distribution of time until termination is rather low: around 50 generations. Very few runs have not found a solution above the threshold after 1000 generations. In a validation test with the test examples the solutions have an average accuracy of 94%, just as with the XOR problem with noise. Figure 4.8 shows the average fitness to stagnate around 0.72, which is higher than with the XOR problems. The low number of generations till termination and the higher fitness might be caused by a minimal level of noise in the data, making it easier for the model to find a solution Handwritten digit recognition Handwritten digit recognition is also a common test for self-learning systems. This dataset provides labeled 8 by 8 gray-scale pictures. That gives it 64 features, and 64 input nodes in the network. There are 10 output nodes, one for each digit. This makes it a significantly more complex problem than the IRIS or XOR problems. Figure 4.9 shows that the average fitness over 5 runs quickly stagnates around 0.875, and never reaches the threshold. The standard deviation indicates little deviation between the runs. The average accuracy in a validation test is 35%. The resulting networks have only added a couple of loosely connected extra nodes. To test whether the lack of extra nodes is the limiting factor, a second test is done where the initial network has a fully connected hidden layer of 18 nodes. As can be seen in figure 4.10, the fitness grows in a very similar way to the first test. The resulting solutions have an average accuracy of 17%. It is possible that this is due to the problem simply having a too large initial network, resulting in a large search space. The initial fully connected network has 650 connections including bias connections, which is significantly more than the 15 from the IRIS problem Scaling complexity To find the limits of the complexity for the NEAT algorithm, a generator for datasets is used. This generator creates clusters in a space. A number of dimensions and a number of clusters are given. Each cluster consists of points distributed according to a Gaussian distribution. The 21

22 Figure 4.8: Mean fitness and standard deviation per generation over 205 runs. (IRIS problem) Figure 4.9: Mean fitness and standard deviation per generation over 5 runs. (Handwritten digits problem) 22

23 Figure 4.10: Mean fitness and standard deviation per generation over 5 runs. digits problem with extra hidden layer) (Handwritten clusters are chosen to have minimal overlap, so they can theoretically be classified with minimal error. The NEAT algorithm has been run to evolve a classifiers for the data, with a range of dimensions and number of classes. For each dimension-class combination the algorithm has been run 5 times and has been averaged. Figures 4.1 and 4.11a show the average number of generations before termination. The maximum termination time was 500 generations. The combination of few features with many classes results in the longest average learning times, while many features with few classes makes for the shortest average learning times. This is due to the number of combinations of the input values that need to be formed. In a case with a low number of features and a high number of classes, all clusters can only be identified by their specific small ranges in the few dimensions there are. This requires precise combinations of the input values, meaning a more complex network. In a case with a high number of features and low number of classes, a small number of input values can indicate the cluster of the point. Knowing if a feature has a high or low value can be enough. Meaning no complex combinations of the inputs are required and therefore a less complex network. These figures show that a combination of a high number of features and high number of classes also results in a higher number of generations. This is possibly a result of the increase in size of the network. A larger network means more connections need to have a good value for a solution to work. This means the search space has more possible initial solutions. Because the population size stays constant, it requires more learning time to find a working solution. Figures 4.2 and 4.11b show the performance of the resulting networks after termination. These have been tested on a separate test set. The white areas indicate that anything with a low number of classes performs relatively well. The case with a low number of features and a high number of classes understandably shows a worse performance, since these cases rarely reached the threshold value, due to the high learning time. Interestingly, the cases with a high number of features and high number of classes show similar performance even though these did reach the threshold before the maximum number of generations. This is possibly caused by the correct output for a classifier being just one neuron with a high value. When there are many classes, this one output value represents only a small part of the potential error. Having all output values low, unrelated to the input, means only this small part can be counted towards the error, resulting in a relatively high fitness. This could be seen as the fitness function being unfit for this domain. A different fitness function which scales with the number of classes could potentially be the solution. 23

24 Number of features Number of classes Table 4.1: Table of the average number of generations before termination. Number of features Number of classes Table 4.2: Table of the average performance of the resulting classifiers. (a) Heatmap of the number of generations before termination. A darker color means a longer learning time. (b) Heatmap of the performance of the resulting classifiers. A darker color means a worse performance. Figure 4.11: Heatmaps of tables 4.1 and

25 4.2 Evolving swarm intelligence Fitness function In order to train the swarm to find the food in the environment, a suitable fitness function is required. Since the eventual goal of the swarm is to reward finding and retrieving the food, the fitness is increased for each unit of food picked up, and each food unit brought to the nest. However, this alone is not sufficient. A simulation with such a function cannot distinguish between differing swarms that do not find any food, which the initial population will often not. Since a moving swarm is more likely to find food, the fitness is also increased with the movement at each step of the simulation. This increase is, however, very small compared to the fitness gained for finding food. This is so a swarm with much movement cannot have a higher fitness than a swarm which finds food. To prevent the agent of a swarm to simply run into the edges of the simulation to gain fitness, fitness from movement is only assigned when no edge is being touched. The step from random movement to actively finding food is still a rather large one, with slight increases in performance not being reflected in the fitness. To solve this, fitness for movement is only rewarded when the agent moves towards its current target: either food or the nest. Due to the random placement of the food, the fitness function is not deterministic. This results in a higher likelihood that good genes are lost, due to randomly getting a lower fitness. One solution for this problem is running the simulation several times and averaging the fitness, which increases computation time. However, this is not required since the algorithm can use the extra computation time to run more generations, in which other good genes can appear. These effects cancel each other out Results For the simulation a swarm of 10 agents was used, going through 700 simulation steps. Figure 4.12 shows the average fitness of the population over 400 generations. The fitness initially grows rapidly and then mostly stagnates. Visual inspection of the resulting behaviour shows that the agents in the best solution keep moving in constant circles, not showing any reaction to the presence of food. The initial jump in fitness comes from learning to continuously move. After which, it never learns to find the food. This probably has to do with the fitness function not being granular enough to indicate the difference in fitness between moving in circles and moving toward the food. To find whether the communication through pheromones play any role in the last test, it is repeated without the communication. To keep the initial network at the same complexity the structure is not changed, but the input for the pheromone strength is kept constant, rendering it dysfunctional. Figure 4.13 shows the average fitness of the population for this test. It shows a very similar pattern to the test with communication, though with a slightly slower fitness increase. Visual inspection of the resulting behaviour again shows agents quickly moving in circles. The slightly slower fitness increase in the test without communication can most likely be attributed to the random nature of NEAT. Therefore we can conclude that communication does not occur in this test. It is possible that it may start to play a role if agents do manage to find the food. 25

26 Figure 4.12: Fitness and standard deviation per generation. (Swarm intelligence with communication) Figure 4.13: Fitness and standard deviation per generation. (Swarm intelligence without communication) 26

27 CHAPTER 5 Conclusions The XOR and IRIS problems show that the NEAT algorithm can find well performing solutions to simple classification tasks. Even with the introduction of noise NEAT can produce well performing results. Handwritten digit recognition, which has significantly increased complexity, does not perform nearly as well, even after long periods of training or initializing with an extra layer. This suggests NEAT does not scale well with complexity. The classification tests with the scaling complexity reveal that this may be due to NEAT performing badly with a high number of classes, though the specific fitness function used might also be a factor in this result. These tests also show that NEAT might be less efficient when the network needs to combine many inputs to provide a correct output. Such is the case in the handwritten digit recognition or classification with many classes in a low dimensional area, where no single or small number of features can indicate the output alone. This might be a major hurdle in the evolution of swarm intelligence, since precise combinations of the inputs might be required for the communication to work. The tests on swarm intelligence show that communication does not occur in the early stages of evolution, though it is still possible that this does evolve when and if the agents become individually more successful at the task. Even though reinforcement learning is a good fit to the problem, the experiments suggest that NEAT does not perform well for the evolution of swarm intelligence. An alternative to NEAT could be to evolve the network topology and the network connections separately. A genetic algorithm can evolve the topology. For the classification tasks backpropagation could be used to train the network connections. For swarm intelligence backpropagation cannot be used since it requires learning examples. Instead, hyperneat [10] could be used. 27

28 28

29 Bibliography [1] Simon Garnier, Jacques Gautrais, and Guy Theraulaz. The biological principles of swarm intelligence. In: Swarm Intelligence 1.1 (2007), pp [2] John H. Holland. Adaptation in natural and artificial systems. MIT press, [3] Duncan E. Jackson and Francis L.W. Ratnieks. Communication in ants. In: Current biology (2006), pp [4] CodeReclaimers LLC. NEAT-Python Documentation, Release [5] Brad L. Miller and Michael J. Shaw. Genetic algorithms with dynamic niche sharing for multimodal function optimization. In: Proceedings of IEEE International Conference on Evolutionary Computation (1996), pp [6] Melanie Mitchell. An introduction to genetic algorithms. MIT press, [7] Warren S. Sarle. Neural networks and statistical models. In: Proceedings of the Nineteenth Annual SAS Users Groups International Conference (1994), pp [8] Kumara Sastry, David E. Goldberg, and Graham Kendall. Genetic algorithms. Springer, [9] Kenneth O. Stanley, Bobby D. Bryant, and Risto Miikkulainen. Real-time neuroevolution in the NERO video game. In: IEEE Transactions on Evolutionary Computation (2005), pp [10] Kenneth O. Stanley, David B. D Ambrosio, and Jason Gauci. A hypercube-based encoding for evolving large-scale neural networks. In: Artificial life 15.2 (2009), pp [11] Kenneth O. Stanley and Risto Miikkulainen. Evolving neural networks through augmenting topologies. In: Evolutionary computation 10.2 (2002), pp [12] Darrell Whitley. A genetic algorithm tutorial. In: Statistics and computing 4.2 (1994), pp

30 30

31 APPENDIX A Parameters for NEAT algorithm These are the parameters of the NEAT-python implementation used for the experiments in this paper. A detailed description of each parameter can be found in the NEAT-python documentation. [4] [h] [NEAT] fitness_criterion = max fitness_threshold = 0.95 pop_size = {50 for XOR problems 100 for all other problems} reset_on_extinction = False [DefaultGenome] # node activation options activation_default = sigmoid activation_mutate_rate = 0.0 activation_options = sigmoid # node aggregation options aggregation_default = sum aggregation_mutate_rate = 0.0 aggregation_options = sum # genome compatibility options compatibility_disjoint_coefficient = 1.0 compatibility_weight_coefficient = 0.5 # connection add/remove rates conn_add_prob = 0.5 conn_delete_prob = 0.5 # connection enable options enabled_default = True enabled_mutate_rate = 0.01 feed_forward initial_connection = True = full # node add/remove rates node_add_prob = 0.2 node_delete_prob =

32 # network parameters num_hidden num_inputs num_outputs = {18 for handwritten digits, 0 for all other problems} = {Changing per problem} = {Changing per problem} # node response options response_init_mean = 1.0 response_init_stdev = 0.0 response_max_value = 30.0 response_min_value = response_mutate_power = 0.0 response_mutate_rate = 0.0 response_replace_rate = 0.0 # connection weight options weight_init_mean = 0.0 weight_init_stdev = 1.0 weight_max_value = 30 weight_min_value = -30 weight_mutate_power = 0.5 weight_mutate_rate = 0.8 weight_replace_rate = 0.1 # node bias options bias_init_mean = 0.0 bias_init_stdev = 1.0 bias_max_value = 30.0 bias_min_value = bias_mutate_power = 0.5 bias_mutate_rate = 0.7 bias_replace_rate = 0.1 [DefaultSpeciesSet] compatibility_threshold = {2.0 for handwritten digits, 3.0 for all other problems} [DefaultStagnation] species_fitness_func = max max_stagnation = {50 for handwritten digits, 20 for all other problems} species_elitism = 2 [DefaultReproduction] elitism = 2 survival_threshold =

Computational Intelligence Lecture 20:Intorcution to Genetic Algorithm

Computational Intelligence Lecture 20:Intorcution to Genetic Algorithm Computational Intelligence Lecture 20:Intorcution to Genetic Algorithm Farzaneh Abdollahi Department of Electrical Engineering Amirkabir University of Technology Fall 2012 Farzaneh Abdollahi Computational

More information

GENETIC ALGORITHMS. Narra Priyanka. K.Naga Sowjanya. Vasavi College of Engineering. Ibrahimbahg,Hyderabad.

GENETIC ALGORITHMS. Narra Priyanka. K.Naga Sowjanya. Vasavi College of Engineering. Ibrahimbahg,Hyderabad. GENETIC ALGORITHMS Narra Priyanka K.Naga Sowjanya Vasavi College of Engineering. Ibrahimbahg,Hyderabad mynameissowji@yahoo.com priyankanarra@yahoo.com Abstract Genetic algorithms are a part of evolutionary

More information

Evolutionary Algorithms

Evolutionary Algorithms Evolutionary Algorithms Evolutionary Algorithms What is Evolutionary Algorithms (EAs)? Evolutionary algorithms are iterative and stochastic search methods that mimic the natural biological evolution and/or

More information

Optimisation and Operations Research

Optimisation and Operations Research Optimisation and Operations Research Lecture 17: Genetic Algorithms and Evolutionary Computing Matthew Roughan http://www.maths.adelaide.edu.au/matthew.roughan/ Lecture_notes/OORII/

More information

Genetic Algorithms for Optimizations

Genetic Algorithms for Optimizations Genetic Algorithms for Optimizations 1. Introduction Genetic Algorithms (GAs) are developed to mimic some of the processes observed in natural evolution. GAs use the concept of Darwin's theory of evolution

More information

2. Genetic Algorithms - An Overview

2. Genetic Algorithms - An Overview 2. Genetic Algorithms - An Overview 2.1 GA Terminology Genetic Algorithms (GAs), which are adaptive methods used to solve search and optimization problems, are based on the genetic processes of biological

More information

Introduction to Artificial Intelligence. Prof. Inkyu Moon Dept. of Robotics Engineering, DGIST

Introduction to Artificial Intelligence. Prof. Inkyu Moon Dept. of Robotics Engineering, DGIST Introduction to Artificial Intelligence Prof. Inkyu Moon Dept. of Robotics Engineering, DGIST Chapter 9 Evolutionary Computation Introduction Intelligence can be defined as the capability of a system to

More information

Machine Learning. Genetic Algorithms

Machine Learning. Genetic Algorithms Machine Learning Genetic Algorithms Genetic Algorithms Developed: USA in the 1970 s Early names: J. Holland, K. DeJong, D. Goldberg Typically applied to: discrete parameter optimization Attributed features:

More information

Machine Learning. Genetic Algorithms

Machine Learning. Genetic Algorithms Machine Learning Genetic Algorithms Genetic Algorithms Developed: USA in the 1970 s Early names: J. Holland, K. DeJong, D. Goldberg Typically applied to: discrete parameter optimization Attributed features:

More information

Machine Learning: Algorithms and Applications

Machine Learning: Algorithms and Applications Machine Learning: Algorithms and Applications Floriano Zini Free University of Bozen-Bolzano Faculty of Computer Science Academic Year 2011-2012 Lecture 4: 19 th March 2012 Evolutionary computing These

More information

Introduction To Genetic Algorithms

Introduction To Genetic Algorithms 1 Introduction To Genetic Algorithms Dr. Rajib Kumar Bhattacharjya Department of Civil Engineering IIT Guwahati Email: rkbc@iitg.ernet.in References 2 D. E. Goldberg, Genetic Algorithm In Search, Optimization

More information

CapSel GA Genetic Algorithms.

CapSel GA Genetic Algorithms. CapSel GA - 01 Genetic Algorithms keppens@rijnh.nl Typical usage: optimization problems both minimization and maximization of complicated functions completely standard problem with non-standard solution

More information

College of information technology Department of software

College of information technology Department of software University of Babylon Undergraduate: third class College of information technology Department of software Subj.: Application of AI lecture notes/2011-2012 ***************************************************************************

More information

Comparative Study of Different Selection Techniques in Genetic Algorithm

Comparative Study of Different Selection Techniques in Genetic Algorithm Journal Homepage: Comparative Study of Different Selection Techniques in Genetic Algorithm Saneh Lata Yadav 1 Asha Sohal 2 Keywords: Genetic Algorithms Selection Techniques Roulette Wheel Selection Tournament

More information

Evolutionary Computation

Evolutionary Computation Evolutionary Computation Evolution and Intelligent Besides learning ability, intelligence can also be defined as the capability of a system to adapt its behaviour to ever changing environment. Evolutionary

More information

Evolutionary Computation. Lecture 3. Evolutionary Computation. X 2 example: crossover. x 2 example: selection

Evolutionary Computation. Lecture 3. Evolutionary Computation. X 2 example: crossover. x 2 example: selection Evolutionary Computation Lecture 3 Evolutionary Computation CIS 412 Artificial Intelligence Umass, Dartmouth Stochastic search (or problem solving) techniques that mimic the metaphor of natural biological

More information

Evolutionary Computation. Lecture 1 January, 2007 Ivan Garibay

Evolutionary Computation. Lecture 1 January, 2007 Ivan Garibay Evolutionary Computation Lecture 1 January, 2007 Ivan Garibay igaribay@cs.ucf.edu Lecture 1 What is Evolutionary Computation? Evolution, Genetics, DNA Historical Perspective Genetic Algorithm Components

More information

Genetic Algorithm and Application in training Multilayer Perceptron Model

Genetic Algorithm and Application in training Multilayer Perceptron Model Genetic Algorithm and Application in training Multilayer Perceptron Model Tuan Dung Lai Faculty of Science, Engineering and Technology Swinburne University of Technology Hawthorn, Victoria 3122 Email:

More information

Part 1: Motivation, Basic Concepts, Algorithms

Part 1: Motivation, Basic Concepts, Algorithms Part 1: Motivation, Basic Concepts, Algorithms 1 Review of Biological Evolution Evolution is a long time scale process that changes a population of an organism by generating better offspring through reproduction.

More information

Introduction to Genetic Algorithm (GA) Presented By: Rabiya Khalid Department of Computer Science

Introduction to Genetic Algorithm (GA) Presented By: Rabiya Khalid Department of Computer Science Introduction to Genetic Algorithm (GA) Presented By: Rabiya Khalid Department of Computer Science 1 GA (1/31) Introduction Based on Darwin s theory of evolution Rapidly growing area of artificial intelligence

More information

VISHVESHWARAIAH TECHNOLOGICAL UNIVERSITY S.D.M COLLEGE OF ENGINEERING AND TECHNOLOGY. A seminar report on GENETIC ALGORITHMS.

VISHVESHWARAIAH TECHNOLOGICAL UNIVERSITY S.D.M COLLEGE OF ENGINEERING AND TECHNOLOGY. A seminar report on GENETIC ALGORITHMS. VISHVESHWARAIAH TECHNOLOGICAL UNIVERSITY S.D.M COLLEGE OF ENGINEERING AND TECHNOLOGY A seminar report on GENETIC ALGORITHMS Submitted by Pranesh S S 2SD06CS061 8 th semester DEPARTMENT OF COMPUTER SCIENCE

More information

COMPARATIVE STUDY OF SELECTION METHODS IN GENETIC ALGORITHM

COMPARATIVE STUDY OF SELECTION METHODS IN GENETIC ALGORITHM COMPARATIVE STUDY OF SELECTION METHODS IN GENETIC ALGORITHM 1 MANSI GANGWAR, 2 MAIYA DIN, 3 V. K. JHA 1 Information Security, 3 Associate Professor, 1,3 Dept of CSE, Birla Institute of Technology, Mesra

More information

Deterministic Crowding, Recombination And Self-Similarity

Deterministic Crowding, Recombination And Self-Similarity Deterministic Crowding, Recombination And Self-Similarity Bo Yuan School of Information Technology and Electrical Engineering The University of Queensland Brisbane, Queensland 4072 Australia E-mail: s4002283@student.uq.edu.au

More information

GENETIC ALGORITHM BASED APPROACH FOR THE SELECTION OF PROJECTS IN PUBLIC R&D INSTITUTIONS

GENETIC ALGORITHM BASED APPROACH FOR THE SELECTION OF PROJECTS IN PUBLIC R&D INSTITUTIONS GENETIC ALGORITHM BASED APPROACH FOR THE SELECTION OF PROJECTS IN PUBLIC R&D INSTITUTIONS SANJAY S, PRADEEP S, MANIKANTA V, KUMARA S.S, HARSHA P Department of Human Resource Development CSIR-Central Food

More information

IMPLEMENTATION OF AN OPTIMIZATION TECHNIQUE: GENETIC ALGORITHM

IMPLEMENTATION OF AN OPTIMIZATION TECHNIQUE: GENETIC ALGORITHM IMPLEMENTATION OF AN OPTIMIZATION TECHNIQUE: GENETIC ALGORITHM TWINKLE GUPTA* Department of Computer Science, Hindu Kanya MahaVidyalya, Jind, India Abstract We are encountered with various optimization

More information

Genetic Algorithm: An Optimization Technique Concept

Genetic Algorithm: An Optimization Technique Concept Genetic Algorithm: An Optimization Technique Concept 1 Uma Anand, 2 Chain Singh 1 Student M.Tech (3 rd sem) Department of Computer Science Engineering Dronacharya College of Engineering, Gurgaon-123506,

More information

CHAPTER 4 PROPOSED HYBRID INTELLIGENT APPROCH FOR MULTIPROCESSOR SCHEDULING

CHAPTER 4 PROPOSED HYBRID INTELLIGENT APPROCH FOR MULTIPROCESSOR SCHEDULING 79 CHAPTER 4 PROPOSED HYBRID INTELLIGENT APPROCH FOR MULTIPROCESSOR SCHEDULING The present chapter proposes a hybrid intelligent approach (IPSO-AIS) using Improved Particle Swarm Optimization (IPSO) with

More information

Keywords Genetic Algorithm (GA), Evolutionary, Representation, Binary, Floating Point, Operator

Keywords Genetic Algorithm (GA), Evolutionary, Representation, Binary, Floating Point, Operator Volume 5, Issue 4, 2015 ISSN: 2277 128X International Journal of Advanced Research in Computer Science and Software Engineering Research Paper Available online at: www.ijarcsse.com A Review on Genetic

More information

What is Evolutionary Computation? Genetic Algorithms. Components of Evolutionary Computing. The Argument. When changes occur...

What is Evolutionary Computation? Genetic Algorithms. Components of Evolutionary Computing. The Argument. When changes occur... What is Evolutionary Computation? Genetic Algorithms Russell & Norvig, Cha. 4.3 An abstraction from the theory of biological evolution that is used to create optimization procedures or methodologies, usually

More information

EVOLUTIONARY ALGORITHMS AT CHOICE: FROM GA TO GP EVOLŪCIJAS ALGORITMI PĒC IZVĒLES: NO GA UZ GP

EVOLUTIONARY ALGORITHMS AT CHOICE: FROM GA TO GP EVOLŪCIJAS ALGORITMI PĒC IZVĒLES: NO GA UZ GP ISSN 1691-5402 ISBN 978-9984-44-028-6 Environment. Technology. Resources Proceedings of the 7 th International Scientific and Practical Conference. Volume I1 Rēzeknes Augstskola, Rēzekne, RA Izdevniecība,

More information

Generational and steady state genetic algorithms for generator maintenance scheduling problems

Generational and steady state genetic algorithms for generator maintenance scheduling problems Generational and steady state genetic algorithms for generator maintenance scheduling problems Item Type Conference paper Authors Dahal, Keshav P.; McDonald, J.R. Citation Dahal, K. P. and McDonald, J.

More information

Journal of Global Research in Computer Science PREMATURE CONVERGENCE AND GENETIC ALGORITHM UNDER OPERATING SYSTEM PROCESS SCHEDULING PROBLEM

Journal of Global Research in Computer Science PREMATURE CONVERGENCE AND GENETIC ALGORITHM UNDER OPERATING SYSTEM PROCESS SCHEDULING PROBLEM Volume, No. 5, December 00 Journal of Global Research in Computer Science RESEARCH PAPER Available Online at www.jgrcs.info PREMATURE CONVERGENCE AND GENETIC ALGORITHM UNDER OPERATING SYSTEM PROCESS SCHEDULING

More information

Introduction Evolutionary Algorithm Implementation

Introduction Evolutionary Algorithm Implementation Introduction Traditional optimization methods fail when there are complex, nonlinear relationships between the parameters and the value to be optimized, the goal function has many local extrema, and resources

More information

Supplemental Digital Content. A new severity of illness scale using a subset of APACHE data elements shows comparable predictive accuracy

Supplemental Digital Content. A new severity of illness scale using a subset of APACHE data elements shows comparable predictive accuracy Supplemental Digital Content A new severity of illness scale using a subset of APACHE data elements shows comparable predictive accuracy Alistair E. W. Johnson, BS Centre for Doctoral Training in Healthcare

More information

From Genetics to Genetic Algorithms

From Genetics to Genetic Algorithms From Genetics to Genetic Algorithms Solution to Optimisation Problems Using Natural Systems Jitendra R Raol and Abhijit Jalisatgi Genetic algorithms are search procedures inspired by natural selection

More information

CSE /CSE6602E - Soft Computing Winter Lecture 9. Genetic Algorithms & Evolution Strategies. Guest lecturer: Xiangdong An

CSE /CSE6602E - Soft Computing Winter Lecture 9. Genetic Algorithms & Evolution Strategies. Guest lecturer: Xiangdong An CSE3 3./CSE66E - Soft Computing Winter Lecture 9 Genetic Algorithms & Evolution Strategies Guest lecturer: Xiangdong An xan@cs.yorku.ca Genetic algorithms J. Holland, Adaptation in Natural and Artificial

More information

Comparison of a Job-Shop Scheduler using Genetic Algorithms with a SLACK based Scheduler

Comparison of a Job-Shop Scheduler using Genetic Algorithms with a SLACK based Scheduler 1 Comparison of a Job-Shop Scheduler using Genetic Algorithms with a SLACK based Scheduler Nishant Deshpande Department of Computer Science Stanford, CA 9305 nishantd@cs.stanford.edu (650) 28 5159 June

More information

Bio-inspired Active Vision. Martin Peniak, Ron Babich, John Tran and Davide Marocco

Bio-inspired Active Vision. Martin Peniak, Ron Babich, John Tran and Davide Marocco Bio-inspired Active Vision Martin Peniak, Ron Babich, John Tran and Davide Marocco GPU Computing Lab Traditional Computer Vision 3 Traditional Computer Vision Teaching a computer to classify objects has

More information

An introduction to evolutionary computation

An introduction to evolutionary computation An introduction to evolutionary computation Andrea Roli andrea.roli@unibo.it Dept. of Computer Science and Engineering (DISI) Campus of Cesena Alma Mater Studiorum Università di Bologna Outline 1 Basic

More information

Genetic'Algorithms'::' ::'Algoritmi'Genetici'1

Genetic'Algorithms'::' ::'Algoritmi'Genetici'1 Genetic'Algorithms'::' ::'Algoritmi'Genetici'1 Prof. Mario Pavone Department of Mathematics and Computer Sciecne University of Catania v.le A. Doria 6 95125 Catania, Italy mpavone@dmi.unict.it http://www.dmi.unict.it/mpavone/

More information

CEng 713 Evolutionary Computation, Lecture Notes

CEng 713 Evolutionary Computation, Lecture Notes CEng 713 Evolutionary Computation, Lecture Notes Introduction to Evolutionary Computation Evolutionary Computation Elements of Evolution: Reproduction Random variation Competition Selection of contending

More information

Genetic Algorithms and Genetic Programming Lecture 13

Genetic Algorithms and Genetic Programming Lecture 13 Genetic Algorithms and Genetic Programming Lecture 13 Gillian Hayes 10th November 2008 Pragmatics of GA Design 1 Selection methods Crossover Mutation Population model and elitism Spatial separation Maintaining

More information

Artificial Evolution. FIT3094 AI, A-Life and Virtual Environments Alan Dorin

Artificial Evolution. FIT3094 AI, A-Life and Virtual Environments Alan Dorin Artificial Evolution FIT3094 AI, A-Life and Virtual Environments Alan Dorin Copyrighted imagery used in the preparation of these lecture notes remains the property of the credited owners and is included

More information

Evolutionary Algorithms - Population management and popular algorithms Kai Olav Ellefsen

Evolutionary Algorithms - Population management and popular algorithms Kai Olav Ellefsen INF3490 - Biologically inspired computing Lecture 3: Eiben and Smith, chapter 5-6 Evolutionary Algorithms - Population management and popular algorithms Kai Olav Ellefsen Repetition: General scheme of

More information

Plan for today GENETIC ALGORITHMS. Randomised search. Terminology: The GA cycle. Decoding genotypes

Plan for today GENETIC ALGORITHMS. Randomised search. Terminology: The GA cycle. Decoding genotypes GENETIC ALGORITHMS Jacek Malec email: jacek.malec@cs.lth.se Plan for today What is a genetic algorithm? Degrees of freedom. Some examples. Co-evolution, SAGA, Genetic Programming, Evolutionary Strategies,...

More information

Performance Analysis of Multi Clustered Parallel Genetic Algorithm with Gray Value

Performance Analysis of Multi Clustered Parallel Genetic Algorithm with Gray Value American Journal of Applied Sciences 9 (8): 1268-1272, 2012 ISSN 1546-9239 2012 Science Publications Performance Analysis of Multi Clustered Parallel Genetic Algorithm with Gray Value 1 Vishnu Raja, P.

More information

Genetic Algorithms and Genetic Programming Lecture 14

Genetic Algorithms and Genetic Programming Lecture 14 Genetic Algorithms and Genetic Programming Lecture 14 Gillian Hayes 13th November 2007 GAs: Representation, Fitness, Searching the Space Representation binary, real, plans, trees Operators for non-binary

More information

Genetic Algorithm: A Search of Complex Spaces

Genetic Algorithm: A Search of Complex Spaces Genetic Algorithm: A Search of Complex Spaces Namita Khurana, Anju Rathi, Akshatha.P.S Lecturer in Department of (CSE/IT) KIIT College of Engg., Maruti Kunj, Sohna Road, Gurgaon, India ABSTRACT Living

More information

Cellular Automaton, Genetic Algorithms, and Neural Networks

Cellular Automaton, Genetic Algorithms, and Neural Networks Cellular Automaton, Genetic Algorithms, and Neural Networks Catherine Beauchemin, Department of Physics, University of Alberta January 30, 2004 Overview Cellular Automaton What is a cellular automaton?

More information

Genetic approach to solve non-fractional knapsack problem S. M Farooq 1, G. Madhavi 2 and S. Kiran 3

Genetic approach to solve non-fractional knapsack problem S. M Farooq 1, G. Madhavi 2 and S. Kiran 3 Genetic approach to solve non-fractional knapsack problem S. M Farooq 1, G. Madhavi 2 and S. Kiran 3 1,2,3 Y. S. R Engineering College, Yogi Vemana University Korrapad Road, Proddatur 516360, India 1 shaikfaroq@gmail.com,

More information

Intelligent Techniques Lesson 4 (Examples about Genetic Algorithm)

Intelligent Techniques Lesson 4 (Examples about Genetic Algorithm) Intelligent Techniques Lesson 4 (Examples about Genetic Algorithm) Numerical Example A simple example will help us to understand how a GA works. Let us find the maximum value of the function (15x - x 2

More information

Comp215: Genetic Algorithms - Part 1

Comp215: Genetic Algorithms - Part 1 Comp215: Genetic Algorithms - Part 1 Mack Joyner, Dan S. Wallach (Rice University) Copyright 2016, Mack Joyner, Dan S. Wallach. All rights reserved. Darwin s Theory of Evolution Individual organisms differ

More information

Genetic Algorithms and Genetic Programming. Lecture 1: Introduction (25/9/09)

Genetic Algorithms and Genetic Programming. Lecture 1: Introduction (25/9/09) Genetic Algorithms and Genetic Programming Michael Herrmann Lecture 1: Introduction (25/9/09) michael.herrmann@ed.ac.uk, phone: 0131 6 517177, Informatics Forum 1.42 Problem Solving at Decreasing Domain

More information

Processor Scheduling Algorithms in Environment of Genetics

Processor Scheduling Algorithms in Environment of Genetics Processor Scheduling Algorithms in Environment of Genetics Randeep Department of Computer Science and Engineering R.N. College of Engg. & Technology Haryana, India randeepravish@gmail.com Abstract The

More information

An introduction to genetic algorithms for neural networks

An introduction to genetic algorithms for neural networks An introduction to genetic algorithms for neural networks Richard Kemp 1 Introduction Once a neural network model has been created, it is frequently desirable to use the model backwards and identify sets

More information

In order to have GA, you must have a way to rate a given solution (fitness function). The fitness function must be continuous.

In order to have GA, you must have a way to rate a given solution (fitness function). The fitness function must be continuous. Disclaimer This document is a summary of Prof. Floreano s Bio-inspired Adaptive Machines course. The purpose is to help the student revise for the oral examination. This document should not be considered

More information

Design and Implementation of Genetic Algorithm as a Stimulus Generator for Memory Verification

Design and Implementation of Genetic Algorithm as a Stimulus Generator for Memory Verification International Journal of Emerging Engineering Research and Technology Volume 3, Issue 9, September, 2015, PP 18-24 ISSN 2349-4395 (Print) & ISSN 2349-4409 (Online) Design and Implementation of Genetic

More information

Recessive Trait Cross Over Approach of GAs Population Inheritance for Evolutionary Optimisation

Recessive Trait Cross Over Approach of GAs Population Inheritance for Evolutionary Optimisation Recessive Trait Cross Over Approach of GAs Population Inheritance for Evolutionary Optimisation Amr Madkour, Alamgir Hossain, and Keshav Dahal Modeling Optimization Scheduling And Intelligent Control (MOSAIC)

More information

10. Lecture Stochastic Optimization

10. Lecture Stochastic Optimization Soft Control (AT 3, RMA) 10. Lecture Stochastic Optimization Genetic Algorithms 10. Structure of the lecture 1. Soft control: the definition and limitations, basics of epert" systems 2. Knowledge representation

More information

IT PROJECT DISTRIBUTION AUTOMATION

IT PROJECT DISTRIBUTION AUTOMATION IT PROJECT DISTRIBUTION AUTOMATION 1 Aniket Kale, 2 Ajay Sonune, 3 Om Awagan, 4 Sandeep Chavan, 5 Prof. Rupali Dalvi 1, 2,3,4,5 Department of Computer Engineering, Marathwada Mitra Mandal s College of

More information

Genetic Algorithms in Matrix Representation and Its Application in Synthetic Data

Genetic Algorithms in Matrix Representation and Its Application in Synthetic Data Genetic Algorithms in Matrix Representation and Its Application in Synthetic Data Yingrui Chen *, Mark Elliot ** and Joe Sakshaug *** * ** University of Manchester, yingrui.chen@manchester.ac.uk University

More information

Gene Reduction for Cancer Classification using Cascaded Neural Network with Gene Masking

Gene Reduction for Cancer Classification using Cascaded Neural Network with Gene Masking Gene Reduction for Cancer Classification using Cascaded Neural Network with Gene Masking Raneel Kumar, Krishnil Chand, Sunil Pranit Lal School of Computing, Information, and Mathematical Sciences University

More information

Genetic Algorithm for Variable Selection. Genetic Algorithms Step by Step. Genetic Algorithm (Holland) Flowchart of GA

Genetic Algorithm for Variable Selection. Genetic Algorithms Step by Step. Genetic Algorithm (Holland) Flowchart of GA http://www.spectroscopynow.com http://ib-poland.virtualave.net/ee/genetic1/3geneticalgorithms.htm http://www.uni-mainz.de/~frosc000/fbg_po3.html relative intensity Genetic Algorithm for Variable Selection

More information

Genetic algorithms. History

Genetic algorithms. History Genetic algorithms History Idea of evolutionary computing was introduced in the 1960s by I. Rechenberg in his work "Evolution strategies" (Evolutionsstrategie in original). His idea was then developed

More information

Unshackling Unshackling Evolution: Evolving Softrobots with a Multitude of Materials

Unshackling Unshackling Evolution: Evolving Softrobots with a Multitude of Materials Unshackling Unshackling Evolution: Evolving Softrobots with a Multitude of Materials Connor Cassidy Abstract While their industrial counterparts that are programmed to execute particular tasks, softbots

More information

APPLICATION OF COMPUTER FOR ANALYZING WORLD CO2 EMISSION

APPLICATION OF COMPUTER FOR ANALYZING WORLD CO2 EMISSION APPLICATION OF COMPUTER FOR ANALYZING WORLD CO2 EMISSION M. Kavoosi 1, B.shafiee 2 1 Department of Computer Engineering, Izeh Branch, Islamic Azad University, Izeh, Iran 1 E-mail address: Hakavoosi@yahoo.com

More information

Detecting and Pruning Introns for Faster Decision Tree Evolution

Detecting and Pruning Introns for Faster Decision Tree Evolution Detecting and Pruning Introns for Faster Decision Tree Evolution Jeroen Eggermont and Joost N. Kok and Walter A. Kosters Leiden Institute of Advanced Computer Science Universiteit Leiden P.O. Box 9512,

More information

Genetic Programming for Symbolic Regression

Genetic Programming for Symbolic Regression Genetic Programming for Symbolic Regression Chi Zhang Department of Electrical Engineering and Computer Science, University of Tennessee, Knoxville, TN 37996, USA Email: czhang24@utk.edu Abstract Genetic

More information

EFFECT OF CROSS OVER OPERATOR IN GENETIC ALGORITHMS ON ANTICIPATORY SCHEDULING

EFFECT OF CROSS OVER OPERATOR IN GENETIC ALGORITHMS ON ANTICIPATORY SCHEDULING 24th International Symposium on on Automation & Robotics in in Construction (ISARC 2007) Construction Automation Group, I.I.T. Madras EFFECT OF CROSS OVER OPERATOR IN GENETIC ALGORITHMS ON ANTICIPATORY

More information

Permutation Free Encoding Technique for Evolving Neural Networks

Permutation Free Encoding Technique for Evolving Neural Networks Permutation Free Encoding Technique for Evolving Neural Networks Anupam Das, Md. Shohrab Hossain, Saeed Muhammad Abdullah, and Rashed Ul Islam Department of Computer Science and Engineering, Bangladesh

More information

Using Multi-chromosomes to Solve. Hans J. Pierrot and Robert Hinterding. Victoria University of Technology

Using Multi-chromosomes to Solve. Hans J. Pierrot and Robert Hinterding. Victoria University of Technology Using Multi-chromosomes to Solve a Simple Mixed Integer Problem Hans J. Pierrot and Robert Hinterding Department of Computer and Mathematical Sciences Victoria University of Technology PO Box 14428 MCMC

More information

Dominant and Recessive Genes in Evolutionary Systems Applied to Spatial Reasoning

Dominant and Recessive Genes in Evolutionary Systems Applied to Spatial Reasoning Dominant and Recessive Genes in Evolutionary Systems Applied to Spatial Reasoning Thorsten Schnier and John Gero Key Centre of Design Computing Department of Architectural and Design Science University

More information

A Genetic Algorithm for Order Picking in Automated Storage and Retrieval Systems with Multiple Stock Locations

A Genetic Algorithm for Order Picking in Automated Storage and Retrieval Systems with Multiple Stock Locations IEMS Vol. 4, No. 2, pp. 36-44, December 25. A Genetic Algorithm for Order Picing in Automated Storage and Retrieval Systems with Multiple Stoc Locations Yaghoub Khojasteh Ghamari Graduate School of Systems

More information

Genetic Algorithm and Neural Network

Genetic Algorithm and Neural Network Proceedings of the 7th WSEAS International Conference on Applied Informatics and Communications, Athens, Greece, August 24-26, 2007 345 Genetic Algorithm and Neural Network JIRI STASTNY*, VLADISLAV SKORPIL**

More information

Available online at International Journal of Current Research Vol. 9, Issue, 07, pp , July, 2017

Available online at   International Journal of Current Research Vol. 9, Issue, 07, pp , July, 2017 z Available online at http://www.journalcra.com International Journal of Current Research Vol. 9, Issue, 07, pp.53529-53533, July, 2017 INTERNATIONAL JOURNAL OF CURRENT RESEARCH ISSN: 0975-833X RESEARCH

More information

SEISMIC ATTRIBUTES SELECTION AND POROSITY PREDICTION USING MODIFIED ARTIFICIAL IMMUNE NETWORK ALGORITHM

SEISMIC ATTRIBUTES SELECTION AND POROSITY PREDICTION USING MODIFIED ARTIFICIAL IMMUNE NETWORK ALGORITHM Journal of Engineering Science and Technology Vol. 13, No. 3 (2018) 755-765 School of Engineering, Taylor s University SEISMIC ATTRIBUTES SELECTION AND POROSITY PREDICTION USING MODIFIED ARTIFICIAL IMMUNE

More information

The Metaphor. Individuals living in that environment Individual s degree of adaptation to its surrounding environment

The Metaphor. Individuals living in that environment Individual s degree of adaptation to its surrounding environment Genetic Algorithms Sesi 14 Optimization Techniques Mathematical Programming Network Analysis Branch & Bound Simulated Annealing Tabu Search Classes of Search Techniques Calculus Base Techniqes Fibonacci

More information

Evolving Control for Micro Aerial Vehicles (MAVs)

Evolving Control for Micro Aerial Vehicles (MAVs) Evolving Control for Micro Aerial Vehicles (MAVs) M. Rhodes, G. Tener, and A. S. Wu Abstract This paper further explores the use of a genetic algorithm for the purposes of evolving the control systems

More information

Ant Colony Optimisation

Ant Colony Optimisation Ant Colony Optimisation Alexander Mathews, Angeline Honggowarsito & Perry Brown 1 Image Source: http://baynature.org/articles/the-ants-go-marching-one-by-one/ Contents Introduction to Ant Colony Optimisation

More information

Feature Selection for Predictive Modelling - a Needle in a Haystack Problem

Feature Selection for Predictive Modelling - a Needle in a Haystack Problem Paper AB07 Feature Selection for Predictive Modelling - a Needle in a Haystack Problem Munshi Imran Hossain, Cytel Statistical Software & Services Pvt. Ltd., Pune, India Sudipta Basu, Cytel Statistical

More information

Minimizing Makespan for Machine Scheduling and Worker Assignment Problem in Identical Parallel Machine Models Using GA

Minimizing Makespan for Machine Scheduling and Worker Assignment Problem in Identical Parallel Machine Models Using GA , June 30 - July 2, 2010, London, U.K. Minimizing Makespan for Machine Scheduling and Worker Assignment Problem in Identical Parallel Machine Models Using GA Imran Ali Chaudhry, Sultan Mahmood and Riaz

More information

Genetic Algorithms and Genetic Programming Lecture 2. Syllabus Reminder. Admin Reminder

Genetic Algorithms and Genetic Programming Lecture 2. Syllabus Reminder. Admin Reminder Genetic Algorithms and Genetic Programming Lecture 2 Admin Reminder Lecturer: Gillian Hayes, IPAB, School of Informatics Email: gmh@inf.ed.ac.uk Office: Informatics Forum 1.22, ext. 513440 Course Activities:

More information

TIMETABLING EXPERIMENTS USING GENETIC ALGORITHMS. Liviu Lalescu, Costin Badica

TIMETABLING EXPERIMENTS USING GENETIC ALGORITHMS. Liviu Lalescu, Costin Badica TIMETABLING EXPERIMENTS USING GENETIC ALGORITHMS Liviu Lalescu, Costin Badica University of Craiova, Faculty of Control, Computers and Electronics Software Engineering Department, str.tehnicii, 5, Craiova,

More information

Ant Colony Optimization

Ant Colony Optimization Ant Colony Optimization Part 4: Algorithms Fall 2009 Instructor: Dr. Masoud Yaghini Ant Colony Optimization: Part 4 Outline The Traveling Salesman Problem ACO Algorithms for TSP Ant System (AS) Elitist

More information

An Evolutionary Algorithm Based On The Aphid Life Cycle

An Evolutionary Algorithm Based On The Aphid Life Cycle International Journal of Computer Information Systems and Industrial Management Applications. ISSN 2150-7988 Volume 8 (2016) pp. 155 162 c MIR Labs, www.mirlabs.net/ijcisim/index.html An Evolutionary Algorithm

More information

Immune Programming. Payman Samadi. Supervisor: Dr. Majid Ahmadi. March Department of Electrical & Computer Engineering University of Windsor

Immune Programming. Payman Samadi. Supervisor: Dr. Majid Ahmadi. March Department of Electrical & Computer Engineering University of Windsor Immune Programming Payman Samadi Supervisor: Dr. Majid Ahmadi March 2006 Department of Electrical & Computer Engineering University of Windsor OUTLINE Introduction Biological Immune System Artificial Immune

More information

NEUROEVOLUTION AND AN APPLICATION OF AN AGENT BASED MODEL FOR FINANCIAL MARKET

NEUROEVOLUTION AND AN APPLICATION OF AN AGENT BASED MODEL FOR FINANCIAL MARKET City University of New York (CUNY) CUNY Academic Works Master's Theses City College of New York 2014 NEUROEVOLUTION AND AN APPLICATION OF AN AGENT BASED MODEL FOR FINANCIAL MARKET Anil Yaman CUNY City

More information

An Evolutionary Approach involving Training of ANFIS with the help of Genetic Algorithm for PID Controller Tuning

An Evolutionary Approach involving Training of ANFIS with the help of Genetic Algorithm for PID Controller Tuning An Evolutionary Approach involving Training of ANFIS with the help of Genetic Algorithm for PID... An Evolutionary Approach involving Training of ANFIS with the help of Genetic Algorithm for PID Controller

More information

Genetic Algorithms using Populations based on Multisets

Genetic Algorithms using Populations based on Multisets Genetic Algorithms using Populations based on Multisets António Manso 1, Luís Correia 1 1 LabMAg - Laboratório de Modelação de Agentes Faculdade de Ciências da Universidade de Lisboa Edifício C6, Piso

More information

Introduction To Genetic Algorithms

Introduction To Genetic Algorithms Introduction To Genetic Algorithms Cse634 DATA MINING Professor Anita Wasilewska Computer Science Department Stony Brook University 1 Overview Introduction To Genetic Algorithms (GA) GA Operators and Parameters

More information

PDGA: the Primal-Dual Genetic Algorithm

PDGA: the Primal-Dual Genetic Algorithm P: the Primal-Dual Genetic Algorithm Shengxiang Yang Department of Computer Science University of Leicester University Road, Leicester LE1 7RH, UK Email: syang@mcsleacuk Abstract Genetic algorithms (GAs)

More information

A Genetic Algorithm on Inventory Routing Problem

A Genetic Algorithm on Inventory Routing Problem A Genetic Algorithm on Inventory Routing Problem Artvin Çoruh University e-mail: nevin.aydin@gmail.com Volume 3 No 3 (2014) ISSN 2158-8708 (online) DOI 10.5195/emaj.2014.31 http://emaj.pitt.edu Abstract

More information

Metaheuristics and Cognitive Models for Autonomous Robot Navigation

Metaheuristics and Cognitive Models for Autonomous Robot Navigation Metaheuristics and Cognitive Models for Autonomous Robot Navigation Raj Korpan Department of Computer Science The Graduate Center, CUNY Second Exam Presentation April 25, 2017 1 / 31 Autonomous robot navigation

More information

ESQUIVEL S.C., LEIVA H. A., GALLARD, R.H.

ESQUIVEL S.C., LEIVA H. A., GALLARD, R.H. SELF-ADAPTATION OF PARAMETERS FOR MCPC IN GENETIC ALGORITHMS ESQUIVEL S.C., LEIVA H. A., GALLARD, R.H. Proyecto UNSL-338403 1 Departamento de Informática Universidad Nacional de San Luis (UNSL) Ejército

More information

SimBa: A novel similarity-based crossover for neuro-evolution

SimBa: A novel similarity-based crossover for neuro-evolution SimBa: A novel similarity-based crossover for neuro-evolution Mauro Dragoni, Antonia Azzini, Andrea G. B. Tettamanzi To cite this version: Mauro Dragoni, Antonia Azzini, Andrea G. B. Tettamanzi. SimBa:

More information

Software Next Release Planning Approach through Exact Optimization

Software Next Release Planning Approach through Exact Optimization Software Next Release Planning Approach through Optimization Fabrício G. Freitas, Daniel P. Coutinho, Jerffeson T. Souza Optimization in Software Engineering Group (GOES) Natural and Intelligent Computation

More information

Genetic algorithms and code optimization. A quiet revolution

Genetic algorithms and code optimization. A quiet revolution Genetic algorithms and code optimization Devika Subramanian Rice University Work supported by DARPA and the USAF Research Labs A quiet revolution (May 1997) Deep Blue vs Kasparaov first match won against

More information

The Impact of Population Size on Knowledge Acquisition in Genetic Algorithms Paradigm: Finding Solutions in the Game of Sudoku

The Impact of Population Size on Knowledge Acquisition in Genetic Algorithms Paradigm: Finding Solutions in the Game of Sudoku The Impact of Population Size on Knowledge Acquisition in Genetic Algorithms Paradigm: Finding Solutions in the Game of Sudoku Nordin Abu Bakar, Muhammad Fadhil Mahadzir Faculty of Computer & Mathematical

More information

Intro. ANN & Fuzzy Systems. Lecture 36 GENETIC ALGORITHM (1)

Intro. ANN & Fuzzy Systems. Lecture 36 GENETIC ALGORITHM (1) Lecture 36 GENETIC ALGORITHM (1) Outline What is a Genetic Algorithm? An Example Components of a Genetic Algorithm Representation of gene Selection Criteria Reproduction Rules Cross-over Mutation Potential

More information

CHAPTER 3 RESEARCH METHODOLOGY

CHAPTER 3 RESEARCH METHODOLOGY 72 CHAPTER 3 RESEARCH METHODOLOGY Inventory management is considered to be an important field in Supply chain management. Once the efficient and effective management of inventory is carried out throughout

More information