CHAPTER 4 LINEAR ANTENNA ARRAY SYNTHESIS USING GENETIC ALGORITHM

Similar documents
Part 1: Motivation, Basic Concepts, Algorithms

2. Genetic Algorithms - An Overview

Machine Learning: Algorithms and Applications

Introduction To Genetic Algorithms

Evolutionary Algorithms

GENETIC ALGORITHMS. Narra Priyanka. K.Naga Sowjanya. Vasavi College of Engineering. Ibrahimbahg,Hyderabad.

Introduction to Artificial Intelligence. Prof. Inkyu Moon Dept. of Robotics Engineering, DGIST

Introduction to Genetic Algorithm (GA) Presented By: Rabiya Khalid Department of Computer Science

Genetic Algorithm: An Optimization Technique Concept

Optimisation and Operations Research

Machine Learning. Genetic Algorithms

Machine Learning. Genetic Algorithms

Evolutionary Computation. Lecture 3. Evolutionary Computation. X 2 example: crossover. x 2 example: selection

initial set of random solutions called population satisfying boundary and/or system

The Metaphor. Individuals living in that environment Individual s degree of adaptation to its surrounding environment

What is Evolutionary Computation? Genetic Algorithms. Components of Evolutionary Computing. The Argument. When changes occur...

College of information technology Department of software

Comparative Study of Different Selection Techniques in Genetic Algorithm

Evolutionary Computation. Lecture 1 January, 2007 Ivan Garibay

EVOLUTIONARY ALGORITHMS AT CHOICE: FROM GA TO GP EVOLŪCIJAS ALGORITMI PĒC IZVĒLES: NO GA UZ GP

Evolutionary Computation

An introduction to evolutionary computation

Introduction To Genetic Algorithms

COMPARATIVE STUDY OF SELECTION METHODS IN GENETIC ALGORITHM

Keywords Genetic Algorithm (GA), Evolutionary, Representation, Binary, Floating Point, Operator

Genetic Algorithms for Optimizations

Plan for today GENETIC ALGORITHMS. Randomised search. Terminology: The GA cycle. Decoding genotypes

Genetic Algorithms. Part 3: The Component of Genetic Algorithms. Spring 2009 Instructor: Dr. Masoud Yaghini

10. Lecture Stochastic Optimization

Intro. ANN & Fuzzy Systems. Lecture 36 GENETIC ALGORITHM (1)

Evolutionary Algorithms - Population management and popular algorithms Kai Olav Ellefsen

VISHVESHWARAIAH TECHNOLOGICAL UNIVERSITY S.D.M COLLEGE OF ENGINEERING AND TECHNOLOGY. A seminar report on GENETIC ALGORITHMS.

Genetic Algorithm: A Search of Complex Spaces

What is an Evolutionary Algorithm? Presented by: Faramarz Safi (Ph.D.) Faculty of Computer Engineering Islamic Azad University, Najafabad Branch

TIMETABLING EXPERIMENTS USING GENETIC ALGORITHMS. Liviu Lalescu, Costin Badica

PARALLEL LINE AND MACHINE JOB SCHEDULING USING GENETIC ALGORITHM

Selecting Genetic Algorithm Operators for CEM Problems

Introduction Evolutionary Algorithm Implementation

Intelligent Techniques Lesson 4 (Examples about Genetic Algorithm)

Artificial Evolution. FIT3094 AI, A-Life and Virtual Environments Alan Dorin

EFFECT OF CROSS OVER OPERATOR IN GENETIC ALGORITHMS ON ANTICIPATORY SCHEDULING

Journal of Global Research in Computer Science PREMATURE CONVERGENCE AND GENETIC ALGORITHM UNDER OPERATING SYSTEM PROCESS SCHEDULING PROBLEM

Genetic Algorithm. Presented by Shi Yong Feb. 1, 2007 Music McGill University

Generational and steady state genetic algorithms for generator maintenance scheduling problems

Design and Implementation of Genetic Algorithm as a Stimulus Generator for Memory Verification

Evolutionary Algorithms

DEVELOPMENT OF MULTI-OBJECTIVE SIMULATION-BASED GENETIC ALGORITHM FOR SUPPLY CHAIN CYCLIC PLANNING AND OPTIMISATION

CHAPTER 4 MAINTENANCE OPTIMIZATION USING GENETIC ALGORITHM

GENETIC ALGORITHM CHAPTER 2

From Genetics to Genetic Algorithms

Genetic Algorithms and Sensitivity Analysis in Production Planning Optimization

Genetic Algorithms. Moreno Marzolla Dip. di Informatica Scienza e Ingegneria (DISI) Università di Bologna.

Genetic'Algorithms'::' ::'Algoritmi'Genetici'1

Computational Intelligence Lecture 20:Intorcution to Genetic Algorithm

CapSel GA Genetic Algorithms.

Evolutionary Computation

CSE /CSE6602E - Soft Computing Winter Lecture 9. Genetic Algorithms & Evolution Strategies. Guest lecturer: Xiangdong An

CHAPTER 3 RESEARCH METHODOLOGY

Evolutionary Algorithms - Introduction and representation Jim Tørresen

Processor Scheduling Algorithms in Environment of Genetics

Available online at International Journal of Current Research Vol. 9, Issue, 07, pp , July, 2017

Feature Selection for Predictive Modelling - a Needle in a Haystack Problem

IMPLEMENTATION OF AN OPTIMIZATION TECHNIQUE: GENETIC ALGORITHM

Utilizing Optimization Techniques to Enhance Cost and Schedule Risk Analysis

Genetic Algorithms and Genetic Programming Lecture 13

Comp215: Genetic Algorithms - Part 1

Deterministic Crowding, Recombination And Self-Similarity

Genetic Algorithm and Application in training Multilayer Perceptron Model

A Gene Based Adaptive Mutation Strategy for Genetic Algorithms

PDGA: the Primal-Dual Genetic Algorithm

Genetic algorithms and code optimization. A quiet revolution

Timetabling with Genetic Algorithms

Energy management using genetic algorithms

A HYBRID ALGORITHM TO MINIMIZE THE NUMBER OF TARDY JOBS IN SINGLE MACHINE SCHEDULING

Logistics. Final exam date. Project Presentation. Plan for this week. Evolutionary Algorithms. Crossover and Mutation

GENETIC ALGORITHM BASED APPROACH FOR THE SELECTION OF PROJECTS IN PUBLIC R&D INSTITUTIONS

Minimizing Makespan for Machine Scheduling and Worker Assignment Problem in Identical Parallel Machine Models Using GA

Genetic Programming for Symbolic Regression

GENETIC ALGORITHM A NOBLE APPROACH FOR ECONOMIC LOAD DISPATCH

Performance Analysis of Multi Clustered Parallel Genetic Algorithm with Gray Value

Genetic Algorithm for Variable Selection. Genetic Algorithms Step by Step. Genetic Algorithm (Holland) Flowchart of GA

FacePrints, Maze Solver and Genetic algorithms

Genetic Algorithm for Predicting Protein Folding in the 2D HP Model

Genetic algorithms. History

Implementation of CSP Cross Over in Solving Travelling Salesman Problem Using Genetic Algorithms

EMM4131 Popülasyon Temelli Algoritmalar (Population-based Algorithms) Introduction to Meta-heuristics and Evolutionary Algorithms

Optimal Capacitor Placement for Loss Reduction in Distribution Systems Using Fuzzy and Hybrid Genetic Algorithm

A Genetic Algorithm for Order Picking in Automated Storage and Retrieval Systems with Multiple Stock Locations

Optimal Design of Laminated Composite Plates by Using Advanced Genetic Algorithm

Genetic Algorithms and Shape Grammars

Using Multi-chromosomes to Solve. Hans J. Pierrot and Robert Hinterding. Victoria University of Technology

Genetic approach to solve non-fractional knapsack problem S. M Farooq 1, G. Madhavi 2 and S. Kiran 3

Assoc. Prof. Rustem Popa, PhD

APPLICATION OF COMPUTER FOR ANALYZING WORLD CO2 EMISSION

An Adaptive and Memory-Assisted Local Crossover in Genetic Algorithm

Genetic Algorithms in Matrix Representation and Its Application in Synthetic Data

A Comparison between Genetic Algorithms and Evolutionary Programming based on Cutting Stock Problem

Genetic Algorithms and Genetic Programming Lecture 14

A Genetic Algorithm on Inventory Routing Problem

Automated Test Case Generation: Metaheuristic Search

Transcription:

76 CHAPTER 4 LINEAR ANTENNA ARRAY SYNTHESIS USING GENETIC ALGORITHM 4.1 INTRODUCTION Improving antenna characteristics such as directivity, input impedance, bandwidth, sidelobe level, null depth level and size has always been the goal of antenna researchers. It is important for the antenna researchers to have the latest tools to effectively design antennas that meet the given specifications. Optimization techniques are used to either synthesize an antenna from given radiation characteristics or simply improve existing antenna designs. Search routines that utilize numerical methods to provide radiation properties of antenna generally consume a considerable amount of time. Therefore, a great deal of work has been devoted to achieving optimization routines that rapidly and accurately search out an optimum solution. Recently, a unique optimization scheme based on genetic algorithms (GA) (Teruel and Rajo-Iglesias 26, Stephen 28, Goldberg and Holland 1998) has been used to solve a number of electromagnetic problems. Genetic algorithms, the adaptive heuristic search algorithms, are programming techniques that mimic biological evolution as a problemsolving strategy, based on the evolutionary ideas of natural selection and genetic. Genetic algorithms tend to evolve in an environment in which there is

77 a very large set of candidate solutions and in which the search space is uneven and has many hills and valleys. Genetic algorithms are useful and efficient when The search space is large, complex and poorly understood Domain knowledge is scarce or expert knowledge is difficult to encode to narrow the search space No mathematical analysis is available Traditional search methods fail The genetic algorithm is a robust, stochastic search method that models its processes after the principles of natural selection and evolution. These genetic algorithms are very useful for finding optimum antenna designs that maximize or minimize certain radiation properties. The GA provides optimal solutions by successively creating populations that improve over many generations. Selecting, mating and mutating the previous population each creates a new generation. This process continues until the population converges to a single optimal solution. The selection process is based on the rating of each member relative to the population. This rating is done by fitness testing of each individual. The fitness value can be the gain, axial ratio, input impedance, size, sidelobe level or any combination thereof. This fitness value is received from some type of numerical code that can provide the radiation characteristics of the antenna. A MATLAB code for SLL reduction and null depth level is used in this thesis as a fitness function from which the fitness value is obtained. This chapter discusses the deployment of Genetic algorithm optimization method for synthesis of antenna array radiation pattern in adaptive beamforming. The synthesis problem discussed is to find the weights

78 of the Uniform Linear Antenna array elements that are optimum to provide the radiation pattern with maximum reduction in the sidelobe level (Basak et al 28). 4.2 GENETIC ALGORITHM AS AN OPTIMIZATION TOOL Holland performed much of the foundational work in Genetic Algorithm during 196-197. His goal of understanding the processes of natural adaptation and designing biologically-inspired artificial systems led to the formulation of the simple genetic algorithm. Since its conception, genetic algorithms have enjoyed global use by many researchers and scientists in many different areas. Although computer scientists can take much of the credit for the development of GA, areas such as business, science, and engineering have put the GA to good use. Researchers, who historically have been obsessed with better and cheaper, find particular interest in the GA. In Electromagnetics, most all problems are of a non-differentiable type. Some type of numerical solution is required to calculate the important characteristics of the problem. When the optimization is a goal, these problems lend themselves very well to the use of genetic algorithms. Antenna design is an area of Electromagnetics that has recently benefited from the use of Gas. Haupt (1994,1995, 24, 28), recently compiled a great deal of work on antenna design using genetic algorithms. Array thinning and array synthesis are two areas where genetic algorithms have proven their usefulness. In both of these cases, the GA has provided results that exceed previous attempts to improve the array designs. Genetic algorithms are typically implemented using computer experiments in which an optimization problem is specified. For this problem, members of a space of candidate solutions, called individuals, are represented

79 using abstract representations called chromosomes. GA consists of an iterative process that evolves a working set of individuals called a population toward an objective function, or fitness function. Traditionally, solutions are represented using fixed length strings, especially binary strings, but alternative encodings have been developed. The key terms related to genetic algorithm are Individual any possible solution Population group of all individuals Search space all possible solutions to the problem Chromosome blueprint for an individual Trait possible aspect of an individual Allele possible settings of a trait Locus the position of a gene on a chromosome Genome collection of all chromosomes for an individual The evolutionary process of a GA is a highly simplified and stylized simulation of biological version. It starts from a population of individuals randomly generated according to some probability distribution, usually uniform and updates this population in steps called generations. In each generation, multiple individuals are randomly selected from the current population based upon some application of fitness, bred using crossover, and modified through mutation to form a new population. The genetic algorithm operators are defined below Crossover exchange of genetic material (substrings) denoting rules, structural components, features of a machine learning, search, or optimization problem

8 Selection the application of the fitness criterion to choose which individuals from a population will go on to reproduce Reproduction the propagation of individuals from one generation to the next Mutation the modification of chromosomes for single individuals Current GA theory consists of two main approaches Markov chain analysis and schema theory. Markov chain analysis is primarily concerned with characterizing the stochastic dynamics of a GA system, i.e., the behavior of the random sampling mechanism of a GA over time. The most severe limitation of this approach is that while crossover is easy to implement, its dynamics are difficult to describe mathematically. Markov chain analysis of simple GAs has therefore been more successful at capturing the behavior of evolutionary algorithms with selection and mutation only. These include evolutionary algorithms (EAs) and evolutionary strategies. A schema is a generalized description or a conceptual system for understanding knowledgehow knowledge is represented and how it is used. According to this theory, schemata represent knowledge about concepts: objects and the relationships they have with other objects, situations, events, sequences of events, actions, and sequences of actions. 4.2.1 Flow Chart for Simple Genetic Algorithm The flow of simple Genetic algorithm can be explained through the flowchart representation as shown in Figure 4.1.

81 Initial population (first generation) / best fit and new population of remaining generations Fitness on cost function evaluations for each generation Selection of best fit individuals/ rejection of least fit population Breed using genetic operators crossover and mutation Generate new offspring Figure 4.1 GA reproduction cycle such as The cycle is repeated until a termination condition has been reached 1. A solution that satisfies the minimum criteria. 2. Reaching the specified number of generations. 3. Reaching the specified Computation time. 4. Arriving at fitness value and 5. Manual inspection.

82 Traditionally, solutions are represented using fixed length strings, especially binary strings, but alternative encodings have also been developed. The genetic algorithms provide a directed random search in complex landscapes. In nature, all living organisms have certain the physical characteristics or traits, known as the phenome. The phenomes are encoded into a set of genetic data structure, termed as genome. A particular set of genetic information is a genotype, and likewise a particular set of physical characteristics, or traits, is a phenotype. There may, or may not, be a direct one-to-one mapping of genotypes to phenotypes. These physical characteristics determine how well suited to its environment a particular organism is. The suitability of a given organism to its environment is usually measured as its fitness. Computationally, it is usual to evaluate the fitness of an organism directly, without considering any kind of phenome. 4.2.2 Encoding In GAs encoding the solutions of the problem into chromosomes is a key issue. John Holland, used a single binary bit string. The problem associated with encoding is that some individuals may correspond to infeasible or illegal solutions. This may become very severe for constrained optimization problems and combinatorial optimization problems. Infeasible solution is one, that chromosome lies outside the feasible region of given problem. Penalty methods can be used handle infeasible chromosomes. Illegal solutions have chromosomes that do not represent a solution. Repair techniques are usually adopted to convert an illegal chromosome to legal one. Many different methods for encoding the genetic information are in common use today; Tree encoding, real-valued arrays, permutations, Gray encoding and so on.

83 In general encoding methods can be classified as follows: i) Binary encoding ii) Real-number encoding iii) Integer or literal permutation encoding 4.2.2.1 Binary encoding Binary encoding (i.e., the bit strings) are the most common encoding used. Much of existing GAs theories is based on the assumption of using binary encoding. The binary code doses not preserve the locality of points in the phenotype space. 4.2.2.2 Real number encoding Real number encoding performs better than binary encoding for function optimization and constrained optimizations problems. In real number encoding, the structure of genotype space is identical to that of the phenotype. Therefore, it is easy to form effective genetic operators by borrowing useful techniques from conventional methods. 4.2.2.3 Permutation encoding Permutation encoding is best used for combinational optimization problems because it is useful in searching for the best permutation or combination of items subject to constrain. The encoding schemes can be better understood while dealing with genetic operators such as cross over and mutation. 4.3 GENETIC ALGORITHM OPERATORS There are two basic genetic algorithms operators which are crossover and mutation. These operators perform a blind search by working

84 together to explore i.e. to investigate new and unknown areas in search space and exploit- to make use of knowledge of solutions previously found in search space to help in find better solutions, by creating new variants in the chromosomes. It is confirmed that mutation operator play the same important role as that of the crossover 4.3.1 Crossover or Recombination Crossover plays important role in the design and implementation of robust evolutionary systems. In most GAs, individuals are represented by fixed-length strings and crossover operates on pairs of individuals (parents) to produce new strings (offspring) by exchanging segments from the parents strings. Traditionally, the number of crossover points (which determines how many segments are exchanged) has been fixed at a very low constant value of 1 or 2. 4.3.1.1 Single point crossover A commonly used method for crossover is called single point crossover which is shown in Figure 4.2. In this method, a single point crossover position (called cutpoint) is chosen at random and the parts of two parents after the crossover position are exchanged to form two offspring Parent1 Offspring1 Parent2 Offspring2 Figure 4.2 Single point crossover

[ 85 4.3.1.2 Multi point crossover Multi-point crossover is a generalization of single point crossover, introducing a higher number of cut-points. In this case multi positions are chosen at random and the segments between them are exchanged as shown in Figure 4.3. Parent 1 Offspring 1 Parent 2 Offspring 2 Figure 4.3 Multi point crossover 4.3.1.3 Uniform crossover Uniform crossover does not use cut-points, but simply uses a global parameter to indicate the likelihood that each variable should be exchanged between two parents as shown in Figure 4.4. Parent1 Offspring1 Parent2 Offspring2 Figure 4.4 Uniform crossover

86 4.3.2 Mutation Mutation is a common operator used to help preserve diversity in the population by finding new points in the search pace to evaluate. When a chromosome is chosen for mutation, a random change is made to the values of some locations in the chromosome. A commonly used method for mutation is called single point mutation. Though, a special mutation types used for varies problem kinds and encoding methods. 4.3.2.1 Single point mutation Single gene (chromosome or even individual) is randomly selected to be mutated and its value is changed depending on the encoding type used, as shown in Figure 4.5. Parent Offspring [[Figure 4.5 Single point mutation 4.3.2.2 Multi point mutation Multi genes (chromosomes or even individuals) are randomly selected to be mutated and there values are changed depending on the encoding type used which is shown in Figure 4.6. Parent Offspring Figure 4.6 Multi point mutation

87 4.3.3 Selection Selection is the process of determining the number of times a particular individual is chosen for reproduction and, thus, the number of offspring that an individual will produce. Selection provides the driving force in genetic algorithms. With too much force, genetic search will terminate prematurely. While with too little force, evolutionary progress will be slower than necessary. Typically, a lower selection pressure is indicated at the start of genetic search in favor of a wide exploration of the search space, while a higher selection pressure is recommended at the end to narrow the search space. In this way, the selection directs the genetic search toward promising regions in the search space and that will improve the performance of genetic algorithms. Many selection methods have been proposed, examined and compared. The most common types are 1) Roulette wheel selection 2) Rank selection 3) Tournament selection 4) Steady state selection 5) Elitism 4.3.3.1 Roulette wheel selection Roulette wheel selection is most common selection method used in genetic algorithms for selecting potentially useful individuals (solutions) for crossover and mutation. In roulette wheel selection which is shown in Figure 4.7, as in all selection methods, possible solutions are assigned fitness by the fitness function. This fitness level is used to associate a probability of selection with each individual. While candidate solutions with a higher fitness will be less likely to be eliminated, there is still a chance that they may be. With roulette wheel selection there is a chance some weaker solutions may

88 survive the selection process; this is an advantage, as though a solution may be weak, it may include some component which could prove useful following the recombination process. The analogy to a roulette wheel can be envisaged by imagining a roulette wheel in which each candidate solution represents a pocket on the wheel; the size of the pockets is proportionate to the probability of selection of the solution. Selecting N individual from the population is equivalent to playing N games on the roulette wheel, as each candidate is drawn independently. Figure 4.7 Roulette wheel selection 4.3.3.2 Rank selection In ranking selection, as shown in Figure 4.8, the individuals in the population are sorted from best to worst according to their fitness values. Each individual in the population is assigned a numerical rank based on fitness, and selection is based on this ranking rather than differences in fitness. The advantage of this method is that it can prevent very fit individuals from gaining dominance early at the expense of less fit ones, which would reduce the population's genetic diversity and might hinder attempts to find an acceptable solution. The disadvantage of this method is that it required sorting the entire population by rank which is a potentially time consuming procedure.

89 (a) (b) Figure 4.8 Rank selection effect (a) before ranking (b) after ranking 4.3.3.3 Tournament selection This method randomly chooses a set of individuals and picks out the best individual for reproduction. The number of individual in the set is called the tournament size. A common tournament size is 2, this is called binary tournament. By adjusting tournament size, the selection pressure can be made arbitrarily large or small. For example, using large Tournament size has the effect of increasing the selection pressure, since below average individuals are less likely to win a tournament while above average individuals are more likely to win it. 4.3.3.4 Steady state selection The steady state selection will eliminate the worst of individuals in each generation. It works as follows; the offspring of the individuals selected

9 from each generation go back into the pre-existing population, replacing some of the less fit members of the previous generation. 4.3.3.5 Elitism Elitism is an addition to many selection methods that force genetic algorithms to retain some number of the best individual at each generation. It improves the selection process and save the best individuals. With elitist selection, the quality of the best solution in each generation monotonically increases over time. Without elitist selection, it is possible to lose the best individuals due to stochastic errors 4.4 GENETIC ALGORITHMS PARAMETERS One of the more challenging aspects of using genetic algorithms is to choose the configuration parameter settings. Discussion of GA theory provides little guidance for proper selection of the settings. The population size, the mutation rate, and the type of recombination have the largest effect on search performance. They are used to control the run of a GA. They can influence the Population and the Reproduction part of the GAs. In traditional GAs the parameters has fixed values. Some guidelines are used in selecting these parameter settings are given in the following subsections. 4.4.1 Population Size The population size is one of the most important parameters that play a significant role in the performance of the genetic algorithms. The population size dictates the number of individuals in the population. Larger population sizes increase the amount of variation present in the initial population at the expense of requiring more fitness evaluations. It is found that the best population size is both applications dependent and related to the

91 individual size (number of chromosomes within). A good population of individuals contains a diverse selection of potential building blocks resulting in better exploration. If the population loses diversity the population is said to have premature convergence and little exploration is being done. For larger individuals and challenging optimization problems, larger population sizes are needed to maintain diversity (higher diversity can also be achieved through higher mutation rates and uniform crossover) and hence better exploration. Many researchers suggest population sizes between 25 16 and 1 individual, while others suggest that it must be very much larger. 4.4.2 Crossover Rate Crossover rate determines the probability that crossover will occur. The crossover will generate new individuals in the population by combining parts of existing individuals. The crossover rate is usually high and application dependent. Many researchers suggest crossover rate to be between.6 and.95. 4.4.3 Mutation Rate Mutation rate determines the probability that a mutation will occur. Mutation is employed to give new information to the population (uncover new chromosomes) and also prevents the population from becoming saturated with similar chromosomes, simply said to avoid premature convergence. Large mutation rates increase the probability that good schemata will be destroyed, but increase population diversity. The best mutation rate is application dependent. For most applications, mutation rate is between.1 and.1 while for automated circuit design problems, it is usually between.3 and.8.

92 Genetic algorithms work on two types of spaces alternatively: coding space and solution space, or in other words, genotype space and phenotype space. Genetic operators (crossover and mutation) work on genotype space, while evolution and selection work on phenotype space. The selection is the link between chromosomes and the performance of decoded solutions. The mapping from genotype space to phenotype space has a considerable influence on the performance of genetic algorithms. 4.5 GENETIC ALGORITHM A TOOL FOR SYNTHESIS OF UNIFORM LINEAR ANTENNA ARRAY In this thesis, two different approaches have been adopted to get a deeper insight into the application of GA for the synthesis of linear antenna array, (1) Using Simple GA (2) Using GA optimization toolbox in MATLAB In general, larger population sizes tend to converge to an optimum solution with fewer generations. This is because the solution space is sufficiently covered with a larger population size. This luxury may not be affordable if the fitness function is too time consuming. Also, it may be redundant to use such a thorough initial search space. A tradeoff occurs between population size and the time needed to converge to a solution. A small population converges quickly, but may not find a global optimum. A large population converges slowly, but more confidently finds the global optimum. Judicious choices of gene and population size are important, and dependent on the particular problem. Gene length (number of alleles or bits) is set relative to how accurate an answer is needed. Larger gene length allows more quantization levels. The gene length affects the accuracy of the solution.

93 It is generally accepted to choose a population size that is 2 to 3 times larger than the total chromosome length (total number of bits). For instance a chromosome with 4 genes (parameters) that each contains 1 alleles (bits) should have a population of 8 12. Larger populations require more simulations per generation, but in general require fewer generations to find a solution. This is merely because the initial search space is so large. This may not be necessary for simple problems. However, more complex problems, the majority of the initial population may be unacceptable and the next generation is filled with the very few members that survived the selection process. In fact, sometimes the initial population is filled with no acceptable solutions at all. This is remedied by allowing a larger initial population and thus a more thorough initial search.. Elitist strategy is important in order to maintain a monotoniously increasing species fitness. The probabilities of crossover and mutation should be set to.6 < pcross <.9 and.1 < pmut <.1. The genetic algorithm itself does not converge down to the exact solution. It is very good at hunting down the approximate solution but does not achieve the perfect solution each time. This fact becomes obvious when multiple runs of the GA turns up many different answers which all lie very close to one another. Therefore the use of another search routine after the initial GA search may be necessary. To take advantage of the GA that is already being used, simply create a new initial population that varies by ~5% from the previous GA result and rerun the GA. This will force the GA to significantly reduce its search space. There is no current theory that proves that the genetic algorithm is capable of achieving the correct solution 1% of the time. This is because the GA is dependent on the correct setup of the problem. For instance, if the GA converges quickly to a wrong answer the population size is too small. These user issues are difficult to quantify into an exact formula for the setup of the GA. Considering all these criteria, we have formulated and selected the parameters.

94 4.5.1 Problem Formulation Consider an array of antenna consisting of N number of elements. It is assumed that the antenna elements are symmetric about the center of the linear array. The far field array factor of this array with an even number of isotropic elements (2N) can be expressed as N AF( ) 2 a cos(2 d sin ) (4.1) n 1 n n where a n is the amplitude of the n th element is the angle from broadside and d n is the distance between position of the n th element and the array center. The objective is to find an appropriate set of required element amplitude a n that achieves interference suppression with maximum sidelobe level reduction. To find a set of values which produces the desired array pattern, the algorithm is used to minimize the following cost function given in Equation (4.2) 9 cf W ( )[ F ( ) F ( )] (4.2) 9 o d where F o ( ) is the pattern obtained using our algorithm and F d ( ) is the pattern desired. Here it is taken to be the Chebyshev pattern with SLL of -13dB and W( ) is the weight vector to control the sidelobe level in the cost function. The value of weight vector is to be selected based on experience and knowledge. The two different approaches used in our investigation are detailed as follows. 4.5.2 ULA Synthesis using Simple GA A continuous GA with a population size 1 and a mutation rate of.35 is run for a total of 5 generations using MATLAB and the best result is

95 found for every iterations. The cost function is the minimum sidelobe level for the antenna pattern. Figure 4.9 shows that the antenna array with N = 8 elements has been normalized for a gain of db along the angle and the maximum relative sidelobe level of -15dB. -5-1 -15-2 -25-3 -1 -.8 -.6 -.4 -.2.2.4.6.8 1 Figure 4.9 Optimized Radiation pattern with reduced Sidelobe level of -15dB for N=8 elements The convergence of the algorithm for maximum reduction in the relative sidelobe level with N = 8 elements is depicted in Figure 4.1. -12-13 -14-15 -16-17 -18-19 -2-21 -22 1 2 3 4 5 6 7 8 9 1 generation Figure 4.1 Convergence of sidelobe level with respect to evolving generations for N=8 elements

96 It starts from -13dB which is the optimized value of chebychev pattern for the RSLL and after 8 iterations it reaches -18.8dB and after 43 generations it converges to a maximum reduction of -21dB. Figure 4.11 shows the optimized radiation pattern with relative sidelobe level of -15dB with N=16 and Figure 4.12 shows its convergence curve. The convergence curve shows that it converges to -19.3dB after 54 generations. Changing the number of elements causes the contiguous GA to get different optimum weights. Among N=8, 16, 2, and 24, N=2 performed well and thus selected as optimized element number. The corresponding array pattern for N=8, 16, 2, and 24 are shown in Figure 4.13. In this, the radiation pattern for N=2 has the best directivity with minimum relative sidelobe level of -14.67dB below the main beam. -5-1 -15-2 -25-3 -1 -.8 -.6 -.4 -.2.2.4.6.8 1 Figure 4.11 Optimized Radiation pattern with reduced sidelobe level of -15 db for N = 16 elements

97-13 -14-15 -16-17 -18-19 -2 1 2 3 4 5 6 7 8 9 1 generation Figure 4.12 Convergence of sidelobe level with respect to evolving generations for N=16 elements -5 N=24 N=2 N=16 N=8-1 -15-2 -25-3 -1 -.8 -.6 -.4 -.2.2.4.6.8 1 Figure 4.13 The optimized radiation pattern with reduced sidelobe level for N=8, 16, 2 and 24 Figure 4.14 shows the convergence of sidelobe level for N=2. Figure 4.15 shows the optimized radiation pattern with RSLL of -18.7dB with N=2 elements.

98-13 -14-15 -16-17 -18-19 -2 1 2 3 4 5 6 7 8 9 1 generation Figure 4.14 Convergence of sidelobe level with respect to evolving generations for N=2 elements -5-1 -15-2 -25-3 -1 -.8 -.6 -.4 -.2.2.4.6.8 1 Figure 4.15 The optimized radiation pattern with reduced sidelobe level for number of elements N = 2 Figure 4.16 shows the convergence curve for N=24 elements. Figure 4.17 shows the optimized radiation pattern with RSLL of -14.97dB with N=24 elements.

99-13 -14-15 -16-17 -18-19 1 2 3 4 5 6 7 8 9 1 generation Figure 4.16 Convergence of sidelobe level with respect to evolving generations for N=24 elements -5-1 -15-2 -25-3 -1 -.8 -.6 -.4 -.2.2.4.6.8 1 Figure 4.17 The optimized radiation pattern with reduced sidelobe level for number of elements N= 24 The obtained costs are ranked from best to worst. The most among suitability criteria is to discard the bottom half and to keep the top half of the list. But in our program the selection criteria is to discard any chromosome that has relative sidelobe level less than -15dB. Table 4.1 shows the cost

1 function relative to the population that has a SLL less than -15 db. Among 1 populations only 5 are selected. This limitation speeds up the convergence of the algorithm. After this natural selection the chromosomes mate to produce off springs. Mating takes place by pairing the surviving chromosome. Once paired, the offspring consists of genetic material from both parents. Table 4.1 Population and Respective Cost Function Values Index Chromosome (weight vector) Relative sidelobe level (db) 1.8933.959.7956.7167.588.3725.2648.2232-26.9682 2.8933.7659.5982.5391.9296.7216.7412.259-19.861 4.4635.7659.6338.5391.9296.2594.7412.259-18.247 7.8933.7659.8712.432.9296.2594.7412.259-17.5515 5.8374.427.7824.5322.5425.8238.9641.351-17.5173 Figure 4.18 shows the amplitude excitation for optimized antenna array as given in Table 4.2. 1 weight vector for N = 2 elements.9.8.7.6.5.4.3.2.1 5 1 15 2 25 elements Figure 4.18 Amplitude distribution for optimized antenna array with N=2 elements

11 Table 4.2 Amplitude excitation values For N=2 elements corresponding to Figure 4.16 W n W 1 W 2 W 3 W 4 W 5 W 6 W 7 W 8 W 9 W1 W11 W12 W13 W14 W15 W16 W17 W18 W19 W2 Amplitude excitation.928.9645.7259.691.726.9491.7789.3478.597.6319.5358.5625.4696.4828.2411.5464.6.16.443.4334 4.5.2.1 Roulette wheel selection In this thesis, the following parameters are defined; maxgen = 5, maxfun = 1 and mincost = -5dB. Population is generated randomly. Then it is sorted based on its cost - minimum sidelobe level. For choosing mates for reproduction Roulette wheel selection is used. Each weight vector is assigned a probability of selection on the basics of either its rank in the sorted

12 population or its cost. Rank order selection is the easiest implementation of roulette wheel selection Figure 4.19 Shows the Roulette wheel selection probabilities for five parents in the mating pool. The chromosome with low sidelobe level has higher percent chance of being selected than the chromosomes with higher sidelobe level. In this case first or the best weight vector has a 43% chance of being selected. 28% 9% 3% 17% 43% Figure 4.19 Roulette wheel probabilities for five parents in the mating pool As more generations are added, the percent chance of weight vector being selected changes. Figure 4.2 shows the Roulette wheel selection for seven parents in the mating pool. 7% 1% 4% 33% 12% 18% 25% Figure 4.2 Roulette wheel probabilities for seven parents in the mating pool

13 The best weight vector has 33% chance of being selected. The roulette wheel selection needs to be computed only once, because the number of parents in the mating pool remains constant from generation to generation. The Genetic Algorithm has converged well for a variant of options mentioned above with some trade-offs to have main impact on convergence speed. 4.5.3 ULA Synthesis using GA Solver Optimization Tool Box Optimization toolbox with GA-Genetic Algorithm solver in MATLAB has been used in this thesis to find the amplitude excitations to achieve minimum sidelobe level of -5 db. Half the number of elements is used as the number of variables with the Lower Bound (LB) = and Upper Bound(UB) = 1. The details of the other parameters set are as follows. Population size = 2; Selection function = Roulette; Reproduction (Elite count) = 1; Mutation function = Adaptive feasible; Crossover function = Single point A. Case 1 Number of variables = 8; Number of array elements =16; The experiment has been conducted for 25 times and the best results are presented here. Figure 4.21 shows four different plots viz 1) Best fitness 2) Best individual 3) Score Diversity and 4) Array pattern.

14 Best: -48.9263 Mean: -48.8641 Best fitness Mean fitness -2-4 1.5 Current Best Individual -6 5 1 Generation Score Histogram 15 1 2 3 4 5 6 7 8 Number of variables (8) 1-2 5-4 -49-48.8-48.6-48.4-48.2 Score (range) -6-5 5 Figure 4.21 Showing (a) best fitness (b) best individual (c) score histogram and (d) radiation pattern for N = 16 elements Best result of 48.9263dB sidelobe level is obtained with a mean value of -48.8641dB. The number of variables is selected as 8, as the antenna array consists of even number of elements which is symmetric about the center. The Score Histogram shows, among 2 of the population, 12 individuals give the best score <-48 db. It converges to -48dB only after 75 generations. Figure 4.22 shows that the sidelobe level is reduced to 36.7213dB with a mean value of -38.651dB. The Score Histogram shows 13 individuals get the score < -36.6 db. The amplitude excitations of best individuals are obtained as w1 =.9853; w2 =.9242; w3 =.8215; w4 =.6698; w5 =.5218; w6 =.3527; w7 =.2316; w8 =.146;

15 Best: -36.7213 Mean: -36.651-1 -2-3 -4 5 1 Generation Score Histogram 8 6 4 2 Best fitness Mean fitness -36.8-36.6-36.4-36.2-36 Score (range) 1.5-2 -4-6 Current Best Individual 1 2 3 4 5 6 7 8 Number of variables (8) -5 5 Figure 4.22 Showing (a) best fitness (b) best individual (c) score histogram and (d) radiation pattern for N = 16 elements The same is tabulated in Table 4.3 for 16 elements. The sidelobe levels are almost constant for 6 sidelobes and the last one is wider and less than the remaining. The convergence takes place in 8 generations. Table 4.3 Amplitude Excitations of a 16 Element Array W1 W2 W3 W4 W5 W6 W7 W8 W9 W1 W11 W12 W13 W14 W15 W16.14.23.35.52.67.92.82.98.98.92.82.67.52.35.25.14 B. Case 2: Number of variables = 1; Number of array elements = 2; The experiment is repeated for 1 variables. Figure 4.23 shows that the sidelobe level is reduced to -31.147dB whereas the mean is -3dB. All the individuals lie within the range of -3.5dB to -31.5dB. The main beam width is narrower but the sidelobes are wider.

16 Best: -31.1473 Mean: -3.861 Best fitness -1 Mean fitness -2-3 -4 5 1 Generation Score Histogram 6 1.5 Current Best Individual 1 2 3 4 5 6 7 8 9 1 Number of variables (1) 4-2 2-4 -31.5-31 -3.5-3 Score (range) -6-5 5 Figure 4.23 Showing (a) best fitness (b) best individual (c) score histogram and (d) radiation pattern for N = 2 elements A similar case is run and the result is shown in Figure 4.24. Best: -32.1697 Mean: -31.9476-1 Best fitness Mean fitness -2 1.5 1 Current Best Individual -3.5-4 5 1 Generation Score Histogram 6 1 2 3 4 5 6 7 8 9 1 Number of variables (1) 4-2 2-4 -32.5-32 -31.5-31 Score (range) -6-5 5 Figure 4.24 Showing (a) best fitness (b) best individual (c) score histogram and (d) radiation pattern for N = 2 elements

17 C. Case 3 The experiments are conducted with 22, 42, and 62 elements for 25 runs and their performance are tabulated in Table 4.4 and compared with that of a table given in Haupt(24). Table 4.4 Performance Comparison of Antenna Array with Different Number of Elements N 8 16 2 24 RSLL (db) -16.1-15 -18.7-14.97 Final Convergence (db) -21.3-19.3-19.4-18.7 No. of generations 43 54 31 42 Table 4.5 shows the performance characteristics of five algorithms for an average of 25 runs with random seed values of the amplitude weights. Table 4.5 Comparisons of Optimized Sidelobes for Three Different Array Sizes using Other Algorithms (Haupt 24) and Genetic Algorithm Algorithm 22 Elements 42 Elements 62 Elements Median Sidelobe Level (db) Median Function Calls Median Sidelobe Level (db) Median Function Calls Median Sidelobe Level (db) Median Function Calls BFGS -3.3 17-25.3 28-26.6 316 DFP -27.9 16-25.2 211-26.6 315 Nelder Mead -18.7 956-17.3 2575-17.2 3551 Steepest descent -24.6 15-21.6 29-21.8 313 Our proposed work (Genetic Algorithm ) -22.3 83-2.3 94-2.9 86

18 Genetic algorithm performs well when compared to Nelder Mead but poorer when compared to the remaining algorithms. But the function calls are minimum than all other algorithm. Hence it is cost effective in terms of computational time. Genetic algorithm shows the best results of median sidelobe level of -32.4dB with median function calls of 7 when the array size is 16 elements. Among the three cases the number of elements of the antenna array with N = 16 performed very well with narrow main beam width and reduced sidelobe level and minimum number of function calls which cost less computation time and less complexity. From our study it is realized that Genetic algorithm has many variables to control and trade-offs to consider as detailed below Number of Chromosomes and initial random Population: more number of chromosomes provide better sampling number, solution space but at the cost of slow convergence. Generating the random list, the type of probability distribution and weighting of the parameters has a significant impact on the convergence time. Roulette selection method is employed to decide which chromosome to discard. Crossover the chromosome for mating, the chromosome may be paired from top to bottom randomly best to worst. Mutation rate is selected to mutate a particular chromosome. Mutate does not permit the algorithm to get stuck at local minimum.

19 Stopping Criteria, set in this program are maxgen = 5, maxfun = 1 and mincost = -5dB. In our investigation the Genetic Algorithm has converged well for a variant of options mentioned above with some trade offs to have main impact on convergence speed. 4.6 SUMMARY In this investigation two different approaches of namely (a) Simple GA and (b) Genetic algorithm Solver in Optimization toolbox of MATLAB are used to obtain maximum reduction in sidelobe level relative to the main beam on both sides of. The specialty of the Genetic algorithm is that it can optimize the large number of discrete parameters. Genetic algorithm is an intellectual algorithm searches for the optimum element weight of the array antenna. This investigation demonstrated the different ways to apply Genetic algorithm such as varying number of elements, to optimize the array pattern. Adaptive feasible mutation with single point crossover and Roulette selection showed the performance improvement by reducing the sidelobe level below -3dB in most of the cases with number of variables as 8 and minimum function calls when compared to the other methods shown in Table 4.4. The best result of -48.9dB is obtained for 16 elements proving that this method is efficient with much of the computation time and complexity are reduced. Unlike Simple GA (SGA), the Genetic algorithm solver from the optimization toolbox of MATLAB is used with adaptive feasible mutation, which enables search in broader space along randomly generated directions to produce new generations. This improves the performance greatly to achieve the maximum reduction in sidelobe level with minimum function calls. Experiments proved the effectiveness of this method.