Heuristic Optimization Population Control & Objectives

Size: px
Start display at page:

Download "Heuristic Optimization Population Control & Objectives"

Transcription

1 Heuristic Optimization Population Control & Objectives José M PEÑA (jmpena@fi.upm.es) (Universidad Politécnica de Madrid) 1

2 Population & Objective Population Management: Population size Micro-populations Objective Calculation Co-evolutionary algorithms Multiobjective algorithms 2

3 Aspects to consider: Population Size Population aspects are important for GA performance: Population size Population dynamics (including and removing individuals) End condition (based on population characteristics) 3

4 Population Size Population size has large impact Size too small then GA may converge too quickly Size too large may waste computational resources Population size (and distribution) must ensure a good provision of building blocks 4

5 Exploration and Exploitation Adaptation in natural systems Exploration is framed by a tension between exploration and exploitation Exploitation: Looking for the best solution close to a good one. Exploration: Looking for Mutation Crossover individuals of even higher fitness far away form the current best ones. Exploration Explotation High Low Low High Exploitation 5 Highly-dependant on the fitness & coding

6 6 Evolutionay Pressure GA can be considered as a pressure system Strong selective pressure supports premature convergence of GA search Weak selective pressure can make the search ineffective Pressure depends on: Fitness evaluation, Selection policy and Reproduction chances

7 Ending Condition Evolutionary algorithms can finish when: A number of generations are met. Best population fitness is not improved since several generations. (Convergence) Population characteristics are almost the same. (Population Convergence) 7

8 Micropopulations There are problems in which population size is restricted, due to: Memory (or time) restrictions Computational cost of the fitness function Theoretical work has shown that populations sizes between n and 2n (n is binary string size) are appropriate. If population size is <25 micro-population GAs 8

9 Micropopulations 9 Small populations have high risk to: Under-cover the solution space Premature convergence Minimum size: At least one instance of every allele (value) at each locus (position). Higher order alphabets require bigger populations. Binary alphabet Quaterly alphabet

10 Micropopulations Alternative: Systematic selection of the initial population: Distance based population generation Coverage conditions Error detection codes 10 Not valid individual Iterative process for the generation of initial population (based on distances)

11 Co-Evolution Co-Evolutionary algorithms (CEAs) evolve individuals to search the best strategy according to a fitness function that requires the comparison with other individuals. Based on Game Theory scenarios The fitness of an individual depends on the relationship between that individual and other individuals In fact, individuals are not evaluated, their relationships do Other perspectives is to consider fitness landscapes as dynamic functions. 11

12 Co-Evolution 12 Do these algorithms optimize? Fitness is subjective Two types of Co-Evolutionary Algorithms: Competitive CEA: Individuals are rewarded at the expense of those with which they interact. Continued minor adaptations in some individuals will force competitive adaptations in others. Cooperative CEA: Individuals are rewarded as they work well together. Parallel adaptive evolutionary forces help keep the algorithm driving along a (possibly infinite) gradient

13 Competitive CEAs 13 General Model: Predator-prey relationship: Application to the optimization of a system (population of the system solutions) vs. different test cases (population of problems) Predator Population Fitness: The number of individuals it can actually capture Prey Population Fitness: The number of individuals from which it escapes

14 Cooperative CEAs 14 Different models: Problem decomposition: Each population contains individuals representing a component of a larger solution, and evolution of these populations occurs almost independently, interacting only to obtain fitness. Subpopulation Fitness: Combination of solutions. The result contributes to the fitness of each element. Subpopulation

15 Pathologies of Co-Evolution Loss of gradient: one population comes to severely dominate the others, thus creating an impossible situation in which the other participants do not have enough information from which to learn e.g., a small child, new to the game of chess, attempting to learn to play by playing a grand master at her best 15

16 16 Pathologies of Co-Evolution Cyclic behavior: where intransitivities in the reward system can allow one population to adapt slightly to gain an advantage over the others, then the others follow suit, only for the original population to change again, eventually, back to the original strategy Mediocre stability: limiting behaviors are obtained, but do so at particularly suboptimal points in the space, from some external perspective.

17 Pathologies of Co-Evolution Focussing problems: brittle solutions because the coevolutionary search has driven players to over-specialize on their opponent s weaknesses. Red Queen effect: populations seem to be changing, but the internal subjective measure shows no progress is occurring It takes all the running you can do, to keep in the same place. Through the Looking Glass (L. Carroll) 17

18 Co-Evolution Scenario: Iterated Prisoner s Dilemma Two-players game: Each player has two choices: cooperate or defect. T>R and P>S (Defection always pay more) R>P (Mutual cooperation beats mutual defection) R>(S+T)/2 (Alternating doesn t pay) Multiple iterations: undefined end. Pris #1 Cooperate Defect Cooperate Defect Pris #2 #1 #2 #1 #2 #1 R=4 T=5 #2 R=4 S=0 #1 S=0 P=1 #2 T=5 P=1 Payoff matrix 18

19 Multiobjective Optimization Evolutionary Multiobjective optimization algorithms (EMOOs) evolve individuals to search the best strategy according to several fitness functions: Multi criterium optimizations (multiple solutions) Includes also constrains In fact, individuals are not evaluated, their relationships do Other perspectives is to consider fitness landscapes as dynamic functions. 19

20 Multiobjective Optimization Pareto Optimal x is said to be Pareto Optimal if there does not exist another solution x x is equal for all the objectives x is better than x in al least one objective x solution dominates x solution Pareto Front The maximal set of non-dominated feasible solutions. 20

21 EMOO 21 Evolutionary MultiObjective Optimization Techniques: A priori Preferences before executing Reduce the problem to a unique objective Unique solution A posteriori Preferences after executing Multiple solutions Methods: Non-based on Pareto Optimal concept based on Pareto Optimal concept Non-elitist elitist

22 EMOO A posteriori: Non-based on Pareto Optimal concept VEGA algorithm (Vector Evaluated Genetic Algorithm) k objectives, population size N Subpopulations size N/k Calculate fitness function and select t best individuals (create new subpopulation) Shuffle all subpopulations Apply GA operators and create new populations of size N Speciation Problem: select individuals depending on 1 objective only 22

23 EMOO A posteriori: non-elitist based on Pareto Optimal concept MOGA algorithm (MultiObjective Genetic Algorithm) range (x) = 1 + p(x) (p(x) number of individuals that dominate it) Sorting by minimal range Create a dummy fitness (lineal or non-linear) and calculate (interpolate) depending on individual range Select t best individuals (niches) Apply GA operators and create new population Others: NSGA, NPGA, 23

24 EMOO A posteriori: elitist based on Pareto Optimal concept NSGA-II algorithm (Non-dominated Sorting Genetic Algorithm) Population P (size N) Create new population P (size N) using GA operators Merge both populations and create new Population R (size 2N) Sort by range of domination Select t individuals (tournament & niches) and create a new population R Others: DPGA, PESA, PAES, MOMGA, 24