Analysis of Representations

We briefly summarize the most important properties of the different tree representations and operators from the previous chapters. We focus on the results concerning redundancy, bias, scaling, and locality for Pr fer numbers, characteristic vectors (CV), the LNB encoding, NetKeys, NetDir, and edge-sets. For the use of Prufer numbers, NetKeys, CV, and NetDir no additional representation-specific parameters are necessary. In contrast, for LNBs and edge-sets, additional representation-specific...

Ssct for t

For t < tdrift we assume no genetic drift, and for t > tdrift all the remaining bits are randomly fixed. Therefore, the probability of GA failure can be calculated as By using (3.19) and (3.20) the average percentage of incorrect alleles is adrift(A) 1 - - 14 For large N, no genetic drift occurs and we get the same failure probability as for the non-drift case (3.16). For small N the most salient I 2'fN I bits are solved correctly with probability 1 a and the...

Binary String Representations

After we have defined the optimization problems, we present possible binary representations fg for integers. The representation fg assigns binary genotypes xg to integer phenotypes xp. Instead of using binary strings with cardinality x 2 for the genotypes, higher x-ary alphabets could also be used. Then, a x-ary alphabet is used for the string of length l instead of a binary alphabet. Therefore, instead of encoding 2l different individuals with a binary alphabet, we are able to encode X1...

Experiments and Empirical Results

Here we present empirical results when using the binary trivial voting mapping for the one-max problem and the concatenated deceptive trap problem. The first test example for our empirical investigation is the one-max problem. This problem is very easy to solve for GEAs as the fitness of an individual is simply the number of ones in the binary phenotype. To ensure that recombination results in a proper mixing of the BBs, we use uniform crossover for all experiments with the one-max problem....

Scalable Test Problems for Graphs

To examine the performance of optimization algorithms for the topological design of trees, standard test problems should be used. Motivated by the previous subsection, we define a fully easy and a fully difficult scalable tree problem. The one-max tree problem is based on the integer one-max problem (compare Sect. 5.1) (Ackley 1987). An optimal solution Topt is chosen either randomly or by hand. The structure of this optimal solution Topt can be determined It can be a star, a list, or a random...

Constructing Trees from Random Keys

After we have presented RKs as the basis for the NetKey encoding, we still have to define a construction algorithm which creates a valid tree from a RK sequence. Both elements, the RKs and the construction algorithm are necessary for the new NetKey encoding. To get a synonymously and uniformly redundant encoding, we demand the construction algorithm to preserve the synonymity of the RK encoding and not to favor some phenotypes but to work uniformly. We have seen that we are able to give...

Redundant and Unbiased Encoding

We illustrate that the CV encoding with repair mechanism is a redundant encoding. Furthermore, we show that when using the repair mechanism from the previous subsection the encoding is unbiased that means uniformly redundant. The genotype-phenotype mapping fg constructs a valid tree xp G < Pp with the help of the repair mechanism from every possible feasible or infeasible xg G g. This means that nn-2 phenotypes are represented by 2n(n-1) 2 genotypes. Therefore, the CV encoding is a redundant...

Experimental Results

For our experiments we concatenate four instances of a deceptive trap problem for trees of size 3 and 4. The size of a sub-problem denotes the number of nodes. Therefore, the problems for the empirical investigation have either n x m 12 (size 3) or m x n 16 nodes (size 4). The minimum fitness of an individual is 0 and the maximum fitness is either 4 x 2 8 (4 instances of the 3 node trap) or 4 x 3 12 (four instances of the 4 node trap). In our experiments we use a simple genetic algorithm...

Empirical Results

In this subsection, we present an empirical investigation into how the problem complexity is changed for the one-max problem and the deceptive trap problem if low-locality representations are used. We experimentally show that for high-locality representations the fully easy one-max problem remains fully easy. In contrast, most of the low-locality representations make the one-max problem more difficult to solve for GAs. The situation is vice versa for the fully difficult deceptive trap where...

And Deceptive Integer One Max Problem

Population Genetics Problems

We compare the performance of GAs using binary, Gray, and unary encoding for the integer one-max and deceptive trap problems as defined in Sect. 5.1. We performed 250 runs for each problem instance and each run was stopped after the population was fully converged all individuals in the population are the same . For the integer one-max problem we used uniform crossover, and for the integer deceptive trap we used two-point crossover. As selection method we used tournament selection without...

Number of Neighbors

We have seen that Pr fer numbers only have high locality if they encode stars. Therefore, we focus on two issues Firstly, we want to reveal why Pr fer numbers have high locality when encoding stars. And secondly, we want to know how large the areas are of high locality. Finding answers for these questions helps us to more accurately predict the behavior of GEAs for different tree optimization problems. The investigation will show that the number of neighbors has a major impact on the answers to...