PARAMETER ADAPTATION FOR DIFFERENTIAL EVOLUTION (DE) IN SOLVING VERY LARGE –SCALE GLOBAL OPTIMIZATION
Authors: Seah Cheah Chin and Jason Teo
This research investigated the effects of the parameter adaptation of the Differential Evolution (DE) which include the fixed, adaptive and self-adaptive algorithms. Similar to other evolutionary algorithms, most of the DE requires the user to manually hand-tune the control parameters which include crossover rate CR, scaling factor F and population size NP using preliminary test runs prior to conducting the actual evolutionary optimization process. There is a lack of knowledge of how to find reasonably good values for the control parameters of DE for large scale global optimization problem. Hence, the main object of this research is to design, implement and test different types of DE in order to find the reasonable parameter configuration for the three types of DE in large scale global optimization problem. In order to achieve this objective, this project proposes a research on parameter adaptation for Differential Evolution for dealing with large-scale optimization problem by combining the result obtained from a suite of benchmark functions with different type of parameter adaptation methods. This will provide a better result for the future research of DE in selecting reasonable parameter configuration. Methodology include design and implement fixed, adaptive and selfadaptive DE algorithms using MATLAB, integrate and develop DE into large scale algorithm in the suite of benchmark function from IEEE Congress on Evolutionary Computation (CEC), analyse and test performance of three types DE algorithms for large scale global optimization problem with maximum fitness evaluation of 3.00E+05. After the experiments on the three algorithms (First batch - standard DE, adaptive jDE, selfadaptive DE), and another two algorithms (Second batch comparison- adaptive jDE, adaptive aDE and hybrid aDEjDE). The results show from the first batch, the adaptive jDE performed the best compared to the others, while the second batch show that the hybrid aDEjDE algorithm has better performance than others. For this project, the major finding is using only 10% of 3.00E+06 fitness evaluation (initial setting), optimization could achieve 88.41%, 90.73%, 94.53%, 91.26%, 89.04% optimality for the fixed DE, adaptive jDE, self-adaptive DE, adaptive aDE and hybrid aDEjDE algorithms, realizing significant run-time reductions.