CMA-ES


Also found in: Wikipedia.
AcronymDefinition
CMA-ESCovariance Matrix Adaptation Evolution Strategy (computing)
Copyright 1988-2018 AcronymFinder.com, All rights reserved.
References in periodicals archive ?
Figure 9 shows the visual representation of the error values according to generations, obtained by performing CMA-ES for each case.
Table 11 shows the computing time taken to make the training data for each case with the CMA-ES algorithm.
In this case, the local search algorithm CMA-ES shows great disadvantages since it is more likely to trap in the local optimal solution, while the algorithms SaDE, GL-25, SHADE, and DEMR have relatively better performance.
And as a local optimizer, it seems easier for CMA-ES to fall into preconvergence, resulting a worse optimization accuracy.
It can be seen that MDE performs significantly better than CLPSO, GL-25, CMA-ES, LBBO, SFLSDE, and L-SHADE on 15, 16, 17, 7, 8, and 8 test functions.
According to Wilcoxon's test at [alpha] = 0.05 and [alpha] = 0.1, there are significant differences in four cases (MDE versus CLPSO, MDE versus GL-25, MDE versus CMA-ES, and MDE versus SFLSDE), which means that in those cases MDE is significantly better than CLPSO, GL-25, CMA-ES, and SFLSDE.
All the control parameters for the EA and SI algorithms are set to be default of their original literatures: initialization conditions of CMA-ES are the same as in [32], and the number of offspring candidate solutions generated per time step is [lambda] = 4[mu]; the limit parameter of ABC is set to be SN x D, where D is the dimension of the problem and SN is the number of employed bees.
According to Section 4.2, we ensure the following optimal parameter setting of HABC: CR =1, N = 10, and K [subset] S, in comparison with CCEA, CPSO, CMA-ES, ABC, PSO, and EGA algorithms.
GA is the classical stochastic search technique mimicking the process of natural selection; the principle of CMA-ES is to apply the information of successful search steps to adjust the covariance matrix of the mutation distribution within an iterative procedure.
From the statistical results of Table 1, we can see that cMa-ES and CPDD achieve the optimal solution in each run for unimodal problems [f.sub.1]-[f.sub.3] for 10 dimensions.
In this experiment, the proposed MCPSO-PSH algorithm is compared with nonrevisiting genetic algorithm (NrGA), Covariance Matrix Adaptation Evolution Strategy (CMA-ES), the canonical particle swarm optimization (PSO) algorithm, and differential evolution (DE) algorithm.
According to Section 4.2, we ensure the following optimal parameter setting of HABC: CR = 1, N = 10, and K c S,in comparison with CABC, CPSO, CMA-ES, ABC, PSO, and GA algorithms.