Scholarly article on topic 'A Free Search Krill Herd Algorithm for Functions Optimization'

A Free Search Krill Herd Algorithm for Functions Optimization Academic research paper on "Mathematics"

CC BY
0
0
Share paper
Academic journal
Mathematical Problems in Engineering
OECD Field of science
Keywords
{""}

Academic research paper on topic "A Free Search Krill Herd Algorithm for Functions Optimization"

Hindawi Publishing Corporation Mathematical Problems in Engineering Volume 2014, Article ID 936374, 21 pages http://dx.doi.org/10.1155/2014/936374

Research Article

A Free Search Krill Herd Algorithm for Functions Optimization

Liangliang Li, Yongquan Zhou, and Jian Xie

College of Information Science and Engineering, Guangxi University for Nationalities, Nanning 530006, China Correspondence should be addressed to Yongquan Zhou; yongquanzhou@126.com Received 5 April 2014; Revised 13 May 2014; Accepted 21 May 2014; Published 19 June 2014 Academic Editor: Yang Xu

Copyright © 2014 Liangliang Li et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

To simulate the freedom and uncertain individual behavior of krill herd, this paper introduces the opposition based learning (OBL) strategy and free search operator into krill herd optimization algorithm (KH) and proposes a novel opposition-based free search krill herd optimization algorithm (FSKH). In FSKH, each krill individual can search according to its own perception and scope of activities. The free search strategy highly encourages the individuals to escape from being trapped in local optimal solution. So the diversity and exploration ability of krill population are improved. And FSKH can achieve a better balance between local search and global search. The experiment results of fourteen benchmark functions indicate that the proposed algorithm can be effective and feasible in both low-dimensional and high-dimensional cases. And the convergence speed and precision of FSKH are higher. Compared to PSO, DE, KH, HS, FS, and BA algorithms, the proposed algorithm shows a better optimization performance and robustness.

1. Introduction

As many optimization problems cannot be solved by the traditional mathematical programming methods, the meta-heuristic algorithms have been widely used to obtain global optimum solutions. And the aim of developing modern metaheuristic algorithms is to increase the accessibility of the global optimum. Inspired by nature, many successful algorithms are proposed, for example, Genetic Algorithm (GA) [1], Particle Swarm Optimization (PSO) [2, 3], Ant Colony Optimization (ACO) [4], Differential Evolution (DE) [5], Harmony Search (HS) [6], Artificial Bee Colony Optimization (ABC) [7], Firefly Algorithm (FA) [8], Artificial Fish Swarm Algorithm (AFSA) [9], Cuckoo Search (CS) [10, 11], Monkey Algorithm (MA) [12], Bat Algorithm (Ba) [13], Charged System Search (CSS) [14], and Flower Pollination Algorithm (FPA) [15]. Nature-inspired algorithms can effectively solve the problems which traditional methods cannot solve and have shown excellent performance in many respects. So its application scope has been greatly expanded. In recent years, the metaheuristic algorithms mentioned above have been applied to solve the application problems. For example, Xu et al. (2010) solve the UCAV path planning problems by chaotic artificial bee colony approach [16]. Hasan9ebi et al. (2013) applied the bat algorithm in structural

optimization problems [17]. Askarzadeh (2013) developed a discrete harmony search algorithm for size optimization of wind-photovoltaic hybrid energy system [18]. Basu and Chowdhury (2013) used cuckoo search algorithm in economic dispatch [19].

Based on the simulation of the herding behavior of krill individuals, Gandomi and Alavi proposed the krill herd algorithm (KH) in 2012 [20]. And KH algorithm is a novel biologically inspired algorithm to solve the optimization problems. In KH, the time-dependent position of the krill individuals is formulated by three main factors: (1) motion induced by the presence of other individuals; (2) foraging motion; (3) physical diffusion. Only time interval (Ct) should be fine-tuned in the KH algorithm which is a remarkable advantage in comparison with other nature-inspired algorithms. Therefore, it can be efficient for many optimization and engineering problems. To improve the krill herd algorithm, Wang and Guo (2013) proposed a hybrid krill herd algorithm with differential evolution for global numerical optimization [21]. The introduced HDE operator let the krill perform local search within the defined region. And the optimization performance of the DEKH was better than the KH. Then, in order to accelerate convergence speed, thus making the approach more feasible for a wider range of real-world engineering applications while keeping

the desirable characteristics of the original KH, an effective Levy-Flight KH (LKH) method was proposed by Wang et al. in 2013 [22]. And Wang also proposed a new improved metaheuristic simulated annealing-based krill herd (SKH) method for global optimization tasks [23]. The KH algorithm has been applied to solve some application problems. In 2014, a discrete Krill Herd Algorithm was proposed for graph based network route optimization by Sur [24]. KH algorithm was further validated against various engineering optimization problems by Gandomi et al. [25]. Inspired from the animals' behavior, free search (FS) [26] is firstly proposed by Penev and Littlefair. In FS, each animal has original peculiarities called sense and mobility. And each animal can operate either with small precise steps for local search or with large steps for global exploration. Moreover, the individual decides how to search personally.

In order to overcome the limited performance of standard KH on complex problems, a novel free search krill herd algorithm is proposed in this paper. The free search strategy has been introduced into the standard KH to avoid all krill individuals getting trapped into the local optima. The proposed algorithm can greatly enrich the diversity of krill population and improve the calculation accuracy, which leads to a good optimization performance. What is more, the new method can enhance the quality of solutions without losing the robustness.

The proposed FSKH algorithm is different from standard KH in two aspects. Firstly, in FSKH, the population of individuals is initialized using opposition based learning (OBL) strategy [27]. By using OBL strategy, the proposed algorithm can make a more uniform distribution of the krill populations. What is more, we can obtain fitter starting candidate solutions even when there is no knowledge about the solutions.

And secondly, the krill can do freedom and uncertain action using free search strategy. In standard KH, krill is influenced by its "neighbors" and the optimal krill, and the sensing distance of each krill is fixed. But in nature, even for the same krill, its sensitivity and range of activities will also change in different environment and different period. The proposed algorithm can simulate this freedom, uncertain individual behavior of thekrill. Thefreesearchstrategyallows nonzero probability for access to any location of the search space and highly encourages the individuals to escape from trapping in local optimal solution.

The remainder of this paper is organized as follows. In the Section 2, the standard krill herd algorithm and free search strategy are described, respectively. In Section 3, the concept of opposition based learning (OBL) strategy is briefly explained. And the proposed algorithm (FSKH) is described in detail. The simulation experiments of the proposed algorithm are presented in Section 4, compared to PSO, DE, KH, HS, FS and BA algorithms. Finally, some remarks and conclusions are provided in Section 5.

2. Preliminary

2.1. Krill Herd Algorithm. Krill herd (KH) is a novel meta-heuristic swarm intelligence optimization method for solving

optimization problems, which is based on the simulation of the herding behavior of krill individuals. The time-dependent position of an individual krill in two-dimensional surface is determined by the following three main actions:

(1) movement induced by other krill individuals;

(2) foraging activity;

(3) physical diffusion.

KH algorithm used the Lagrangian model as follows:

dX, dt

- =Nt + Ft + D„

where Nj is the motion induced by other krill individuals; F; is the foraging motion; and Dt is the physical diffusion of the (th krill individuals.

2.1.1. Motion Induced by Other Krill Individuals. For a krill individual, the motion induced by other krill individuals can be determined as follows:

Tk Tnew Tk Tmax A Told

Nt = N a, + wnN° ,

local target

a, = a- + a. ,

where Nmax is the maximum induced speed, wn is the inertia weight of the motion induced in [0,1], N,old is the last motion induced, alocal is the local effect provided by the neighbors,

and at g is the target direction effect provided by the best krill individual. And the effect of the neighbors can be defined

X„ =

= IK'J Xij >

Xj -X,

\\xj -x-\

K: - K,

K ■ ■ = 1

1J ^worst — ^best'

where Kt is the fitness value of the (th krill individual. Khest and Kwoist are the best and worst fitness values of the krill individuals so far. Kj is the fitness of jth (j = 1,2,..., NN) neighbor. And NN is the number of the neighbors. X represents the related positions.

The sensing distance for each krill individual is determined as follows:

dsi = — J \\Xi -X:

where N is the number of the krill individuals.

The effect of the individual krill with the best fitness on the (th individual krill is taken into account using

-.tai-grt _ ^best ^ O

O; = ^ besfA;

S ,best^i ,best

where the value of Cbest is defined as

Cbest = 2 ( rand +

2.1.2. Foraging Motion. The foraging motion is formulated in terms of two main effective parameters. The first is the food location and the second one is the previous experience about the food location. This motion can be expressed for the (th krill individual as follows:

F, = +

ß. = 6ood + 6best,

where Vj is the foraging speed, is the inertia weight of the foraging motion between 0 and 1, and F°ld is the last foraging motion. ^food is the attraction of the food and ^best is the effect of the best fitness of the (th krill so far. In our paper, we set V = 0.02.

In KH, the virtual center of food concentration is approximately calculated according to the fitness distribution of the krill individuals, which is inspired from "center of mass." The center of food for each iteration is formulated as follows:

Therefore, the food attraction for the (th krill individual can be determined as follows:

z>food _ ^food Y Y

pi - C ^¡,foodAi,food'

where Cfood is the food coefficient defined as follows:

Cfood = 2Î1 - -p-

The effect of the best fitness of the ith krill individual is also handled using the following equation:

ß> = -^MbestXMbest>

where KQ,;best is the best previously visited position of the (th krill individual.

2.1.3. Physical Diffusion. The random diffusion of the krill individuals can be considered to be a random process in essence. This motion can be described in terms ofa maximum diffusion speed and a random directional vector. It can be indicated as follows:

D, = Dmax5,

where _Dmax is the maximum diffusion speed, and S is the random directional vector, and its arrays are random values in [-1,1]. The better the position is, the less random the motion is. The effects of the motion induced by other krill individuals and foraging motion gradually decrease with increasing the time (iterations). Thus, another term equation (12) into equation (13). This term linearly decreases the random speed with the time and performs on the basis of a geometrical annealing schedule:

D = Dm

2.1.4. Crossover. The crossover is controlled by a crossover probability Cr, and the mth component of X; (i.e., xi m) is defined as follows:

Xr,m rand;,m < Cr X;,m else

Cr = 0.2ÎC,

where r e {1,2,..., i - 1, i + 1,...

2.1.5. Mutation. The mutation is controlled by a mutation probability Mu, and the adaptive mutation scheme is determined as follows:

X3best,m + F (xp,m - Xq,m) rand>,m < Mu

X,,m else> (15)

MU = 0.05%est,

where p, q e {1,2,..., i - 1, i + 1,..., K}, ^ e [0,1].

2.1.6. Main Procedure of the Krill Herd Algorithm. In general, the defined motions frequently change the position of a krill individual toward the best fitness. The foraging motion and the motion induced by other krill individuals contain two global and two local strategies. These are working in parallel which make KH a powerful algorithm. Using different effective parameters of the motion during the time, the position vector of a krill individual during the interval to t + Ai is given by the following equation:

X; (i + Ai) = X; (i) + Ai

It should be noted that Ai is one of the most important constants and should be carefully set according to the optimization problem. This is because this parameter works as a scale factor of the speed vector. Ai can be simply obtained from the following formula:

Ai = Ct¿(UB; - LB,-),

where NV is the total number of variables and LB^, UB;-are lower and upper bounds of thejth variables (j = 1,2,..., NV), respectively. Ct is a constant number between [0,2] (Algorithm 1).

2.2. Free Search Operator. Free search (FS) [26] models the behavior of animals and operates on a set of solutions called population. In the algorithm each animal has original peculiarities called sense and mobility. The sense is an ability of the animal for orientation within the search space. The animal uses its sense for selection of location for the next step. Different animals can have different sensibilities. It also varies during the optimization process, and one animal can have different sensibilities before different walks. The animal can select, for start of the exploration walk, any location

X,,m "

Initialization

Opposition-based population initialization

Select the NP fittest individuals as initial population

Motion induced by other individuals

Foraging motion

Physical diffusion

Generate sensibility

Take T exploration walks

Distribute the pheromone

> Output the best solutions

Figure 1: Flowchart of the free search krill algorithm.

PSO DE KH FSKH

Iteration

- HS ■■ FS ----BA

PSO DE KH FSKH

Iteration

- HS -■ FS ----BA

Figure 2: Convergence curves for Ft (D = 30).

Figure 3: Convergence curves for F2 (D = 30)

----PSO

...... KH

- FSKH

Iteration

- HS ■■ FS ----BA

Figure 4: Convergence curves for F3 (D = 30).

marked with pheromone, which fits its sense. During the exploration walk the animals step within the neighbor space. The neighbor space also varies for the different animals. Therefore, the probability for access to any location of the search space is nonzero.

During the exploration, each krill achieves some favor (an objective function solution) and distributes a pheromone in amount proportional to the amount of the found favor (the quality of the solution). The pheromone is fully replaced with a new one after each walk.

Particularly, the animals in the algorithm are mobile. Each animal can operate either with small precise steps for local search or with large steps for global exploration. And each animal decides how to search (with small or with large steps) by itself. Explicit restrictions do not exist. The previous experience can be taken into account, but it is not compulsory.

2.2.1. The Structure of the FS Operator. The structure of the algorithm consists of three major events: initialization, exploration, and termination.

(1) Initialization. In this paper, the initialization strategy

is X0ji = Xi min + (xi max - Xi min) * randomtji (0, 1),

where random(0,1) is a random value in [0,1] and xtmin, ximax are the space borders.

(2) Exploration. The exploration walk generates coordinates

of a new location as

xtjt = x0jt - Axtjt + 2* Axtjt * random^ (0,1). (18) The modification strategy is Axtji = Rji * (xi max - Xi min) * randomtji (0, 1) , (19)

where T is step limit per walk and t = 1,...,T is current step.

----PSO

...... KH

- FSKH

Iteration

...... FS

----BA

Figure 5: Convergence curves for F4 (D = 30).

The individual behavior, during the walk, is modeled and described as ftj = f(xtji), fj = max(ftj), where fj is the only location marked with pheromone from an animal after the walk.

The pheromone generation is

The sensibility generation is

Sj = ^min +

ASi = (Smax - Smin) * randomi (0, ^ >

where Smin and Smax are minimal and maximal possible values of the sensibility. Pmin and Pmax are minimal and maximal possible values of the pheromone trials. And Pmax = Smax, P = S

min min

Selection and decision making for a start location for an exploration walk is

X0ji =

(pk <sj)

(pk * SJ)

where j = 1,...,NK, k = 1, locations number.

NK, and k is the marked

(3) Termination. In this paper, the criterion for termination is iter > iterMax, where iterMax is the maximum number of iterations.

The steps are imitation of the ability for motion and action. The steps can be large or small and can vary. In the search process, the neighbor space is a tool for tuning rough and precise searches. So, search radius R: is a parameter related to individual j search space; the value of Rj decides the optimization quality during the search process.

3 300 1

\ V'-. \ \ '■•

10 20 30 40

Iteration

PSO DE KH FSKH

HS FS BA

Figure 6: Convergence curves for F5 (D = 30).

PSO DE KH FSKH

Iteration

HS FS BA

Figure 7: Convergence curves for F6 (D = 30).

There are two methods to set the value of search radius. The first one is that Rj is constant. If the value is higher, the individual search space is wider, search time is longer, and the convergence precision is lower. If the value is lower, the individual search space is smaller and the convergence precision is higher.

The second method is that changing neighbor space Rj adaptively. And during the search process, Rj is decreasing gradually. The rule is as follows:

R: (t)=p*R, (t- 1),

where t is the exploration generation and 0 < p < 1 is the radius contract coefficient, which is an important parameter. In this paper, we adopt the first approach.

PSO DE KH FSKH

Iteration

---HS ■■ FS ----BA

Figure 8: Convergence curves for F7 (D = 30).

----PSO

...... KH

- FSKH

Iteration

...... FS

----BA

Figure 9: Convergence curves for Fs (D = 30).

2.2.2. Procedure of FS Operator. The detailed process of FS operator is described as in Algorithm 2.

3. Free Search krill Herd Algorithm

In KH algorithm, krill is influenced by its "neighbors" and the optimal krill, and the sensing distance of each krill is fixed. But in nature, the action of each krill is free and uncertain. In order to simulate this freedom, uncertain individual behavior of the krill, this paper introduces the free search strategy into

----PSO

...... KH

- FSKH

Iteration

...... FS

----BA

Figure 10: Convergence curves for F9 (D = 30).

----PSO

...... KH

- FSKH

Iteration

---HS ■■ FS ----BA

Figure 12: Convergence curves for F11 (D = 2).

PSO DE KH FSKH

Iteration

- HS ■■ FS ----BA

----PSO

...... KH

- FSKH

Iteration

- HS ■■ FS ----BA

Figure 11: Convergence curves for _f10 (D = 30).

Figure 13: Convergence curves for F12 (D = 2).

the krill herd algorithm and proposes a novel free search krill herd algorithm (FSKH).

3.1. Opposition-Based Population Initialization. Population initialization has an important impact on the optimization results and global convergence; this paper introduces the initialization method of opposition based learning (OBL) strategy [27] to generate initial krill populations (Algorithm 3).

By utilizing OBL we can obtain fitter starting candidate solutions even when there is no knowledge about the solutions. This initialization method can make a more uniform

distribution of the krill populations. Therefore, it is good for the method to get better optimization results. And by utilizing free search strategy, each krill individual in FSKH can decide how to search by itself (Algorithm 4). The strategy allows nonzero probability to approach to any location of the search space and highly encourages the individuals to escape from trapping in local optimal solution. During the search process, each krill takes exploration walks according to different search radius.

In general, three main actions in standard KH algorithm can guide the krill individuals to search the promising

Table 1: Benchmark functions.

Category Number Name Benchmark function Scope f .min

fi Sphere n /(*) = z*2 =1 [-5.12,5.12] 0

ff Step n-1 2 /(*)= z(l*, + 0.5j)2 >=1 [-100,100] 0

f3 Rosenbrock n-1 /(*)= z =1 (*, -1)2 + 100(%2 -*, J2] [-2.048,2.048] 0

f4 Quartic /(*) = n z*4 + random [0,1) =1 [-1.28,1.28] 0

I f5 Rastrigin n /(*) = z =1 [*22 - 10 cos (2^) + 10] [-5.12,5.12] 0

f6 Ackley /(*) = -20 exp (-0.2^ 1 n / 1 n \ \ - z*2 - exp ( - z cos)) + 20 + e n j=1 ' \n j=1 ) J [-32.768,32.768] 0

f7 Quadric

f8 Griewank

f, Alpine

fo Zakharov

/(*)= z( 1

-^ = 4000? (^?)-n cos (V^

/(*) = z sin (xj) + 0.1xj|

n / n \ 2 / n ^ 4

/(*) = zel^f + ( zz10.5t^j) + ( ¿10.5»*,

[-100,100] [-600,600] [-10,10] [-5,10]

Schaffer's F6

Drop wave

sin2y(%1 +xf) - 0.5 [1 + 0.001(xf + *2)]

[-100,100]

1 + cos (12^*2 + xf) (1/2)(xf + xf) + 2

2 , A1 1 „2

f13 Six-hump camel back /(xpx2) = (4 - 2.1*2 +-3-) xj + %1%2 + (-4 + 4%^)

/(*) = -(-1)n ( ncos2 (*,)) exp (-z (*, - Tt)

-5.12,5.12] -1

[-3,3] -1.0316

[-2rc,2rc] -1

solution space. But it is easy for the standard KH algorithm to be trapped into local optima, and the performance in high-dimensional cases is unsatisfied. In the FSKH algorithm, the individual can search the promising area with small or large steps. So, the krill individuals can move step by step through multidimensional search space. In nature, the activity range of krill individuals is different. R^ ; can adjust the activity range of the individual, and there is no explicit restrictions.

Using the free search strategy, the krill individual can search any region of the search space. Each krill individual can search according to their perception and the scope of activities and can not only search around the global optimum, but also search around local optimum. When using larger step, it takes global search which can strengthen the weak global search ability of KH. Therefore, the proposed algorithm has better population diversity and convergence speed and can enhance the global searching ability of the algorithm. To achieve a better balance between local search and global search, FSKH algorithm includes both "exploration" process of FS and "exploitation" process of KH. When increasing the sensitivity, the krill individual will approach the whole population's current best value (i.e., local search). While reducing the sensitivity, the krill individual can search around other neighborhood (i.e., global search) (Figure 1).

4. Simulation Experiments

4.1. Simulation Platform. All the algorithms compared in this section are implemented in Matlab R2012a (7.14). And experiments are performed on a PC with a 3.01 GHz, AMD Athlon (tm) IIX4 640 Processor, 3 GB of RAM, and Windows XP operating system. In the tests, population size is NP = 50. The experiment results are obtained in 50 trials.

4.2. Benchmark Functions. In order to verify the effectiveness of the proposed algorithm, we select 14 standard benchmark functions [28-30] to detect the searching capability of the proposed algorithm. The proposed algorithm in this paper (i.e., FSKH) is compared with PSO, DE, KH, HS, FS, and BA.

4.3. Parameter Setting. Generally, the choice of parameters requires some experimenting. In this paper, after a lot of experimental comparison, the parameters of the algorithm are set as follows.

In KH and FSKH, the maximum induced speed Nmax = 0.01, the foraging speed Vy = 0.02, and the maximum

> -1.01

----PSO

...... KH

- FSKH

Iteration

- FS ----BA

Figure 14: Convergence curves for _f13 (D = 2).

S -0.98

PSO DE KH FSKH

Iteration

- HS ■■ FS ----BA

Sphere

Figure 15: Convergence curves for _f14 (D = 2).

diffusion speed Dmax = 0.005. In FSKH and FS, the search radius

1, j=1,...,-

NP 2NP 0.5, j=— + 1,...,-;

0.1, j=-+ 1,..., NP.

But in the FSKH, the search step is T = 5. In FS, the search step is T = 50.

PSO DE KH FSKH HS FS BA Figure 16: ANOVA tests of the global minimum for fj (d = 30).

xio4 7 6 5 4 3 2 1 0

PSO DE KH FSKH HS FS BA Figure 17: ANOVA tests of the global minimum for F2 (d = 30).

Rosenbrock

1000 800 600 400 200

Quartic

ü + -i i i

■ 4= i i ^ -

PSO DE KH FSKH HS FS BA Figure 19: ANOVA tests of the global minimum for F4 (D = 30).

Quadric

PSO DE KH FSKH HS FS BA Figure 22: ANOVA tests of the global minimum for F7 (D = 30).

Rastrigin

300 250 200 150 100 50 0

PSO DE KH FSKH HS FS BA Figure 20: ANOVA tests of the global minimum for F5 (D = 30).

Griewank

600 500 400

PSO DE KH FSKH HS FS BA Figure 23: ANOVA tests of the global minimum for Fa (D = 30).

Ackley

Alpine

S S T - - £

■ + ^ -

Initialization. Set the generation counter iter = 1; initialize the population P of NP krill individuals randomly and each krill corresponds to a potential solution to the given problem; set the foraging speed v^, maximum diffusion speed dmax, and maximum induced speed nmax; Fitness evaluation. Evaluate each krill individual according to its position.

While the termination criterion is not satisfied or iter < iterMax Generation do Sort the population krill from best to worst. for i = 1: NP (all krill) do

Perform the following motion calculation.

Motion induced by the presence of other individuals Foraging motion Physical diffusion Implement the genetic operators. Update the krill individual position in the search space. Evaluate each krill individual according to its new position.

end for

Sort the population krill from best to worst and find the current best. iter = iter + 1. End While

Post-processing the results and visualization.

Algorithm 1: Krill herd algorithm.

Zakharov

Schaffer f6

PSO DE KH FSKH HS FS BA Figure 25: ANOVA tests of the global minimum for F10 (D = 30).

PSO DE KH FSKH HS FS BA Figure 26: ANOVA tests of the global minimum for F11 (D = 2).

In BA, the parameters are generally set as follows: pulse frequency range is Q; e [0,2], the maximum loudness is A0 = 0.5, maximum pulse emission is r0 = 0.5, attenuation coefficient of loudness is a = 0.95, and increasing coefficient of pulse emission is y = 0.05. In PSO, we use linear decreasing inertia weight o>max = 0.9, o>min = 0.4, and learning factor is C1 = C2 = 1.4962. In HS, the harmony consideration rate is HMCR = 0.9, the minimum pitch adjusting rate is PARmin = 0.4, the maximum pitch adjusting rate is PARmax = 0.9, the minimum bandwidth is bwmin = 0.0001,andthemaximum benchmark functions (F1, F2, F5, F7

4.4. Comparison of Experiment Results. The best, mean, worst, and Std. represent the optimal fitness value, mean fitness value, worst fitness value, and standard deviation, respectively. Bold and italicized results mean that FSKH is better, while the * results means that other algorithm is better.

For the low-dimensional case, the maximum number of iterations of each algorithm is iterMax = 500. As seen from Tables 2 and 3, FSKH provides better results than other algorithms except F3, F13, and F14. What is more, FSKH can find the theoretical optimum solutions for nine

bandwidth is bw = 1. In DE, the crossover constant is

Per = 0.5.

and has a

very strong robustness. For other algorithms, only PSO can find the theoretical optimum solution for three functions

Table 2: Simulation results for Fj ~ F9 in low dimension.

Benchmark Result Method

function PSO DE KH II FSKH HS FS BA

Best 1.8151b - 02 1.2075b - 01 1.1148b - 04 0 1.2072b + 01 4.5486b - 02 9.4991b - 04

Mean 4.5954b - 01 2.4943£ - 01 1.6110b-03 6.1254E - 316 3.4612b + 01 5.9487b - 02 1.4434b - 03

f (d = 30) Worst 1.6052b + 00 3.7078b - 01 6.4270b - 03 1.1599E - 314 5.1108b + 00 7.0424b - 02 1.7796b - 03

Std. 3.2409b - 01 6.1704b - 02 1.6590b - 03 0 1.7649b + 01 6.7912b - 03 1.8339b - 04

Rank 6 5 3 1 7 4 2

Best 7.9000b + 01 2.1000b + 01 0 0 4.0710b + 03 1.6000b + 01 3.2450b + 04

Mean 2.4706b + 02 8.8740b + 01 2.8200b + 00 0 7.8597b + 03 2.3780b + 01 5.9598b + 04

F2 (d = 30) Worst 6.2900b + 02 2.7900b + 02 8.4000b + 01 0 1.8688b + 03 3.0000b + 01 7.2507b + 04

Std. 1.2618b + 02 5.3506b + 01 1.1845b + 01 0 7.1600b + 03 2.9017b + 00 6.9512b + 03

Rank 5 4 3 1 7 2 6

Best 2.8953b + 01 3.0199b + 01 2.6378b + 01 2.8703b + 01 3.1922b + 02 2.2637.e + 01 1 .0961E + 01

Mean 6.2541b + 01 3.6796b + 01 2.7980b + 01 2.8912b + 01 5.8759b + 02 2.6906E + 01 3.0216b + 01

f3 (d = 30) Worst 4.5307.e + 02 4.9489£ + 01 2.9430b + 01 2.8991E + 01 1.0549b + 03 2.9389b + 01 8.3654b + 01

Std. 6.0248b + 01 3.7560b + 00 8.1537b - 01 6.1287E - 02 1.5737b + 02 1.4196b + 00 1.3356b + 01

Rank 5 6 3 4 7 2 1

Best 3.0162b - 02 8.6779b - 03 1.1579b - 02 2.2393E - 05 2.7531b - 01 6.3723b - 01 8.6847b - 02

Mean 8.0366b - 02 2.5195b - 02 4.4238b - 02 7.6115E - 04 7.3426b - 01 9.7518b - 01 1.6032b - 01

(d = 30) Worst 1.7465£ - 01 4.5602b - 02 9.5740b - 02 2.1624E - 03 1.1753b + 00 1.2684b + 00 2.4878b - 01

Std. 3.3563b - 02 7.4389b - 03 2.0059b - 02 5.0720E - 04 2.1038b - 01 1.3336b-01 3.4310b - 02

Rank 4 2 3 1 7 6 5

Best 4.3907b + 01 1.2231b + 02 7.2421b + 00 0 8.8691b + 01 1.3828b + 02 1.4257b + 02

Mean 8.8471.e + 01 1.6497b + 02 2.1243b + 01 0 1.3383b + 02 1.9791b + 02 2.0262b + 02

f5 (d = 30) Worst 1.4702b + 02 1.8931b + 02 4.0592b + 01 0 1.7953b + 02 2.4035b + 02 2.8786b + 02

Std. 2.3543b + 01 1.4949£ + 01 6.7613b + 00 0 1.5616b + 01 2.2151b + 01 2.8986b + 01

Rank 6 3 2 1 4 5 7

Best 3.4508b + 00 3.1954b + 00 8.0527.e - 02 8.8818E - 16 1.3509b + 01 1.9172b + 01 1.8667b + 01

Mean 5.8518b + 00 4.2090b + 00 3.1470b + 00 8.8818E - 16 1.4935b + 01 1.9908b + 01 1.9148b + 01

h (d = 30) Worst 8.4440b + 00 5.2716b + 00 6.8219b + 00 8.8818E - 16 1.6955b + 01 2.0247b + 01 1.9503b + 01

Std. 1.2143b + 00 5.3016b - 01 1.1229b + 00 0 8.6116b - 01 2.2907b - 01 1.9351b - 01

Rank 7 4 6 1 5 3 2

Best 3.5384b + 03 5.6171b + 01 7.0436b + 03 0 1.3848b + 05 1.4414b + 04 1.8780b + 04

Mean 2.1959b + 04 1.4336b + 02 3.2982b + 04 4.3250E - 180 4.6076b + 05 2.3871.e + 04 1.7660b + 05

F7 (d = 30) Worst 1.7426b + 05 4.3311b + 02 1.7281b + 05 2.1625E - 178 1.3967b + 06 3.6638b + 04 7.5561b + 05

Std. 2.7411b + 04 7.5588b + 01 2.7450b + 04 0 2.4220b + 05 5.4373b + 03 1.6793b + 05

Rank 4 2 5 1 7 3 6

Best 1.0973b + 00 1.1225b + 00 2.0611b - 02 0 5.0342b + 01 1.1446b + 00 3.7618b + 02

Mean 2.0579b + 00 1.6510b + 00 1.2319b - 01 0 7.3791b + 01 1.2110b + 00 5.3973b + 00

30) Worst 3.9297.e + 00 2.5709b + 00 5.1307b - 01 0 1.2387b + 02 1.2616b + 00 6.61453177 + 02

Std. 5.5487b - 01 3.5593b - 01 8.0788b - 02 0 1.4831b + 01 2.4008b - 02 6.4097b + 01

Rank 5 4 3 1 6 2 7

Best 0 1.9429b - 16 1.0908b - 09 0 5.3806b - 05 2.8243b - 08 1.2522b - 09

Mean 9.1221b - 41 1.7080b- 13 6.3405b - 08 3.7231E - 168 3.6484b - 02 1.4745b - 06 1.2981b - 06

k, (d = 30) Worst 1.9689b - 39 1.1459b- 12 2.5349b - 07 8.9590E - 167 3.9778b - 01 1.3508b - 05 2.5807.e - 05

Std. 3.4654b - 40 2.5917b- 13 6.0990b - 08 0 6.7720b - 02 2.4162b - 06 3.7830b - 06

Rank 2 3 4 1 7 5 6

Best 4.7818b - 01 2.7822b - 01 4.0704b - 02 0 6.2320b + 02 1.6179b - 01 3.2702b - 03

Mean 3.5156b + 00 9.6176b - 01 1.5103b + 00 8.6510E - 318 8.5918b + 02 2.7912b - 01 5.2075b - 03

= 30) Worst 3.1501b + 01 3.1197b + 00 9.2295b + 00 3.3739E - 316 1.2293b + 03 3.8473b - 01 7.4336b - 03

Std. 4.3846b + 00 5.1776b - 01 1.9643b + 00 0 1.0523b + 03 4.0208b - 02 1.0260b - 03

Rank 6 4 5 1 7 3 2

Note: In this paper, both KH and KH II represent the KH with crossover operator.

Table 3: Simulation results for Kn ~ F,

Benchmark function Result PSO DE KH 11 Method FSKH HS FS BA

Best 1 -9.999984B - 01 -9.999997E - 01 1 -9.880465B -01 -9.982745B - 01 -9.627759B -01

Mean -9.961136B-01 -9.996264B - 01 -9.999894B - 01 1 -8.445035B -01 -9.921680B - 01 -7.407210£ -01

Fn(D = 2) Worst -9.902840B-01 -9.941573B - 01 -9.999449B - 01 1 -5.830062E -01 -9.902681B - 01 -5.482243B -01

Std. 4.808128B-03 8.981117B-04 1.209463B - 05 0 9.752777£ - -02 2.869464B - 03 1.250904B - -01

Rank 2 4 3 1 6 5 7

Best 1 1 1 1 -9.756697E -01 -9.999931B - 01 -9.999997E -01

Mean -9.961747£ - 01 -9.999996E - 01 -9.821486B - 01 1 -8.443158£ -01 -9.983692B - 01 -7.368404B -01

Fu (D = 2) Worst -9.362453B - 01 -9.999938B - 01 -9.362453B - 01 1 -5.85603 6E -01 -9.948806B - 01 -2.888120B -01

Std. 1.529461B - 02 1.113912B-06 2.891639B - 02 0 9.704192E - -02 1.421345B - 03 1.667622B - -01

Rank 3 2 4 1 7 6 5

Best 1.031628E + 00 1.031628E + 00 1.031628E + 00 1.031628E + 00 -1.031587B + 00 1.031628E + 00 1.031628E + 00

Mean 1.031628E + 00 1.031628E + 00 1.031628E + 00 -1.031510B + 00 -9.719758B -01 -1.031628B + 00 -9.989818B -01

Fu(D = 2) Worst 1.031628E + 00 1.031628E + 00 1.031628E + 00 -1.031071B + 00 -6.378180B -01 -1.031627B + 00 -2.154637B -01

Std. 6.728967E 16 8.535156B- 16 2.176352B - 09 1.420660B - 04 7.340586B - -02 1.353951B - 07 1.615586B- -01

Rank 1 2 3 5 7 4 6

Best 1 1 1 1 -9.938758E -01 -9.999999E - 01 -1

Mean 1 1 -9.999999E - 01 -9.998070B - 01 -5.496989E -01 -9.997912B - 01 -6.000308E -01

Fu (D = 2) Worst 1 1 -9.999999E - 01 -9.995538B - 01 -1.236216E -05 -9.984373B - 01 -8.743099E - 12

Std. 0 0 2.026218B - 09 1.291856B - 03 4.099494E - -01 2.968278E - 04 4.948335E - -01

Rank 1 1 3 4 7 6 5

Table 4: Simulation results for Fj ~ F9 in high dimension.

Benchmark Result Method

function PSO DE KH II FSKH HS FS BA

Best 1.7539f + 02 1.1046f + 02 2.2325f + 02 0 5.5839f + 03 1.9398f + 03 4.6377f + 00

Mean 4.2055f + 02 1.2990f + 02 2.3419f + 02 4.6082E - 316 5.8593f + 03 2.1829f + 03 5.1666f + 00

fj (d = 1000) Worst 8.7083f + 02 1.4698f + 02 2.4475f + 02 4.4691E - 315 6.0746f + 03 2.2895f + 03 5.7013f + 00

Std. 1.5335f + 02 8.4319f + 00 8.0663f + 00 0 9.3024f + 01 7.1875f + 01 2.4083f - 01

Rank 4 3 5 1 7 6 2

Best 1.7420f + 03 5.8630f + 03 3.8140f + 03 0 2.0879f + 05 3.6160f + 03 5.1312f + 05

Mean 5.6810f + 03 7.6012f + 03 6.4365f + 03 0 2.4108f + 05 4.2361f + 03 5.5451f + 05

F2 (d = 200) Worst 1.2432f + 04 9.4600f + 03 9.4830f + 03 0 1.3406f + 04 4.8140f + 03 6.0191f + 05

Std. 2.4260f + 03 7.9567f + 02 1.2702f + 03 0 2.5040f + 05 2.8971.e + 02 1.8030f + 04

Rank 2 5 4 1 6 3 7

Best 2.2607f + 02 1.8958f + 02 1.0216f + 02 9.8070E + 01 4.8892f + 03 1.1008f + 02 9.9091f + 01

Mean 1.2251f + 03 2.3327f + 02 1.1403f + 02 9.8701E + 01 6.6060f + 03 1.1743f + 02 1.0747f + 02

f3 (d = 100) Worst 4.3896f + 03 2.8370f + 02 1.3766f + 02 9.8942E + 01 8.4203f + 03 1.2626f + 02 1.6127f + 02

Std. 1.1701f + 03 2.0992f + 01 9.4142f + 00 2.1724E - 01 9.0378f + 02 2.8202f + 00 2.0822f + 01

Rank 6 5 3 1 7 4 2

Best 2.9336f + 00 2.1517.e - 01 1.3101f + 00 6.3543E - 06 2.6746f + 02 7.7482f + 01 6.9964f + 00

Mean 2.7344f + 01 3.0597f - 01 1.5941f + 00 4.4048E - 04 2.9365f + 02 8.4210f + 01 8.4533f + 00

f4 (d = 1000) Worst 6.4757f + 01 4.0618f - 01 2.0862f + 00 9.8040E - 04 3.1329f + 02 8.9597f + 01 9.7956f + 00

Std. 1.3793f + 01 4.7306f - 02 2.5818f - 01 3.0041E - 04 9.2166f + 00 3.2374f + 00 5.4824f - 01

Rank 4 2 3 1 7 6 5

Best 2.7456f + 03 4.0397f + 03 2.0311f + 03 0 6.1828f + 03 6.3281f + 03 3.8981f + 03

Mean 3.9835f + 03 4.2471f + 03 2.3252f + 03 0 6.5147f + 03 6.5574f + 03 4.2476f + 03

f5 (d = 500) Worst 4.6952f + 03 4.4312f + 03 2.5811f + 03 0 6.7919f + 03 6.7360f + 03 4.5534f + 03

Std. 4.2126f + 02 8.3027f + 01 1.2453f + 02 0 1.2874f + 02 9.1246f + 01 1.3686f + 02

Rank 3 5 2 1 6 7 4

Best 6.3274f + 00 8.0299f + 00 7.0415f + 00 8.8818E - 16 1.9882f + 01 2.0898f + 01 1.9386f + 01

Mean 8.2684f + 00 8.7293f + 00 8.6721f + 00 8.8818E - 16 2.0099f + 01 2.1003f + 01 1.9495f + 01

f6 (d = 300) Worst 1.0209f + 01 9.4457f + 00 1.0111f + 01 8.8818E - 16 2.0345f + 01 2.1056f + 01 1.9589f + 01

Std. 8.7857f - 01 3.0761f - 01 6.4227f - 01 0 7.7479f - 02 3.3659f - 02 3.5615f - 02

Rank 2 4 3 1 6 7 5

Best 2.1725f + 05 2.7428f + 03 4.3171f + 05 3.8339E - 319 9.4228f + 06 7.1039f + 05 7.9254f + 05

Mean 1.5219f + 06 5.7789f + 03 2.8233f + 06 2.2442E - 280 3.0414f + 07 1.3329f + 06 4.5801f + 06

Fy (d = 100) Worst 6.8641f + 06 1.2878f + 04 7.0375f + 06 1.1221E - 278 7.1585f + 07 1.8894f + 06 1.6408f + 07

Std. 1.3202f + 06 2.5486f + 03 1.6109f + 06 0 1.3820f + 07 2.6259f + 05 3.7611f + 06

Rank 3 2 4 1 7 5 6

Best 4.7748f + 01 1.6294f + 02 2.1827.e + 02 0 7.7044f + 03 1.0245f + 03 1.1878f + 04

Mean 1.5434f + 02 2.0567f + 02 2.7742f + 02 0 8.3385f + 03 1.2015f + 03 1.3091f + 04

f8 (D = 500) Worst 3.1407f + 02 2.4796f + 02 3.3449f + 02 0 8.9522f + 03 1.4065f + 03 1.3825f + 04

Std. 5.9349f + 01 1.7027f + 01 2.6215f + 01 0 2.7756f + 02 7.9418f + 01 3.3184f + 02

Rank 2 3 4 1 6 5 7

Best 0 0 3.4191f- 10 0 8.9839f - 05 7.9967f - 09 2.4396f - 10

Mean 7.9602f - 78 9.7855f - 19 5.6362f - 05 6.1471E - 167 1.0189f - 02 4.9600f - 07 3.5023f - 07

f> (d = 200) Worst 2.6973f - 76 1.3878f- 17 1.7294f - 03 2.3564E - 165 1.1317f - 01 2.5864f - 06 6.8099f - 06

Std. 3.8349f - 77 2.9411f- 18 2.5904f - 04 0 1.9049f - 02 5.7052f - 07 1.1021f - 06

Rank 2 3 5 1 7 6 4

Best 1.2447f + 02 1.2043f + 02 5.5973f + 02 0 9.0756f + 03 2.9821f + 02 7.3081f + 03

Mean 3.9115f + 02 1.5201f + 02 8.4831f + 02 1.3901E - 317 1.0803f + 05 3.6677f + 02 8.3861f + 03

fjo (d = 300) Worst 8.1348f + 02 1.9321f + 02 1.1894f + 03 6.4647E - 316 1.9276f + 06 4.7007f + 02 9.4975f + 03

Std. 1.8186f + 02 1.6427f + 01 1.6003f + 02 0 3.6921f + 05 3.4810f + 01 4.8766f + 02

Rank 3 2 5 1 7 4 6

i min> Xi max. NK> NP> G> X0ji > T> RjP

Initialize

Take initial walks ftj (x0jj + axt); Generate the initial pheromone fk; Distribute the initial pheromone Pk Learn the initial achievements Pk -While (t < T)

Generate sensibility Sj Select start locations for a walk x'n

= *k(S,,Pk>,

Take exploration walks ftj (x0jt + Axt); Generate a pheromone Pk; Distribute the pheromone Pk ^ xkp;

Learn the achievements K

End While

Algorithm 2: Free search operator.

-0.2 -

-0.4 -

-0.6 -

-0.8 -

-1.2 -

PSO DE KH FSKH HS FS BA Figure 29: ANOVA tests of the global minimum for f14 (D = 2).

Drop wave

-0.3 -0.4 -0.5 -0.6 -0.7 -0.8 -0.9 -1

PSO DE KH FSKH HS FS BA Figure 27: ANOVA tests of the global minimum for P12 (D = 2).

Six-hump camel back

3 6000

----PSO

...... KH

- FSKH

Iteration

■■ FS ----BA

Figure 30: Convergence curves for Ft (D = 1000).

(F9, Fn, F12), DE can find the theoretical optimum solution for one function (F14), and KH II can find the theoretical optimum solution for one function (F2). The number of finding the optimal solution for FSKH is more than that of the other six algorithms. For F4 and F6 FSKH has a higher precision of optimization. The accuracy of FSKH can be higher than that of other algorithms for 2 and 14 orders of magnitude, respectively, at least. For F3, F13, and F14, we can findthatthere is at leastone algorithmthatcan performbetter than FSKH. But in general, even for these three functions, the performance of FSKH is highly competitive with other algorithms. For all functions, the standard deviations of FSKH are very small are very small which indicates that FSKH is very robust and efficient.

----PSO

...... KH

- FSKH

Iteration

HS FS BA

Figure 31: Convergence curves for f2 (D = 200).

tn 1.5

20 40 60 80

Iteration

PSO DE KH FSKH

--- HS

..... FS

-■- BA

Figure 32: Convergence curves for f3 (D = 100).

For benchmark functions in Table 1, Figures 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, and 15 are the convergence curves, Figures 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, and 29 are the ANOVA tests of the global minimum. Figures 2, 3, 4, 5, 6, 7, 8, 9, and 11 show that the convergence speed of FSKH is quicker than other algorithms. FSKH is the best algorithm for most functions. Moreover, as seen from the ANOVA tests of the global minimum, we can find that FSKH is the most robust method. For the fourteen functions, other algorithms compared (i.e., PSO, DE, KH, HS, FS, and BA)

----PSO

...... KH

- FSKH

Iteration

■■ FS ----BA

Figure 33: Convergence curves for f4 (D = 1000).

3 6000

----PSO

...... KH

- FSKH

Iteration

- HS ■■ FS ----BA

Figure 34: Convergence curves for f5 (D = 500).

can only be robust for a few functions, but cannot be robust for all functions. Therefore, FSKH is an effective and feasible method for optimization problems in low-dimensional case.

In order to validate the optimization ability of algorithms in high-dimensional case, this paper set F3 and F7 to 100-dimension, F2 and Fg to 200-dimension, F6 and F10 to 300-dimension, F5 and F8 to 500-dimension, F1 and F4 to 1000-dimension. The maximum number of iterations of each algorithm is adjusted to iterMax = 1000, and other parameters are the same.

The comparison results for high-dimensional case are shown in Table 4. As seen from the results, the optimization

For i = 1 to NP do

For j = 1 to NK do

Xi,j = Xmm,j + (Xmax,j - Xmin,j) * rand;

End For End For

/* opposition-based population initialization */ For i = 1 to NP do

For j = 1 to NK do

X'i,j = Xmin,j + (Xmax,j - Xi,j);

End For End For

Select the NP fittest individuals from {p0 u op0} as initial population.

Algorithm 3: Opposition-based population initialization.

Initialization. Set the generation counter iter = 1; set the foraging speed Vf, maximum diffusion speed Dm and the maximum induced speed Nmax;

Generate uniformly distributed random population p0; Opposition-based population initialization For i = 1 to NP do

For j = 1 to NK do

X'i,j = Xmin,j + (Xmax,j - Xi,j);

End For End For

Select the NP fittest individuals from {p0 u op0} as initial population. Fitness evaluation. Evaluate each krill individual according to its position.

While the termination criterion is not satisfied or iter < iterMax Generation do Sort the population/krill from best to worst. For i = 1: NP (all krill) do

Perform the following motion calculation. Motion induced by other individuals Foraging motion Free search operation While (t < T)

Take exploration walks Modification strategy

End While

Update the krill individual position in the search space. Evaluate each krill individual according to its new position. End For

Sort the population/krill from best to worst and find the current best. iter = iter + 1. End While

Post-processing the results and visualization.

Algorithm 4: Free search krill herd algorithm.

performance of FSKH is the best. Although the dimension of the functions is very high, the proposed algorithm (FSKH) can also find optimum solutions for six benchmark functions (F1, F2, F5, F8 ~ F10). Some algorithms can show a good performance in low-dimensional case, but in the high-dimensional case it cannot get a good result for most benchmark functions. The optimization ability of FSKH does not show a significant decline with increasing the dimension.

Figures 30, 31, 32, 33, 34, 35, 36, 37, 38, and 39 are the convergence curves for high-dimensional case and Figures

40,41,42,43,44,45,46,47,48, and 49 are the ANOVA tests of the global minimum for high-dimensional case. As seen from the ANOVA tests of the global minimum, we can find that FSKH is still the most robust method, and superiority is even more obvious. Therefore, in high-dimensional case, FSKH can also be an effective and feasible method for optimization problems. For metaheuristic algorithm, it is important to tune its related parameters. The proposed algorithm is not a very complex metaheuristic algorithm. Only time interval (Ct) and search radius (Rjt) should be fine-tuned in FSKH.

Iteration

Iteration

PSO DE KH FSKH

HS FS BA

----PSO

...... KH

- FSKH

HS FS BA

Figure 35: Convergence curves for F6 (D = 300).

Figure 37: Convergence curves for Fs (D = 500).

x10 15

PSO DE KH FSKH

Iteration

HS FS BA

Figure 36: Convergence curves for F7 (D = 100).

5 2 Fi

PSO DE KH FSKH

Iteration

--- HS ■■ FS ----BA

Figure 38: Convergence curves for F9 (D = 200).

It is a remarkable advantage of the proposed algorithm compared with other algorithms. But in order to get high accuracy solutions, the time complexity of the proposed algorithm is a little high. How to reduce the time complexity of the proposed algorithm by some strategies is our main work in the future.

5. Conclusions

In order to overcome the shortcomings of krill herd algorithm (e.g., poor population diversity, low precision of

optimization, and poor optimization performance in high dimensional case). This paper introduces the free search strategy into krill herd algorithm and proposes a novel free search krill herd algorithm (FSKH). The main improvement is that the krill individual can search freely and the diversity of krill population is enriched. The proposed algorithm (FSKH) achieves a better balance between the global search and local search. Experiment simulation and comparison results with other algorithms show that the optimization precision, convergence speed, and robustness of FSKH are all better

----PSO

...... KH

- FSKH

Iteration

...... FS

----BA

Figure 39: Convergence curves for _f10 (D = 300). Sphere

PSO DE KH FSKH HS FS BA Figure 40: ANOVA tests of the global minimum for Fj (D = 1000).

than other algorithms for most benchmark functions. So FSKH is a more feasible and effective way for optimization problems.

Conflict of Interests

The authors declare that there is no conflict of interests regarding the publication of this paper.

Acknowledgments

PSO DE KH FSKH HS FS BA Figure 41: ANOVA tests of the global minimum for F2 (D = 200).

Rosenbrock

PSO DE KH FSKH HS FS BA Figure 42: ANOVA tests of the global minimum for F3 (D = 100).

Quartic

a _l_ 9 _

PSO DE KH FSKH HS

This workis supported by the National Science Foundation of China under Grant no. 61165015, the Key Project of Guangxi Figure 43: ANOVA tests of the global minimum for F4 (D = 1000).

Rastrigin

Griewank

7000 6000 + 14000 12000

5000 10000

4000 ■s * _L_ " 8000

3000 6000

2000 4000

1000 2000

0 - 0

PSO DE KH FSKH HS FS BA Figure 44: ANOVA tests of the global minimum for F5 (D = 500).

Ackley

10 - T

PSO DE KH FSKH HS FS BA Figure 45: ANOVA tests of the global minimum for F6 (D = 300).

x 107 Quadric

PSO DE KH FSKH HS FS BA Figure 47: ANOVA tests of the global minimum for Fa (D = 500).

Alpine

0.04 I-0.02 0

PSO DE KH FSKH HS FS BA Figure 48: ANOVA tests of the global minimum for F9 (D = 200).

x105 Zakharov

PSO DE KH FSKH HS FS BA Figure 49: ANOVA tests of the global minimum for _f10 (D = 300).

Science Foundation under Grant no. 2012GXNSFDA053028, the Key Project of Guangxi High School Science Foundation under Grant no. 20121ZD008, and the Funded by Open Research Fund Program of Key Lab of Intelligent Perception and Image Understanding of Ministry of Education of China under Grant no. IPIU01201100.

References

[1] M. Srinivas and L. M. Patnaik, "Adaptive probabilities of crossover and mutation in genetic algorithms," IEEE Transactions on Systems, Man and Cybernetics, vol. 24, no. 4, pp. 656667,1994.

[2] J. Kennedy, R. Eberhart, and Y. Shi, Swarm Intelligence, Morgan Kaufman, San Francisco, Calif, USA, 2001.

[3] J. Kennedy and R. Eberhart, "Particle swarm optimization," in Proceedings of the IEEE International Conference on Neural Networks, pp. 1942-1948, December 1995.

[4] M. Dorigo, V. Maniezzo, and A. Colorni, "Ant system: optimization by a colony of cooperating agents," IEEE Transactions on Systems, Man, and Cybernetics B: Cybernetics, vol. 26, no. 1, pp. 29-41, 1996.

[5] S. Das and P. N. Suganthan, "Differential evolution: a survey of the state-of-the-art," IEEE Transactions on Evolutionary Computation, vol. 15, no. 1, pp. 4-31, 2011.

[6] Z. W. Geem, J. H. Kim, and G. V Loganathan, "A new heuristic optimization algorithm: harmony search," Simulation, vol. 76, no. 2, pp. 60-68, 2001.

[7] D. Karaboga and B. Basturk, "A powerful and efficient algorithm for numerical function optimization: artificial bee colony (ABC) algorithm," Journal of Global Optimization, vol. 39, no. 3, pp. 459-471, 2007.

[8] X. S. Yang, "Firefly algorithms for multimodal optimization," in System for Automated Geoscientific Analyses, O. Watanabe and T. Zeug-mann, Eds., vol. 5792, pp. 169-178, LNCS, 2009.

[9] X.-L. Li, Z.-J. Shao, and J.-X. Qian, "Optimizing method based on autonomous animats: fish-swarm Algorithm," Xitong Gongcheng Lilun yu Shijian/System Engineering Theory and Practice, vol. 22, no. 11, p. 32, 2002.

[10] A. H. Gandomi, X.-S. Yang, and A. H. Alavi, "Cuckoo search algorithm: a metaheuristic approach to solve structural optimization problems," Engineering with Computers, vol. 29, no. 1, pp. 17-35, 2013.

[11] X.-S. Yang and S. Deb, "Cuckoo search via Lévy flights," in Proceedings of the World Congress on Nature and Biologically Inspired Computing (NABIC '09), pp. 210-214, December 2009.

[12] R. Q. Zhao and W. S. Tang, "Monkey algorithm for global numerical optimization," Journal of Uncertain Systems, vol. 2, no. 3, pp. 164-175, 2008.

[13] X. S. Yang, "A new meta-heuristic bat-inspired algorithm," in Proceedings of the International Workshop on Nature Inspired Cooperative Strategies for Optimization (NICSO '10), pp. 65-74.

[14] A. Kaveh and S. Talatahari, "A novel heuristic optimization method: charged system search," Acta Mechanica, vol. 213, pp. 267-289, 2010.

[15] X.-S. Yang, "Flower pollination algorithm for global optimization," Lecture Notes in Computer Science, vol. 7445, pp. 240-249, 2012.

[16] C. Xu, H. Duan, and F. Liu, "Chaotic artificial bee colony approach to Uninhabited Combat Air Vehicle (UCAV) path planning," Aerospace Science and Technology, vol. 14, no. 8, pp. 535-541, 2010.

[17] O. Hasançebi, T. Teke, and O. Pekcan, "A bat-inspired algorithm for structural optimization," Computers & Structures, vol. 128, pp. 77-90, 2013.

[19] M. Basu and A. Chowdhury, "Cuckoo search algorithm for economic dispatch," Energy, vol. 60, pp. 99-108, 2013.

[20] A. H. Gandomi and A. H. Alavi, "Krill herd: a new bio-inspired optimization algorithm," Communications in Nonlinear Science and Numerical Simulation, vol. 17, no. 12, pp. 4831-4845, 2012.

[21] G. Wang and L. Guo, "A novel hybrid bat algorithm with harmony search for global numerical optimization," Journal of Applied Mathematics, vol. 2013, Article ID 696491, 21 pages,

[22] G. Wang, L. Guo, A. H. Gandomi et al., "Levy-flight krill herd algorithm," Mathematical Problems in Engineering, vol. 2013, Article ID 682073, 14 pages, 2013.

[23] G.-G. Wang, L. Guo, A. H. Gandomi, A. H. Alavi, and H. Duan, "Simulated annealing-based krill herd algorithm for global Optimization," Abstract and Applied Analysis, vol. 2013, Article ID 213853,11 pages, 2013.

[24] C. Sur, "Discrete krill herd algorithm—a bio-inspired meta-heuristics for graph based network route optimization," in Distributed Computing and Internet Technology, R. Natarajan, Ed., vol. 8337 of Lecture Notes in Computer Science, pp. 152-163,

[25] A. H. Gandomi, A. H. Alavi, and S. Talatahari, "Structural Optimization Using Krill Herd Algorithm.," in Swarm Intelligence and Bio-Inspired Computation Theory and Applications, X.-S. Yang, Z. Cui, R. Xiao et al., Eds., pp. 335-349, Elsevier, 2013.

[26] K. Penev and G. Littlefair, "Free search—a comparative analysis," Information Sciences, vol. 172, no. 1-2, pp. 173-193, 2005.

[27] R. S. Rahnamayan, H. R. Tizhoosh, and M. M. A. Salama, "Opposition-based differential evolution," IEEE Transactions on Evolutionary Computation, vol. 12, no. 1, pp. 64-79, 2008.

[28] K. Tang, X. Yao, P. N. Suganthan et al., Benchmark Functions For the CEC'2008 Special Session and Competition on Large Scale Global Optimization, University of Science and Technology of China, Hefei, China, 2007.

[29] A. H. Gandomi, X.-S. Yang, S. Talatahari, and S. Deb, "Coupled eagle strategy and differential evolution for unconstrained and constrained global optimization," Computers and Mathematics with Applications, vol. 63, no. 1, pp. 191-200, 2012.

[30] A. H. Gandomi, "Benchmark problems in structural optimization," in Computational Optimization, Methods and Algorithms, S. Koziel and X. S. Yang, Eds., Study in Computational Intelligence, SCI 356, pp. 259-281, Springer, 2011.

Copyright of Mathematical Problems in Engineering is the property of Hindawi Publishing Corporation and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use.