Open access peer-reviewed chapter

A Review and Comparative Study of Firefly Algorithm and its Modified Versions

By Waqar A. Khan, Nawaf N. Hamadneh, Surafel L. Tilahun and Jean M. T. Ngnotchouye

Submitted: October 30th 2015Reviewed: February 11th 2016Published: September 21st 2016

DOI: 10.5772/62472

Downloaded: 2378

Abstract

Firefly algorithm is one of the well-known swarm-based algorithms which gained popularity within a short time and has different applications. It is easy to understand and implement. The existing studies show that it is prone to premature convergence and suggest the relaxation of having constant parameters. To boost the performance of the algorithm, different modifications are done by several researchers. In this chapter, we will review these modifications done on the standard firefly algorithm based on parameter modification, modified search strategy and change the solution space to make the search easy using different probability distributions. The modifications are done for continuous as well as non-continuous problems. Different studies including hybridization of firefly algorithm with other algorithms, extended firefly algorithm for multiobjective as well as multilevel optimization problems, for dynamic problems, constraint handling and convergence study will also be briefly reviewed. A simulation-based comparison will also be provided to analyse the performance of the standard as well as the modified versions of the algorithm.

Keywords

  • Optimization
  • Metaheuristic Algorithms
  • Parametric Modification
  • Mutation
  • Binary Problems
  • Simulation

1. Introduction

An optimization problem refers to the maximization or minimization of an objective function by setting suitable values for the variables from a set of feasible values. These problems appear not only in complex scientific studies but also in our day-to-day activities. For instance, when a person wants to go from one place to another and has multiple possible routes, a decision needs to be made on which route to take. The decision can be with the objective to minimize travel time, fuel consumption and so on. However, these kinds of problems with fewer number of alternatives can easily be solved by looking at the outcome of each of the alternatives. However, in real problems, it is not always the case to have a finite and small number of alternatives. Hence, different solution methods are proposed based on the behaviour of the problem.

Since the introduction of evolutionary algorithms, many studies have been conducted on heuristic algorithms. Introducing new algorithms has been one of the leading research areas [1]. Currently, there are more than 40 metaheuristic algorithms [2]. Most of these new algorithms are introduced by mimicking a scenario from nature. For instance, genetic algorithm is inspired by the Darwin theory of survival of the fittest [3]; particle swarm optimization is another metaheuristic algorithm mimicking how a swarm moves by following each other [4]; firefly algorithm is inspired by how fireflies signal each other using the flashing light to attract for mating or to identify predators [5] and prey predator algorithm is another new algorithm inspired by the behaviour of a predator and its prey [6]. These algorithms use different degree of exploration and exploitation based on their different search mechanisms.

Firefly algorithm is among those metaheuristic algorithms which have different applications. Its uncomplicated and easy steps with its effectiveness attract researchers from different disciplines it. Different studies have been performed to modify the standard firefly algorithm to boost its performance and to make it suitable for a problem at hand. In this chapter, a comprehensive study will be presented on firefly algorithm and its modified versions. A brief discussion on extended firefly algorithm with other relevant studies will also be provided. In the next section, a discussion on optimization problems with their solution methods will be given followed by a review on studies on firefly algorithm, which includes a discussion on the standard firefly algorithm with its modified versions and other relevant studies on firefly algorithm, in Section 3. In Section 4, a comparative study based on simulation results will be presented followed by summary of the chapter in Section 5.

Advertisement

2. Optimization problems

Decision-making problems can be found beyond our daily activity. They are very common in engineering, management and in many other disciplines. Different researchers used the concept of optimization in different applications, including engineering applications, transportation planning, management applications, economics, computational intelligence, decision science, agriculture, tourism, sport science and even political science [718].

When these problems are formulated mathematically, they are called mathematical optimization problems. It will have a set of feasible actions, also called feasible regions, and a measure of performance of these actions called the objective. A standard single objective minimization problem can be given as in Eq. (1).

minx{f(x)|xSRn}E1

where f:nis called the objective function, Sis the feasible region and the vector xis the decision variable. A vector x¯is said to be an optimal solution for the minimization problem given in Eq. (1) if and only if x¯Sand f(x¯)f(x),xS. A local solution x′ is a member of Sand f(x')f(x), for all xin the neighbourhood of x′.

In a broad sense, optimization solution methods can be categorized as exact and approximate solution methods. Exact solution methods are methods which use an exhaustive search for the exact solution in the solution space. They use mathematical and statistical arguments to get an exact solution. They mainly used calculus-based and iterative procedures. Perhaps Fermat is the first to use a calculus-based argument to solve optimization problems [19]. Iterative methods were first proposed and used by Newton and Gauss [20]. Since then, several exact solution methods are proposed and used in different problems. Branch and bound, simplex method and gradient descent method are good examples of exact solution methods. However, due to complex problems modelled from complex real aspects, it becomes challenging for the deterministic solution methods. This leads to the search of new 'out of the box' way of solving these problems, which in turn gives rise to the birth of metaheuristic solution algorithms.

Metaheuristic algorithms are approximate solution methods for an optimization problem which use a randomness property with an 'educated guess' in their search mechanism and try to improve the quality of the solutions at hand through the iterations, from a randomly generated set of feasible solutions, by exploring and exploiting the solution space. Even though these algorithms do not guaranty optimality, they are tested to give a reasonable and acceptable solution. Furthermore, they have the advantage of not to be affected much by the behaviour of the problem; this makes them useful in many applications. Having a variety of algorithms will give the option to choose a suitable one to solve a problem according to its behaviour.

3. Studies on firefly algorithm

3.1. Introduction

Nature has been an inspiration for the introduction of many metaheuristic algorithms. It has managed to find solution to problems without being told but through experience. Natural selection and survival of the fittest was the main motivation behind the early metaheuristic algorithms. Different animals communicate with each other through different mode of communications. Fireflies use their flashing property to communicate. There are around 2000 firefly species with their own distinct flash patterns. They usually produce a short flash with a certain pattern. The light is produced by a biochemical process called the bioluminescence. The flashing communication is used to attract their mate and also to warn predators. Based on the pattern of the light, a suitable mate will communicate back by either mimicking the same pattern or responding with a specific pattern. It also needs to be noted that the light intensity decreases through distance; hence, a flashing light emanating from a firefly gets a response from fireflies around it within a visual range of the flash.

In addition to enjoying the beautiful view of a summer sky created by fireflies, they have motivated and have been the centre for many scientific researches [5, 21, 22]. In the sense of optimization, if we consider the fireflies as solution on the landscape of the solution space, then the attraction and movement of fireflies can inspire an optimization algorithm in which solutions follow better (brighter) solutions. Hence, firefly algorithm is motivated and inspired by these properties.

3.1.1. The standard firefly algorithm

Firefly algorithm is a swarm-based metaheuristic algorithm which was introduced by Yang [5]. The algorithm mimics how fireflies interact using their flashing lights. The algorithm assumes that all fireflies are unisex, which means any firefly can be attracted by any other firefly; the attractiveness of a firefly is directly proportional to its brightness which depends on the objective function. A firefly will be attracted to a brighter firefly. Furthermore, the brightness decreases through distance based on inverse square law, as given in Eq. (2).

I1r2E2

If the light is passing through a medium with a light absorption coefficient γ, then the light intensity at a distance of rfrom the source can be given as in Eq. (3).

I=I0eγr2E3

where I0 is light intensity at the source. Similarly, the brightness, β, can be given as in Eq. (4).

β=β0eγr2E4

A generalized brightness function for ω≥ 1 is given in Eq. (5) [5]. In fact, any monotonically decreasing function can be used.

β=β0eγrωE5

In the algorithm, a randomly generated feasible solution, called fireflies, will be assigned with a light intensity based on their performance in the objective function. This intensity will be used to compute the brightness of the firefly, which is directly proportional to its light intensity. For minimization problems, a solution with smallest functional value will be assigned with highest light intensity. Once the intensity or brightness of the solutions is assigned, each firefly will follow fireflies with better light intensity. For the brightest firefly, it will perform a local search by randomly moving in its neighbourhood. Hence, for two fireflies, if firefly jis brighter than firefly i, then firefly iwill move towards firefly jusing the updating formula given in Eq. (6).

xi:=xi+β0eγrij2=β(xjxi)+α(ε()0.5)E6

where β0 is the attractiveness of xjat r= 0, in [5] the author recommended that β0 = 1 for implementation, γis an algorithm parameter which determines the degree in which the updating process depends on the distance between the two fireflies, αis an algorithm parameter for the step length of the random movement and ε() is a random vector from uniform distribution with values between 0 and 1. For the brightest firefly, xb, the second expression in Eq. (6) will be omitted, as given in Eq. (7).

xb:=xb+α(ε()0.5)E7

Table 1.

The standard firefly algorithm.

These updates of the location of fireflies continue with iteration until a termination criterion is met. The termination criterion can be maximum number of iterations, a tolerance from the optimum value if it is known or no improvement is achieved in consecutive iterations. The algorithm is summarized in Table 1.

3.2. Modified versions of firefly algorithm with critical analysis

Firefly algorithm is efficient and an easy-to-implement algorithm. It is also suitable for parallel implementation. However, researches show that it is slow in convergence and easily gets trapped in local optimum for multimodal problems. In addition, the updates solely depend on current performance and no memory on previous best solutions and performances are kept. That may lead to losing better solutions. Furthermore, since the parameters are fixed, the search behaviour remains to be the same for any condition in all iterations. Hence modifying the standard firefly algorithm to boost its performance has been one of the research issues. Furthermore, the standard firefly algorithm is designed for continuous optimization problems; hence in order to use it for non-continuous problems it needs to be modified and adjusted.

3.2.1. Modification for problems with continuous variables

Basically, there are three classes of modification. Class 1 modification is the modification on the parameters. It is the first category in which the parameters of the algorithm are modified and the same updating mechanisms or formulas are used. Class 2 contains new updating mechanisms. It includes modifications which change part or all of the updating formulas, add mutation operator and the likes. The last category, Class 3, includes modifications on the search space, perhaps with the same updating mechanism it may be easier to switch to another ‘easy-to-search’ space, and changes in the probability distribution when generating random numbers. The categories are not necessarily disjoint as some of the modifications may fall in multiple classes.

3.2.1.1. Class 1 (parametric modification)

In the standard firefly algorithm, the parameters in Eq. (6) are user-defined constants. Like any other metaheuristic algorithms, the performance of a firefly algorithm depends on these parameter values. They control the degree of exploration and exploitation.

Some of the modifications of firefly algorithm are done by making these parameters variable and adaptive. In recent researches on the modification of firefly algorithms, the parameters α, γand also rare modified. The modification of αaffects the random moment of the firefly, whereas modifying either γor raffects the degree of attraction between fireflies. Adjusting the brightness at the origin, β0, has also been done in some researches.

a. Modifying the random movement parameter:-

To deal with parameter identification of infinite impulse response (IIR) and nonlinear systems, firefly algorithm is modified in [23]. The modification with the random movement is based on initial and final step lengths α0 and α using α:=α+(α0α)eItr. In addition, additional fourth term in the updating process, given by αε(xixb)where xbis the brightest firefly of all the fireflies, is added so that firefly algorithm will resemble and have search behaviour like particle swarm optimization. In order to implement this modification, initial and final randomization parameters, α0 and α, need to be supplied by the user. The randomized parameter is set to decrease exponentially and within a couple of iteration it will vanish. For example, if α0 = 1 and α = 0.2 starting from 0.089 in the first iteration it will decrease to 0.0001 in the seventh iteration. Furthermore, the additional term in the updating formula takes the solution xiaway from xb, with a given step length αε. This is in contradiction to the following concept of the current best solution of the algorithm. Assuming that the step length for the new term as well as the randomness is the same, there is only one additional parameter, either α0 or α in place of a single parameter α.

Another firefly algorithm with adaptive αis presented in [24]. The modification is given by α(Itr):=α(Itr1)(12ItrMax)1ItrMax. In addition, two new solutions are generated based on three solutions from the population chosen randomly, the one with the better brightness from the two new solutions, and xiwill replace xiand pass for the next iteration. It is also used to solve optimal capacitor placement problem [25]. Similar modification of αwith additional mutation and crossover operators is also given in [26].

In [27], the randomized parameter is modified based on the number of iterations using 0.41+e0.005(ItrItrMax). The simulation results in 16 benchmark problems show that the modification increases the performance of the standard firefly algorithm significantly.

In extending firefly algorithm for multiobjective problems, an adaptive αis proposed and used in [28]. Here αis made adaptive based on the number of iteration and is given by α:=α00.9Itr. Hence, the step length decreases faster than linearly.

Self-adaptive step firefly algorithm is another modification done to the third term of the updating process by Yu et al. [29]. The step length αis updated based on the historical information and current situation using αi(Itr+1):=11fb(Itr)(fi(Itr))2+(fi(Itr))2+1where hiItr=1(fibest(Itr1)fibest(Itr2))2+1for fbItr=f(xb), fiItr=f(xi)after iteration Itr, fibest(Itr1)and fibest(Itr2)the best performance of solution xiuntil Itr− 1 and Itr− 2 iterations. Sixteen two-dimensional benchmark problems are used showing that the proposed approach produces better result with smaller standard deviation. It is a promising idea to control the randomized parameter based on the solution's previous and current performances. The update works in such a way that whenever the solution approaches the brightest firefly its step length will decrease, since the performance of the solutions needs to be saved and the memory complexity should be studied.

Another study of modification of the random movement parameter based on the historic performance of the solution is presented in [30]. Based on its best position until current iteration, xi,best, and the global best solution until current iteration, xgbest, αi(Itr+1)=αi(Itr)(αi(Itr)αmin)eItr|xgbestxi,best|MaxGen.

b. Modifying the attraction parameters

The attraction of one firefly by another depends on the light intensity at the source of the brighter firefly as well as on the distance between them and the light absorption coefficient of the medium.

For a small change in the distance between two fireflies results in a quick decrease of the attraction term. To deal with this problem, Lin et al. [31] introduced a virtual distance which will put rin between 0 and 1 and is defined by r'=rrminrmaxrmin, where rmin = 0 and rmax=i=1d(xmax(i)xmin(i))2for xmax(i) and xmin(i) being the maximum and minimum values of the ith dimension over all fireflies, respectively. Furthermore, βis set as β=β0γ(1r'). In later iterations, the swarm tends to converge around an optimal solution. It means that the distance rdecreases and so does rmax. However, in most cases, the decreasing rate of ris faster than rmax, resulting in a slight increase in β. In order to overcome the possibility of the attraction term dominating the attraction term, the authors proposed a new updating equation in later iterations using xi:=xi+β(xjxi)αε. Indeed the new updating formula omits the random movement of the firefly. The firefly will only move towards a brighter firefly with a step length of βαε.

Tilahun and Ong [32] suggested that, rather than making β0 = 1, it should be a function of the light intensity given by β0=eI0,jI0,ifor a firefly ito be attracted to j, where I0,jand I0,iare intensity of fireflies jand iat r= 0. In addition, moving the brightest firefly randomly may decrease the brightness; hence a direction which improved the brightness will be chosen from mrandom directions. If such a direction is not among these mdirections, it will stay in its current position. The complexity of the algorithm may increase with respect to the new parameter m, and therefore it should be taken into consideration.

Due to the non-repetition and ergodicity of chaos, it can carry out overall searches at higher speeds. Hence, Gandomi et al. [33] proposed a modification on βand γusing chaos functions. This approach has attracted many researchers, and it has been used in different problem domains. The approach is successfully applied using Chebyshev chaos mapping for MRI brain tissue segmentation in [34], for heart disease prediction using Gaussian mapping [35], reliability-redundancy optimization [36] and for solving definite integral problems in [37]. In [38], chaotic mapping is used for βor γ. In addition, αis made to decrease based on the intensity of solutions using α=αmax(αmaxαmin)ImaxImeanImaxIminwhere Imax,Imean,and Imin are the maximum, the average and the minimum intensities of the solutions.

Another modification in this category is done in [39]. In this chapter, βis modified using β=(βmaxβmin)eγr2+βmin, where βmin and βmax are user-supplied values. Similar modification is also done in [40].

c. Modifying both the random and attraction movement parameters

To overcome this challenge arising with an increase in the problem dimension and the size of the feasible region, Yan et al. [41] proposed a modification for the standard firefly algorithm. This modification is done on the generalized distance term given in Eq. (5), in which rωis replaced by rKn(Range), where Kis a constant parameter, nis the dimension of the problem and Range is the maximum range of the dimensions. The parameter αalso reduces with iteration from a starting value α0 to a final value αendlinearly. In addition, a firefly is attracted to another firefly if it is brighter and if it is winking. The winking is based on a probability pw=0.5+0.1count_i, where count_iis the value of a firefly iwinking state counter. The larger the counter the greater will be the probability of shifting the state. The maximum counter is five, and after that it will be reset to 0.

In order to solve economic dispatch problem, firefly algorithm is modified in [42]. To increase the exploration property, the authors replaced the Cartesian distance by the minimum variation distance. In addition, they used mutation operator on αbut no explanation on how the mutation works is given.

To deal with premature convergence, firefly algorithm has also been modified based on the light intensity [43]. The light intensity difference is defined by ξ=ΔIij(t)max{I}min{I}for an iteration t. Based on ξ, a modification is done on γ, βand αas follows, γ=γ0rmax2where rmax=max{d(xi,xj)|i,j}, β0={ξ,ξ>η1η1,ξη1where η1 is a new parameter and α=α0(0.02rmax)where α0={ξ,ξ>η2η2,ξη2for another new algorithm parameter η2. The modification shows that for two fireflies, the brighter one will have a small attraction and randomness step length compared to the brighter ones.

For the optimal sizing and siting of voltage-controlled distribution generator in distributed network, firefly algorithm is modified and is used in [44]. The problem is to minimize the power loss by selecting optimal location for distributed generations and the power produced. In the modification β0 = 1, whereas γand αare modified based on the problem property (location and maximum power per location in each iteration). This modification is done based on the problem characteristic. The effectiveness and quality of a solution for a metaheuristic algorithm depend on the proper tuning of an algorithm parameter as well as on the behaviour of the landscape of the objective function.

Another modification of the standard firefly algorithm to be listed in this category is done in [45]. The randomized parameter αhas been made adaptive using α=αmaxItr(αmaxαmin)MaxGen. Furthermore, the distance function has been made to be influenced not by their location in the landscape of the feasible region but the brightness or functional values of the fireflies using f(xb)f(xi). For two fireflies with similar performance in the objective function, they are considered to be near each other.

For path planning of autonomous underwater vehicle, the parameters γand αof the standard firefly algorithm are modified by γ=γb+ItrMaxGen(γeγb)for γe> γband α=αb+ItrMaxGen(αeαb)for αe< αb[46]. Furthermore, the updating formula is defined as xi:=xi+β(xixj)+αrε. As the iteration increases, αdecreases and γincreases linearly, implying both the randomness movement and the attraction decrease as a function of the iteration. In the updating formula, the random movement is multiplied by the distance.

A similar approach in which the parameters α,βand γare encoded in the solution is proposed in [47]. Unlike in [44], the update is done using ψ:=ψ+σψN(0,1)where σψ:=σψeτ'N(0,1)+τN(0,1)for learning parameters τ,τ′ and ψ={α,β,γ}. Another modification that can be listed in this category is done in [48]. The parameter γis modified using γmax(γmaxγmin)(ItrMaxGen)2for 2γmax4and 0.5γmin1. In addition, for a new parameter λ, α=αmax(αmaxαmin)(Itr1G01)λwhere G0 is an iteration number in which α=αmin. This results in a decrease in αquicker than linear function if λis in the range (0, 1), linearly if λ= 1 and slower than linear function if λ> 1. Furthermore, in order to overcome the trapping of the solutions in local optimal solution, Gauss distribution is applied to move the brightest solution, xb, i.e. xb:=xb+xbN(μ,σ). This will be applied if the variance of the solutions before a predetermined Miteration is less than a given precision parameter η. The authors also suggested that chaos, particularly cubic mapping, can be used to improve the distribution of the initial solutions.

In [49], γand αare computed using γ=0.03|G1|and α=0.03|G2|where G1 and G2 are generated from Gaussian or normal distribution with mean 0 and variance 1. Supported by two case studies for multivariable proportional–integral–derivative (PID) controller tuning, a similar study was also done in [50]. They used Tinkerbell mapping to tune γ, using γ=|G|x¯itrMaxGenwhere x¯has normalized values generated by the Tinkerbell map in the range [0, 1]. In addition to that, αis modified to decrease linearly using α=(αfinalαinitial)ItrMaxGen+αinitial.

3.2.1.2. Class 2 modifications (new updating mechanisms)

The updating mechanism in the standard firefly algorithm is guided by Eqs. (6) and (7). In Class 1 modification, the same updating equations are used but with adaptive preference. Class 2 modifications include modification on the updating equations including modification in the updating process of the best (the brightest) and the worst (the dimmer) solutions changing part of the updating equations and some modification with additional mutation operator.

a. Modifying the movement of the brightest or dimmer firefly

In a high dimensional problem, the exploration is weak which results in premature convergence. To deal with this, two modifications are proposed in [51] for the standard firefly algorithm. That is, for the initial random Nsolutions, their opposites will be generated, and the best Nsolutions will be chosen from the Nsolutions and their opposites where an opposite number for xis given by xmin+xmaxx. The brightest solution xbwill be updated as follows:     y=xb

     for i=1:D(for all dimensions)

       for j=1:N(for all the solutions)

         y(i)=xj(i)

         if [f(y)is better than f(xb)] xb=yend if

      end for

    end for

Similar to the previous modification, here also the best solution will improve or will not change in each of the iterations.

Opposition-based learning is also used in [52], to update the dimmer solution xw, a solution with worst performance, using xw={xb,ε<pxmin+xmaxxw,Otherwise, for an algorithm parameter p.

Indeed, it relocates the worst solution to a new position that may give the algorithm a good explorative behaviour.

b. Mutation incorporation

Jumper firefly algorithm is a modified firefly algorithm in which a memory on the performance of each of the solution is kept [53]. A criterion called hazard condition is defined, and solutions will be tested based on their previous performance. If they are in hazardous condition, they will be randomly replaced by a new solution. Hence, based on the hazard condition, a mutation can be done by replacing the weak solution based on previous performance by a new solution.

Another modification in this category is done by Kazemzadeh-Parsi [54], where each iteration krandom ‘newborn’ solutions will be generated to replace the weak solutions. In addition to this mutation, highly ranked k1 solutions from the previous iteration will replace the same number of weak solutions. The other modification is that, rather than having consecutive movement of a firefly towards brighter fireflies, a single combined direction, the average of the directions (1li=1lxj)for brighter fireflies xj's, will be computed and used. Similar approach is used in [55]. In the first case, where newborn solutions replace weak solutions, the number of solutions should not be large; otherwise it behaves as an exhaustive search. In the second case, whenever some solutions are replaced by others from the previous solutions, the solutions coming from the previous iteration will more or less perform similar search behaviour as what has been done in the previous iteration.

Another modification of firefly algorithm by introducing new solutions as a mutation or crossover is given in [26, 56]. In addition to adaptive parameter α, they introduced two mutation operators and five crossover operators based on the mutated solutions. The first mutation operator works by combining randomly selected three solutions, xq1,xq2and xq3, different from xi, from the solution set [xmute1=xq1+ε(xq2xq3)]and the second based on the mutated solution from the first mutation operators, the best and worst solutions (for an iteration t[xmute2=xmute1+εt(xbxw)]. Based on these two solutions, five solutions will be generated, and the best one from the mutated as well as the new five solutions replace xi. Similar modification of αand two mutation types are also proposed in [24]. In [57], the parameter αis made to adapt using chaotic mapping and mutation operators.

c. New updating strategy

This is another category of Class 2 modification in which the updating formula, given by Eqs. (6) and (7), is modified or changed. The first modification, to mention in this category, is the modification proposed in [57]. For a firefly iattracted by another firefly j, the search is updated to be in the vicinity of xj, as given by xi:=xj+β(xjxi)+αε. Furthermore, after the update, only improving solutions are accepted. Since the update is done based on the location of xj, the exploration property of points in between the two solutions will not be done, and it will be trapped in local optimum solution provided xjis a local solution, and the step lengths are small. Through iterations, the solutions will be forced to be in a neighbourhood of the best solution. The diversity of the solutions will also be low.

A similar modification in the vicinity of the brighter firefly is given in [58]. They proposed two updating formulas, with and without division, as the authors name them. The updating formula, without division, is given by xi:=xj+αε. Once the fireflies are sorted according to their brightness, increasing with their index, the updating formula will be defined by xi:=xj+αjε, which will decrease αas the brightness increases and which will give a good intensification or exploitation property. In addition, the parameters αand γare made adaptive. This put this modification in both Class 1 and Class 2 in our classification. Similar discussion holds for a similar work done in [57]. In addition, unlike in [59], there is an attraction term in the direction from the brighter xjtowards xi, that means moving the brighter firefly in a non-promising direction replaces xi. Hence, in this sense, the modification in [58] is better as it moves the solution not in a non-promising direction but randomly.

For a data clustering problem, the standard firefly algorithm is modified firefly algorithm [60, 61]. They proposed a new updating formula to increase the influence of the brightest firefly. The new updating formula is given by xi:=xi+β(xjxi)+β0eγri,gbest2(xgbestxi)+αε. This means that a firefly is not only attracted to brighter fireflies but also by the best solution found so far, xgbest. Suppose there are lbrighter fireflies, brighter than xi, at the end of the iteration the attraction term will be j=1lβ(xjxi)+lβ0eγri,gbest2(xgbestxi). Hence, repeatedly moving a firefly to the best solution increases the attraction step length, and based on the feasible region, it usually may not be acceptable. Furthermore, the global solution found can be an optimal solution, and it may result in the solutions to be forced to follow that local solution rather than exploring other regions in each loop of iteration. In [61], four ten-dimensional benchmark problems and four twenty-dimensional benchmark problems along with Iris data set are used for clustering. Similar modification is done in [62]. In addition, to update αin a decreasing manner, the updating formula for a solution xibased on the brighter firefly xband the best solution from memory gbestwith a new algorithm parameter λis modified as xi:=xi+β0eγrij2(xjxi)+β0eγri,best2(xbxi)+λε(xigbest)+αε.

Fuzzy firefly algorithm is another modification of the standard firefly algorithm [63]. Even though they start with a wrong claim by saying "in the standard firefly algorithm, only one firefly in each iteration can affect others and attract its neighbours", they try to increase the exploration property of the algorithm by adding an additional term in which the top kfireflies attract according to xi:=xi+[β0eγeij2(xjxi)+h=1kA(h)β0eγeih2(xhxi)]αε, where A(h)=f(xb)l(f(xh)f(xb))with lbeing a new algorithm parameter. The effect of the best kfireflies is doubled, and if this updating mechanism is done for each brighter firefly xjthen its effect is more than double. Furthermore, multiplying the random term with the second expressions affects their step length and deletes the random movement. Hence, it forces the fireflies to follow best ksolutions. Exploring other regions is not possible with this update.

Another modification with a new updating formula is proposed in [64] and is given by xi:={xi+β(xjxi)+αε(xmaxx)min,ifε>0.5MaxGen-ItrMaxGen(1η)xi+ηxb, otherwise where β0 is computed based on the location of xinormalized by the locations of fireflies in the search space and γis computed with a direct relation with β0 and additional two parameters; ηis a value based on the difference between the location of the fireflies.

Diversity-guided firefly algorithm is one of the recent modified versions [65]. The modification is done to make the solutions as diverse as possible with a given threshold. The updating mechanism of the standard firefly is used until diversity of the solution falls beyond the given threshold. The diversity is measured by 1NLi=1N

where Lis the longest diagonal of the feasible region, and x¯is the average position of all fireflies. If the diversity is less than a predefined threshold value, then the updating formula will be xi:=xi+β(xjxi)+αε(xix¯). The modification proposed is effective in diversifying the solutions by replacing the random movement by moving the solutions away from the average position of fireflies.

In [66, 67], a mutated firefly algorithm is proposed in such a way that the brighter firefly donates some of its features based on a new algorithm parameter called probability of mutation, pm. The features and their amount copied from the bright firefly are not mentioned. However, based on the context, it seems some components of the vectors for xiwill be replaced by the corresponding components from the brighter firefly xj. In [66], this mutation operator will replace the updating formula given in Eq. (6) whereas in [67], the mutation will be done after the update is taken place.

In [68], a firefly located at xifirst checks the direction towards other brighter fireflies and looks for the one that improves its performance. If there exists such a solution in which ximoves towards the firefly, its brightness increases. Checking the direction towards each of the solution may increase the complexity. Furthermore, in order to escape a local solution some solution should be allowed to decrease its performance; hence in this modification it is highly affected by local optimum solutions especially in misleading problems.

Another modification in this category is introduced to deal with economic dispatch optimization of thermal units [69]. A memory is used to record the best solution found so far. Based on cultured differential evolution, the updating formula is modified as xi:=xi+aβ(gbestxi)+bα(ε()0.5)where a=f(xif(gbest))fmaxfminand b=xmaxxminfor xmax and xmin being the maximum and minimum component of vector x, respectively.

Another modification in this category is presented in [70]. The updating formula becomes xi:=wxi+β(xjxi)+αε()for a weighting parameter wgiven by w=wmax(wmaxwmin)ItrMaxGen.

3.2.1.3. Class 3 modifications (change in search space or probability density functions)

This class of modifications is on the abstract level modification and includes two types of modifications. The first one is changing the solution space to an easy search space, and the second one is on the types of probability distribution that is used to generate a random vector direction for the random movement.

a. Change in search space

In the modified version presented in [71], each component of a solution will be represented by quaternion xi(k)=(y1(i),y2(i),y3(i),y4(i))for all components kof xi, and the updating will be done over the quaternion space. In order to compute the brightness, the Euclidian space is used by changing the quaternion space to the search space by taking the norm function, xi(k)=

. Even though the search space increases fourfold, it is interesting to zoom in into each component and perform the search for optimal solution. However, since a norm is used to convert quaternion space to the search space, a mechanism to deal with negative values should be studied. A more mathematical support should be provided along with complexity study.

b. Change in probability distribution function

Perhaps the first work which tries to adapt the randomness movement in the updating process is by Farahani et al. [72, 73]. Even though they started with a wrong claim by saying ‘In standard Firefly algorithm, firefly movement step length is a fixed value. So all the fireflies move with a fixed length in all iterations’by ignoring the random variable εthat makes the step length between 0 and α. They updated the step length to decrease with iteration and introduced a new parameter which updates each solution using xi:=xi+αε(1p), where pis a random vector from Gaussian distribution. This will increase the randomness property of the algorithm as it randomly moves once using the usual updating equation. The same modification is also employed in [74].

By enhancing the random movement of a firefly algorithm, Levy firefly algorithm is introduced in [75]. This is the first modification made to firefly algorithm with the Levy distribution guiding the random movement by generating a random direction as well as the step length. The update formula is modified as xi:=xi+β(xjxi)+αsign(ε)Levy; ⊗ indicates a component-wise multiplication between the random vector from the Levy distribution and the sign vector. Similarly, in [76], Levy distribution is used to guide the random term of the updating formula. In addition, the parameter αis made to decrease with iteration, using α=αmaxItr2. Furthermore, what they call information exchange between top fireflies will be done. That is, two solutions are randomly chosen from the top fireflies, and a new solution on the line joining the two fireflies near the brightest one will be generated. Similar approach of using Levy distribution with the step length is generated using a chaotic random number and has applications in image enhancement [77]. The same updating using Levy distribution and same formula for αis used in [78]. In addition to these updates, a communication between top fireflies is used in [79]. Levy distribution along with other probability distribution is suggested for the randomized parameter and used in [80, 81].

3.2.2. Modifications for problems with non-continuous variables

Even though firefly algorithm is introduced for continuous problems, due to its effectiveness it has been modified for non-continuous problems as well. In this section, we will look at three classes of modification. The first one is when modifications are made to solve binary problems. The second is for integer-valued problems which include problems whose variable can have discrete values. The last one is mixed problems in which some of the variables are continuous and the rest are non-continuous.

3.2.2.1. Modifications for binary problems

To deal with set covering problem, a binary firefly algorithm is proposed in [82]. There is no modification in the updating process except converting the solution to either one or zero. Three ways of conversion are proposed in [82]. The conversion works dimension wise. After a solution xiis updated, for each component pof xi, three rules based on a transfer function T, which will change the new value of xiin the interval [0, 1], are given with eight transfer functions. The first rule of conversion is given by xi(p)={1,ε<T(xi(p))0,Otherwise. The second is xi(p)={(xi(t)(p))1,ε<T(xit+1(p))xi(t)(p),Otherwise, where (xi(t)(p))1is the complement of xi(t)(p)i.e. if xi(t)(p)=1then (xi(t)(p))1=0, otherwise (xi(t)(p))1=1, xi(t)(p)and xi(t+1)(p)are the pthcomponents of xifrom the previous iteration and after the update. The last rule is given by xi(p)={(x*(p)),ε<T(xit+1(p))0,Otherwisewhere x* is the best memory from memory.

Another modification for binary problems which works dimension wise, in each dimension, is presented in [83]. The update formula of the standard firefly algorithm is used. After the update, the solution will lie in the interval [0, 1] using tanh(xi(p))function for the pthcomponent of solution xi. Based on the user-defined threshold value, xi(p)=1; if the result is greater than the threshold value, then it will be 0. Another similar work of using sigmoid function is given in [84, 85]. In [8688], tangent hyperbolic sigmoid function is used for discretizing the solution and also in the updating process using xi={xi+β0eγr2(xjxi)+αε,ifε<|tanh(λr)|xi,else, where λis a parameter close to one.

Another discrete firefly algorithm, in order to deal with job, schedule problem is proposed in [89]. Each firefly xiin the problem will have two classes of index as xi(p,q)where prepresents the jobs and qtheir priority in that particular firefly. In order to change real values to binary after the update as sigmoid function, 11+exi(p,q)is used. Based on the values of the sigmoid function for each job p, the one with higher probability in qwill be assigned with value 1 and the rest priority for that particular job with 0.

In [90], for a dynamic knapsack problem, firefly algorithm has been modified. The conversion of the solutions is done based on the property of the problem using priority-based encoding. In addition to making the algorithm to suit for the problem, some modifications are done to increase its effectiveness. One of the modifications is that a firefly imoves towards firefly jif jis brighter and ε<rankimode(Itr1,MaxGen)MaxGen, where rankiis the rank of firefly iin the solution population. If the condition is not met, i.e. if εrankimode(Itr1,MaxGen)MaxGen, no updating mechanism is mentioned in the chapter. A similar modification is used in [91]. In addition to the discretization done in [90], the authors in [91] proposed adaptive step length given by α:=[1φemod(ItrItr+1,1)]αfor a scaling parameter ϕ. Furthermore, after the updates, two additional moves are introduced. The first one is a random flight by 10% of the top fireflies with 0.45 probability. The move will be accepted only if it is improving. The second is a local search of xb, after 10% of the iterations xbwill do a local search, and the update will be accepted if it is improving. The additional local searches help to improve the quality of the solution. Furthermore, they also mentioned that chaotic mapping can be used to generate random numbers.

Another modification in this category is presented in [92]. In addition to the discretization, they have made αand γadaptive using α=αmaxItr(αmaxαmin)MaxGenand γ=γmaxeItrMaxGenlnγminγmax. Furthermore, the random movement is replaced by αL(xb)|xixb|for a random number L(xb) from Levy flight centred at xb. Three discretization methods, the sigmoid, elf function and rounding function are used to change values in the range [0, 1] along with three updating processes. The first one is done on the continuous space, and sigmoid function will be used to change the results to binary; the second one is the update done on the discrete space, and the discretized results can be used and the third one, instead of using the updating formula, uses a probabilistic method based on sigmoid function.

3.2.2.2. Modifications for integer optimization problems

In [93], firefly algorithm has been modified to deal with software modularization as a graph-partitioning problem. Initially, random integer-encoded solutions are generated. The hamming distance, the number of different entries between two solutions with the same index, is used to measure the distance between two solutions. The update is done by switching a number of entries of a firefly by the entries from a brighter firefly.

Another modification in this category is done in [94]. The modification is based on a concept of random key, which is proposed in [95]. The method uses a mapping of a random number space, [0,1]D, to the problem space.

In [96, 97], the standard firefly algorithm is modified for loading pattern enhancement. The generation of random solutions uses random permutation, and the distance between fireflies, d(xi,xj), is measured using hamming's distance. The updating process is separated and made sequentially; first the βstep, a move due to the attraction, and next the αstep, a move due to the random movement. In the βstep, first same entries with same index for both fireflies, xiand xj, are preserved and then an entry will be copied from xjif ε<β, where β=11+γd(xi,xj)2; otherwise the xientry will be used. The αstep is done using xi:=Int(xi+αε), with a swapping mechanism to preserve feasibility. A similar modification of sequentially applying βstep and αstep is also used in [98], with additional modification on βand αusing β=e(max{Pi}pij)2max{Pi}for Pij=ε+1|rankirankj|and α=DDMaxGen. It is a good idea to adapt and increase the step length with the dimension of the problem. For instance, when D= 12, the step length αwill start from 11 and decrease to zero in last iterations. However, if the feasible region is in [0, 4], the search in more than 60% of the time αwill be at least 4. Hence, the moves in the first 60 plus % are not acceptable because it will force the solution out of the feasible region. Hence, the modification needs to consider the size of the feasible region. Another similar modification with the modifications done in [97] is given in [99] with additional modification to keep the best solution and use it in the updating process. That is based on ρ=0.5+0.5ItrMaxGen, and if ε>ρthe brighter firefly xjwill be replaced by the global best from memory.

For travel salesman problem, firefly algorithm has been modified in [100]. Initial solutions are generated using permutation of D, and each solution is represented as a string of chromosomes of these numbers. The distance between two solutions is computed using r=10AD, where Ais the number of different arcs. The movement is done randomly selecting the length of movement between 2 and rand then using inversion mutation towards better solution; if there is no better solution, a random move will be done. Each firefly will produce msolutions and the best Nsolutions will be chosen to pass to the next generation.

Another modification in this category is proposed in [101]. The decision variables, xi's, represent assembly sequence. In the update, the random movement is omitted, and the attraction move is done in the discrete space. The attraction direction is computed for each dimension kusing sji(k)={xj(k),ifxj(k)xi0,else, and the update will be done by xi:=xi+Sjiwhere Sji={sji,ifα|ε0.5|<β0eγr20,else. In addition, the visual range, dv, which will control a firefly to be influenced by fireflies in tis visual range, is introduced. The visual range is computed by dv={3Itr(dvmaxdvmin)2(MaxGen1)+dvmin,ifItr<23MaxGendvmax,otherwisefor parameters dvmax and dvmin. This means that a firefly will not be attracted to any brighter firefly but to a brighter firefly in a visible range.

Firefly algorithm has been discretized for supply selection problem in [102]. The sum of the absolute differences between the entries is used to measure the distance r=k=1D|xj(k)xi(k)|. In addition, the movement is modified based on the property of the problem using rounding up for step length. In most cases, a modification specific to a problem is effective for that particular problem or a class of problems. However, it is hard to generalize the problems in other domains. It would be interesting to generalize the approach to be tested in other problem domains as well.

3.2.2.3. Modifications for mixed optimization problems

Perhaps the first modification to the standard firefly algorithm in this category is presented in [103]. The updating of solutions is conducted using the updating mechanism of the standard firefly algorithm. To deal with the discrete variables, constraint handling mechanism is used based on penalty function. In addition, the authors proposed two ways to generate a diverse set of random initial solutions. An adaptive random step length is also proposed using similar updating way in [104]. The same approach is improved in [105] by adding a scaling parameter for the random movement based on the difference between the maximum and minimum values for each variable. Portfolio optimization can be expressed as a mean-variance problem which belongs to the group of quadratic mixed-integer programming problems. In [106, 107], firefly algorithm has been extended with the use of rounding function and constraint handling approach. Deb’s method [108] is also used for constraint handling. In addition, αis modified using α:=α[1{1(1049)1MaxGen}].

3.3. Discussion

Like any metaheuristic algorithm, firefly algorithm is prone to parameter values. It is noticed that changing the parameters based on the search state is effective. Hence, modification on parameters is a direct forward idea to improve the performance of firefly algorithm. As the search proceeds, in order to have a conversion with good precision, the randomness movement must decrease. Hence, the randomness step length, α, is modified to be adaptive in which its value decreases with iteration [2328, 41, 45, 46, 48]. Figure 1 shows the graph of the modifications.

Figure 1.

With initial and final values of 2.5 and 0.4;α1 [23],α2 [2426],α3 [27],α4 [28],α5 [41],α6 [45,46,50],α7 [48] withλ= 0.4 andα8 [48] withλ= 2.1.

The decreasing scenario for αstarting from the first iteration may not always be a good idea. Perhaps, it is better to keep a constant αfor a number of iterations and start the decreasing scenario based on the performance of the solutions, especially when the solution approaches an optimum point. This can be one of the possible ideas for future work. Some modifications are done making the parameter αadaptive based on the performance of the solution [29, 38, 43]. Some of the modifications also involve a random term, and it behaves neither in decreasing nor in an increasing way [47, 49, 59]. In addition, other approaches such as encoding the parameters in the solution [109] and modifying the parameters based on the problem [44] are also proposed.

The attraction term has also been modified in different ways. Adaptive light absorption constant of the medium changing with iteration is given in some studies. This modifications use increasing function [46], decreasing function [43, 67] or neither increasing nor decreasing function [3338, 49, 50] of γ. The modification is neither increasing nor decreasing especially when a chaotic distribution [3338] or normal distribution [34, 49, 50] is used to compute the update. Increasing γimplies the decrease of the attraction step length, and its decrease implies an increase in the step length. The attraction step length βhas also been modified. A chaotic mapping is used to modify βin some of the studies [3338]. It should be noted that using a chaotic map or updating γdoes update β. For instance, Figure 2 shows that when γis updated using a logistic map, the resulting βis also chaotic.

Hence, γor βshould not be updated at the same time. In addition to γupdating, βhas also been done based on minimum and final values [39, 40], depending on the location of the solution [31] and the light intensity of solutions [32]. In addition, different approaches are used to measure the distance between the fireflies [31, 41, 45]. Modifying the feasible region should be considered as a very big step length that may take the solution far away from the brighter solution and possibly out of the feasible region. The property of the random step should also be properly tuned in agreement with the attraction; otherwise, the random movement may dominate the attractiveness step length.

Figure 2.

The effect of chaotic map update ofγonβ[34].

The movement of the best solution should be tuned properly. If it is allowed to decrease then its best performance may get lost. Hence, the approaches used to preserve the best solution are ended effectively [32, 52].

Mutation is another good approach to diversify the solutions which in turn increase the exploration behaviour of the algorithm [24, 5358]. However, generating many solutions may hinder the search as it will take long to run. In addition, accepting weak solution should also be incorporated in deceiving problems; a solution needs to decrease in order to escape local solutions.

Modifying the update equation is another interesting modification featured in some studies [56, 5861, 6367, 69]. These studies suggest that the update should be done in the vicinity of the brighter firefly [58, 60]. This is not always a good idea as the region in between the two solutions will not be explored. Some of the studies indicate an increase in the attraction towards brighter fireflies [61, 64]. It simply means that increasing the step length of the attraction may dominate the random movement or even take the solution out of the feasible region. A memory is utilized to save the best solution found and additional attraction term towards that global solution is added in [63, 69]. It is a good idea in which the best solution will not be lost through iteration. To increase the diversity of the solution, an effective modification is proposed in [110]. Using such kind of modification, the diversity of the solutions will be preserved, and the exploration behaviour of the algorithm will be improved.

Basically, two updating strategies are proposed for the non-continuous case. The first one is using the same updating formula and changing the results to discrete values afterwards [82, 94, 104]. And the second is to modify the updating formula on the discrete space [97101]. The first problem is susceptible of trapping in local solution and misses the optimal solution. The optimal solution for a continuous version of a discrete problem may not always be an optimal solution for the discrete problem. Hence, the algorithm will tend to converge to the optimal solution of the continuous version of the problem. Hence, the second approach has an advantage in such cases.

4. Simulation results

The comparison of results is performed between the standard firefly algorithm and non-parameter modified version, i.e. Class 2 and Class 3 modifications. The modified versions selected for simulation are based on two criteria, the first one clear modification, that is the modification should be clearly described, and the second one is with small number of new parameters. In some of the modifications, a number of new algorithm parameters are introduced and tuning this parameter by itself needs another study so they are not included in the simulation. The modified versions used for simulation include Firefly Algorithm 1 [32], FFA2, [52], FFA3 [53], FFA4 [26, 57], FFA5 [24, 59], FFA6 [58] FFA7 [60], FFA8 [61, 62], FFA9 [69], FFA10 [63] where xi-gbest is replaced by gbest-xi, FFA11 [110], FFA12 [7274], FFA13 [7579], FFA14 from [70].

4.1. Benchmark problems and simulation setup

Five benchmark problems are selected from different categories as presented in Table 2. The simulations are performed on Intel® Core™i3-3110M CPU @ 2.40 Ghz 64 bit operating system. MATLAB 7.10.0 (R2010a) is used for these simulations. The algorithm parameters are set as given in Table 2 for dimensions 2 and 5.

4.2. Simulation results and discussion

The simulation results, as presented in Table 3, show that some of the algorithms are very expensive in terms of computational time but give a good result, and others have small running time. For instance, in second problem, when the dimension is 2 on average, FFA3 outperforms all with average CPU time of 8.8, whereas FFA1 and FFA2 give a good approximation with smaller average CPU time. In general, it can be seen that FFA4 is very effective but not with the computational time. FFA1 and FFA2 give good approximate results with smaller CPU time compared to FFA4. However, when the dimension increases, FFA2 outperforms FFA1. Perhaps it is due to the fixed random direction m for all the simulations.

ProblemsRef. Properties of
the problem
Parameters and set-up
D= 2D= 5
f1ei=1Dxi22ei=1D(xi15)6i=1Dcos2xi
20xi20
[111]Multimodal
Continuous
Differentiable
Non-separable
N= 50
α= 4
γ= 2
N= 200
α= 5
γ= 2
f2i=1D|xisinxi+0.1xi|
10xi10
[112]Multimodal
Continuous
Non-differentiable
Separable
Non-scalable
N= 25
α= 3
γ= 2
N= 100
α= 4
γ= 2
f3j=1D[i=15p(i)xj5i]for
p=[0.037790.8405614.427.134]
1xi12
[113]Multimodal
Discontinuous
Non-differentiable
Separable
N= 20
α= 1.5
γ= 2
N= 100
α= 3
γ= 2
f4200e0.02i=1Dxi2
32xi32
[112]Unimodal
Continuous
Differentiable
Non-separable
Non-Scalable
N= 60
α= 6
γ= 2
N= 250
α= 7
γ= 2
f5i=1Dεi|xi|i
5xi5
[112]Unimodal
Continuous
Non-differentiable
Separable
Scalable
Stochastic
N= 20
α= 1.5
γ= 2
N= 70
α= 2.5
γ= 2

Table 2.

Selected benchmark problems and simulation set-up.

F1F2F3F4F5
D2525252525
μσμσμσμσμσμσμσμσμσμσ
FFAf(x*)− 0.01950.10020.000.000.66960.44830.05040.0135−6.67451.2874−6.58543.6570−193.753.4097−166.467.30720.01580.01480.04770.0373
CPU1.40.31.40.00.30.10.10.00.20.12.70.11.80.48.42.50.20.00.30.1
FFA1f(x*)− 0.71850.42200.000.000.00390.00850.01090.0052−7.65070.00−17.1351.8541−199.610.1757−195.441.12860.00050.00060.00020.0003
CPU1.50.31.50.10.40.10.10.00.30.12.80.21.80.48.82.40.30.00.30.0
FFA2f(x*)−0.79740.40280.000.000.00030.00030.00000.0000−7.65070.00−19.1260.0−199.980.0136−199.980.00760.00470.00920.00480.0102
CPU2.60.43.50.31.10.10.10.00.40.14.50.42.20.511.32.90.40.10.90.2
FFA3f(x*)−0.01280.09030.000.000.44020.37680.03970.0158−7.26210.7406−8.67512.9248−195.202.6004−177.046.55910.00850.00850.01350.0125
CPU1.30.21.40.10.30.10.10.00.20.12.60.31.60.38.12.20.20.00.20.1
FFA4f(x*)−1.000−1.000.000.00.00.000−7.65070.00−17.9771.9499−2000.0−2000.00.00.00.000.00
CPU44.26.760.43.88.80.61.60.24.20.3126.54752.5610.3574.896.440.450.8
FFA5f(x*)−0.03720.17110.000.000.00780.01690.01130.0038−7.65070.00−17.6101.2629−199.580.2164−194.511.39930.00130.00130.00090.0007
CPU37.65.851.02.47.80.81.40.13.60.3102.543.145.69.3493.680.13.40.34.20.8
FFA6f(x*)0.00000.000.000.000.75340.53070.02770.0091−7.65070.00−13.3230.8509−198.430.7722−193.951.41930.00510.00450.00140.0008
CPU1.40.21.50.10.30.10.10.00.30.12.70.31.60.38.62.50.20.10.30.1
FFA7f(x*)0.00000.000.000.000.06640.03300.00070.0002−7.65070.00−12.2861.2151−199.980.0136−196.051.84380.00010.00020.00000.0001
CPU0.70.10.80.10.20.10.10.00.20.11.70.20.90.24.01.30.10.00.20.0
FFA8f(x*)0.00000.000.000.000.71130.56340.04990.0149−7.22070.8647−6.23233.2674−193.813.7645−167.586.81160.00840.00850.04250.0372
CPU3.60.66.80.11.30.10.10.00.90.17.60.74.90.829.48.50.80.110.1
FFA9f(x*)−0.03030.150.000.001.13690.67020.11150.0251−5.41081.7004−1.004813.039−188.395.3655−104.3712.8180.04860.03990.09510.0915
CPU0.30.10.20.00.20.10.00.00.20.10.50.10.40.10.70.10.10.00.10.0
FFA10f(x*)0.00000.000.000.001.04660.62020.04030.0203−6.78241.0078−13.2451.6688−175.5212.883−172.6810.4590.00640.00550.00140.0012
CPU3.50.611.10.11.20.10.10.00.90.19.30.94.70.964.218.11.00.11.40.2
FFA11f(x*)0.00000.000.000.000.63570.41930.05140.0129−6.70981.0934−6.17373.3915−194.493.3451−165.947.31830.02080.01860.04710.0324
CPU1.30.31.40.10.30.10.10.00.30.12.70.31.70.38.32.30.20.10.30.1
FFA12f(x*)0.00000.000.000.000.66110.51030.04930.0141−6.59861.3430−6.86594.0703−193.963.4841−169.348.70510.01610.01600.05950.0452
CPU1.50.21.50.10.40.10.10.00.30.12.90.31.80.48.42.40.30.00.30.1
FFA13f(x*)0.00000.000.000.002.11371.48400.09770.0315−3.33603.2207−1.191712.914−189.534.8277−148.5912.76160.13560.13741.47242.4610
CPU5.40.916.10.22.40.20.30.11.70.1282.57.71.412230.21.70.21.80.2
FFA14f(x*)−0.63130.290.23990.40.09600.08290.01070.0037−2.32897.0785−0.488811.437−198.320.8034−193.921.44850.00460.00470.00100.0006
CPU1.50.31.50.10.40.10.10.00.30.12.70.31.80.48.52.50.20.00.30.1

Table 3.

Simulation results.

5. Conclusion

In this chapter, a detailed review of modified versions of firefly algorithm is presented. The modifications are used to boost its performance for both continuous and non-continuous problems. Three classes of modifications are discussed for continuous problems. The first one being parameter level modification which will improve the performance of the algorithm. The second class is on the updating mechanism level, in which new updating equation or mechanisms are introduced. The last class is in the abstract level in which change of solution space and probability distribution of the randomness term are discussed. The strength and weakness of the approaches are also presented. Simulation results show that mutation-incorporated firefly algorithm gives better result with larger computational time, whereas versions of firefly algorithm with opposition-based learning and elitist movement for the brighter firefly give approximate solution with smaller computational time. Hence, if a proper way of implementation is used, mutation operator and elitist move of brighter firefly algorithm along with possible implementation of opposition-based approach may perform better.

© 2016 The Author(s). Licensee IntechOpen. This chapter is distributed under the terms of the Creative Commons Attribution 3.0 License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

How to cite and reference

Link to this chapter Copy to clipboard

Cite this chapter Copy to clipboard

Waqar A. Khan, Nawaf N. Hamadneh, Surafel L. Tilahun and Jean M. T. Ngnotchouye (September 21st 2016). A Review and Comparative Study of Firefly Algorithm and its Modified Versions, Optimization Algorithms - Methods and Applications, Ozgur Baskan, IntechOpen, DOI: 10.5772/62472. Available from:

chapter statistics

2378total chapter downloads

9Crossref citations

More statistics for editors and authors

Login to your personal dashboard for more detailed statistics on your publications.

Access personal reporting

Related Content

This Book

Next chapter

Genetic Algorithm Optimization of an Energy Storage System Design and Fuzzy Logic Supervision for Battery Electric Vehicles

By Stefan Breban

Related Book

Nature-inspired Methods for Stochastic, Robust and Dynamic Optimization

Edited by Javier Del Ser

First chapter

Introductory Chapter: Nature-Inspired Methods for Stochastic, Robust, and Dynamic Optimization

By Eneko Osaba and Javier Del Ser

We are IntechOpen, the world's leading publisher of Open Access books. Built by scientists, for scientists. Our readership spans scientists, professors, researchers, librarians, and students, as well as business professionals. We share our knowledge and peer-reveiwed research papers with libraries, scientific and engineering societies, and also work with corporate R&D departments and government entities.

More About Us