Open access peer-reviewed chapter

A Brief Survey on Intelligent Swarm-Based Algorithms for Solving Optimization Problems

Written By

Siew Mooi Lim and Kuan Yew Leong

Submitted: 24 October 2017 Reviewed: 03 April 2018 Published: 18 July 2018

DOI: 10.5772/intechopen.76979

From the Edited Volume

Nature-inspired Methods for Stochastic, Robust and Dynamic Optimization

Edited by Javier Del Ser and Eneko Osaba

Chapter metrics overview

1,869 Chapter Downloads

View Full Metrics

Abstract

This chapter presents an overview of optimization techniques followed by a brief survey on several swarm-based natural inspired algorithms which were introduced in the last decade. These techniques were inspired by the natural processes of plants, foraging behaviors of insects and social behaviors of animals. These swam intelligent methods have been tested on various standard benchmark problems and are capable in solving a wide range of optimization issues including stochastic, robust and dynamic problems.

Keywords

  • optimization
  • artificial intelligence
  • swarm intelligence
  • nature-inspired and bio-inspired computation

1. Introduction

Optimization is a form of mathematical procedure for determining optimal allocation of scare resources. In recent years, the optimization area has received enormous attention primarily due to the rapid emerging science and technology in computing, communication, engineering, environment and society. Several types of optimization problems exist. Two important classes of objects for most optimization problems are limited resources and activities. Resources include land size, plant capacity and sales force. Whereas production activities are like produce stainless steel, low carbon steel or high carbon steel; how we solve them will depend on the circumstances to determine the best condition of activity levels using the resources available. All optimization problems have an objective function, constraints, and choice variables which will lead to the improvement in application or audience. For instance, tradeoffs between faster algorithm with more consumption of memory and vice-versa are used to bring the greatest interest to the audience [1].

Three categories of optimizations techniques namely stochastic optimization (SO), robust optimization (RO) and dynamic optimization (DO) are presented in the following subsections with the conclusions given on the advantages and application in practice for each technique. The main motivation behind this study on the nature inspired computation is to identify among the connection, social conduct and rise. This work is needed in the current scientific community to utilize the use of computing to demonstrate the living marvels, to investigate and to enhance our life by using computers. This study will substantially contribute in bringing the inspiration of computerized solutions through a wide range of nature processes.

1.1. Stochastic optimization

Stochastic optimization (SO) process involves randomness in the minimization or maximization of a function and lends itself to real-life phenomena which involve uncertainty and imprecision. The randomness may be present as either noise in measurements or Monte Carlo randomness in the search procedure, or both. Some common techniques of SO are: direct search methods, stochastic approximation, stochastic programming, simulated annealing, genetic algorithms, etc. These techniques can cope with the inherent system noise, and systems with high nonlinearity and high dimensional models.

In other words, these models are derived, solved analytically or numerically and analyzed to extract information to be presented to decision makers [2]. SO is important in analyzing, designing and operating modern systems. Specific applications of SO in business include short and long-term investment decisions, aerospace engineering in designing missile or aircraft, new drug design and the network in traffic control. The challenge in real-life applications is hard to estimate the accurate probabilistic description of the randomness, if such information is available, stochastic programming can be applied as a powerful modeling tool. SO has the advantage of solving problems in polynomial time. Theoretically, it guarantees the quality of the solutions generated. Practically, SO is limited by its heavy dependency on the availability of historical data and complex modeling [3, 4].

1.2. Robust optimization

Robust optimization (RO) is a rather new approach that deals with data uncertainty. The two motivational factors of RO are firstly the uncertainty model is rather deterministic and set-based. This motivational concept is the most appropriate notion of parameter uncertainty in many applications. The second motivational factor is the computational tractability. For instance, for a given optimization problem, multiple robust versions exist depend on the structure of the uncertainty set, therefore maintaining tractability is important. The classification models for RO includes local vs. global and probabilistic vs. non-probabilistic. Based on the nature of the problem, this technique is also known as min-max or worst-case approach. It provides a good guaranteed solution for most possible realizations of the uncertainty in the data. It is also useful if some of the parameters belong to the estimation process and contains estimation errors.

One important concept in defining and interpreting robustness and the resulting models is constraint robustness (model robustness) [5]. The application of RO in engineering is known as robust design optimization or reliability-based design optimization where the solutions remain feasible for all possible values of the uncertain inputs. RO methodology is applicable to every generic optimization problem in which numerical data can be separated from the structure of the problems. The challenge of RO is that it gives the same weight and values to all of the uncertain parameters. The advantages of RO formulation are cost saving and increment of stability, qualitative and quantitative robustness. The practical usage of RO is that it does not significantly increase the complexity of the considered optimization problems in most cases [6, 7].

1.3. Dynamic optimization

Dynamic optimization (DO), also known as dynamic programming is a process of finding the optimal control profile of one or more control parameters of a system. It is used to find the possible number of solutions for a given problem. There are several approaches of DO such as based on the calculus variations, deal with optimization discrete time and extend the static optimization. Basically, the process of DO implementation involves a system controller, a performance criterion and an algorithm to execute the control. Two key attributes of DO are optimal substructure and overlapping sub-problems [8]. Four major steps on development of DO algorithm are:

  1. Characterize the structure of an optimal solution.

  2. Recursively define the value of an optimal solution.

  3. Compute the value of an optimal solution in a bottom-up fashion.

  4. Construct an optimal solution from computed information.

The advantage of this paradigm: it performs the optimization recursively by dividing the problems into a collection of simpler sub-problems. Each sub-problem is solved only once using either top-down or bottom-up approach. To facilitate its lookup, a technique called memorization is applied where the solutions of subproblems are indexed based on its input parameter values, thereby solving computation time at the expense of modest expenditure in storage space. Practically, the concept of DO is universal and flexible which can be applied to the execution of any effort [9].

Advertisement

2. Algorithms

Artificial intelligence (AI) has been viewed as a regulation in computer science. It has been developing and examining frameworks which work logically. Bio-inspired computation, metaheuristics and computational intelligence are the common examples of algorithms from numerous parts of AI. Bio-inspired computation utilizes the computing power to demonstrate the living marvels. Computational intelligence which emphasizes on strategy and outcome can be broadly divided into five dominant fields: swarm intelligence, evolutionary computation, artificial neural networks, artificial immune system and fuzzy systems. This chapter will be focusing on a few swarm intelligence-based algorithms which are inspired by their natural processes.

Advertisement

3. Swarm intelligence

Swarm intelligence (SI) is evaluated as an adaptive strategy which takes collective intelligence as a behavior without centralized control structure on how an individual should behave. The rules of SI are simple, self-organizing, co-evolution and being widely applied in the domains of optimizing, searching methods, research in DNA computing improvement, heating system planning etc. SI paradigm includes bird flocking, cuckoo search, animal herding and fish schooling etc. However, the two dominant subfields of SI are ant colony optimization, inspired by pheromone-trail of the ant behavior and particle swarm optimization, inspired by flocking and swarming behavior [10].

However, providing a complete review to all the swarm-based algorithms is rather impossible. The next sub-sections present the inspiration, working, metaphor and heuristic of eight popularly known swarm-based methods. These methods have been introduced and implemented in the last decade. The main challenges of the field and their future trends have also been discussed.

3.1. Bat algorithm

Bat algorithm (BA) [11] helps in simplicity and flexibility. It is found to be very efficient in handling nonlinear and multi objective issues. Bats have a special high-level capability of bio-sonar (echolocation) which is used to find their prey, obstacles, roosting crevices detection and discriminate different types of insects.

The efficiency of BA depends on the features below:

  1. Automatic zooming: this capability is performed based on the automatic switch from explorative direction to the local insensitive exploitation.

  2. Frequency tuning: the variation of frequency is performed on the echolocation.

Microbats are the famous examples among all the bat species. The echolocation attribute of microbat is used to model BA. Literature has reported a diverse range of BA applications such as loading pattern of nuclear core in engineering optimization, nonlinear economic dispatch problem, design of a power system stabilizer, size optimization for the skeletal structures which consist of truss and frame, multilevel image thresholding which is an image processing technique. In the context of inverse problem and parameter estimation, bat calculations have been utilized in solving numerical improvement, advancing the brushless DC wheel engines, and enhancing topological shape in microelectronic applications [12, 13, 14, 15].

There are some successful implementations of BA in SO. In their work of stochastic resonance for MR images enhancement [16], proposed a neuron model that tapped on the BA multi-objective optimization property to tune the parameters. In their work, the BA is utilized to maximize both the image performance indices contrast enhancement factor and the mean opinion score. Their results show that the method has improved the gray-white matter differentiation, which has been found useful to diagnose MR images. In another work by [17], BA is adapted with inclusion of two operations—(1) iterative local search, and (2) stochastic inertial weight to improve its performance in terms of accuracy, speed and convergence stability. It is claimed that BA is easy to fall into local optima and has unstable optimization results due to low global exploration ability. The authors overcome the weaknesses of BA when their iterative local search algorithm disturbs the local optimum and do some local re-search, such that the BA has better ability to get out of the local optima. Adding with their stochastic inertial weight to disrupt the velocity updating equation, it enhances the diversity and flexibility of bat population. They proved their results based on 10 classic benchmark functions, CEC 2005 benchmark suite, and two (2) real-world problems, in which they concluded with improved performance.

A robust tuning of power system stabilizer is demonstrated to be possible by using BA [18]. In such scenario of RO application, the stability of the power system is highly critical. This paper proposed BA to optimize the gain and the pole-zero parameters of the stabilizer. They compared that the BA approach is superior than PSO optimization method. The optimization was performed with objective function based on eigenvalue shifting to guarantee the stability of nonlinear plant for a wide range of operating conditions.

A dynamic perceptive BA [19] is used to optimize particle filter for multiple targets tracking. This is an example of DO in which the authors proposed a multiple-maneuvering-target tracking algorithm and combined it with the BA to optimize particle filter typically used in a modern radar tracking system. Their combined algorithm regards the particles as bats and it simulates the behavior of bats preying by dynamically adjusting the radar tracking system’s components of frequency, volume and pulse rate. This dynamic control of adjusting the particle filter, adding with a joint probabilistic data association has enabled an improved accuracy in target tracking even under a complex environment.

In other relevant applications, BA has been reported in data mining techniques of classifications and clustering. BA has been applied in grouping microarray information, minimization of make span and mean flow time to study half breed flow shop booking issues [20]. In the application of image processing, BA has been utilized for full body human stance estimation. In this study, BA has outperformed particle swarm optimization, particle filter and annealed particle filter. Bat based model has also shown its effectiveness in envision coordinating as compared in evolution and genetic algorithms [15]. In fuzzy logic and other applications, BA has been applied in investigating the ideal capacitor position for misfortune decrease in dispersion frameworks. BA and fluffy frameworks have also been utilized for energy displaying and energy changes in a gas turbine [15].

3.2. Firefly algorithm

The current population of firefly species is over 2000. The short and rhythmic blazing light of fireflies is an astounding sight in the sky of the tropical and calm areas. This nature capability of fireflies inspired the firefly algorithm (FA) [21]. Bioluminescence is the process of generating the flash light. The light is used to model the warning signals. Each objective function of optimization problem is represented by different light intensity. They are some similarities between FA and bacterial foraging algorithm. Their attractions are based on objective function, fitness and distance respectively. FA can solve discrete and continuous optimization problems. Kwiecien and Filipowicz [22] have applied FA in cost optimization of queueing systems. A study carried out by Gandomi et al. [23] has proven that FA is better than other nonlinear optimization techniques in designing the stepped cantilever beam. FA has been reported in the recent literature as an efficient computational procedure for simultaneously generating multiple different alternatives to an optimal solution [24].

FA has a wide spread of applications since its introduction, attributed mainly to its simplicity of implementation as compared to some traditional approaches. In content-based image retrieval, feature extraction has been done with the Euclidean distance estimation between the pixels. However, such approach needs more precision, and this has motivated [25] to employ FA to optimize the image features. Their work is closely related to SO as the potential image features are stochastically found by FA. They benchmarked their FA image optimization results with PSO and GA and discovered the differences of each model in terms of precision and image recall. [26] also applied FA to solve a SO problem in linear phase finite impulse response filter (FIR) design. Differential evolution (DE) is known as one of the best performer in used for such problem. However, they proved through their simulation of designing FIR filters that FA is better than other relevant algorithms (inclusive of PSO and GA). The improvement was recorded not only in the convergence speed but also in the performance of the designed filter.

Other variants of FA have also found their applications in several various disciplines. [27] proposed a hybrid PSO-FA to solve a combinatorial optimization issue in floor planning. [28] introduced a hybrid FA and DE method to estimate the parameters of the nonlinear biological model. In the optimization design of sewer pipes, [29] has proposed a novel method by combining a support vector regression and the FA to predict the minimum velocity required to avoid sediment settling in the pipe channels.

3.3. Lion optimization algorithm

Lion optimization (LO) [30] is a population-based algorithm which was inspired by lion’s social system and collaboration characteristics which can be described with the term ‘pride’. The uniqueness of lion’s social behavior makes them the strongest mammal in the world. LO is modeled based on two unique behaviors of lion: territorial defense and territorial takeover. Based on these two behaviors, the solutions of LO are generated through three steps: (1) to differentiate whether each cub solution is an original or a derived solution; (2) territorial defense will proceed to evaluate and compare the existing and new solutions; (3) if the existing solution is better than the new solution, then territorial takeover will keep the existing solution to further improve it. LO can perform huge search space to solve continuous, single variable and multi-variable optimization problems. LO algorithm has been validated using De-Jong’s Type 1 function and the performance has been compared against evolutionary programming. The results shown that LO performed better than evolutionary programming.

LO is still in its early stage of application, both [31, 32] have experimented on its optimization capability and benchmarked it on some function optimization problems. Both authors concluded on the high performance of LO in functions optimization. While [33] has taken LO further to perform clustering of data by utilizing the optimization capability of LO. Such data clustering approach is very much a SO problem. In their work, the LO is modified with the fractional theory to search the cluster centroids of data instead of typical distance measurement.

3.4. Chicken swarm optimization algorithm

Chickens are sociable birds that they live in groups. Chicken swarm optimization (CSO) algorithm [34] was inspired by the social behavior of chickens. Every chicken has their own motion laws. The fitness values of the chickens will identify themselves as the roosters, hens and chicks. These identities will put them into groups. The fittest, the weakest and the intermediate chickens are simulated as a rooster, as a chick and as hens accordingly. The hierarchal order plays a vital part in the group of chickens and this characteristic is used to model CSO algorithm. CSO has been applied in solving the design of speed reducer efficiently. In that research, a gearbox has been created with the design of most efficient speed. The research on CSO has been promising. It has been used to improve the performance of the greedy algorithm [35].

A deadlock-free migration of virtual machine consolidation is optimized using CSO [36]. In such consolidation of services, two separate but related issues of virtual machine placement and migration is a challenging optimization task. The authors proposed a consolidation scheme utilizing CSO that turns the virtual machine consolidation problem into a vector packing optimization based on deadlock-free migration. The optimization also helps to minimize the energy consumptions. The proposed method achieved higher convergence rate as compared to several other deadlock-free migration algorithms.

CSO is also applied to a classical job-shop scheduling problem in the work of [37]. An improved version of CSO is also being applied to identifying the maximum power point tracking control of a photovoltaic system [38]. CSO has also seek its application in disaggregation of non-invasive domestic appliances [39].

3.5. Social spider algorithm

Social spiders are organism living in groups. They are solitary and having aggressive characters among their own species. Their foraging behaviors and their corporation in performing daily tasks are used to model social spider algorithm (SSA) [40]. In SSA, two different evolutionary operators are created based on the gender of male and female spiders to divide their tasks for predation, web design and mating. This algorithm can solve a wide range of continuous optimization problems including minimization of molecular potential energy function [41, 42]. SSA has been validated using standard benchmark problems to study its performance. An analysis has been carried out on the performance SSA against particle swarm optimization and artificial bee colony. The results shown that SSA has outperformed the other techniques.

SSA has some wide application in recent researches. An SSA is proposed to solve a non-convex economic load dispatch problem [43]. Economic load dispatch (ELD) is one of the essential components in power system control and operation. Most modern power system introduces new models of the power units which are non-convex, non-differentiable, and sometimes non-continuous; such RO problem is hence difficult to be solved by conventional mathematical techniques. In this paper, the authors modified the SSA to suit the characteristics of ELD, and their simulation results show that such ELD problem can be solved by SSA effectively and efficiently.

Another example of employing SSA to a DO problem is proposed by [42] to solve the transmission expansion planning in electrical power system. The authors tested SSA in solving the transmission expansion planning problem of three benchmark systems having 6-busses, 46-busses, and 87-busses. They achieved great performance as well as reduction in the total investment cost.

In a multi-objective optimization problem of QoS-aware web services, [44] applied SSA to perform optimized selection of numerous functionality in the web services involving complex tasks delivery. They studied current approaches of GA and PSO to realize that the time performance of such approaches is still a great concern. The proposed SSA has outperformed PSO in terms of both execution time and fitness.

3.6. Spider monkey optimization algorithm

Spider monkey (SM) [45] is a population-based optimization algorithm which was inspired by the intelligent ways of spider monkeys to search for the most suitable food sources. The excellency of food source corresponds to the fitness of a solution. Major characteristics and the strategies of SM algorithm are similar to artificial bee colony algorithm. The exploration and exploitation functions allowed SM algorithm to perform huge search space and generate greatest feasible solutions. SM algorithm is simple and speedy. It is used to solve numerical optimization problems.

An example of SM application in continuous numerical optimization can be found in the work of [46]. They modified SM to enable it for solving some constrained optimization problems. Their proposed SM was tested on the well-defined constrained optimization problems of CEC2006 and CEC2010 benchmark sets. The algorithm acquired some promising results when compared to PSO, artificial bee colony and DE methods.

Although SM could be best to implement for a numerical problem, several other researchers have applied it to some wider scope of optimization problems. Both [47, 48] in their separate studies, have proposed SM to solve antenna optimization. [47] focuses their work in the thinning of concentric circular antenna arrays. While such thinning problem is a binary optimization, the original SM might not be suitable. Hence, they suggested their version of binary SM in which it must handle the logical operators of the thinning problem. Their results proved the competence and superiority of a binary SM as compared to existing metaheuristic algorithms. On the other hand, [48] applies SM to synthesize the factor of a linear antenna’s array and to optimally design an E-shaped patch antenna. They discovered that their SM, when compared to traditional method for such optimization problem, can reach optimum solutions with less number of iterations.

3.7. African buffalo optimization algorithm

African buffalo optimization (ABO) algorithm [49] was inspired by the practice of African buffalos in the vast African forests and savannahs in finding pastures. Three specific characteristics of these animals are used to model ABO algorithm. Firstly, these animals have high memory capacity. This skill enables them to monitor their routes up to thousands of kilometers in Africa continent. Secondly, they communicate among themselves using two specific vocalizations: ‘maa’ and ‘waa’ to support each other in surviving. Lastly, African buffalos practice ‘democracy’ system in making decisions.

The information of every buffalo’s past and current location is used in tackling the issue of premature convergence. Leading buffalo’s search space and the experience of all other buffalos complement to the exploration and exploitation strategies in ABO. This algorithm utilizes only learning parameters, therefore it is a simple and yet easy to implement algorithm which guarantees quick convergence. The efficiency and powerful features of this algorithm are capable in solving knapsack problems. ABO has been validated using traveling salesman benchmark problems to study its cost effectiveness. A study has been carried out on the comparative CPU time for ABO against Genetic Algorithm, Honey Bee Mating Optimization, Ant Colony Optimization, Simulated Annealing and Adaptive Simulated Annealing with Greedy Search. ABO has outperformed all other techniques to obtain the solutions at an incredibly fast rate.

[50] has used ABO to solve the well-studied Traveling Salesman’s Problem (TSP). They performed ABO on 33 benchmark symmetric TSP and observed excellent exploration and exploitation of the search space through regular communication, cooperation, good memory of its previous individual and collective exploits. They concluded that ABO is as competitive as other algorithms superior in TSP.

In another DO example, ABO is used to optimize parameters tuning of proportional-integral-derivative (PID) controller [51]. The PID controller in their study is used to automatically regulate voltage. It was noted that existing metaheuristic tuning methods have been proven to be quite successful, but the authors wanted to improve the gain overshoot and steady state errors of the system. They gained some encouraging results with ABO when it was compared to several other optimization algorithms.

3.8. Flower pollination algorithm

The goal of plants, like any other living organism is producing offspring for the next generation. Pollination is the process of transferring pollen grains from the male anther of a flower to the female stigma. Two types of pollinations process are self-pollination and cross pollination. When pollen grains are produced, pollinators will spread it among the flowers to either local or global flow of pollination. The process of passing the pollen grains from the stamens to the ovule-bearing organs during pollination is used to model flower pollination algorithm (FPA) [52].

The assumption on the current version of FPA: each flower only produces one pollen gamete. The time complexity of FPA is shorter; therefore, it is flexible and easy to implement. A study has been carried in structural engineering to evaluate the cost optimization of tubular column under compressive load using FPA against Cuckoo search algorithm, fuzzy rules and engineering optimization techniques. The results showed that FPA is the most efficient method and the convergence of the algorithm is very effective. This algorithm has been applied for solving continuous, single objective and multi-objective optimization problems. It can be improved further to apply in the areas of image compression and graph coloring. FPA has successfully converged in shape matching problem based on a relatively new branch called atomic potential matching model [53].

FPA is applied to a visual tracking problem in the work of [54]. In their model, visual tracking is considered to be a process of optimal reproduction of flowering plants. This is a typical example of DO. FPA is then presented with a switch probability that changes dynamically with generation numbers. To compare the tracking ability of the FPA tracker, the tracking accuracy of particle filter, mean shift and PSO are studied as well. Comparative results show that their method outperforms the other three trackers. Other applications of FPA can be found in the works of [55, 56, 57]. [55] proposed a FPA to optimize a SO problem of static economic dispatch incorporating wind farm. [56] worked on the sizing optimization of truss structures. Again, this is another example of SO. While [57] applied FPA on a DO problem of photovoltaic parameters optimization. Through their simulation, FPA is recommended as the fastest and the most accurate optimization technique for the optimal parameters extraction process, after benchmarking it on several other methods.

Advertisement

4. Conclusion

All different creatures survive with their own unique behaviors and features. Their characteristics are concealing in the natural world. This chapter has reviewed eight natural inspired algorithms mainly from the field of swarm intelligence. These techniques are becoming powerful in numerical optimization and have shown remarkable robustness, high accuracy and have immense capacity in solving different types of optimization problems.

In dealing with real world problems of optimization, we are now presented with even more choices of algorithm thanks to inspirations form the nature, as well as persistence research contributions by many researchers. However, with more choices means it could be even harder to decide which one of the algorithms is a better candidate for a problem at hand. Based on No-Free-Lunch theorem, there is surely no single best algorithm for every problem. The complexity, characteristics and diversity of optimization problems mean that it is very unlikely to have a single method that can handle all types of optimization problems. This is very much the current state of research in optimization, despite the abundance of nature inspired algorithms being added to the solution pool.

Moving ahead, the research community is definitely not looking for a single most powerful algorithm to solve all types of problem. Indeed, we might witness even more successful variants of algorithm being developed from their current counterparts. This is a trend we have observed since any introduction of a new algorithm being inspired. It is also possible that more and more new optimization algorithms are to be inspired, given that the vast unknown and untested phenomena in the nature are still beyond our exploration.

References

  1. 1. Crowley M. Using equilibrium policy gradients for spatiotemporal planning in forest ecosystem management. IEEE Transactions on Computers. 2014;63(1):142-154
  2. 2. Weisstein EW. Stochastic optimization. From MathWorld—A Wolfram Web Resource
  3. 3. Mesbah A. Stochastic model predictive control: An overview and perspectives for future research. IEEE Control Systems. 2016;36(6):30-44
  4. 4. Schmidhuber J. Deep learning in neural networks: An overview. Neural Networks. 2015;61:85-117
  5. 5. Dunning I, Huchette J, Lubin M. JuMP: A modelling language for mathematical optimization. SIAM Review. 2017;59(2):295-320
  6. 6. Ahmadi-Javid A, Jalali Z, JKlassen K. Outpatient appointment systems in healthcare: A review of optimization studies. European Journal of Operational Research. 2017;258(1):3-34
  7. 7. Bertsimas D, Gupta V, Kallus N. Data-driven robust optimization. Springer Link. 2018;167(2):235-292
  8. 8. Shin YC, Xu CY. Intelligent Systems: Modelling, Optimization and Control. CRC Press, Taylor & Francis Group; 2017
  9. 9. Priya Esther B, Sathish Kumar K. A survey on residential demand side management architecture, approaches, optimization models and methods. Renewable and Sustainable Energy Reviews. 2016;59:342-351
  10. 10. Brownlee J. Clever Algorithms: Nature-Inspired Programming Recipes. 2012. ISBN: 978-1-4467-8506-5
  11. 11. Yang X-H. A new metaheuristic bat-inspired algorithm. In: Gonzalez JR et al. Nature Inspired Cooperative Strategies for Optimization (NICSO 2010), SCI. Vol. 284. 2010. pp. 65-74
  12. 12. Tsai PW et al. Bat algorithm inspired algorithm for solving numerical optimization problems. Applied Mechanics and Materials. 2011;148-149:134-137
  13. 13. Titus J. Careful Designers Get the Most from Brushless DC Motors. Tillgänglig: ECN Electronic Communication Network; 2012
  14. 14. Yang X-S. Bat algorithm: Literature review and applications. International Journal of Bio-Inspired Computing. 2013;5(3):141-149
  15. 15. Marichelvam MK, Prabaharam T. A bat algorithm for realistic hybrid flow shop scheduling problems to minimize makespan and mean flow time. ICTACT, Journal on Soft Computing. 2012;3(1):428-433
  16. 16. Singh M, Verma A, Neeraj S. Bat optimization-based neuron model of stochastic resonance for the enhancement of MR images. Biocybernetics and Biomedical Engineering. 2017;37(1):24-134
  17. 17. Gan C, Cao WH, Wu M, Chen X. A new bat algorithm based on iterative local search and stochastic inertia weight. Expert Systems with Applications. 2018;104:202-212
  18. 18. Sambariya DK, Prasad R. Robust tuning of power system stabilizer for small signal stability enhancement using metaheuristic bat algorithm. International Journal of Electrical Power and Energy Systems. 2014;61:229-238
  19. 19. Chen ZM, Bo YM, Tian MC, Wu PL, Ling XD. Dynamic perceptive bat algorithm used to optimize particle filter for tracking multiple targets. Journal of Aerospace Engineering. 2018;31(3)
  20. 20. Yang X-H. A new metaheuristic bat-inspired algorithm. In: Gonzalez JR et al., editors. Nature Inspired Cooperative Strategies for Optimization (NICSO 2010), SCI. Vol. 284. 2010. pp. 65-74
  21. 21. Bacanin N, Tuba M. Firefly algorithm for cardinality constrained mean-variance portfolio optimization problem with entropy diversity constraint. The Scientific World Journal. 2014. Article ID 721521, http://dx.doi.org/10.1155/2014/721521
  22. 22. Kwiecien J, Filipowicz B. Firefly algorithm in optimization of queueing systems. Bulletin of the Polish Academy of Sciences: Technical Sciences. 2012;60(2):363-368
  23. 23. Gandomi AH, Yang X-S, Alavi AH. Mixed variable structural optimization using firefly algorithm. Computers & Structures. 2011;89(23–24):2325-2336
  24. 24. Yeomans JS. An efficient computational procedure for simultaneously generating alternatives to an optimal solution using the firefly algorithm. In: Yang XS, editor. Nature-Inspired Algorithms and Applied Optimization. Studies in Computational Intelligence, Vol. 744. Springer; 2018
  25. 25. Kanimozhi T, Latha K. Stochastic firefly for image optimization, International Conference on Communication and Signal Processing. 2013. pp. 592-596
  26. 26. Suman S, Kar R, Mandal D, Ghoshal S. A novel firefly algorithm for optimal linear phase FIR filter design. International Journal of Swarm Intelligence Research (IJSIR). 2013;4(2):29-48
  27. 27. Sivaranjani P, Senthil Kumar A. Hybrid particle swarm optimization-firefly algorithm (HPSOFF) for combinatorial optimization of non-slicing VLSI floor planning. Journal of Intelligent and Fuzzy Systems. 2017;32(1):661-669
  28. 28. Afnizanfaizal A, Safaai D, Sohail A, Arjunan SNV. An evolutionary firefly algorithm for the estimation of nonlinear biological model parameters. PLoS ONE. 2013;8(3):56310
  29. 29. Ebtehaj I, Bonakdari H. A support vector regression-firefly algorithm-based model for limiting velocity prediction in sewer pipes. Water Science and Technology: A Journal of the International Association on Water Pollution Research. 2016;73(9):2244-2250
  30. 30. Rajakumar BR. The lion's algorithm: A new nature-inspired search algorithm. Procedia Technology. 2012;6:126-135
  31. 31. Yazdani M, Fariborz J. Lion optimization algorithm (LOA): A nature-inspired metaheuristic algorithm. Journal of Computational Design and Engineering. 2016;3(1):24-36
  32. 32. Wang B, Jin XP, Cheng B. Lion pride optimizer: An optimization algorithm inspired by lion pride behaviour. Science China Information Sciences. 2012;55(10):2369-2389
  33. 33. Chander S, Vijaya P, Dhyani P. Fractional lion algorithm-an optimization algorithm for data clustering. Journal of Computer Science. 2016;2(7):323-340
  34. 34. Meng X, Liu Y, Gao XZ, Zhang H. A new bio-inspired algorithm: Chicken swarm optimization. In: Advances in Swarm Intelligence. ICSI, Lecture Notes in Computer Science. Vol. 8794. Springer; 2014. pp. 86-94
  35. 35. Mohamed TM. Enhancing the performance of the greedy algorithm using chicken swarm optimization: An application to exam scheduling problem. Egyptian Computer Science Journal. 2018;42(1):1-17
  36. 36. Tiana F, Zhanga R, Lewandowskic J, Chaoc KM, Li LZ, Dong B. Deadlock-free migration for virtual machine consolidation using chicken swarm optimization algorithm. Journal of Intelligent & Fuzzy Systems. 2017;32(2):1389-1400
  37. 37. Xu S, Wu D, Kong F, Ji Z. Solving flexible job-shop scheduling problem by improved chicken swarm optimization algorithm. Journal of System Simulation. 2017;29(7):1497-1505
  38. 38. Benoit C, Sebastian S, Wu ZQ, Yu DQ, Kang XH. Application of improved chicken swarm optimization for MPPT in photovoltaic system. Optimal Control Applications and Methods. 2018;39(2), 1029(14)
  39. 39. Xu Y, Li W, Li D, You X. Disaggregation for non-invasive domestic appliances based on the improved chicken swarm optimization algorithm. Power System Protection and Control. 2016;44(13):27-32
  40. 40. Yu JJQ, Li VOK. A social spider algorithm for global optimization. Applied Soft Computing. 2015;30:614-627
  41. 41. Tawhid MA, Ali AF. A hybrid social spider optimization and genetic algorithm for minimizing molecular potential energy function. Soft Computing. Springer; 2017;21(21):6499-6514
  42. 42. El-bages MS, Elsayed WT. Social spider algorithm for solving the transmission expansion planning problem. Electric Power Systems Research. 2017;143:235-243
  43. 43. Yu JJQ, Li VOK. A social spider algorithm for solving the non-convex economic load dispatch problem. Neurocomputing. 2016;171:955-965
  44. 44. Mousa A, Bentahar J. An efficient QoS-aware web services selection using social spider algorithm. Procedia Computer Science. 2016;94:176-182
  45. 45. Bansal JC, Sharma H, Jadon SS, Clerc M. Spider monkey optimization algorithm for numerical optimization. Memetic Computing. 2014;6(1):31-47
  46. 46. Gupta K, Deep K, Bansal J. Spider monkey optimization algorithm for constrained optimization problems. Soft Computing. 2017;21(23):6933-6962
  47. 47. Singh U, Salgotra R, Rattan M. A novel binary spider monkey optimization algorithm for thinning of concentric circular antenna arrays. IETE Journal of Research. 2016;62(6):736-744
  48. 48. Al-Azza A, Al-Jodah A, Ammar F, Harackiewicz J. Spider monkey optimization: A novel technique for antenna optimization. IEEE Antennas and Wireless Propagation Letters. 2016;15:1016-1019
  49. 49. Odili JB, Kahar MNM, Anwar S. African buffalo optimization: A swarm-intelligence technique. Procedia Computer Science. 2015;76:443-448
  50. 50. Odili J, Beneoluchi M, Kahar MN. Solving the traveling Salesman’s problem using the African buffalo optimization. Computational Intelligence and Neuroscience. 2016;2016(3)
  51. 51. Odili J, Noraziah. Parameters-tuning of PID controller for automatic voltage regulators using the African buffalo optimization. PLoS One. 2017;12(4):175901
  52. 52. Yang X-S. Flower pollination algorithm for global optimization. In: Unconventional Computation and Natural Computation, Lecture Notes in Computer Science. Vol. 7445. 2012. pp. 240-249
  53. 53. Zhou Y, Zhang S, Luo Q, et al. Neural Computing and Applications. 2018;29:21. DOI: 10.1007/s00521-016-2524-0
  54. 54. Gao ML, Jin S, Jun J. Visual tracking using improved flower pollination algorithm. Optik - International Journal for Light and Electron Optics. 2018;156:522-529
  55. 55. Velamuri, Suresh S, Sreejith P, Ponnambalam. Static economic dispatch incorporating wind farm using flower pollination algorithm. Perspectives in Science. 2016;8:260-262
  56. 56. Gebrail B, Melih NS, Yang XS. Sizing optimization of truss structures using flower pollination algorithm. Applied Soft Computing. 2015;37:322-331
  57. 57. Alam DF, Yousri DA, Eteiba MB. Flower pollination algorithm based solar PV parameter estimation. Energy Conversion and Management. 2015;101:410-422

Written By

Siew Mooi Lim and Kuan Yew Leong

Submitted: 24 October 2017 Reviewed: 03 April 2018 Published: 18 July 2018