Open access

Introductory Chapter: Swarm Intelligence and Particle Swarm Optimization

Written By

Pakize Erdogmus

Published: May 30th, 2018

DOI: 10.5772/intechopen.74076

Chapter metrics overview

2,092 Chapter Downloads

View Full Metrics

1. Introduction

In order to survive, the main objective of all creatures is foraging. Foraging behavior is cooperative in the same species. Each agent in the swarm communicates with others in such a way to find the food in the shortest time and way. This capability of all lively beings gives inspiration to the human being in order to find solutions to the optimization problems. Collective foraging behaviors of the lively beings are called swarm intelligence.

Most of the animals live as social groups in order to find foods easily and protect from the enemies to survive. Each individual lives in their habitat. Looking for food, they use their own experiences called cognitive movements as well as the experience of their leaders called social movements.

Optimization is to find the best solution to a given problem under some constraints. All disciplines use optimization for finding the best solution for their problems. Optimization is the first and foremost objective for engineers too. So especially in the future engineering applications, optimization will be an indispensable part of the product.

Optimization is everywhere. In the production of a new device, in a new artificial intelligence technique, in a big data application or in a deep learning network, optimization is the most important part of the application. To design a device with optimum sizes using minimum energy, to train a network, to minimize the error between the desired output and real output values, optimization is required.

Because of the difficulties of classical optimization algorithms, scientists have started to find an easy way to solve their problems in the last 1960s. The development of the computers made the efforts of the scientists easy, and completely new problem solution techniques are studied. These techniques using heuristic information were derivative free, easy to implement, and shorten the solution time. The first product of these studies is genetic algorithm (GA) developed by Holland [1]. The evolutionary idea has been applied to the solution of the optimization problems. Instead of the evolving only one solution, a group of solutions called population has been used in the algorithm. Each solution is called individual. By this way, running such algorithms with multiple processors could be possible. After GA, simulated annealing [2] has been generally accepted as the second algorithm, inspired from the annealing process of physical materials. In high temperatures, particles move randomly in order to explore the solution space. While temperature is decreasing, particles try to create a perfect crystalline structure, only with local movements.

Advertisement

2. Particle swarm optimization

Particle swarm optimization (PSO) is accepted as the second population-based algorithm inspired from animals. Since James Kennedy (a social psychologist) and Russell C. Eberhart simulated the bird flocking and fish schooling foraging behaviors, they have used this simulation to the solution of an optimization problem and published their idea in a conference in 1995 [3] for the optimization of continuous nonlinear functions. There are two main concepts in the algorithm: velocity and coordinate for each particle. Each particle has a coordinate and an initial velocity in a solution space. As the algorithm progresses, the particles converge toward the best solution coordinates. Since PSO is quite simple to implement, it requires less memory and has no operator. Due to this simplicity, PSO is also a fast algorithm. Different versions of PSO have been developed, using some operators since the first version of PSO was published.

In the first versions of PSO, the velocity was calculated with a basic formula using current velocity, personal best and local best values in the formula, multiplying stochastic variables. The current particle updates its previous velocity, not only its previous best but also the global best. The total probability was distributed between local and global best using stochastic variables.

In the next versions, in order to control the velocity, an inertia weight was introduced by Shi and Eberhart in 1998 [4]. Inertia weight balances the local and global search ability of algorithm. Inertia weight specifies the rate of contribution of previous velocity to its current velocity. Researchers made different contributions to the inertia weight concept. Linearly, exponential or randomly decreasing or adaptive inertia weight was introduced by different researchers [5]. In the next version of PSO, a new parameter called constriction factor was introduced by Clerc and Kenedy [6, 7]. Constriction factor (K) was introduced in the studies on stability and convergence of PSO. Clerc indicates that the use of a constriction factor insured convergence of the PSO. A comparison between inertia weight and constriction factor was published by Shi and Eberhart [8].

Nearly all engineering discipline and science problems have been solved with PSO. Some of the most studied problems solved with PSO are from Electrical Engineering, Computer Sciences, Industrial Engineering, Biomedical Engineering, Mechanical Engineering and Robotics. In Electrical Engineering, power distribution problem [9] is solved with PSO. Another most studied problem in Electrical Engineering is economic dispatch problem [10, 11]. In Computer Sciences, face localization [12], edge detection [13], image segmentation [14], image denoising [15], image filtering [16] problems are solved with PSO. In Industrial Engineering, examination timetabling problems [17], traveling salesman problem [18], and job-shop scheduling problems [19] are solved with PSO. In Robotics, particle swarm optimization in coconut tree plucking robot is introduced [20], and path planning problem is solved with PSO [21]. In these studies, it has been proven PSO success in point of both performance and speed in most of the studies.

After the main PSO algorithm was studied and evolved with some parameters, hybrid algorithms were designed and developed by researchers. The most prominent ones are hybrid PSO with genetic algorithm (GA) [22] and particle swarm optimization with chaos [23, 24, 25] and quantum chaotic PSO [26]. In the next section, some of the recent hybrid PSO algorithms are presented.

Advertisement

3. Hybrid PSO algorithms using swarm intelligence

Hybrid algorithms are quite successful since they combine both algorithms’ powerful sides. Since PSO is quite fast algorithm, nearly all newly developed algorithm combined with PSO. Some of the recent studies using swarm intelligence are crow search algorithm (CSA) [27], ant lion optimizer (ALO) [28], the whale optimization algorithm (WOA) [29], grey wolf optimizer (GWO) [30], monarch butterfly optimization (MBO) [31], moth flame optimization [32], selfish herd optimization (SHO) [33], and salp swarm optimization (SSO) [34]. Since the algorithms stated in the following paragraph are quite new, according to the Web of Science records, there is not any hybrid study combining PSO with them.

Firefly algorithm (FFA) [35], bacterial foraging optimization algorithm (BFOA) [36], ant colony optimization (ACO) [37], artificial bee colony (ABC) [38], and cuckoo search (CS) [39] are some of the algorithms using swarm intelligence improved in the last decade. There are hybrid versions of these algorithms with PSO.

3.1. Firefly algorithm

FFA is improved by mimicking the flashing activity of fireflies. FFA is similar most of the swarm intelligence algorithm. Fireflies are located in a position in the solution space randomly initially. The fitness of the fireflies is calculated according to the light intensity. The next location of each firefly is calculated according to the current position, randomness and attractiveness. The hybrid algorithm combined PSO with firefly optimization [40] proposes a technique for the detection of Bundle Branch Block (BBB), one of the abnormal cardiac beat, using hybrid firefly and particle swarm optimization (FFPSO) technique in combination with Levenberg Marquardt Neural Network (LMNN) classifier.

3.2. Bacterial foraging optimization algorithm

BFOA is inspired by the social foraging behavior of Escherichia coli. BFOA is an efficient algorithm in solving real-world optimization problems. Chemotaxis process simulates the movement of an E. coli cell through swimming and tumbling via flagella. E. coli cells like particles move in solution space and change their location with a formula-dependent previous location, randomness and chemotactic step size. The big difference between PSO is that not only global best, but also global worst is also evaluated in the algorithm. Among E. coli cells, the least healthy one, eventually die. Each of the healthier bacteria splits into two bacteria, which are then placed in the same location. This keeps the swarm size constant. The hybrid algorithm using PSO and BFOA optimized PI controller, for multiobjective load frequency [41]. The authors developed a hybrid PSO with firefly, also developed a hybrid PSO with BFOA, for the detection of BBB. The same classifier is used in the study [42].

3.3. Ant colony optimization

ACO is inspired from the pheromone trails of ants. The first version improved by Dorigo is called ant system [43]. Although most of the ant species are blind, they can find the shortest path from their nest to food source using swarm intelligence. Ants are located in random positions in the solution space and moves with pheromone trail and randomness. In the first iterations, ants move stochastically. Eventually, pheromone increases in the path used most because ants prefer the path that contains more pheromone. This means that “Trace me.” In order not to get trapped to the local convergence, pheromone evaporates related to time. ACO is generally used for the solution of combinatorial optimization problem solution such as travelling salesman problem (TSP) and some network problems. Hybrid PSO with ACO [44] solves routing problem.

3.4. Artificial bee colony

ABC optimization is developed by Karaboga in 2005. ABC is also a swarm intelligence algorithm based on the foraging behavior of honey bee swarms. The artificial bee colonies in the ABC algorithm consists of three groups: employed bees, onlookers and scouts. Employed bees search for food source and sharing this information to recruit onlooker bees. Onlooker bees select better food sources from those employed bees and further search around the selected food source. If a food source is not improved by some iteration, this employed bee will become a scout bee to search randomly for new food sources. In [45], a hybrid algorithm is developed combining PSO and ABC. Since PSO is fast convergent algorithm and ABC is slow convergent algorithm, hybrid algorithm uses the powerful sides of each algorithm.

3.5. Cuckoo search

CS is also another swarm intelligence algorithm inspired from cuckoos. CS is based on the interesting breeding behavior such as brood parasitism of certain species of cuckoos in combination with L’evy flight behavior of some birds. CS is successful for finding optimum values of multimodal functions. A hybrid algorithm is proposed finding for optimal design of multiband stop filters [46].

Advertisement

4. Conclusion

After GA, PSO is the first and foremost algorithm which is the most successful algorithm especially for the solution of the continuous optimization problems. Subsequent algorithms using swarm intelligence nearly have the same ideas. All of them are population based, and all particles are distributed in the solution space. Particles have initial locations and velocities. They converge to the optimum solution using their swarm intelligence.

Although there have been improvements in the optimization algorithms using swarm intelligence, there is not a unique algorithm which is successful in all types of optimization problems. So the efforts trying to simulate the animal behaviors and swarm intelligence will continue. At the same time, developing hybrid algorithms will also continue until the best combination of the algorithms would be found.

References

  1. 1. Holland JH. Adaptation in Natural and Artificial Systems. Second edition (First edition, 1975) ed. Cambridge, MA: MIT Press; 1975/1992
  2. 2. Kirkpatrick S, Gelatt C, Vecchi M. Optimization by simulated annealing. Science. 1983;220(4598):671-680 Retrieved from http://www.jstor.org/stable/1690046
  3. 3. Kennedy J, Eberhart R. Particle Swarm Optimization. 1995. pp. 1942-1948
  4. 4. Shi Y, Eberhart R. A Modified Particle Swarm Optimizer. In: IEEE International Conference on Evolutionary Computation Proceedings. 1998. pp. 69-73
  5. 5. C. Paper, I. Technology, and T. Kharagpur. Inertia weight strategies in particle swarm inertia weight strategies in particle swarm. no. May 2014, 2011
  6. 6. Clerc M. The swarm and the queen: Towards a deterministic and adaptive particle swarm optimization. In: Proceedings of the I999 ICEC. Washington, DC. 1999. pp 1951-1957
  7. 7. Clerc M, Kennedy J. The particle swarm - explosion, stability, and convergence in a multidimensional complex space. In: IEEE Transactions on Evolutionary Computation. Feb 2002;6(1):58-73. DOI: 10.1109/4235.985692
  8. 8. Eberhart RC, Shi Y. Comparing inertia weights and constriction factors in particle swarm optimization. In: Proceedings of the 2000 Congress on Evolutionary Computation. CEC00 (Cat. No.00TH8512). La Jolla, CA. 2000;1:84-88. DOI: 10.1109/CEC.2000.870279
  9. 9. Gao S, Wang H, Wang C, Gu S, Xu H, Ma H. Reactive power optimization of low voltage distribution network based on improved particle swarm optimization. In: Proceedings of the 2017 20th International Conference on Electrical Machines and Systems (ICEMS). Sydney, NSW. 2017. pp. 1-5
  10. 10. Abbas G, Gu J, Farooq U, Asad MU, El-Hawary M. Solution of an economic dispatch problem through particle swarm optimization: A detailed survey - part I. In: IEEE Access. 2017;5:15105-15141
  11. 11. Abbas G, Gu J, Farooq U, Raza A, Asad MU, El-Hawary ME. Solution of an economic dispatch problem through particle swarm optimization: A detailed survey – Part II. In: IEEE Access. 2017;5:24426-24445
  12. 12. Jois S, Ramesh R, Kulkarni AC. Face localization using skin colour and maximal entropy based particle swarm optimization for facial recognition. In: Proceedings of the 2017 4th IEEE Uttar Pradesh Section International Conference on Electrical, Computer and Electronics (UPCON). Mathura, India. 2017. pp. 156-161
  13. 13. Chaudhary R, Patel A, Kumar S, Tomar S. Edge detection using particle swarm optimization technique. In: Proceedings of the 2017 International Conference on Computing, Communication and Automation (ICCCA). Greater Noida, India. 2017. pp. 363-367
  14. 14. Mozaffari MH, Lee WS. Convergent heterogeneous particle swarm optimisation algorithm for multilevel image thresholding segmentation. In: IET Image Processing. 2017;11(8):605-619
  15. 15. Karami A, Tafakori L. Image denoising using generalised Cauchy filter. In: IET Image Processing. 2017;11(9):767-776
  16. 16. Zhou Xc. Color Image Filter Based on Predator-Prey Particle Swarm Optimization. 2009 International Conference on Artificial Intelligence and Computational Intelligence, Shanghai. 2009. pp. 480-484
  17. 17. Marie-Sainte SL. A new hybrid particle swarm optimization algorithm for real-world university examination timetabling problem. In: Proceedings of the 2017 Computing Conference. London, United Kingdom. 2017. pp. 157-163
  18. 18. Chang JC. Modified particle swarm optimization for solving traveling salesman problem based on a Hadoop MapReduce framework. In: Proceedings of the 2016 International Conference on Applied System Innovation (ICASI). Okinawa. 2016. pp. 1-4
  19. 19. Wenbin G, Yuxin L, Yi W. Energy-efficient job shop scheduling problem using an improved particle swarm algorithm. In: Proceedings of the 2017 IEEE 3rd Information Technology and Mechatronics Engineering Conference (ITOEC). Chongqing. 2017. pp. 830-834
  20. 20. Junaedy A, Sulistijono IA, Hanafi N. Particle swarm optimization for coconut detection in a coconut tree plucking robot. In: Proceedings of the 2017 International Electronics Symposium on Knowledge Creation and Intelligent Computing (IES-KCIC). Surabaya, Indonesia. 2017. pp. 182-187
  21. 21. Roberge V, Tarbouchi M, Labonte G. Comparison of parallel genetic algorithm and particle swarm optimization for real-time UAV path planning. In: IEEE Transactions on Industrial Informatics. Feb 2013;9(1):132-141
  22. 22. Juang C. A Hybrid of Genetic Algorithm and Particle Swarm Optimization for Recurrent Network Design. 2004;34(2):997-1006
  23. 23. Liu B, Wang L, Jin Y, Tang F, Huang D. Improved particle swarm optimization combined with chaos. Chaos, Solitons & Fractals. 2005;25:1261-1271
  24. 24. Alatas B, Akin E, Ozer AB. Chaos embedded particle swarm optimization algorithms. Chaos, Solitons & Fractals. 2009;40(4):1715-1734
  25. 25. Rong H. An adaptive chaos embedded particle swarm optimization algorithm. In: Proceedings of the 2010 International Conference on Computer, Mechatronics, Control and Electronic Engineering. Changchun. 2010. pp. 314-317
  26. 26. Emrah O, Sinan M, Turhan M. Chaotic quantum behaved particle swarm optimization algorithm for solving nonlinear system of equations. Computers and Mathematics with Applications. 2014
  27. 27. Askarzadeh A. A novel metaheuristic method for solving constrained engineering optimization problems: Crow search algorithm. Computers & Structures. 2016;169:1-12, ISSN 0045-7949. https://doi.org/10.1016/j.compstruc.2016.03.001
  28. 28. Mirjalili S. The ant lion optimizer. Advances in Engineering Software. 2015;83:80-98, ISSN 0965-9978. https://doi.org/10.1016/j.advengsoft.2015.01.010
  29. 29. Mirjalili S, Lewis A. The whale optimization algorithm. Advances in Engineering Software. 2016;95:51-67. ISSN 0965-9978. https://doi.org/10.1016/j.advengsoft.2016.01.008
  30. 30. Mirjalili S, Mirjalili SM, Lewis A. Grey wolf optimizer. Advances in Engineering Software. 2014;69:46-61. ISSN 0965-9978. https://doi.org/10.1016/j.advengsoft.2013.12.007
  31. 31. Wang GG, Deb S, Cui Z. Neural Computing & Applications. 2015. https://doi.org/10.1007/s00521-015-1923-7
  32. 32. Mirjalili S. Moth-flame optimization algorithm: A novel nature-inspired heuristic paradigm. Knowledge-Based Systems. 2015;89:228-249. ISSN 0950-7051. https://doi.org/10.1016/j.knosys.2015.07.006
  33. 33. Fausto F, Cuevas E, Valdivia A, González A. A global optimization algorithm inspired in the behavior of selfish herds. Biosystems. 2017;160:39-55. ISSN 0303-2647. https://doi.org/10.1016/j.biosystems.2017.07.010
  34. 34. Mirjalili S, Amir H, Mirjalili SZ, Saremi S, Faris H, Mirjalili SM. Salp swarm algorithm: A bio-inspired optimizer for engineering design problems. Advances in Engineering Software. 2017;114:163-191. ISSN 0965-9978. https://doi.org/10.1016/j.advengsoft.2017.07.002
  35. 35. Yang XS. Firefly Algorithms for Multimodal Optimization. In: Watanabe O, Zeugmann T, editors. Stochastic Algorithms: Foundations and Applications. Lecture Notes in Computer Science. Vol 5792. Springer, Berlin, Heidelberg: SAGA; 2009
  36. 36. Passino KM. Biomimicry of bacterial foraging for distributed optimization and control. In: IEEE Control Systems. Jun 2002;22(3):52-67. DOI: 10.1109/MCS.2002.1004010
  37. 37. Dorigo M, Maniezzo V, Colorni A. Ant system: Optimization by a colony of cooperating agents. In: IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics). Feb 1996;26(1):29-41. DOI: 10.1109/3477.484436
  38. 38. Karaboga D, Basturk B. A powerful and efficient algorithm for numerical function optimization: Artificial bee colony (ABC) algorithm. J. Global Optimiz. 2007;39:459-471
  39. 39. Yang X-S, Deb S. Cuckoo Search via Levy Flights. In: World Congress on Nature & Biologically Inspired Computing (NaBIC 2009). 2009. pp. 210-214
  40. 40. Padmavathi K, Sri Rama Krishna K. Hybrid firefly and Particle Swarm Optimization algorithm for the detection of Bundle Branch Block. International Journal of the Cardiovascular Academy. 2016;2(1):44-48. ISSN 2405-8181. https://doi.org/10.1016/j.ijcac.2015.12.001
  41. 41. Dhillon SS, Lather JS, Marwaha S. Multi objective load frequency control using hybrid bacterial foraging and particle swarm optimized PI controller. International Journal of Electrical Power & Energy Systems. 2016;79:196-209. ISSN 0142-0615, https://doi.org/10.1016/j.ijepes.2016.01.012
  42. 42. Kora P, Kalva SR. Hybrid bacterial foraging and particle swarm optimization for detecting bundle branch block. SpringerPlus. 2015;4(1):481
  43. 43. Vitorino LN, Ribeiro SF, Bastos-Filho CJA. A mechanism based on artificial bee Colony to generate diversity in particle swarm optimization. Neurocomputing. 2015;148:39-45. ISSN 0925-2312. https://doi.org/10.1016/j.neucom.2013.03.076
  44. 44. Cheng C-Y, Chen Y-Y, Chen T-L, Yoo JJ-W. Using a hybrid approach based on the particle swarm optimization and ant colony optimization to solve a joint order batching and picker routing problem. International Journal of Production Economics. (Part C). 2015;170:805-814. ISSN 0925-5273. https://doi.org/10.1016/j.ijpe.2015.03.021
  45. 45. Li Z, Wang W, Yan Y, Zheng L. PS–ABC: A hybrid algorithm based on particle swarm and artificial bee colony for high-dimensional optimization problems. Expert Systems with Applications. 2015;42(22):8881-8895. ISSN 0957-4174. https://doi.org/10.1016/j.eswa.2015.07.043
  46. 46. Dash J, Dam B, Swain R. Optimal design of linear phase multi-band stop filters using improved cuckoo search particle swarm optimization. Applied Soft Computing. 2017;52:435-445. ISSN 1568-4946. https://doi.org/10.1016/j.asoc.2016.10.024

Written By

Pakize Erdogmus

Published: May 30th, 2018