Abstract
It has been recently revealed that particle swarm optimization (PSO) is a modern global optimization method and it has been used in many real world engineering problems to estimate model parameters. PSO has also led as tremendous alternative method to conventional geophysical modeling techniques which suffer from dependence to initial model, linearization problems and being trapped at a local minimum. An area neglected in using PSO is joint modeling of geophysical data sets having different sensivities, whereas this kind of modeling with multiobjective optimization techniques has become an important issue to increase the uniqueness of the model parameters. However, using of subjective and unpredictable weighting to objective functions may cause a misleading solution in multiobjective optimization. Multiobjective PSO (MOPSO) with Pareto approach allows obtaining set of solutions including a joint optimal solution without weighting requirements. This chapter begins with an overview of PSO and Pareto-based MOPSO presented their mathematical formulation, algorithms and alternate approaches used in these methods. The chapter goes on to present a series synthetic modeled of seismological data that is one kind of geophysical data by using of Pareto-based multiobjective PSO. According to results matched perfectly, we believe that multiobjective PSO is an innovative approach to joint modeling of such data.
Keywords
- particle swarm optimization
- Pareto-based optimization
- multiobjective
- geophysical modeling
1. Introduction
As a conventional approach, least squares and linear programming optimization methods have been used modeling of geophysical data with a general form without requiring any special case. However, due to some disadvantages of these methods such as computational time and linearization problems, it has become inevitable to tend to new approaches for the researchers. Unlike traditional optimization methods, optimization of nonlinear models has been improved in two ways which are derivative based and non-derivative search methods. Unfortunately, one of the major disadvantages of derivative based methods is that solutions potentially trap at a local minimum because of depending on initial model. On the other hand, non-derivative search methods partly provide a global solution, however, significantly increase computing time.
In recent years, two approaches that are artificial intelligence and meta-heuristic optimization algorithms have been effectively put forward in geophysical modeling studies having complex and nonlinear models. Meta-heuristic optimization algorithms called as modern global optimization methods are based on systematic characteristics of biological, molecular, neurobiological and animal swarms [1]. PSO as one of the modern global optimization algorithms, has increased its popularity with rapid convergence compared to various optimization algorithms [2, 3], especially when real model parameters are used [4].
PSO for multiobjective optimization has also been used in many studies in order to solve real world engineering problems having conflicted solutions between objective functions [5]. Despite this interest, very few researchers have studied MOPSO for joint modeling of geophysical data such as electromagnetic and gravity [6, 7]. In fact, simultaneous optimization of multiobjective functions is also favored to increase uniqueness of model parameters in joint modeling of geophysical data that are generally sensitive to different physical phenomena. Multiobjective functions can be transferred into single-objective by combination of objective functions by using weighted-sum approach. However, it is very difficult to find reasonable and optimum weigting coefficients [5]. In engineering problems, subjective and unpredictable weightings used to objective functions are the primary cause of a misleading solution, because different sensitivities and unpredictable noise of different data sets lead to uncertainty in weighting. Pareto optimality approach is a good way to obtain set of possible solutions including an optimum solution in objective function space overcoming the use of weighting and combining.
The purpose of this chapter is to review the literature on Pareto-based MOPSO. This chapter first gives a brief overview of the methods and approaches used in PSO and Pareto-based MOPSO and to look at how mathematical formulations and general algorithms of these optimization techniques work. In order to show the superiority of Pareto-based MOPSO over weighting-sum approaches, the chapter proceeds as joint modeled of two synthetic seismological data using Pareto-based MOPSO and analyses the obtained results. The results demonstrate that Pareto-based MOPSO is a useful approach to joint modeling of seismological data as explain in detail in our previous paper [8], of which TÜBİTAK is the publisher. Findings validate the usefulness of MOPSO as a technique to optimize objective functions simultaneously without weighting requirements. Finally, conclusion section gives a brief summary of MOPSO and critique of findings in modeling.
2. Particle swarm optimization
PSO method, inspired by social behavior of the bird or fish flock to reach to target in a shortest route was originally introduced by [9]. It was noticed that members of flocks suddenly change their movements as scattering and regrouping, when trajectory of swarms was observed. A striking feature of that was an effort of members to reduce their distance from both flock and surrounding members. It was found that knowledge within a flock was continuously shared by all members. PSO method was developed by defining each member in a flock as a particle. According to PSO, particles bearing an information of decision variable or model parameters take a position in an objective function space. Each particle is in communication and learning with other particles as schematically illustrated in Figure 1a. If a minimization problem is considered, each particle changes its position with a velocity vector and converges a global minimum as shown in Figure 1b.
The basic algorithm of PSO is outlined in Figure 2. According to this scheme, particles which velocities are initially assigned as zero are initiated by randomly selection between the minimum and maximum value of decision variables. After each particle is evaluated with an objective function, particle with best fitness value is assigned as a global best solution. Particles move to the next position with a velocity vector
where,
subscript
2.1 Selection of the PSO parameters
Velocity vector in Eq. (1) is controlled in the following factors: velocity limitation, learning coefficients and inertia weight. These factors are significantly contribute to prevent explosion in a swarm and ensure convergence.
Eq. (3) reveals that
2.2 Swarm topologies
Network of particles in a swarm is provided by kind of neighborhood topologies that regulate sharing of information between particles. Small-scale topologies are selected to use in solving complex problems, whereas large-scale topologies are selected for simpler problem [19]. Empty, local best, fully connected, star network and tree network are the list of the neighborhood network topologies which are generally used in PSO [20].
3. Pareto multiobjective optimization
In multiobjective optimization problems (MOOP) indicated as an optimization of more than one objective function, “trade-off” solutions that are conflicted to each other are obtained rather than single solution. MOOP is generally defined to obtain decision variables
in
including all objective functions [25]. In MOOP, all objective functions in
All particles of Pareto front generally spreads in two ways which are concave and L-shaped curves. Concave shaped spreading indicates that one objective function does not be ameliorated without deteriorating any other objective function(s). On the other hand, L-shaped spreading indicates that complete optimal solution which means that all objective functions can be optimized simultaneously [26]. Other remarkable distribution of Pareto front is its deviation from symmetry referenced by utopia point [0,0]. Deviation is an indication that one objective function has many local minimums relatively the others (minimum one or more) and/or modeling has not properly accomplished with defined parameter search space and used methods [27].
4. Pareto-based multiobjective particle swarm optimization
Several considerations that are increasing the diversity, maintaining the non-dominated particles and selecting a leader should be taken into account when using MOPSO. [20] A general MOPSO algorithm modified from PSO algorithm by these considerations outlined below is shown in Figure 5.
The kernel density estimator is based on the niching method. A niche which has radius as
5. Selecting the methods and parameters for Pareto-based multiobjective particle swarm optimization
We joint modeled two synthetic seismological data obtained by response of five-layered models by using MOPSO. In one synthetic model, shear wave velocities increase smoothly with depth (SM-1), while the other has a noticeable velocity contrast at third layer interface (SM-2). In the modeling stage, we simultaneously optimized two objective functions related with Rayleigh wave dispersion (RWD) and horizontal to vertical spectral ratio (HVSR) methods, which have different sensivities to physical phenomena. The estimated parameters were shear-wave velocities and depth obtained by cumulative sum of layer thicknesses in each layer. Parameter search space and more technical details were be given in [8].
Minimization was carried out between observed data obtained by response of synthetic models (
where
6. Results and discussions
The results of synthetic models (SM-1 and SM-2) are shown in Figure 9 and Figure 10, respectively. As can be seen in Figure 9a and Figure 10a, MOPSO was successful in proving to obtain the real models with defined methods and parameters. These tests showing matched perfectly between synthetic data and model responses as seen in Figure 9b, Figure 9c, Figure 10b and Figure 10c, further strengthened our confidence in MOPSO modeling such seismological data. The most conspicuous observation to emerge from the result of both models was the distribution of dominated particles and Pareto front. Dominated particles in Figure 9d show a clear and balanced distribution for SM-1, however, as in Figure 10d dominated particles tend to towards the
7. Conclusion
This chapter considered is an overview of methods and parameters generally used in PSO and Pareto-based MOPSO, in the first step. The parameters and methods used in the literature are reliable but do not have an obvious superiority to each other. In spite of that MOPSO has been widely applied in many real world engineering problems, a few attempts have been made in order to modeling geophysical data. Until our previous study, this methodology have not been applied modeling seismological data. A set of solutions demonstrated in this chapter support the idea that MOPSO provides a powerful methodology for joint modeling of data having different sensivity. The present findings have important implications in order to solve weighting problem encountered in joint modeling approach. A clear benefit of MOPSO in the prevention of weighting-sum approaches could be clearly identified in this analysis. A further important implication is that divergence of particles from an objective function axis is not only related to properly defined parametrization and accomplished modeling, but also to non-uniqueness solutions. Our investigations into this point are still ongoing and seem likely to confirm our hypothesis. The evidence from this study implies that MOPSO is considered to be very attractive for joint modeling geophysical data in the future. However, further work needs to be performed to confirm whether MOPSO is beneficial to joint modeling of different types of geophysical data.
Acknowledgments
I am grateful to Prof. Dr. Abdullah Karaman and Assoc. Prof. Dr. Ekrem Zor for their help and valuable suggestions and discussions. Support was given by Istanbul Technical University and The Council of Higher Education (YÖK), who funded the work in its initial stages.
References
- 1.
Rao SS. Engineering Optimization: Theory and Practice. Fourth Edition. Hoboken, NJ, USA: John Wiley & Sons, Inc; 2009. 813 p. DOI: 10.1002/9780470549124 - 2.
Van Den Bergh F, Engelbrecht AP. A study of particle swarm optimization particle trajectories. Information Sciences. 2006; 176: 937–971. DOI: 10.1016/j.ins.2005.02.003 - 3.
Büyük E, Zor E, Karaman A. Rayleigh wave dispersion curve inversion by using particle swarm optimization and genetic algorithm. In: 19th European Geophysical Union EGU General Assembly; 2017; Vienna, Austria. p. 6911 - 4.
Engelbrecht AP. Computational Intelligence: An Introduction. Second Edition . John Wiley and Sons, Inc; 2007. 628 p. DOI: 10.1002/9780470512517 - 5.
Blum C, Li X. Swarm Intelligence in Optimization. In: Blum C, Merkle D, editors. Swarm Intelligence Introduction and Applications. Berlin Heidelberg: Springer-Verlag; 2008. p. 43–85. DOI: 10.1007/978-3-540-74089-6_2 - 6.
Pace F, Godio A, Santilano A, et al. Joint optimization of geophysical data using multi-objective swarm intelligence. Geophysical Journal International. 2019; 218: 1502–1521. DOI: 10.1093/gji/ggz243 - 7.
Danek T, Leśniak A, Miernik K, et al. Pareto Joint Inversion of 2D magnetometric and gravity data-synthetic study. E3S Web of Conference. 2019; 133. DOI: 10.1051/e3sconf/201913301009 - 8.
Büyük E, Zor E, Karaman A. Joint modeling of rayleigh wave dispersion and H/V spectral ratio using pareto-based multiobjective particle swarm optimization. Turkish Journal of Earth Science. 2020; 29: 684–695. DOI: 10.3906/yer-2001-15 - 9.
Kennedy J, Eberhart R. Particle swarm optimization. Neural Networks. 1995; 4: 1942–1948. DOI: 10.1109/ICNN.1995.488968 - 10.
Eberhart RC, Shi Y. Particle swarm optimization: Developments, applications and resources. In: Proceedings of the IEEE Conference on Evolutionary Computation, ICEC; 2001; 1: 81–86. DOI: 10.1109/cec.2001.934374 - 11.
Fan H, Shi Y. Study on Vmax of particle swarm optimization. In: Proceedings of the Workshop on Particle Swarm Optimization. Indianapolis: Purdue School of Engineering and Technology, IUPUI. 2001 - 12.
Ratnaweera A, Halgamuge SK, Watson HC. Self-organizing hierarchical particle swarm optimizer with time-varying acceleration coefficients. IEEE Transactions on Evolutionary Computation. 2004; 8: 240–255. DOI: 10.1109/TEVC.2004.826071 - 13.
Ozcan E, Mohan CK. Particle swarm optimization: Surfing the waves. In: Proceedings of the 1999 Congress on Evolutionary Computation, CEC; 1999 ; pp. 1939–1944. DOI: 10.1109/CEC.1999.785510 - 14.
Shi Y, Eberhart R. Modified particle swarm optimizer. In: Proceedings of the IEEE Conference on Evolutionary Computation, ICEC ; 1998; pp. 69–73. DOI: 10.1109/icec.1998.699146 - 15.
Kennedy J. The behavior of particles. In: Porto VW, Saravanan N, Waagen D, editors. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics). Berlin: Springer Berlin Heidelberg; 1998. p. 581–589. DOI: 10.1007/bfb0040809 - 16.
Venter G, Sobieszczanski-Sobieski J. Particle Swarm Optimization. AIAA Journal. 2003; 41: 1583–1589. DOI: 10.2514/2.2111 - 17.
Clerc M. The swarm and the queen: Towards a deterministic and adaptive particle swarm optimization. In: Proceedings of the 1999 Congress on Evolutionary Computation, CEC; 1999; pp. 1951–1957. DOI: 10.1109/CEC.1999.785513 - 18.
Eberhart RC, Shi Y. Comparing inertia weights and constriction factors in particle swarm optimization. In: Proceedings of the 2000 Congress on Evolutionary Computation, CEC; 2000 ; pp. 84–88. DOI: 10.1109/CEC.2000.870279 - 19.
Mendes R, Kennedy J, Neves J. The fully informed particle swarm: Simpler, maybe better. IEEE Transactions on Evolutionary Computation. 2004; 8: 204–210. DOI: 10.1109/TEVC.2004.826074 - 20.
Coello Coello CA, Reyes-Sierra M. Multi-Objective Particle Swarm Optimizers: A Survey of the State-of-the-Art. International Journal of Computational Intelligence Researchs. 2006; 2: 287–308. DOI: 10.5019/j.ijcir.2006.68 - 21.
Eberhart RC, Shi Y. Computational Intelligence. In: Eberhart RC, Shi Y, editors. Morgan Kaufmann; 2007; p. 17–38. DOI: 10.1016/B978-155860759-0/50002-0 - 22.
Kennedy J. Small worlds and mega-minds: Effects of neighborhood topology on particle swarm performance. In: Proceedings of the 1999 Congress on Evolutionary Computation, CEC 1999. IEEE Computer Society; 1999; pp. 1931–1938. DOI: 10.1109/CEC.1999.785509 - 23.
Engelbrecht AP. Computational Intelligence: An Introduction. John Wiley & Sons; 2005. p. 597. DOI: 10.1002/9780470512517 - 24.
Janson S, Middendorf M. A hierarchical particle swarm optimizer. In: 2003 Congress on Evolutionary Computation, CEC 2003 - Proceedings; IEEE Computer Society; 2003. pp. 770–776. DOI: 10.1109/CEC.2003.1299745 - 25.
Deb K. Multi-Objective Optimization Using Evolutionary Algorithms. New York USA: John Wiley & Sons; 2001. p 518 - 26.
Kozlovskaya E, Vecsey L, Plomerova J, et al. Joint inversion of multiple data types with the use of multiobjective optimization: problem formulation and application to the seismic anisotropy investigations. Geophysical Journal International. 2007; 171: 761–779. DOI: 10.1111/j.1365-246X.2007.03540.x - 27.
Dal Moro G. Insights on surface wave dispersion and HVSR: Joint analysis via Pareto optimality. Journal of Applied Geophysics. 2010; 72: 129–140. DOI: 10.1016/j.jappgeo.2010.08.004 - 28.
Angeline PJ. Evolutionary optimization versus particle swarm optimization: Philosophy and performance differences. In: Porto VW, Saravanan N, Waagen D, et al., editors. Evolutionary Programming VII. Berlin: Springer, Berlin, Heidelberg; pp. 601–610. DOI: 10.1007/bfb0040811 - 29.
Van Den Bergh F. An Analysis of Particle Swarm Optimizers. PhD Thesis, Department of Computer Science, University of Pretoria, Pretoria, South Africa, 2006; p. 284 - 30.
Coello Coello C, Pulido GT, Lechuga MS. Handling multiple objectives with particle swarm optimization. IEEE Transactions on Evolutionary Computationrans. 2004; 8: 256–279. DOI: 10.1109/TEVC.2004.826067 - 31.
Goldberg DE, Holland JH. Genetic Algorithms and Machine Learning. Machine Learning; 1988; 3: 95–99. DOI: 10.1023/A:1022602019183 - 32.
Van den Bergh F, Engelbrecht AP. A new locally convergent particle swarm optimiser. In: Proceedings of the IEEE International Conference on Systems, Man and Cybernetics. 2002, pp. 94–99. DOI: 10.1109/icsmc.2002.1176018 - 33.
Brits R, Engelbrecht AP, Bergh F Van Den. A Niching Particle Swarm Optimizer. In: Proceedings of the 4th Asia-Pacific Conference on Simulated Evolution and Learning (SEAL 2002). Orchid Country Club, Singapore, 2002, pp. 692–696 - 34.
Barrera J, Coello CAC. A review of particle swarm optimization methods used for multimodal optimization. In: Chee Peng Lim, Lakhmi C. Jain, Dehuri S, editors. Innıovations in Swarm Intelligence . 2009, pp. 9–37. DOI: 10.1007/978-3-642-04225-6_2 - 35.
Kennedy J. Stereotyping: Improving particle swarm performance with cluster analysis. In: Proceedings of the IEEE Conference on Evolutionary Computation, ICEC . 2000, pp. 1507–1512 - 36.
Pulido GT, Coello Coello CA. Using clustering techniques to improve the performance of a multi-objective particle swarm optimizer. In: Deb K, editor. Genetic and Evolutionary Computation. Lecture Notes in Computer Science: 2004; 3102: 225–237. DOI: 10.1007/978-3-540-24854-5_20 - 37.
Laumanns M, Thiele L, Deb K, et al. Combining convergence and diversity in evolutionary multiobjective optimization. Evolutionary Computation. 2002; 10: 263–282. DOI: 10.1162/106365602760234108 - 38.
Salazar-Lechuga M, Rowe JE. Particle swarm optimization and fitness sharing to solve multi-objective optimization problems. In: Congress on Evolutionary Computation (CEC’2005). 2005; Edinburgh, Scotland, UK: IEEE Press; 2: 1204–1211. DOI: 10.1109/cec.2005.1554827 - 39.
Li X. A non-dominated sorting particle swarm optimizer for multiobjective optimization. In: Cantú-Paz E. et al., editors. Genetic and Evolutionary Computation. Lecture Notes in Computer Science, Springer, Berlin, Heidelberg: 2003; pp. 37–48. DOI: 10.1007/3–540-45105-6_4 - 40.
Deb K, Pratap A, Agarwal S, et al. A fast and elitist multiobjective genetic algorithm: NSGA-II. In: IEEE Transactions on Evolutionary Computation. 2002; 6:182–197. DOI: 10.1109/4235.996017 - 41.
Deb K, Goldberg DE. An investigation of niche and species formation in genetic function optimization. In: Proceedings of the third international conference on Genetic algorithms. 1989;pp. 42–50 - 42.
Raquel CR, Naval PC. An effective use of crowding distance in multiobjective particle swarm optimization. In: GECCO 2005 - Genetic and Evolutionary Computation Conference. New York, New York, USA: ACM Press, 2005; pp. 257–264. DOI: 10.1145/1068009.1068047 - 43.
Kennedy J. Swarm Intelligence. In: Zomaya A.Y. editor. Handbook of Nature-Inspired and Innovative Computing. Springer, Boston: 2006; pp 187–219. DOI: 10.1007/0–387-27705-6_6