Benchmark functions.

## Abstract

Crow Search Algorithm (CSA) is one of the recently proposed swarm intelligence algorithms developed inspiring of the social behaviour of crow flocks. One of the drawbacks of the original CSA is that it tends to randomly select a neighbour on search strategy due to its low convergence rate, which pushes the search to stick in local optima due to the same search strategy applied across iterations. The multi-strategy search for CSA (CSA-MSS) has been proposed to enrich the search facilities and provide diversity to overcome these drawbacks. The multi-strategy search implies utilising a pool of strategies consists of six different types of search operators. The multi-strategy approach with a selection mechanism has not been proposed for CSA before and implemented first time. The comparative performance analysis for the proposed algorithm has been conducted over solving 24 benchmark problems. The results demonstrate that the proposed approach is outperforming well-known state-of-the-art methods.

### Keywords

- swarm intelligence
- crow search algorithm
- global optimisation
- strategy selection
- operator selection schemes

## 1. Introduction

Over the last two decades, many nature-inspired metaheuristic optimisation algorithms (MOAs) have been proposed to solve many abstract and real-world problems including combinatorial, binary and real engineering problems [1]. MOAs are generally iterative processes developed imitating natural phenomena and animal behaviours to obtain optimal solutions for given problems [2]. Although these algorithms provide satisfactory solutions to various optimisation problems, they do not always guarantee an optimal way of solving them in different scales [3, 4]. In this respect, MOAs has recently received increasing interest from scholars in related fields (e.g. engineering, business, logistics, etc.) as a way to design algorithms that specify a solution to general problems using a certain pattern of behaviour inspired by a given society. An example in this respect is the study of a colony of insects or the behaviour of other animals’ societies [5]. Swarm intelligence algorithms tend to simulate social behaviours of the imitated societies, where some renown swarm intelligence algorithms are particle swarm optimisation (PSO) inspired of bird flocks and fish schooling [6, 7], artificial bee colony (ABC) simulates collective food searching behaviour of honey bees [8, 9], cuckoo search algorithm is based on the brood parasitism of some cuckoo species [10, 11], group search optimiser (GSO) is based on animal searching behaviour [12, 13], and firefly algorithm (FA) is based on the flashing light patterns of tropic fireflies [14, 15].

Crow Search Algorithm (CSA) is another swarm intelligence metaheuristic framework inspired of feeding behaviours of crow flocks [1]. Similar to many other MOAs, CSA needs a balance in between exploration and exploitation activities throughout the search process in order to approximate global optimal solutions [16]. In this respect, exploitation refers to the use of gained experience in deepening the search within the locality of the current solution while exploration implies diversification, e.g. jumping to another region of the search space, in order to reach the optimal solution. The majority of the real-world problems are multi-model problems, which possess numerous local optimum solutions hard to avoid, the exploitative search has higher probability to get stuck into a local optimum. In contrast, exploration helps enrich the search with higher diversity to let the search move to other feasible regions within the search space, which may end up with a better payoff due to that a new region may lead to global best. In addition, a well-balanced exploration would let the algorithm to avoid sub-optimal solutions. It is known that a search strategy with balance in between exploration and exploitation facilitates finding sub-optimal solutions close to the global optimal [17] in shorter time, while CSA is known with its higher exploitation capability as designed [18].

In order to balance exploration and exploitation capabilities for MOAs, a number of approaches have been implemented; among these, three categories can be highlighted; (i) dynamic parameterisation on the fly [19], (ii) implementing different perturbation search strategies [20] and (iii) using multiple strategies in the search [21] instead of single one. For the latest approach, a strategy pool consisting of different strategies is constructed. The algorithm chooses the appropriate strategy according to search process circumstances and dynamics. In this article, an implementation of CSA approach with multiple strategies is introduced to solve functional optimisation problems using a pool of six different search strategies. The main contribution of the study is the use of multi-strategy search policy with CSA and suggesting a selection scheme in order to respond best to the search dynamics for solving functional optimisation problems.

## 2. Related works

Studies related to this work can be viewed into two categories: modified version of CSA and adaptation approaches. Zamani et al. [22] proposed new version of CSA, called Conscious Neighbourhood-based CSA (CCSA), with three new search strategies to improve local and global search capabilities of CSA. Sonuc [23] proposed BinCSA, a pure binary version of CSA, in which a logic gate was used as a search strategy. Majhi et al. [24] developed a new version of CSA, called OBL-CSA-MO, in which they tried to escape from local optima by using opposition learning strategy and mutation operator.

CSA has two control parameters: namely awareness probability (AP) and flight length (FL). AP is randomly generated solution probability and controls intensification and diversification balance. FL controls searching effect of algorithm. Rizk-Allah et al. [25] proposed a new version of CSA integrated with chaos theory for solving fractional optimisation problems. Coelho et al. [26] developed a modified CSA in which Gaussian distribution function is used in order to control two algorithmic parameters. Necira et al. [27] proposed an enhanced version of CSA, called dynamic crow search algorithm (DCSA), in which there are two modifications on CSA. Awareness probability is linearly adjusted throughout iterations, and FL is expressed as pareto probability density function. There are several studies aiming to improve the efficiency and robustness of CSA; for more details and further modifications of CSA, the reader is referred to the surveys [28, 29]. To the best knowledge of the authors, CSA has not been studied with multiple search strategies to avoid local optima problem, and this article presents the first ever study in this respect.

## 3. Materials and methods

### 3.1 Crow search algorithm

Crows are known to be among the smartest bird species based on their natural talents [30]; they can remember the location where they or other bird species store food [31], often times even called ‘thieves’. With such a sound memory skill, they can even recall faces and warn the flock using complex communication channels in dangerous situations [32]. CSA is developed as a metaheuristic optimisation algorithm inspired by such smart behaviours [1] where each crow represents a valid solution within the search space. The crow’s memory represents the best solutions for the given problem set. Let

The position of the information held by each crow in the memory should be

where

The general overview of the CSA is given in Algorithm 1 in a pseudo code. Firstly, the algorithmic parameters are determined based on the position of each crow randomly generated in a d-dimensional search space. Once the first population is adopted as the initial best set of solution in the flock memory. Then the main loop of the algorithm starts working. Each crow in the flock makes decision for whom to follow within the flock. Next, the probability of

** Algorithm 1.**The pseudo code of the CSA.

1: Initialising of parameters (

2: Randomly initialise crows positions

3: Memorise the positions

4: ** while**termination criteria is not met

do

5: ** for**i = 1:N

do

6: Selection Phase

7: Determine AP

8: ** if**rand()

then

9: Calculate

10:

11: Generate Random Position

12:

13:

14: Bound Control

15: Evaluate

16: Memorise

17:

### 3.2 Search strategies

Search strategies are used to help algorithms to switch from one solution to another within the neighbourhood. However, use of single search strategy with any metaheuristic algorithm ends up with serious limitations with respect to the balance in between the exploration ant exploitation capabilities. In order to deliver search with a balance, we decided to use a pool of search strategies composed of six different search strategies. The five different approaches picked up from to the literature inspired of the update rules used in differential evolution (DE) and artificial bee colony (ABC) algorithms, where some modifications applied [5, 33].

Eq. (2) is the original update rule used by CSA to generate candidate solution, while Eq. (3) is extended with global best guided search strategy (GCSA) [34] in order to improve exploitation capability. Eq. (4) is a global best-guided search strategy embedded with flock memory (BCSA). Eq. (5) is added to the strategy pool to enhance exploration ability with which it brings two different randomly chosen crows’ position in the scene (CBCSA). Eq. (6) is developed inspiring of DE/RAND/1 to offer higher exploration capability (RCSA). Eq. (7) is also an implementation of DE/RAND/1 rule inserting memory solutions in use (RMCSA). In the all strategies,

where

### 3.3 CSA with multi-strategy search (CSA-MSS)

As explained in the previous section, search-based on single strategy imposes limitations which enforce local optima. This renowned problem is tackled with various ways in the literature. Memetic algorithms [35], modified swarm intelligence algorithms [36], hybrid solutions [37] are the studies seeking for resolutions for this issue. Recently, adaptive operator selection schemes [8, 38] look promising to overcome this issue in an easier way. The proposed approach in this study is based on a multi-strategy search approach, CSA-MSS, which implies adaptive use of multiple search strategies in order to overcome the limitations of single strategy search. It consists of two main components: credit assignment and strategy selection. In order to evaluate the performance of the applied strategy, reward value,

where

where

Adaptive pursuit approach is used as the strategy selection scheme for selecting the search strategy to operate due to its proven good performance in literature [8]. In this selection scheme, each strategy has a probability value which is calculated with Eq. (10).

where

** Algorithm 2.**The general overview of CSA-MSS.

1: Initialising of parameters (

2: Randomly initialise crows positions

3: Memorise the positions

4: ** while**termination criteria is not met

do

5: ** for**i = 1:N

do

6: Selection Phase

7: Determine AP

8: ** if**rand()

then

9: ** if**Any strategy has not rewarded

then

10: Select the strategy randomly

11:

12: Assign probabilities using Eq. (10)

13: Select the strategy using roulette-wheel selection

14:

15: Generate candidate solution

16:

17: Generate Random Position

18:

19: Bound Control

20: Evaluate

21: Calculate reward using Eq. (8)

22: ** if**reward

then

23: Add reward and increase s for selected strategy

24:

25:

26: Memorise

27: Credit assignment using Eq. (9)

28:

### 3.4 Benchmark functions

In order to compare and analyse the performance of the approaches under-consideration alongside CSA, a number of well-known numeric benchmark functions have been used, which are collected from the literature [39, 40] as in Table 1. The characteristics of each are depicted in the second column with UN, MN, SP and NP standing for unimodality, multimodality, separable and non-separable, respectively. If the functions have only one local optima as the global optima, then it is called as unimodal function [41]. These types of functions can be used to demonstrate exploitation ability of the algorithms. If the functions have more than one local optima, then it is called a multimodal function, where exploration is also needed significantly to travel among the regions, i.e. modals, that have their local optima. If the relationship between any two inputs of a function is loose, i.e. can be easily operated standalone, then it is considered separable, non-separable otherwise [42].

No | Name | Search Space | C | Function |
---|---|---|---|---|

F1 | Sphere | US | ||

F2 | Elliptic | UN | ||

F3 | Sum Squares | US | ||

F4 | Sum Power | MS | ||

F5 | Schwefel 2.22 | UN | ||

F6 | Schwefel 2.21 | UN | ||

F7 | Step | US | ||

F8 | Quartic | US | ||

F9 | QuarticWN | US | ||

F10 | Rosenbrock | UN | ||

F11 | Rastrigin | MS | ||

F12 | Non-Continuous Rastrigin | MS | ||

F13 | Griewank | MN | ||

F14 | Schwefel 2.26 | UN | ||

F15 | Ackley | MN | ||

F16 | Penalized1 | MN | ||

F17 | Penalized2 | MN | ||

F18 | Alpine | MS | ||

F19 | Levy | MN | ||

F20 | Weierstrass | MN | ||

F21 | Schaffer | MN | ||

F22 | Himmelblau | MS | ||

F23 | Michalewicz | MS |

## 4. Experimental results

This section presents the experimental study for testing the performance of proposed algorithm in comparison to a number of state-of-the-art algorithms. To fairly compare the performance of proposed approach, each algorithm was executed with 30 different runs; each seeded randomly. The algorithmic parameters of proposed approach and other variants,

30D | 60D | |||||||||
---|---|---|---|---|---|---|---|---|---|---|

F. No | CSA | CSA-MSS | CSA | CSA-MSS | ||||||

Mean | Std | Mean | Std | Sign | Mean | Std | Mean | Std | Sign | |

F1 | 4.77E-05 | 2.81E-05 | + | 7.87E+04 | 1.99E+04 | + | ||||

F2 | 3.21E+00 | 5.95E+00 | + | 2.71E+09 | 4.69E+08 | + | ||||

F3 | 6.60E-06 | 4.00E-06 | + | 2.50E+04 | 5.06E+03 | + | ||||

F4 | 6.71E-14 | 1.21E-13 | + | 5.81E+40 | 1.37E+41 | + | ||||

F5 | 1.32E-03 | 2.78E-04 | + | 3.17E+18 | 8.63E+18 | + | ||||

F6 | 1.95E-02 | 4.37E-01 | 1.89E-01 | + | 8.35E+00 | 2.73E-01 | + | |||

F7 | 0.00E+00 | 3.30E+00 | 2.15E+00 | + | 7.75E+04 | 2.16E+04 | + | |||

F8 | 5.18E-14 | 8.21E-14 | + | 2.39E+02 | 7.05E+01 | + | ||||

F9 | 1.91E-02 | 5.43E-03 | + | 2.16E+02 | 7.95E+01 | + | ||||

F10 | 2.74E+01 | 9.95E+00 | + | 2.97E+06 | 8.81E+05 | + | ||||

F11 | 5.01E+01 | 1.76E+01 | + | 7.86E+02 | 3.09E+01 | + | ||||

F12 | 5.47E+01 | 2.29E+01 | − | 7.49E+02 | 2.83E+01 | + | ||||

F13 | 3.93E-02 | 4.93E-02 | + | 6.57E+02 | 2.23E+02 | + | ||||

F14 | 8.37E+03 | 3.37E+02 | + | 1.92E+04 | 3.24E+02 | + | ||||

F15 | 3.24E+00 | 1.28E+00 | + | 1.98E+01 | 9.61E-01 | + | ||||

F16 | 2.08E-01 | 4.50E-01 | − | 4.63E+08 | 1.41E+08 | + | ||||

F17 | 4.84E-03 | 2.61E-02 | + | 8.62E+08 | 3.08E+08 | + | ||||

F18 | 1.19E-03 | 1.03E-03 | + | 1.08E+02 | 8.18E+00 | + | ||||

F19 | 4.47E-01 | 1.26E+00 | − | 1.16E+03 | 3.39E+02 | + | ||||

F20 | 7.75E+00 | 2.28E+00 | + | 8.90E+01 | 5.61E+00 | + | ||||

F21 | 8.01E-02 | 1.45E-02 | + | 3.97E-01 | 8.65E-03 | + | ||||

F22 | -6.88E+01 | 2.88E+00 | − | -3.65E+01 | 1.79E+00 | + | ||||

F23 | −1.56E+01 | 3.43E+00 | + | −1.47E+01 | 8.09E-01 | + |

F No | CSA | GCSA | BCSA | CBCSA | RCSA | RMCSA | CSA-MSS | |||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|

Mean | STD | Mean | STD | Mean | STD | Mean | STD | Mean | STD | Mean | STD | Mean | STD | |

F1 | 4.77E-05 | 2.81E-05 | 1.75E-19 | 1.45E-19 | 3.63E+04 | 1.22E+04 | 5.90E+03 | 5.21E+02 | 3.58E+04 | 5.30E+03 | 1.55E-39 | 6.04E-39 | ||

F2 | 3.21E+00 | 5.95E+00 | 4.48E-09 | 1.25E-08 | 6.03E+08 | 1.45E+08 | 7.49E+07 | 1.23E+07 | 5.83E+08 | 9.55E+07 | 1.36E-34 | 5.01E-34 | ||

F3 | 6.60E-06 | 4.00E-06 | 4.06E-20 | 3.41E-20 | 5.02E+03 | 1.18E+03 | 7.96E+02 | 1.06E+02 | 5.23E+03 | 7.18E+02 | 5.20E-39 | 2.21E-38 | ||

F4 | 6.71E-14 | 1.21E-13 | 4.82E-17 | 5.72E-17 | 1.32E+17 | 2.45E+17 | 1.80E+08 | 1.78E+08 | 5.81E+16 | 1.13E+17 | 8.72E+01 | 4.25E+02 | ||

F5 | 1.32E-03 | 2.78E-04 | 4.68E-11 | 2.34E-11 | 1.51E+06 | 2.29E+06 | 3.75E+01 | 3.23E+00 | 9.98E+05 | 1.83E+06 | 8.04E-27 | 3.13E-26 | ||

F6 | 5.72E-02 | 1.95E-02 | 7.11E+00 | 2.23E-01 | 3.23E+00 | 1.92E-01 | 7.06E+00 | 1.88E-01 | 2.30E+00 | 4.71E-01 | 4.37E-01 | 1.89E-01 | ||

F7 | 4.03E+04 | 4.31E+03 | 5.79E+03 | 7.46E+02 | 3.58E+04 | 4.35E+03 | 1.34E+01 | 3.25E+01 | 3.30E+00 | 2.15E+00 | ||||

F8 | 5.18E-14 | 8.21E-14 | 7.12E-39 | 1.65E-38 | 4.39E+01 | 1.09E+01 | 1.24E+00 | 2.28E-01 | 3.48E+01 | 1.02E+01 | 4.80E-38 | 2.55E-37 | ||

F9 | 1.91E-02 | 5.43E-03 | 1.63E-02 | 5.13E-03 | 4.69E+01 | 1.18E+01 | 1.56E+00 | 2.75E-01 | 3.55E+01 | 7.44E+00 | 1.85E-02 | 1.88E-02 | ||

F10 | 2.74E+01 | 9.95E+00 | 2.85E+01 | 2.17E+01 | 1.05E+06 | 4.52E+05 | 3.30E+04 | 7.35E+03 | 9.65E+05 | 2.20E+05 | 2.65E+01 | 2.01E+01 | ||

F11 | 3.73E+01 | 1.37E+01 | 4.75E+01 | 1.08E+01 | 3.41E+02 | 1.14E+01 | 2.22E+02 | 1.19E+01 | 3.29E+02 | 1.64E+01 | 5.01E+01 | 1.76E+01 | ||

F12 | 5.47E+01 | 2.29E+01 | 3.04E+02 | 1.50E+01 | 1.89E+02 | 9.60E+00 | 2.92E+02 | 2.00E+01 | 5.19E+01 | 1.41E+01 | 5.37E+01 | 1.67E+01 | ||

F13 | 1.73E-02 | 3.45E-02 | 3.62E+02 | 3.72E+01 | 5.42E+01 | 7.96E+00 | 3.29E+02 | 3.57E+01 | 1.33E-02 | 1.70E-02 | 3.93E-02 | 4.93E-02 | ||

F14 | 8.37E+03 | 3.37E+02 | 5.51E+03 | 6.15E+02 | 8.40E+03 | 2.89E+02 | 8.31E+03 | 2.03E+02 | 8.39E+03 | 2.91E+02 | 4.51E+03 | 1.52E+03 | ||

F15 | 3.87E-02 | 2.08E-01 | 1.97E+01 | 5.59E-01 | 1.36E+01 | 4.54E-01 | 1.95E+01 | 2.02E-01 | 9.61E-01 | 7.36E-01 | 3.24E+00 | 1.28E+00 | ||

F16 | 2.81E-01 | 1.33E+00 | 1.82E+08 | 3.58E+07 | 1.81E+05 | 8.24E+04 | 1.26E+08 | 3.23E+07 | 3.90E-01 | 8.09E-01 | 2.08E-01 | 4.50E-01 | ||

F17 | 4.60E-06 | 3.78E-06 | 3.60E+08 | 1.13E+08 | 2.90E+06 | 1.02E+06 | 2.88E+08 | 8.23E+07 | 2.90E-01 | 9.00E-01 | 4.84E-03 | 2.61E-02 | ||

F18 | 1.19E-03 | 1.03E-03 | 2.49E-04 | 7.46E-04 | 4.53E+01 | 2.61E+00 | 2.32E+01 | 1.19E+00 | 4.33E+01 | 2.31E+00 | 2.17E-09 | 1.17E-08 | ||

F19 | 3.19E-06 | 2.84E-06 | 5.58E+02 | 7.14E+01 | 8.59E+01 | 1.02E+01 | 5.08E+02 | 6.01E+01 | 9.84E-02 | 1.63E-01 | 4.47E-01 | 1.26E+00 | ||

F20 | 1.14E+01 | 3.19E+00 | 4.16E+01 | 4.07E+00 | 2.72E+01 | 7.16E-01 | 4.12E+01 | 1.63E+00 | 8.72E-01 | 7.99E-01 | 7.75E+00 | 2.28E+00 | ||

F21 | 1.14E-01 | 2.52E-02 | 2.76E-01 | 3.35E-03 | 1.10E-01 | 1.99E-02 | 2.63E-01 | 1.88E-02 | 6.54E-02 | 2.82E-02 | 8.01E-02 | 1.45E-02 | ||

F22 | -6.89E+01 | 2.17E+00 | -6.76E+01 | 2.75E+00 | −4.54E+01 | 4.79E+00 | −5.29E+01 | 1.49E+00 | −4.46E+01 | 1.62E+00 | −6.88E+01 | 2.88E+00 | ||

F23 | −1.56E+01 | 3.43E+00 | −2.56E+01 | 1.24E+00 | −9.43E+00 | 6.27E-01 | −1.16E+01 | 4.97E-01 | −9.32E+00 | 3.83E-01 | −2.58E+01 | 1.20E+00 |

F No | CSA | GCSA | BCSA | CBCSA | RCSA | RMCSA | CSA-MSS | ||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|

Rank | Sign | Rank | Sign | Rank | Sign | Rank | Sign | Rank | Sign | Rank | Sign | Rank | |

F1 | 4 | + | 3 | + | 7 | − | 5 | − | 6 | − | 2 | + | 1 |

F2 | 4 | − | 3 | + | 7 | − | 5 | − | 6 | − | 2 | + | 1 |

F3 | 4 | + | 3 | + | 6 | − | 5 | − | 7 | − | 2 | + | 1 |

F4 | 3 | + | 2 | + | 7 | − | 5 | − | 6 | − | 4 | − | 1 |

F5 | 4 | + | 3 | + | 7 | − | 5 | − | 6 | − | 2 | + | 1 |

F6 | 2 | + | 1 | + | 7 | − | 5 | − | 6 | − | 4 | − | 3 |

F7 | 1 | + | 1 | + | 7 | − | 5 | − | 6 | − | 4 | − | 3 |

F8 | 4 | + | 2 | + | 7 | − | 5 | − | 6 | − | 3 | + | 1 |

F9 | 4 | + | 2 | + | 7 | − | 5 | − | 6 | − | 3 | + | 1 |

F10 | 3 | − | 4 | − | 7 | − | 5 | − | 6 | − | 2 | − | 1 |

F11 | 2 | − | 3 | − | 7 | − | 5 | − | 6 | − | 1 | − | 4 |

F12 | 4 | − | 1 | − | 7 | − | 5 | − | 6 | − | 2 | − | 3 |

F13 | 1 | + | 3 | + | 7 | − | 5 | − | 6 | − | 2 | + | 4 |

F14 | 5 | − | 3 | − | 7 | − | 4 | − | 6 | − | 2 | − | 1 |

F15 | 1 | + | 2 | − | 7 | − | 5 | − | 6 | − | 3 | − | 4 |

F16 | 1 | + | 3 | − | 7 | − | 5 | − | 6 | − | 4 | − | 2 |

F17 | 2 | + | 1 | + | 7 | − | 5 | − | 6 | − | 4 | − | 3 |

F18 | 4 | + | 3 | + | 7 | − | 5 | − | 6 | − | 1 | + | 2 |

F19 | 2 | + | 1 | + | 7 | − | 5 | − | 6 | − | 3 | − | 4 |

F20 | 1 | + | 4 | − | 7 | − | 5 | − | 6 | − | 2 | − | 3 |

F21 | 1 | + | 5 | + | 7 | + | 4 | + | 6 | + | 2 | + | 3 |

F22 | 2 | − | 4 | − | 6 | − | 5 | − | 7 | − | 1 | − | 3 |

F23 | 4 | − | 3 | − | 6 | − | 5 | − | 7 | − | 1 | − | 2 |

Mean: | 2.73913 | 2.608696 | 6.869565 | 4.913043 | 6.130435 | 2.434783 | 2.26087 |

F No | CSA | GCSA | BCSA | CBCSA | RCSA | RMCSA | CSA-MSS | |||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|

Mean | STD | Mean | STD | Mean | STD | Mean | STD | Mean | STD | Mean | STD | Mean | STD | |

F1 | 7.87E+04 | 1.99E+04 | 8.51E-13 | 4.83E-13 | 1.13E+05 | 4.89E+03 | 2.90E+04 | 2.32E+03 | 1.14E+05 | 5.66E+03 | 5.27E-09 | 2.05E-08 | ||

F2 | 2.71E+09 | 4.69E+08 | 2.66E+00 | 3.27E+00 | 2.88E+09 | 4.26E+08 | 6.08E+08 | 1.02E+08 | 2.83E+09 | 3.31E+08 | 2.83E+08 | 7.80E+08 | ||

F3 | 2.50E+04 | 5.06E+03 | 6.64E-13 | 3.17E-13 | 3.20E+04 | 1.42E+03 | 8.04E+03 | 5.22E+02 | 3.19E+04 | 2.11E+03 | 1.08E-08 | 4.21E-08 | ||

F4 | 5.81E+40 | 1.37E+41 | 3.26E-14 | 2.84E-14 | 2.11E+41 | 9.55E+41 | 6.83E+27 | 1.42E+28 | 7.94E+40 | 1.43E+41 | 2.38E+40 | 6.95E+40 | ||

F5 | 3.17E+18 | 8.63E+18 | 2.37E-07 | 1.04E-07 | 6.61E+19 | 1.69E+20 | 4.04E+02 | 4.16E+02 | 5.83E+19 | 1.16E+20 | 1.62E+17 | 7.84E+17 | ||

F6 | 8.35E+00 | 2.73E-01 | 8.41E+00 | 2.04E-01 | 5.37E+00 | 1.71E-01 | 8.42E+00 | 2.01E-01 | 4.96E+00 | 1.73E+00 | 2.94E+00 | 4.17E-01 | ||

F7 | 7.75E+04 | 2.16E+04 | 1.15E+05 | 5.48E+03 | 3.03E+04 | 1.29E+03 | 1.15E+05 | 4.54E+03 | 1.88E+02 | 1.64E+02 | 3.15E+01 | 1.94E+01 | ||

F8 | 2.39E+02 | 7.05E+01 | 1.24E-26 | 9.93E-27 | 3.46E+02 | 3.04E+01 | 3.03E+01 | 3.88E+00 | 3.41E+02 | 2.53E+01 | 8.70E-12 | 4.44E-11 | ||

F9 | 2.16E+02 | 7.95E+01 | 7.06E-02 | 1.91E-02 | 3.53E+02 | 3.04E+01 | 2.86E+01 | 4.47E+00 | 3.45E+02 | 3.60E+01 | 1.75E-01 | 5.41E-02 | ||

F10 | 2.97E+06 | 8.81E+05 | 4.76E+06 | 5.80E+05 | 4.18E+05 | 5.10E+04 | 4.74E+06 | 4.95E+05 | 1.52E+02 | 4.80E+01 | 9.13E+01 | 3.25E+01 | ||

F11 | 7.86E+02 | 3.09E+01 | 1.14E+02 | 2.44E+01 | 8.17E+02 | 1.68E+01 | 5.72E+02 | 1.96E+01 | 8.07E+02 | 2.57E+01 | 1.16E+02 | 1.62E+01 | ||

F12 | 7.49E+02 | 2.83E+01 | 7.54E+02 | 2.62E+01 | 5.20E+02 | 2.25E+01 | 7.55E+02 | 3.01E+01 | 3.09E+02 | 2.95E+02 | 1.18E+02 | 2.70E+01 | ||

F13 | 6.57E+02 | 2.23E+02 | 1.03E+03 | 5.50E+01 | 2.69E+02 | 1.64E+01 | 1.04E+03 | 4.44E+01 | 1.72E-02 | 3.56E-02 | 8.54E-02 | 1.16E-01 | ||

F14 | 1.92E+04 | 3.24E+02 | 1.43E+04 | 1.87E+03 | 1.91E+04 | 5.35E+02 | 1.92E+04 | 4.43E+02 | 1.92E+04 | 4.70E+02 | 1.93E+04 | 4.05E+02 | ||

F15 | 1.98E+01 | 9.61E-01 | 7.41E+00 | 2.29E+00 | 2.04E+01 | 6.71E-02 | 1.69E+01 | 2.91E-01 | 2.04E+01 | 9.49E-02 | 6.89E+00 | 1.48E+00 | ||

F16 | 4.63E+08 | 1.41E+08 | 6.56E-01 | 7.70E-01 | 7.97E+08 | 8.99E+07 | 2.08E+07 | 6.58E+06 | 7.88E+08 | 1.03E+08 | 2.00E+04 | 1.03E+05 | ||

F17 | 8.62E+08 | 3.08E+08 | 7.58E+00 | 1.25E+01 | 1.59E+09 | 1.37E+08 | 8.28E+07 | 1.47E+07 | 1.58E+09 | 1.87E+08 | 5.30E+03 | 2.59E+04 | ||

F18 | 1.08E+02 | 8.18E+00 | 1.09E-02 | 1.20E-02 | 1.11E+02 | 4.63E+00 | 6.81E+01 | 2.56E+00 | 1.13E+02 | 4.00E+00 | 4.14E+01 | 5.45E+01 | ||

F19 | 1.16E+03 | 3.39E+02 | 1.67E+03 | 8.70E+01 | 4.58E+02 | 2.73E+01 | 1.68E+03 | 7.68E+01 | 4.64E-01 | 1.22E+00 | 7.76E-01 | 1.84E+00 | ||

F20 | 8.90E+01 | 5.61E+00 | 3.77E+01 | 4.39E+00 | 9.61E+01 | 1.37E+00 | 6.81E+01 | 1.64E+00 | 9.56E+01 | 1.64E+00 | 2.81E+01 | 3.89E+00 | ||

F21 | 3.97E-01 | 8.65E-03 | 2.25E-01 | 2.60E-02 | 4.07E-01 | 7.33E-03 | 2.57E-01 | 1.87E-02 | 4.04E-01 | 7.64E-03 | 1.71E-01 | 2.17E-02 | ||

F22 | -3.65E+01 | 1.79E+00 | −6.65E+01 | 1.54E+00 | −3.57E+01 | 1.35E+00 | −4.34E+01 | 1.07E+00 | −3.60E+01 | 1.66E+00 | −6.49E+01 | 1.08E+01 | ||

F23 | −1.47E+01 | 8.09E-01 | −4.73E+01 | 1.75E+00 | −1.49E+01 | 7.20E-01 | −1.80E+01 | 7.08E-01 | −1.46E+01 | 5.62E-01 | −1.48E+01 | 7.80E-01 |

F No | CSA | GCSA | BCSA | CBCSA | RCSA | RMCSA | CSA-MSS | ||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|

Rank | Sign | Rank | Sign | Rank | Sign | Rank | Sign | Rank | Sign | Rank | Sign | Rank | |

F1 | 5 | + | 2 | + | 6 | + | 4 | + | 7 | + | 3 | + | 1 |

F2 | 5 | + | 2 | + | 7 | + | 4 | + | 6 | + | 3 | + | 1 |

F3 | 5 | + | 2 | + | 7 | + | 4 | + | 6 | + | 3 | + | 1 |

F4 | 5 | + | 2 | + | 7 | + | 3 | + | 6 | + | 4 | + | 1 |

F5 | 5 | + | 2 | + | 7 | + | 3 | + | 6 | + | 4 | + | 1 |

F6 | 5 | + | 1 | + | 6 | + | 4 | + | 7 | + | 3 | + | 2 |

F7 | 5 | + | 1 | + | 7 | + | 4 | + | 6 | + | 3 | + | 2 |

F8 | 5 | + | 2 | + | 7 | + | 4 | + | 6 | + | 3 | + | 1 |

F9 | 5 | + | 2 | + | 7 | + | 4 | + | 6 | + | 3 | + | 1 |

F10 | 5 | + | 1 | + | 7 | + | 4 | + | 6 | + | 3 | + | 2 |

F11 | 5 | + | 2 | + | 7 | + | 4 | + | 6 | + | 1 | + | 3 |

F12 | 5 | + | 1 | + | 6 | + | 4 | + | 7 | + | 3 | + | 2 |

F13 | 5 | + | 1 | + | 6 | + | 4 | + | 7 | + | 2 | + | 3 |

F14 | 6 | + | 2 | + | 3 | + | 4 | + | 5 | + | 7 | + | 1 |

F15 | 5 | + | 3 | + | 7 | + | 4 | + | 6 | + | 1 | + | 2 |

F16 | 5 | + | 2 | + | 7 | + | 4 | + | 6 | + | 3 | + | 1 |

F17 | 5 | + | 2 | + | 7 | + | 4 | + | 6 | + | 3 | + | 1 |

F18 | 5 | + | 2 | + | 6 | + | 4 | + | 7 | + | 3 | + | 1 |

F19 | 5 | + | 1 | + | 6 | + | 4 | + | 7 | + | 2 | + | 3 |

F20 | 5 | + | 3 | + | 7 | + | 4 | + | 6 | + | 1 | + | 2 |

F21 | 5 | + | 3 | + | 7 | + | 4 | + | 6 | + | 2 | + | 1 |

F22 | 5 | + | 2 | + | 7 | + | 4 | + | 6 | + | 3 | + | 1 |

F23 | 6 | + | 2 | + | 4 | + | 3 | + | 7 | + | 5 | + | 1 |

Mean: | 5.086957 | 1.869565 | 6.434783 | 3.869565 | 6.26087 | 2.956522 | 1.521739 |

Table 2 shows the results of CSA and CSA-MSS algorithms on 30- and 60-dimensional benchmark functions. For the 30D problems, CSA looks slightly competing with the CSA-MSS. The column with * Sign*headings represents Wilcoxon signed sum rank test result. If the sign is

The comparative experimental results with 30 D benchmark functions for different search strategies are presented in Tables 3 and 4. CSA-MSS has produced the minimum mean values for F1, F2, F3, F4, F5, F8, F9, F10, F14 functions, while CSA achieves slightly better results for F7, F13, F15, F16, F20, F21 functions. For F6, F7, F12, F17, F19 functions, GCSA produced better solutions than the others. RMCSA found the minimum mean values for F11, F18, F22, F23 functions, while CBCSA and RCSA have lower performance than the others presumably due to their poorer exploitation capabilities. The tables for both dimensions are combined to conduct statistical confidence test of the success, which demonstrated that the success of CSA-MSS over the others is found statistically sound with an exemption that the success of RMCSA for the functions F11 and F18 is not statistically different from CSA-MSS. It is noted that the mean rank values by CSA-MSS remain slightly better than the other methods.

The comparison of the experimental results for 60 D benchmark functions with different search strategies are provided in Tables 5 and 6. CSA-MSS has the minimum mean values except for F6, F7, F10, F11, F12, F13, F15, F19, F20 functions. Nevertheless, for all functions considered in the test, CSA-MSS produced better solutions than CSA. It is also observed that GCSA produced better solutions than CSA-MSS for F6, F7, F10, F12, F13, F19 functions, and RMCSA produced better solutions than CSA-MSS for F11, F15, F20 functions. CSA turns to poorer performance when the problem dimension increases. The statistical soundness of the success of CSA-MSS is approved except for F21, F22 functions. For the comparison of mean rank values, CSA-MSS outperforms the other methods.

Figure 1 depicts the convergence comparison of the methods over the iterations. As seen, CSA-MSS has higher convergence speed from the others for F1,F2,F3 and F4 functions. For the 60 D problems, Figure 2 shows the convergence history for the all methods.

## 5. Conclusions

In this article, a multi-strategy search with crow search algorithm is proposed for solving global optimisation problems. In the proposed approach, adaptive pursuit is used to select search strategies according to the search dynamics. Search strategies are obtained from other state-of-the-art metaheuristics and implemented into crow search algorithm. Twenty-three benchmark functions with different characteristics have been used to compare strategies’ performance. The proposed approach is demonstrated that it helps improve the performance of original crow search algorithm especially on the higher-dimensional problems. Future works of this study would be dynamic and adaptive parametric settings, especially for awareness probability and flight length distance. It may also include embedding machine learning approaches to develop high-performance adaptive strategy selection schemes into the algorithm to achieve transfer learning.

## Acknowledgments

This study has been conducted under the research support scheme of 2219, namely “Post-doctorate research programme”, TUBITAK of Republic of Turkey.

## Notes/thanks

The first author would like to show his thanks to TUBITAK for supporting his visit in UWE Bristol, UK, in the period between October, 2021 and October 2022.

## References

- 1.
Askarzadeh A. A novel metaheuristic method for solving constrained engineering optimization problems: Crow search algorithm. Computers & Structures. 2016; 169 :1-12 - 2.
Said GAE-NA, Mahmoud AM, El-Horbaty E-SM. A comparative study of meta-heuristic algorithms for solving quadratic assignment problem. arXiv preprint arXiv:1407.4863. 2014 - 3.
Helmi AM et al. Efficient and sustainable reconfiguration of distribution networks via metaheuristic optimization. In: IEEE Transactions on Automation Science and Engineering. IEEE; 2021 - 4.
Kalita K, Ghadai RK, Chakraborty S. A comparative study on the metaheuristic-based optimization of skew composite laminates. Engineering with Computers. 2021:1-18. (In press) - 5.
Baatar N, Zhang D, Koh C-S. An improved differential evolution algorithm adopting \𝜆-best mutation strategy for global optimization of electromagnetic devices. IEEE Transactions on Magnetics. 2013; 49 (5):2097-2100 - 6.
Eberhart R, Kennedy J. A new optimizer using particle swarm theory. In: MHS’95. Proceedings of the Sixth International Symposium on Micro Machine and Human Science. Nagoya, Japan: IEEE; 1995. pp. 39-43 - 7.
Wang F, Zhang H, Zhou A. A particle swarm optimization algorithm for mixed-variable optimization problems. Swarm and Evolutionary Computation. 2021; 60 :100808 - 8.
Durgut R, Aydin ME. Adaptive binary artificial bee colony algorithm. Applied Soft Computing. 2021; 101 :107054 - 9.
Karaboga D, Basturk B. On the performance of artificial bee colony (ABC) algorithm. Applied Soft Computing. 2008; 8 (1):687-697 - 10.
Garcıa J, Maureira C. A KNN quantum cuckoo search algorithm applied to the multidimensional knapsack problem. Applied Soft Computing. 2021; 102 :107077 - 11.
Yang X-S, Deb S. Cuckoo search via Lévy flights. In: 2009 World Congress on Nature & Biologically Inspired Computing (NaBIC). IEEE; 2009. pp. 210-214 - 12.
Naik A, Satapathy SC. A comparative study of social group optimization with a few recent optimization algorithms. Complex & Intelligent Systems. 2021; 7 (1):249-295 - 13.
He S, Wu QH, Saunders JR. Group search optimizer: An optimization algorithm inspired by animal searching behavior. IEEE Transactions on Evolutionary Computation. 2009; 13 (5):973-990 - 14.
Yang X-S. Firefly algorithm, levy flights and global optimization. In: Research and Development in Intelligent Systems XXVI. Springer; 2010. pp. 209-218 - 15.
Yang X-S, Slowik A. Firefly algorithm. In: Swarm Intelligence Algorithms. CRC Press; 2020. pp. 163-174 - 16.
Morales-Castañeda B et al. A better balance in metaheuristic algorithms: Does it exist? Swarm and Evolutionary Computation. 2020; 54 :100671 - 17.
Hughes M, Goerigk M, Dokka T. Particle swarm metaheuristics for robust optimisation with implementation uncertainty. Computers & Operations Research. 2020; 122 :104998 - 18.
Majhi SK, Sahoo M, Pradhan R. A space transformational crow search algorithm for optimization problems. Evolutionary Intelligence. 2020; 13 (3):345-364 - 19.
Ozsari S et al. Adaptation of metaheuristic algorithms to improve training performance of an ESZSL model. Turkish Journal of Electrical Engineering & Computer Sciences. 2021; 29 :3 - 20.
Xiao S et al. Artificial bee colony algorithm based on adaptive neighborhood search and Gaussian perturbation. Applied Soft Computing. 2021; 100 :106955 - 21.
Hu P et al. Multi-strategy serial cuckoo search algorithm for global optimization. Knowledge-Based Systems. 2021; 214 :106729 - 22.
Zamani H, Nadimi-Shahraki MH, Gandomi AH. CCSA: Conscious neighborhood-based crow search algorithm for solving global optimization problems. Applied Soft Computing. 2019; 85 :105583 - 23.
Sonuc E. Binary crow search algorithm for the uncapacitated facility location problem. Neural Computing and Applications. 2021; 33 (21):1-17 - 24.
Majhi SK, Sahoo M, Pradhan R. Oppositional crow search algorithm with mutation operator for global optimization and application in designing FOPID controller. Evolving Systems. 2019; 12 (2):1-26 - 25.
Rizk-Allah RM, Hassanien AE, Bhattacharyya S. Chaotic crow search algorithm for fractional optimization problems. Applied Soft Computing. 2018; 71 :1161-1175 - 26.
dos Santos Coelho L et al. Electromagnetic optimization based on Gaussian crow search approach. In: 2018 International Symposium on Power Electronics, Electrical Drives, Automation and Motion (SPEEDAM). IEEE; 2018. pp. 1107-1112 - 27.
Necira A et al. Dynamic crow search algorithm based on adaptive parameters for large-scale global optimization. Evolutionary Intelligence. 2021:1-17. (In press) - 28.
Hussien AG et al. Crow search algorithm: Theory, recent advances, and applications. IEEE Access. 2020; 8 :173548-173565 - 29.
Meraihi Y et al. A comprehensive survey of crow search algorithm and its applications. Artificial Intelligence Review. 2020; 54 (4):1-48 - 30.
Prior H, Schwarz A, Güntürkün O. Mirror-induced behavior in the magpie (Pica pica): Evidence of self-recognition. PLoS Biology. 2008; 6 (8):e202 - 31.
Black H. Social skills to crow about. Scientific American Mind. 2013; 24 (4):12-12 - 32.
Abdelaziz AY, Fathy A. A novel approach based on crow search algorithm for optimal selection of conductor size in radial distribution networks. Engineering Science and Technology, an International Journal. 2017; 20 (2):391-402 - 33.
Kiran MS et al. Artificial bee colony algorithm with variable search strategy for continuous optimization. Information Sciences. 2015; 300 :140-157 - 34.
Peng H, Deng C, Zhijian W. Best neighbor-guided artificial bee colony algorithm for continuous optimization problems. Soft Computing. 2019; 23 (18):8723-8740 - 35.
Liang J, Xue Y. An adaptive GP-based memetic algorithm for symbolic regression. Applied Intelligence. 2020; 50 (11):3961-3975 - 36.
Aydin ME. Coordinating metaheuristic agents with swarm intelligence. Journal of Intelligent Manufacturing. 2012; 23 (4):991-999 - 37.
Abualigah L et al. A parallel hybrid krill herd algorithm for feature selection. International Journal of Machine Learning and Cybernetics. 2021; 12 (3):783-806 - 38.
Durgut R, Yavuz İ, Aydin M. Küme Birles，imli Sırt Çantası Probleminin Adaptif Yapay Arı Kolonisi Algoritması ile Çözümü. Journal of Intelligent Systems: Theory and Applications. 2021:43-54. DOI: 10.38016/jista.854584 - 39.
Surjanovic S, Bingham D. Virtual Library of Simulation Experiments: Test Functions and Datasets. Burnaby, BC, Canada: Simon Fraser University; 2013. p. 2015. [Accessed: May 13] - 40.
Zhu G, Kwong S. Gbest-guided artificial bee colony algorithm for numerical function optimization. Applied Mathematics and Computation. 2010; 217 (7):3166-3173 - 41.
Trelea IC. The particle swarm optimization algorithm: Convergence analysis and parameter selection. Information Processing Letters. 2003; 85 (6):317-325 - 42.
Babaoglu I. Artificial bee colony algorithm with distribution-based update rule. Applied Soft Computing. 2015; 34 :851-861