A New Metaheuristic Approach to Solving Benchmark Problems: Hybrid Salp Swarm Jaya Algorithm

: Metaheuristic algorithms are one of the methods used to solve optimization problems and find global or close to optimal solutions at a reasonable computational cost. As with other types of algorithms, in metaheuristic algorithms, one of the methods used to improve performance and achieve results closer to the target result is the hybridization of algorithms. In this study, a hybrid algorithm (HSSJAYA) consisting of salp swarm algorithm (SSA) and jaya algorithm (JAYA) is designed. The speed of achieving the global optimum of SSA, its simplicity, easy hybridization and JAYA’s success in achieving the best solution have given us the idea of creating a powerful hybrid algorithm from these two algorithms. The hybrid algorithm is based on SSA’s leader and follower salp system and JAYA’s best and worst solution part. HSSJAYA works according to the best and worst food source positions. In this way, it is thought that the leader-follower salps will find the best solution to reach the food source. The hybrid algorithm has been tested in 14 unimodal and 21 multimodal benchmark functions. The results were compared with SSA, JAYA, cuckoo search algorithm (CS), firefly algorithm (FFA) and genetic algorithm (GA). As a result, a hybrid algorithm that provided results closer to the desired fitness value in benchmark functions was obtained. In addition, these results were statistically compared using wilcoxon rank sum test with other algorithms. According to the statistical results obtained from the results of the benchmark functions, it was determined that HSSJAYA creates a statistically significant difference in most of the problems compared to other algorithms.


Introduction
Optimization problems; signal processing, mathematics, chemistry, computer science, mechanics, economics, etc. it is the expression of real-world problems in fields by converting them into mathematical terms. Purpose in optimization problems; is to find the best available solution by optimizing the value among the possible solutions within a certain solution search range and constraints [1,2].
It may be necessary to use different algorithms to find the best solution to optimization problems. These algorithms are divided into deterministic algorithms, which often use a method of tracking a particular sequence of actions, and stochastic algorithms that contain randomness [3].
In deterministic algorithms, no hesitant result is obtained. As long as the input given to the problem in the algorithm is the same, the solution obtained as the output is always the same, but with a deterministic algorithm, structural difficulties can develop in solving problems, and there is a possibility that the expected solution cannot be obtained [4,5].
For the reasons above; metaheuristic algorithms inspired by nature contained in stochastic algorithms; they are preferred because they can be created in a simple way according to deterministic algorithms, hybrid with multiple metaheuristic algorithms, flexibility in adapting to different problems, solving real problems without derivatives, and avoiding local optimal values [6,7].
Most of the metaheuristic algorithms are population and swarm based high level heuristics; It is one of the methods preferred by researchers in the solution of optimization problems. Cuckoo search algorithm (CS) [8], firefly algorithm (FFA) [9], salp swarm algorithm (SSA) [10], jaya algorithm (JAYA) [11,12], genetic algorithm (GA) [13] can be given as examples to these algorithms.
Metaheuristic algorithms; simple, easy to implement, successful in solving difficult problems, etc. but high computational costs, stuck in local search, uncertainty in reaching convergence, and nonrepeatable exact solutions, etc. are among its weaknesses [14,15].
To obtain a stronger metaheuristic algorithm, either new algorithms should be created or new hybrid algorithms should be developed by taking the successful parts of more than one algorithm [16]. While the hybrid algorithm aims to create a better algorithm by combining the advantageous aspects of more than one algorithm, it also aims to reduce or remove the weaknesses of the algorithms that make up the hybrid algorithm [17,18].
Hybrid metaheuristic algorithms developed in studies are also superior in areas such as optimization problems, artificial neural network training, etc. Some of the studies in the literature are; Li et al. [19] developed a hybrid algorithm with GSA to improve SSA's success in complex problem solutions and improve its search capability. The hybrid algorithm has been tested with CEC2017 functions and has been found by researchers to increase accuracy and convergence rate.
Singh et al. [20] have developed a hybrid salp swarm-sine cosine algorithm for nonlinear optimization problems. They performed the positions of salp swarms in the search space using position equations in the sine cosine algorithm. Researchers noted that the hybrid algorithm tested in optimization and engineering problems reach the best solution in a short time and with high accuracy compared to other algorithms.
Caldeira et al. [21] have designed advanced JAYA to solve flexible workshop scheduling problems. In order for JAYA's solutions to be better, local search methods, new acceptance criteria, etc. they have added innovations such as. The improved JAYA algorithm has been compared with other well-known metaheuristic algorithms based on the makespan criterion on benchmarking samples.
Khamees et al. [22] hybridized the simulated annealing algorithm (SA) with the SSA algorithm. They used it for multi-purpose feature selection. The hybrid algorithm, which has been tested in a total of 16 data sets, has been compared to the original SSA, PSO and ant lion algorithm (ALO). They noted that the accuracy rate in classification according to results is high compared to other algorithms.
Aslan et al. [23] designed JAYA with XOR operator for binary optimization called JayaX. Researchers believe that JAYA is not suitable for binary optimization problems, and have noted that solutions solve this obstacle by using the XOR operator. They aimed to improve the performance of the algorithm by adding a local search section to the algorithm they developed. They noted that the solution quality and stability of the new algorithm, which is compared with other algorithms on various problems, is better.
Ibrahim et al. [24] designed an improved SSA algorithm for PSO-based attribute selection. Researchers have taken advantage of the strengths of the two algorithms to overcome the highdimensionality problem in attribute selection. The algorithm developed was evaluated in two parts; in the first part, benchmarking functions were evaluated and in the second part, experimental analysis was performed on the selection of the best attributes on different data sets. As a result, they noted that the improved hybrid algorithm results better in performance and accuracy.
Chen et al. [25] proposed a hybrid algorithm that they created using PSO-CS algorithms. They used their proposed algorithms as a new training method for feedforward neural networks. As a result, they found that the proposed hybrid algorithm performed better in feedforward neural networks training than in PSO and CS.
The aim of this study is to develop a new hybrid algorithm by combining the metaheuristic optimization algorithms that exist in the literature. The developed hybrid algorithm has a high accuracy rate, the error rate has been minimized, and at the same time it was desired to develop an algorithm that will succeed from the algorithms that make up the hybrid algorithm.
In this context, a hybrid metaheuristic algorithm (Hybrid Salp Swarm Jaya Algorithm-HSSJAYA) consisting of the SSA and JAYA algorithm was developed. The developed hybrid algorithm was used in unimodal -multimodal benchmarking functions. The hybrid algorithm developed has been compared to SSA, JAYA and several leading algorithms.
In Section 1 (Introduction), information on optimization, metaheuristic algorithms, hybrid algorithms and related studies and the purpose and subject of the study were given.
In Section 2 (Overview), information about SSA and JAYA was given.
In Section 3 (Proposed Hybrid Approach), information was given about the developed HSSJAYA. Equations, changes and updates of HSSJAYA were tried to be expressed in the best way. Also in this section, there is a detailed pseudo code about HSSJAYA.
In Section 4 (Experimental Results), the solutions obtained by HSSJAYA in unimodal and multimodal benchmark functions were compared with other algorithms and also the results were compared statistically. Many information such as benchmark functions used in the research, search agents, number of iterations, parameters of algorithms used in comparison are also included in this section.
In Section 5 (Conclusions and Future Work), conclusions about the designed HSSJAYA and information that will guide future studies were mentioned. SSA is inspired by salps from the salpedia family, which are structurally similar to jellyfish and live in packs deep in the seas and oceans. At the beginning of this chain, there is a salp in the leader position, and the other salps follow the leader. The leader updates its position relative to food source. The best solution is always in the leader. The salps that follow the leader update their positions relative to each other. SSA, which is easy and simple to implement, is used in many areas, including optimization problems [10,26].
Mirjalili et al. [10] explained the equations used in SSA as follows; In SSA, the location of salps is located in a d−dimensional search space. N refers search agents. The X matrix where the position of the salps in Nxd size is kept is shown in Eq. (1). Salps are randomly assigned between the lower and upper bounds specified at the beginning.
Eq. (2) is used to update the salp position, which is the leader in SSA.
According to this equation, if i == 1, x i j shows the position information of the leader salp in the j th dimension, F j indicates the best solution (food source) of j th dimension. The terms ub and lb refer to the lower and upper limits of the j th dimension. The term c 1 , which is important for the food source, is calculated according to the equation contained in Eq. (3).
According to the Eq. (3), the value e shows the number e, the value it shows the current iteration value, and the Max_it shows the maximum number of iterations. According to the literature, the coefficents of c 2 and c 3 represent random values between 0 and 1. This means that the value of c 3 will never fall below zero, and it means that the equation in the c 3 < 0 proposition in Eq. (2) cannot be calculated at all. Ahmed et al. [27], Singh et al. [28] and Faris et al. [29] in their studies, they wrote an equation in which the leader position can be updated according to the situation where the value of c 3 can be between 0 and 1, which is shown in Eq. (4).
The mathematical expression required to update the position of the salps following the leader is contained in Eq. (5).
Tab. 1 contains the pseudo-code of the SSA. In this pseudo-code, the leader is based on a single leader when updating the salp position [10]. Determine salp population (x i (i = 1, 2, 3, . . . , N)) in lower and upper bound 02: while (unless the stopping criterion is met) 03: Find the best fitness value 04: Set the best salp as F 05: Update the c 1 (Eq. (3)) 06: for each salp (x i ) 07: if (i == 1) 08: Update leader salp position (Eq. (4)) 09: else 10: Update follower salp position (Eq. (5)) 11: end 12: end 13: Adjust salps that exceed bounds according to the lower and upper bounds 14: end 15: return F In the literature review, if the single leader salp is selected as multiple, the randomness of the algorithm can be increased. This increase affects the stability of the algorithm as a disadvantage, if it is desired to increase the randomness and keep its stability in a balanced state. It has been stated that half of the search agents (N) should be chosen as the leader (N/2) and the other half (N/2) as the follower [30,31]. In addition, when the leader and follower position updates in the codes written by the developers of SSA are examined, it is seen that half of the search agents are leaders and the other half are followers [32,33]. According to the explanations written above, the pseudo-code of SSA is shown in Tab. 2. In our research, the pseudo-code in Tab. 2 was taken as the basis in SSA and HSSJAYA.  Find the best fitness value 04: Set the best salp as F 05: Update the c 1 (Eq. (3)) 06: for each salp (x i ) 07: if (i <= N/2) 08: Update leader salp position (Eq. (4)) 09: else if (i > N/2 and i <= N) 10: Update follower salp position (Eq. (5)) 11: end 12: end 13: Adjust salps that exceed bounds according to the lower and upper bounds 14: end 15: return F

Jaya
JAYA, which means victory in Sanskrit developed by Rao [11], does not have its own extra parameters compared to other optimization algorithms. There are best and worst solutions in this algorithm. It is an algorithm that tries to get as close as possible to the best solution and as far away as possible from the worst solution. In this algorithm, which is easy and simple to implement, the basic parameters are very few. Rao [11] describes the solution updates according to Eq. (6) as follows; The terms in Eq. (6) are expressed in Tab. 3 as follows;  Determine population in lower and upper bound 02: while (unless the stopping criterion is met) 03: Determine best and worst candidate solution 04: Change solution to new (updated) solution according to best and worst solutions (Eq. (6)) 05: if (New (Updated) Solution < Solution) 06: Replace new (updated) solution with solution 07: else 08: Keep solution 09: end 10: end 11: return Optimum Solution

Proposed Hybrid Approach
Metaheuristic algorithms aim to find global or close to optimal solutions at a reasonable computational cost. By using the global optimum search feature of the salp swarm algorithm and the success of the jaya algorithm in reaching the best solution, a hybrid algorithm that achieves the best result faster than traditional metaheuristic algorithms is aimed.
In SSA, salps update their positions according to the source of the food. During this update, leader and follower salps try to be closest to the food source. Positions that do not give good results in SSA do not have any effect on the calculations. In JAYA, the best and worst candidates are the solutions. These obtained solutions are used to calculate the new solution.
The hybrid algorithm is based on SSA's leader and follower salp system and JAYA's best and worst solution part. HSSJAYA works according to the best and worst food source positions. The best food source refers to the position that the leader and follower salps should reach; the worst food source refers to the position that the leader and follower salps should not reach. When calculating the best food source position, the values obtained from the worst food source position are also included in the calculation. In this way, it is thought that the leader-follower salps will find the best solution to reach the food source. HSSJAYA algorithm has been developed based on the SSA given pseudo code in Tab. 2. The equations developed for HSSJAYA and the descriptions of these equations are as follows.
In HSSJAYA, as in SSA, the location of salps is located in a d−dimensional search space. N refers to search agents. Eq. (7) contains the X matrix in which the position of the salps in Nxd size is kept. Salps are randomly assigned between the lower and upper bounds specified initially.
In Eqs. (8)- (10), the d−dimensional best food source, worst food source and new candidate solution positions are defined respectively.
Obtaining a new candidate solution contained in JAYA will be performed in HSSJAYA, as in Eq. (11).
The terms in Eq. (11) are expressed in Tab. 5 as follows; In the equation, bfpfv is the best food fitness value; wfpfv, on the other hand, represents the worst food fitness value. Before the leader and follower salps update the position, the position of the salps is updated according to Eq. (13).
Updating the leading salp in the HSSJAYA algorithm, provided i ≤ N/2, is shown in Eq. (14).
The difference of this equation from the position update of the leader salp in SSA is that instead of F j (food position in the j th dimension), bf p j (the best food position in the j th dimension) is located. Except for the coefficent c 1 , other coefficents and terms (ub j , lb j , c 2 , c 3 ) are used as contained in the SSA. There are studies in the literature in which SSA coefficients were used by changing them. One of them is the perturbation weight salp swarm algorithm developed by Fan et al. [34] who proposed a new c 1 and c 2 value, leader and follower position update technique according to the perturbation weight mechanism. The equation prepared by Fan et al. [34] for c 1 is shown in Eq. (15). The value t refers to the iteration; the value T refers to the maximum iteration and u 1 refers to the number between 0 and 1.
In this study, we also made changes to the original c 1 coefficient by adding a new parameter and obtained a new c 1 coefficient shown in Eq. (16).
The iv value (0 < iv < 1) contained in Eq. (16) is the improvement value of c 1 , which performs the update process by reducing the difference between positions when updating the leader's position. In the analysis conducted, it was found that the hybrid algorithm gives better results in this way. The iv value is not random; it was considered more appropriate to assign it as a fixed parameter so that researchers who will use the hybrid algorithm can change it at the above-mentioned intervals depending on the type of problems to be solved.   Set the position with the best fitness value as bfp 06: Set best fitness value as bfpfv 07: Set the position with the worst fitness value as wfp 08: Set worst fitness value as wfpfv 09: while (unless the stopping criterion is met) 10: Update bwfr (Eq. (12)) 11: Update c 1 (Eq. (16)) 12: for i = 1 to N 13: Determine random values r 1 and r 2 14: for j = 1 to d 15: Update new candidate solution (Eq. (11)) 16: Amend Assign the position of the fitness value to wfp 43: Assign the fitness value to wfpfv (Continued)

Experimental Results
In order to measure the performance of the developed algorithm, some analysis must be performed. In this section, analyzes made with HSSJAYA are included. The hybrid algorithm has been used in solving unimodal and multimodal benchmark functions. The results obtained were compared with the popular CS, GA, FFA algorithms, primarily the SSA and JAYA algorithms that make up the hybrid algorithm. All algorithms have equal number of independent runs, equal search agent, equal iteration. For all algorithms, each operation was run independently 30 times; the number of search agents used in each run was set to 30 and the number of iterations was set to 100.
HSSJAYA has been created by writing in Python (version 3.6). In the study, Evolopy framework was used. The Evolopy framework is an easy-to-use framework developed for optimization problem solving, artificial neural network training, attribute selection, clustering operations [35][36][37].
Algorithms can include special parameters. These parameters can take constant values and can take different values according to the problem being studied. The iv parameter of HSSJAYA was accepted as 0.1. For the parameter values of other algorithms, the values in the Evolopy Framework were used [33].
In addition, the methods to be used in the comparison between algorithms are included in the subheadings of this section.

Results of Benchmark Functions
HSSJAYA and other algorithms have optimized a total of 35 benchmark functions, including 14 unimodal and 21 multimodal. In the optimization process, attention was paid to the criteria in Section 4. In order to better compare the results of the algorithms, the results were normalized from between 0 and 1 [10]. Tab. 7 contains the unimodal and multimodal benchmark functions used in the study. In this table, a few benchmark functions; Although it is mentioned as both unimodal and multimodal in the literature, it has been used by choosing one of the unimodal or multimodal types according to the type it is used most frequently [38][39][40][41][42][43].
Mean and standard deviation values of benchmark functions according to algorithms are shown in Tab. 8. If the mean and standard deviation values in this table are examined, it is seen that HSSJAYA gets better mean and standard deviation values in most benchmark functions than other algorithms. In addition, the mean convergence curves of some benchmark functions optimized by HSSJAYA are given in Fig. 1.    Although HSSJAYA appears to be successful in the optimization of unimodal and multimodal benchmark functions, statistically it is necessary to prove that the algorithm is successful. For this reason, the wilcoxon rank sum test, which is one of the data analysis tests, was applied and the p value was considered less than 0.05 (5E−02) in order to express a statistically significant difference. In each statistical test, the best algorithm was compared with the other algorithm [11,44]. The results of the wilcoxon rank sum test are given in Tab. 9. When the results are examined, it is seen that the hybrid algorithm creates statistically significant differences in unimodal and multimodal benchmark functions and is successful.  HSSJAYA was inspired by SSA's method of reaching the nutrient of salps and JAYA's method of reaching the desired solution through the best and worst candidate solutions. In other words, a hybrid algorithm has been developed in which the leader salp and the follower salp can reach the food more successfully and efficiently by calculating the positions of the salps that are far/should not reach (worst food solution) and close/should (best food solution) reach the food.
Proposed HSSJAYA appears to be successful in optimization of benchmark functions. HSSJAYA achieved the best mean results in 30 out of 35 benchmark functions compared to other algorithms. Our study is also statistically successful. It has been determined that the hybrid algorithm creates statistically significant differences in most results compared to other algorithms. Other factors in the success of HSSJAYA are due to elements such as the algorithm structure developed, new equations and parameters added. According to these results, it has been proven that HSSJAYA is successful in solving benchmark problems according to the algorithms with which it is compared.
HSSJAYA has been tested by the number of trials, search agents and iterations in the optimization of benchmark functions. It is recommended that it be tested with different number of trials, search agents and iterations, and also using it in different problems or artificial intelligence techniques apart from the problems in the study.
By selecting algorithms that work well in their field from among the metaheuristic algorithms created or developed by researchers. It is believed that new hybrid algorithms will be developed that will give better results if they are hybridized with HSSJAYA developed in our study.