Open Access
REVIEW
A Comprehensive Survey on Snake Optimizer and Its Performance Evaluation in Image Clustering Field
1 Department of Computer Applications, Sikkim University, Sikkim, India
2 Natural Science Research Centre of Belda College Affiliated to Vidyasagar University, Belda College, Paschim Medinipur, West Bengal, India
3 Department of Computer Science, Midnapore College (Autonomous), Paschim Medinipur, West Bengal, India
* Corresponding Author: Rebika Rai. Email:
Computer Modeling in Engineering & Sciences 2026, 147(1), 7 https://doi.org/10.32604/cmes.2026.079037
Received 13 January 2026; Accepted 08 March 2026; Issue published 27 April 2026
Abstract
Snake Optimizer (SO) is a popular optimization algorithm developed by Hashim and Hussien, based on the competitive and selective mating nature of snakes. By emulating such natural methods, SO presents an intelligent method to solve complicated optimization problems, making it a valuable tool in various scientific and technological applications. This paper provides an extensive review of the SO, its inception, the development of different variants, and applications. This paper identifies several SO variants, such as improved SO variants using different strategies, hybridized SO variants with other metaheuristics, Binary SO variants to solve discrete optimization problems, and multi-objective SO variants to tackle many objectives. Furthermore, the applications of variants of SO demonstrate its adaptability across diverse fields. In addition, the paper discusses a few of the possible future research directions for SO. The performance of the SO has been evaluated in the clustering-based image segmentation domain and compared to other MAs. The numerical and statistical results clearly demonstrate the superiority of the SO to other tested MAs. With researchers engaging MA as an alternate methodology in solving almost every optimization challenge, this survey would definitely provide valuable perceptions to numerous researchers seeking to attain a thorough understanding of SO, its advancements, and its broad applications in resolving diverse optimization problems.Keywords
Metaheuristic algorithms (MAs) are one of the emerging optimization techniques that have witnessed phenomenal advancement in offering solutions to many complex optimization problems wherein conventional methods fails to deliver the optimal solution. Fred Glover [1] in the year 1986, coined the term “Metaheuristic” to refer to heuristic approaches that provides general framework that can be employed to extensive range of optimization problems without being tailored. In recent years, researcher has developed extreme interest in metaheuristic-based optimization, thus bringing along tremendous progress since the introduction of the first meta-heuristic approach. With regular breakthroughs, new algorithms are proposed to solve and tackle real world challenges effectively. In metaheuristics, exploration and exploitation are two essential strategies [2] that boost the search for optimal solutions during the optimization process and it is crucial to maintain the balance between them for accomplishing efficiency and accuracy. Literature also terms metaheuristic algorithms as Nature-Inspired Optimization Algorithms (NIOAs) [3]. The No Free Lunch (NFL) theorem [4] states that there is no single MA that is capable of solving every type of optimization problems. This instigates researchers to continually come up with new MAs, thus making significant contribution in the field of optimization. MAs [5] are categorized into four main groups: Swarm-Based, Physics/Chemistry-Based, Human-Based, and Plant-Based optimization algorithms and the same are depicted in Fig. 1.

Figure 1: Classification of metaheuristic algorithms.
Swarm-Based MAs are social organism-inspired optimization techniques wherein social agent such as birds, fish, ants, and bees, moving in a flock, employ self-organizing, decentralized systems in which various agents cooperate with one another to exchange information for searching optimal solutions effectively. Some of the swarm-based algorithms are: Ant Colony Optimization (ACO) [6], Particle Swarm Optimization (PSO) [7], Marine Predators Algorithm (MPA) [8], Firefly Algorithm (FA) [9], Cuckoo Search (CS) [10], and Snake Optimizer (SO) [11]. The second class of MAs, as highlighted in Fig. 1, is Physics/Chemistry-Based MAs that draw inspiration from physical processes and chemical reactions, which are then mathematically modelled to develop solutions for optimization problems. A few algorithms that belong to this class are: Nuclear Reaction Optimization (NRO) [12], Atom Search Optimization (ASO) [13], Chemical Reaction Optimization (CRO) [14], Central Force Optimization (CFO) [15], and Thermal Exchange Optimization (TEO) [16]. The third class of MAs is Human-Based MAs that take inspiration from human behaviours, decision-making ability, and even social interactions. Some of the algorithms listed under this category are: Political Optimizer (PO) [17], Search and Rescue Optimization (SRO) [18], Teaching Learning-Based Optimization (TLBO) [19], Human Mental Search (HMS) [20], and Soccer League Competition Algorithm (SLC) [21]. The last category of MAs is Plant-Based MAs that draw inspiration from the growth, reproduction, adaptation, and survival strategies of plants. Few plant-based MAs are: Plant Growth Optimization (PGO) [22], Root Growth Algorithm (RGA) [23], Plant Propagation Algorithm (PPA) [24], Rooted Tree Optimization (RTO) [25], and Paddy Field Algorithm (PFA) [26]. The survey study, such as [27–31] in MAs, serves as a rich foundation for obtaining complete data about these algorithms, their origin, advancements, variety of applications, variants, and different adaptations, thus helping researchers gain more insights into various algorithms that fall under the category of MA. Additionally, case studies and experimental validation offer researcher with good indication of their effectiveness as well as real-world applicability. Consequently, such a survey provides a complete resource, thus paving the way for researchers to explore advanced optimization techniques in several domains.
Optimization techniques play a very critical role in image processing, as many image-related tasks can be formulated as complex optimization problems. Image-related tasks include image clustering, segmentation, feature selection, image enhancement, and performance evaluation, where the objective functions are often nonlinear, multimodal, and even high-dimensional. Traditional methods often struggle with such complexities, motivating the widespread adoption of metaheuristic optimization algorithms in image processing applications. In recent years, a variety of population-based optimization algorithms have been successfully applied to image clustering and analysis, including PSO, ACO, Differential Evolution (DE), Grey Wolf Optimizer (GWO), and Whale Optimization Algorithm (WOA). These methods have been employed to optimize clustering objectives, feature representations, and clustering validity indices, demonstrating improved robustness and accuracy compared to traditional approaches. However, their performance can still be affected by premature convergence and sensitivity to parameter settings, particularly in high-dimensional image feature spaces. More recently, novel bio-inspired optimization algorithms have been introduced to address these limitations by enhancing exploration–exploitation balance and convergence stability. Despite these advances, the application of newly developed optimizers in image clustering performance evaluation remains relatively limited. Even though a large number of MAs have been proposed and is available in the literature, their efficiency is strongly problem-dependent. In this study, the optimization task is associated with the performance evaluation of image clustering, which is inherently nonlinear and multimodal. This complexity arises from the high-dimensional feature space of images, complex inter-cluster relationships, and the presence of noisy or overlapping data, all of which increase the risk of premature convergence when conventional optimization techniques are applied.
This motivates the exploration of the SO in the present study, where its adaptive population interaction mechanism is leveraged to improve the reliability and effectiveness of performance evaluation in image clustering tasks. Therefore, this paper performs a review of the SO, which is a new Swarm-Based MA proposed by Hashima and Hussien in the year 2022, and to date, no thorough review of the SO has yet been conducted. The SO was also selected due to its adaptive exploration–exploitation mechanism, which is well-suited for handling complex search spaces encountered in image clustering problems. By dynamically regulating global search and local refinement through population interaction strategies, SO helps preserve solution diversity while improving convergence stability. This makes SO particularly effective for performance evaluation tasks in image clustering, where avoiding local optima is crucial for achieving reliable and robust results. Unlike static optimization strategies, SO dynamically adapts its search behavior during the optimization process, which is especially beneficial for clustering-related evaluation problems where the objective landscape is highly irregular. The SO has achieved 1110 citations as per Google Scholar dated 5 October 2025. These plentiful citations and their increasing order prove the wide acceptability and popularity of the SO to solve diverse complex optimization issues. This study aims to compile and assess the existing SO-based literature. This study is significant because it describes, categorizes, and evaluates the SO-based methods and its enhanced versions, which have been successfully employed to address multiple optimization problems.
The main contributions of the paper include knowledge consolidation, noting the strengths and drawbacks, proposing improvements by creative approaches, and showing its usefulness in optimization domains. The contributions are as follows:
(a) Analysis of SO’s Methodology: In this research, the theoretical approach and basic search mechanisms of the SO will be considered, with its approach for maintaining a trade-off between exploration and exploitation. The effect of various control parameters on the performance of SO will be discussed, and approaches for controlling them will also be evaluated.
(b) Analysis of SO’s Variants: Several improved variants of SO are presented in this study, along with an explanation. The major strategies that help to improve the SO’s performance are listed and discussed. Multi-objective and binary variants of SO and their applications are also mentioned in this study.
(c) Performance Evaluation: This research presents a complete analysis of SO’s effectiveness in the area of clustering-based colour image segmentation tasks. It has been assessed in this research by comparing it to other latest models of MAs.
(d) Applications: The SO algorithm employed across multiple areas, namely Energy, Engineering, IoT, Machine Learning, and Big Data, is systematically compiled, highlighting the wider applicability of SO.
(e) Identification of SO’s Advantages and Limitations: This paper offers a separate section that highlights the advantages and limitations of the SO. The discussion includes SO’s adaptability as well as issues encountered in handling complex optimization problems.
(f) Addresses Various Future Research Directions: This paper, in addition, provided several future research directions and thus inspiring researcher to work on the identified literature gaps, address limitation and explore the unexplored related to the SO algorithm.
1.2 Positioning of the Proposed Review with Respect to Existing Surveys
Various comprehensive surveys have been published on Nature-inspired and Metaheuristic optimization algorithms, providing broad taxonomies, classifications, and general performance discussions. However, most of these studies adopt a macro-level perspective, summarizing multiple algorithm families without offering an in-depth methodological and critical examination of any single algorithm, and the same has been highlighted in Table 1. In contrast, the present review adopts a micro-level analytical perspective, focusing exclusively on the SO algorithm. The distinctive features of this review are:

1. A detailed theoretical dissection of SO’s exploration–exploitation strategy, search operators, and control parameter influence, which is generally not addressed in broad surveys.
2. Discussion of parameter adaptation mechanisms and their effect on convergence behavior, providing practical guidance for researchers.
3. Structured categorization of binary, multi-objective, hybrid, and adaptive variants of SO, along with their modification strategies.
4. Unlike traditional surveys that summarize reported results, this work performs an independent experimental evaluation of SO in clustering-based colour image segmentation tasks, comparing it with recent metaheuristic algorithms.
5. A dedicated section explicitly analysing SO’s scalability, convergence speed, computational complexity, and limitations in high-dimensional or multimodal problems.
6. Identification of unresolved challenges and formulation of structured research directions, including adaptive parameter control, hybrid learning mechanisms, and theoretical convergence analysis.
A survey study on the SO often considers research questions about the algorithm’s performance, modifications, applications, and comparative analysis. Typical key research questions consist of:
The principled methodological and theoretical research questions are as follows:
• What are the fundamental principles of the SO, and how has a mathematical optimization framework been established based on the mating behaviour and foraging strategies of snakes in their natural environment?
• How does SO balance between exploration and exploitation in various optimization scenarios?
• What are the characteristics, computational complexity, and sensitivity to parameters of SO?
The research questions pertaining to the different variants of SO are as follows:
• What significant enhancements and hybridizations of the SO have been documented in the literature?
• How can these variants alleviate the limitations of the original SO, namely concerning local optima entrapment and computational costs in high-dimensional contexts?
• Are there any special circumstances under which certain SO versions operate better than others?
The research questions concerning the applications and performance assessment of SO are as follows:
• What are the primary application domains where the SO and its variants have been successfully employed?
• How well is the performance of SO when compared with other popular MAs in solving problems across various domains?
• What are the primary performance metrics and benchmarking challenges utilized to evaluate the effectiveness and robustness of the SO?
The research questions pertaining to the challenges and future study paths of SO are as follows:
• What are the outstanding problems and ongoing scientific inquiries related to the SO?
• What can be done to improve the SO regarding the current inadequacies in the SO?
• What are the possible future research directions in the field of SO, including modifications based on new challenges or the adoption of additional AI methods?
These research questions can help build the basis for a rather comprehensive paper on the subject of the SO in terms of a literature survey.
The remaining sections of the paper are organized around seven key pillars, structured as follows: Research Methodology and survey taxonomy are discussed in Section 2. A comprehensive overview of the SO Algorithm and its mathematical formulation is presented in Section 3. Literature review of SO-based research is discussed in Section 4. A detailed overview of SO’s applications across diverse sectors is performed in Section 5. Theoretical comparison with other MAs is presented in Section 6. Performance evaluation of SO and other tested MAs has been performed in Section 7. The conclusion and future research are elaborated in Section 8.
2 Research Methodology and Survey Taxonomy
In this section, a thorough review methodology for the SO, along with specifics on how to collect, examine, interpret, and synthesize data in order to assess and comprehend the subject being reviewed. The collection process is divided into three main steps: Preliminary study, Article screening, and Article finalization, and the same is demonstrated in Fig. 2.

Figure 2: Phases of review methodology.
The foremost objective of the preliminary study phase is to attain a comprehensive understanding of the scope of the literature search. This necessitates determining potential sources for further investigation and performing a preliminary search using pertinent keywords. Initially, the search was conducted based on several key terminologies associated with the algorithm, such as Snake Optimizer, Snake Optimizer Algorithm, Snake Optimizer Nature-Inspired Algorithm, (though not limited to), as illustrated in Fig. 3. Those papers that included these keywords in their title were chosen. To compile the various published articles related to SO, several well-known publishers, including IEEE, Elsevier, Springer, MDPI, Nature Portfolio, Taylor & Francis, Wiley, Frontiers, and many others, have been included. To facilitate this, Google Scholar, a commonly used web search engine making full-text research articles available across multiple disciplines, has been taken into account.

Figure 3: Keywords used to search research papers related to SO from Google Scholar.
This phase basically is included to guarantee the relevance of the selected articles with regard to the research theme. The screening process focuses on identifying the most suitable publications that align with the research objectives while eliminating those that do not meet the specified criteria. This screening, without any doubt, helped eliminate irrelevant research papers. To achieve our objective, we establish specific criteria for the article screening phase, (i) inclusion and (ii) exclusion criteria. Inclusion criteria include: Research articles published only in reputable journals, Research articles written in the English language, and a full article available for download. On the other hand, Exclusion criteria comprise: Research articles published as book chapters, Research papers available in languages besides English, and the full version of the article not available for download.
The important characteristics of the selected 129 different SO-related journal and conference papers are meticulously recorded and classified based on several parameters, namely titles, authors, publication year, publishers, citations, application area, and type, throughout the data extraction phase.
Among these 129 papers, the numbers of journal and conference papers are 104 and 25, respectively. The graphical representation of this information is presented in Fig. 4. Year-wise publications of these 129 considered papers for the survey are pictorially presented in Fig. 5. The publishers of these papers are depicted in Fig. 6, and this figure clearly demonstrates that IEEE is at the top of the list. Elsevier and Springer, two major contributors to the body of literature on this subject, are closely followed. Wiley, MDPI, and Nature get the fourth rank. Fig. 7 shows year-wise citations of the original source paper of SO. The citations of the SO are increasing, which proves the popularity of the SO in the MA-based research domain. MAs that were published in 2022 and achieved excellent citations are presented in Fig. 8. The figure clearly shows that SO gets fourth rank by achieving 1206 citations. The surveys or review papers on the first three ranked MAs of 2022 have already been published in the literature. Therefore, we opt for SO for the survey.

Figure 4: Number of publications in Journals and Conferences.

Figure 5: Year wise number of papers considered in the survey.

Figure 6: Publication percentages of different publishers.

Figure 7: Year-wise citations of snake optimization algorithm as per Google Scholar (date. 27 December 2025).

Figure 8: Metaheuristic algorithm published in 2022, and its citations count per Google Scholar.
Snake Optimizer was developed and introduced in the year 2022 by Hashim and Hussien by simulating the mating, hunting, and survival behaviors of snakes in nature [11]. Their research article entitled “Snake Optimizer: A novel meta-heuristic optimization algorithm” was published in an esteemed journal, Knowledge-Based Systems, under Elsevier. The article became publicly accessible online on 22nd January’2022. As previously mentioned, the SO algorithm is inspired by the mating behavior of the snakes, and based on this concept, its search mechanism is structured into two distinct phases, namely exploration and exploitation.
• Exploration: In the natural mating process of snakes, male snakes are basically the primary travelers, enthusiastically searching for mates, whereas female snakes are the decision-makers, selectively picking the best partner. Thereby, to attract the female snake, males display extensive movement patterns to locate the female snakes, and this typically involves broad area searching and even making an unpredictable movement pattern, such as random or zigzag, thereby trying to increase their chance of encountering their mating partner. Mimicking the randomization movement of snakes ensures that the algorithm is capable of encountering multiple potential solutions before settling on an optimal one. Therefore, the exploration phase of the SO algorithm aims to expand the search space and prevent premature convergence to local optima.
• Exploitation: Once the snake has explored the entire search space and identified the potential mate, the behavior of the snake changes from random movements to focused engagements. The male snake tries to stay close to the female snake, exhibiting slow yet controlled movements. However, if the competition tends to arise, the male snake might get engaged in battle with the rivals, refining its position to secure the place and be chosen by the female snake. Therefore, following the trend that takes place in natural processes, the exploitation phase of SO aims to narrow down the search space, focusing on refining the best solution that, in turn, enhances the solution quality.
The biological behaviours of snakes can be interpreted well alongside optimization principles, and the same has been projected in Table 2.

3.1 Mathematical Model of Snake Optimizer
The SO algorithm is highlighted as Algorithm 1, and the phases of the SO algorithm are detailed as follows, with its diagrammatic representation depicted in Fig. 9.

Figure 9: Flowchart of snake optimizer.

Phase 1: Population Initializing Phase
Like all other metaheuristics, SO [11] initiates the optimization process via the creation of randomly and uniformly distributed. In essence, this is done to guarantee diversity in the initial solution, giving the search mechanism a solid base. Eq. (1) can be used to obtain the initial population. Furthermore, the population is equally split into two groups, assuming a 50% distribution of male and female snakes. Eqs. (2) and (3) are used to divide the population.
where,
where,
Phase 2: Evaluation Phase
The two groups created in phase 1 are evaluated to determine the best individual in each group, thus identifying the best male (fbest,m), female (fbest,f), and the optimal food position (ffood). Further, the temperature, Temp is calculated using Eq. (4), where t and T in the equation basically refer to the current iteration and the total number of iterations, respectively
This phase also deals with defining the quantity of food, typically denoted as Q, and the same is obtained using Eq. (5).
where,
Phase 3: Exploration Phase (when food is unavailable)
The exploration phase begins whenever the food is unavailable prompting the snakes to explore the search space. In order to determine when to initiate phase 3, a threshold value of 0.25 is set. If the quantity of food, Q falls below threshold, the snakes initiate the exploration phase by choosing any random position and update their position accordingly. This process is carried forward using Eqs. (6) and (8).
In regard to male snakes, ith male position is determined using Eq. (6) as given below:
where,
where,
Similarly, in regard to male snakes, ith female position is determined using Eq. (8) as given below:
where,
where,
Phase 4: Exploitation Phase (when food is available)
This phase is executed when the food is available which means that Q > Threshold (0.25). Since, the food alone do not determine the mating process rather snakes do rely on temperature as well, so another threshold value in regard to temperature is set to 0.6 and checked accordingly.
If Q (Food Quantity) > Threshold (0.25)
If the temperature > Threshold (0.6) (hot) then
The snakes will navigate towards only food and exploitation is carried out using Eq. (10).
where,
However, If the rand < Threshold (0.6) then the snake will either enter the Fighting Phase else Mating Phase.
Phase 5: Fighting Phase
Male snakes compete for mating rights, often engaging in physical battles to attract females, as represented by Eqs. (11) and (12).
where,
Also,
where,
FF and FM are further evaluated using Eqs. (13) and (14).
where,
Phase 6: Mating Phase
However, female snakes eventually make the decision of choosing one male snake among the many from the competition. Eqs. (15) and (16) depicts the mating procedure.
where,
Phase 7: Termination Phase
Upon hatching, the worst male and female snakes are identified and replaced using Eqs. (19) and (20). The algorithm terminates once the predefined number of iterations is reached.
3.2 Computational Complexity of Snake Optimizer
The computational complexity of any algorithm typically defines the amount of time and memory an algorithm requires as the input size grows. The key complexity factors in snake optimizer algorithms include Population size (N), Problem Dimension (D), Number of Iteration (T), and Fitness function cost (f(D)). In each iteration, SO typically performs fitness evaluation for all snakes represented by O (N × f(D)), position updates such as movement, mating, fighting behavior denoted as O (N × D), which is the vector operations over D dimensions, sorting and ranking snakes denoted as O (N log N). In such a scenario, the total time complexity over T iteration is represented using Eq. (21).
In most of the optimization problems wherein fitness evaluation dominates, and sorting is negligible or avoided, time complexity is denoted using Eq. (22).
On the other hand, the space complexity of the SO is basically determined by the data structure used to store the population and the related information, such as Population position (X), Fitness values, the required O(N) memory, Temporary position matrix, and Global best storage. Hence, the overall space complexity of SO is dominated by population storage and is depicted in Eq. (23).
Table 3 presents a comparative analysis of the theoretical time complexity and structural properties of the Snake Optimizer (SO) and several well-established metaheuristic algorithms.
Parameter sensitivity basically refers to how the algorithm’s performance is affected by changes in its control parameters. The performance of the SO is strongly influenced by several core parameters that administrate the balance between exploration and exploitation, convergence speed, and solution diversity. Some of the control parameters are Temperature (Temp), Population size (N), Maximum Iterations (T), Energy parameter (E), Food quantity (Q), Male-Female ratio, and Threshold parameter (θ).
(a) Temperature (Temp): This is one of the crucial parameters that typically controls behavioral mode switching and determines whether a snake fights or mates. The high temperature leads to aggressive exploration, whereas the low temperature indicates stable exploitation. If the temperature drops drastically, the algorithm tends to miss out the global optimum and give priority to exploitation. Therefore, the gradual cooling schedule for this parameter is important as a slower drop guarantees well exploration in the early stages.
(b) Population size (N): Another parameter that is a fundamental part of the SO structure is N, which determines how many candidate solutions will be exploring the search space and their equal division into male and female. Small N, though, leads to faster computation but generates a higher risk of premature convergence. On the other hand, a large N would give better exploration, but the computational overhead increases.
(c) Maximum iterations (T): Another parameter that controls the total search duration of the SO algorithm is T, and directly hampers the convergence quality. If the value of T considered is too small, the algorithm might stop early and miss out on the good solution. On the other hand, if the value of T considered is too high, the algorithms keep running even when there is little or no improvement, resulting in diminishing returns. In practice and analyzing most of the scenarios, the SO algorithm tends to converge faster in comparison to many classical metaheuristics, thus making a too high iteration count pointless.
(d) Energy parameter (E): This parameter, E, determines how the SO balances searching new areas of the solution space (exploration) and refining the best solutions found so far (exploitation). When the energy considered is very high, the algorithm's ability to explore a wide range of the search space increases, which even helps the algorithm to determine new and diverse solution but can slow down convergence. Above all, keeping the E value too high for too long might cause random movement with little or no progress. On the other hand, if the energy considered is low, the algorithm might get stuck in local optima. Therefore, poor tuning might result in uncontrolled random search or premature convergence.
(e) Food quantity (Q): This parameter represents the availability of the resources, thus influencing the attraction of snakes towards promising solutions. Q acts as a threshold-based switch; when it is high (above threshold), attraction is stronger, and the algorithm tends to shift towards the best solutions found so far, enhancing exploitation and convergence, however, reducing diversity if applied too early. On the other hand, when the value of Q is low (below threshold), attraction is weaker, thus enabling the algorithm to explore different areas of the search space (remains in exploration phase) and diversity is maintained. However, if Q is always low, the algorithm might explore too much, and a convergence issue might arise. In the beginning, Q should be low and should gradually increase later in the search, thus making gradual tuning of Q one of the important requirements in SO.
(f) Male-female ratio: In SO, this parameter is basically employed to determine how many snakes will be considered as “male” and how many as “females”, and this determines the population diversity and further affects the interaction pattern during mating and fighting phases. Generally, in practice, a balanced ratio of 50:50 generally ensures stability, unbiasedness, and thereby maintains good diversity across the population. If the ratio considered is too unbalanced, the search might, in the algorithms, remain biased towards a certain pattern, thereby reducing exploration in some areas of the search space and potentially leading to slower convergence. The best practice is to keep the ratio balanced and fixed unless a specific requirement of the problem being solved.
(g) Threshold parameter (θ): In most implementations of SO, the parameter θ is used that is employed as a decision boundary to determine behavior switching, such as exploration to exploitation, fighting to mating, and so on. If the threshold value is low, behavior switches to exploitation or mating too early, and snakes may focus too much on known good solutions, reducing diversity. This enhances the risk of premature convergence. On the other hand, if θ is high, then exploration and fighting continue for a longer period of time. This generates risks such as slow convergence, wasted iterations, and unbalanced search. Thus, it is advisable to keep the threshold parameter balanced, which will ensure a good balance between exploration and exploitation and further guarantee that the SO algorithm searches for good enough solutions before refining them.
The Snake Optimizer algorithm, despite being implemented in various optimization tasks, generates effective results, but it suffers from a few drawbacks when used in complex or multi-modal search spaces. One major drawback is its tendency toward premature convergence, particularly when dealing with complex, multimodal, or high-dimensional search spaces. This behaviour may cause the algorithm to become trapped in local optima, thereby limiting solution quality. Additionally, the convergence speed of the basic SO algorithm may be slow in later iterations due to insufficient population diversity, which reduces its exploration capability. Additionally, the SO algorithm is quite sensitive to parameter settings, thus it needs careful fine parameter tuning to perform effectively across different problem domains. Furthermore, algorithms are not robust in noisy and dynamic situations, and different fitness landscapes may challenge the algorithm’s ability to adapt effectively and sustain a premium solution over time. Table 4 highlights the key strengths and limitations of the SO algorithm.

Thereby to address the limitations of the original SO algorithm, several research initiatives have focused on developing numerous variants of SO, namely, improved, hybrid, binary, and multi-objective SO, which aim to overcome the weaknesses of the standard SO. These enhanced versions introduce mechanisms such as adaptive parameter adjustment, diversity preservation strategies, and hybridization with other optimization techniques. Such modifications improve the balance between exploration and exploitation, enabling the algorithm to escape local optima and converge more efficiently. Table 5 is tabulated to clearly highlight the differences and characteristics of improved SO algorithms and standard SO.

4 Related Works on Snake Optimizer Variants
Since its introduction in 2022, the SO algorithm has attracted considerable research attention for addressing diverse challenges, leading to a notable amount of work using SO. The following subsections explore related studies on its various enhancements and adaptations along with their application areas. As per the survey, the different variants of SO comprises of Improved variant of SO, the Hybridized variant of SO, the Multi-Objective variant of SO, and the Binary variant of SO.
An improvised variant of the SO is an enhanced version of the original algorithm, developed to polish the algorithm, guaranteeing it is more well-aligned with the specific features of the problem at hand. Metaheuristic improvements are typically intended to cater to the inadequacies of the original algorithm, in so doing, strengthening its capacity to recognize solutions that are optimal or close to optimal. These changes could include operator tweaks, parameter tuning, or modifications to the search protocols. Since real-world optimization problems vary in complexity, and a more polished algorithm can produce faster, more accurate, and more reliable solutions in a variety of applications, creating improvised versions becomes imperative. The major improvement strategies that are utilized in the literature of SO are depicted in Fig. 10. However, it is noticed that orthogonal learning, fitness distance balance-based selection mechanism, Weibull distribution, local escaping operator, and multi-population strategies are not utilized for the enhancement of SO.

Figure 10: Major improvement strategies for SO as per the literature.
Wang et al. [36] in their work suggested an enhanced version of SO, named ISO (Improved Snake Optimizer), designed to enhance the convergence ability and stability for solving complex continuous optimization problems. The proposed ISO integrates a chaotic mapping strategy for generating a uniform and randomly distributed initial population, a forced switching mechanism to balance SO’s updating phases and improve convergence speed, a Whale Optimization Algorithm (WOA) variant to enhance exploration, and an optimal domain perturbation strategy to refine solutions near the best results. ISO outperforms classical SO and eight other classical or recent algorithms, showcasing higher convergence efficiency (94/160) and stability (77/160) on CEC-2021 test functions, also achieving 100% convergence efficiency and 96.25% stability in applicable functions. Three real-world continuous optimization problems were used to further confirm its usefulness, and discretization may be used in combinatorial optimization in the future. Zhang et al. [37] propose a multi-strategy Integrated Snake Optimizer (ISO) for solving task scheduling problems in cloud-fog computing environments, where efficient computational resource allocation is crucial to minimizing transmission delay and total costs. To enhance the standard Snake Optimizer, several optimization strategies are integrated. Circle map initialization enhances population diversity as well as traversal, guaranteeing a well-searched space distribution. Synergistic Optimization Strategy further supports the global search by boosting population diversity, while the Elite Reverse Learning Mechanism (ERLM) accelerates convergence, thus improving solution quality by pulling out the best-performing individuals. Furthermore, Evolutionary Population Dynamics presents mutation to additionally expand the search and avoid premature convergence. The proposed ISO is compared with Genetic Algorithm (GA), Wildebeest Herd Optimization (WHO), Grey Wolf Optimizer (GWO), Whale Optimization Algorithm (WOA), Harris Hawks Optimization (HHO), Arithmetic Optimization Algorithm (AOA), and African Vultures Optimization Algorithm (AVOA) on multiple datasets and the experimental result clearly indicate that ISO demonstrates better convergence, lower costs, improved response speed, and optimized transfer paths, making it a more efficient approach for cloud-fog computing resource allocation. Li et al. [38] introduced the Multi-Task Snake Optimization (MTSO) algorithm, an improved variant of SO basically designed for Multi-Task Optimization (MTO), in order to address challenges such as low optimization precision and high computational costs. The proposed algorithm typically operates in two phases: first, independently solving each optimization task, and second, enhancing performance through knowledge transfer between tasks. The transfer mechanism is solely guided by the probability of knowledge exchange and elite selection process that allows the algorithm to either integrate superior knowledge from other tasks or else improve solutions through the means of self-perturbation. Experimental results demonstrate that the proposed MTSO outperforms existing MTO algorithms, achieving superior accuracy on benchmark functions and real-world engineering problems, such as planar kinematic arm control, robot gripper optimization, and car side-impact design, proving its effectiveness in handling multiple optimization tasks simultaneously. Song et al. [39] proposed a hybrid algorithm termed the Modified Snake Optimizer (MSO) specifically designed for color image thresholding segmentation, with a primary application in agriculture for early identification, diagnosis, and treatment of rice disease. MSO integrates numerous techniques: a dynamic adaptive parameter tuning method to speed up convergence by fine-tuning the algorithm’s parameters; an improved global position update formula incorporates guidance from the optimal individual and random disturbances to balance global and local searches; Lévy flight is introduced in the flight mode to help the optimizer escape local optima; a balancing strategy in the mating mode enables effective sharing of information among sub-populations; and a hill-climbing jump operation is applied to the optimal individual to overcome local stagnation. These integrations jointly boost the algorithm’s performance, resulting in more accurate segmentation outcomes compared to traditional methods and other optimization techniques. Kong et al. [40] proposed an Improved Snake Optimizer (ISO)-based MPPT control strategy, typically designed for renewable energy applications, precisely for optimizing power converters in photovoltaic systems. The proposed methods hybridize the traditional Snake Optimizer with several enhancement techniques, namely a Lévy flight strategy to enhance global search capabilities and quickly bypass local optima, a dynamic exploitation probability to prevent premature convergence, and chaos theory to increase the diversity of snake positions for a more robust search. Each considered in the proposed method contributes significantly to refining the tracking performance, thus accelerating convergence, reducing oscillations, and further minimizing power loss under real-world conditions. Experimental results show that the ISO achieves a maximum tracking efficiency of 99.99% and a response time of only 0.08 s thereby significantly outperforming the conventional SO. An enhanced Snake Optimization algorithm’s variant, called SO-OBL, that integrates Opposition-Based Learning (OBL) to improve global optimization and for multilevel image segmentation was proposed by Houssein et al. [41]. Specially designed for medical imaging applications, specifically for accurate liver disease segmentation from CT scans. In this proposed hybridized approach, the traditional SO algorithm is boosted by OBL to improve the search process and also to avoid local optima, while an optimized multilevel thresholding technique leveraging Otsu’s function is employed to effectively demarcate liver boundaries despite uneven presence and unclear edges. Experiments on CEC’2022 test functions and segmentation tests demonstrate that the proposed SO-OBL outperforms eleven state-of-the-art metaheuristics, achieving superior metrics, thereby enhancing segmentation performance in the field of medical imaging. Mai et al. [42] in their work proposed an enhanced Snake algorithm, termed as ISASO, for accurate parameter identification of photovoltaic (PV) models, basically to optimize the operation and control of PV systems. The proposed ISASO hybridizes the traditional Snake Optimization algorithm with Subtraction Average-Based Optimization (SABO) to improve global search capability by updating agent positions by means of a consistent arithmetic mean, thus avoiding premature convergence towards local optima. In addition, a tent chaotic map is employed for population initialization for diversity improvement, while a dynamic learning factor and adaptive inertia weight strategy are blended together to accelerate convergence and exploit and explore trade-offs. Tested on CEC2005 benchmark functions and many PV models, the proposed ISASO exhibits better parameter identification accuracy with the lowest values of Root Mean Square Error (RMSE) and superior performance compared to other metaheuristics. Yang et al. [43] in their work tried to optimize the design of Hybrid Wind-Wave Energy Converters (HWWECs) in order to maximize power output and operating stability. By integrating a V27-225 kW wind turbine with a wave energy converter, the system enhances energy efficiency and reliability. The work suggests an Enhanced Snake Optimizer (ESO), based on chaotic initialization, asynchronous learning factors, and Lévy flight, to optimize the array configuration of several HWWECs. Simulation outcomes using various configurations (three, six, and twelve buoys) prove ESO superior to other algorithms, producing the greatest absorption of power, with a gain of 144.337 kW compared to the baseline Snake Optimizer. Wang and Wang [44] proposed a modified variant of SO known as Multi-Strategy Enhanced Snake Optimizer (EMSO), designed to improve Quantitative Structure-Activity Relationship (QSAR) modelling by optimizing hyperparameters and selecting relevant molecular descriptors. It introduces new food emergence and temperature change mechanisms to balance exploration and exploitation, along with a death and regeneration strategy to escape local optima. Moreover, new food searching, fighting, mating, and eating formulas improve search accuracy and efficiency. Compared on benchmark functions, engineering design problems, and feature selection tasks, the model performs better than algorithms currently available. Used in QSAR modelling for predicting environmental toxicity, it performs improved hyperparameter tuning and selects a lower number of more informative features, constituting a valuable tool for improving QSAR models. Nafeh et al. [45] introduce an optimization approach for the sizing of a photovoltaic (PV) battery grid-connected system for electric vehicle fast charging stations in Cairo, Egypt. The major aims are minimizing the total system cost and maintaining energy balance. A new energy management strategy is formulated based on two price models to maximize power flow and find optimal component sizes. Furthermore, a modified Snake Optimization (MSO) algorithm is introduced and simulated in MATLAB to solve the optimization problem. A comparative study with four other meta-heuristic algorithms validates the better performance of MSO. A techno-economic analysis also validates the feasibility of the proposed system throughout its lifetime with efficient energy management and optimal cost-effectiveness. Zheng et al. [46] proposed a modified version of the traditional SO, named the Snake Optimizer based on Sobol Sequential Nonlinear Factors and Different Learning Strategies (SNDSO), aimed at solving issues such as vulnerability to local optima, bad convergence in high-dimensional problems, and adversity in discretized and multi-constraint situations. The proposed algorithm delivers three main enhancements: Sobol sequences to optimize initial population distribution, nonlinear factors employing the inverse tangent function to induce a balance between exploration and exploitation, and learning strategies to promote population diversity and avoid premature convergence. The efficacy of SNDSO is established through extensive testing, including CEC2015 and CEC2017 benchmark functions for optimization in high-dimensional problems, twelve feature selection problems for discretized situations, and five real-world multi-constraint optimization problems. Experimental outcomes affirm SNDSO’s excellence in outperforming the shortcomings of SO, positioning it as a robust algorithm for sophisticated optimization problems. An Improved Snake Optimizer-based Deployment Method (DMISO) for sea disaster monitoring, suggested by Jin et al. [47], uses an AHP-based hierarchical deployment technique to coordinate sensor positioning with sea disaster spatial distribution. Subsequently, an Improved Snake Optimizer (ISO), reinforced with Opposition-Based Learning (OBL), is utilized to deploy nodes efficiently while maintaining the trade-off between network coverage and cumulative energy cost. Moreover, a Path Allocation Strategy (PAS) is used to redistribute nodes so that energy consumption is evenly distributed throughout the network. Simulation results show that DMISO decreases energy consumption by 35.46% over the Underwater Fruit Fly Optimization Algorithm (UFOA) while enhancing coverage by 4.77% over UFOA and 17.4% over random deployment. In addition, DMISO balances energy consumption efficiently and prolongs the entire network lifetime, hence making it a stable solution for underwater monitoring purposes. Bao et al. [48] in their work proposed an Improved Binary Snake Optimizer (IBSO) that improves the existing Snake Optimizer for binary optimization, particularly for the purpose of feature selection. The proposed algorithm incorporated Hamming distance and a Gaussian-based Mutation Transfer Function (MTF), thus enhancing search diversity, local exploration, and computational efficiency. Tested on approximately 27 benchmark datasets, IBSO outperforms other binary swarm intelligence algorithms, achieving better accuracy and lower computational cost. Its effectiveness is further validated against different SO variants, proving its superiority in high-dimensional data selection and machine learning tasks. Sarkhi and Koyuncu [49] introduced an algorithm called Energy Serpent Optimizer (ESO), an enhanced version of the Snake Optimizer, typically intended to improve Deep Reinforcement Learning (DRL) models for game AI, especially in dynamic environments such as Pac-Man. Through the optimization of hyperparameters, the introduced ESO improves AI adaptability, response time, and efficiency, thereby making DRL models more efficient in real-time gaming. It improves the algorithm thereby refining the whole learning process, making it converge faster and make better decisions. This combination of metaheuristic optimization with DRL not only improves gaming AI but also has potential in autonomous systems and robotics. Zheng et al. [50] present an improved variant of the Snake Optimization Algorithm (ISO) that integrates multiple strategies like Sobol initialization, Lévy flight, adaptive step size, and Brownian motion that improves the Snake Optimization algorithm by enhancing exploration, convergence speed, and avoiding local optima. Tests on benchmark functions show better stability, faster convergence, and stronger global optimization than SO. ISO also achieves superior results in engineering applications such as UAV and robot path planning, wireless sensor deployment, and pressure vessel design, demonstrating strong robustness and optimization capability. Liu et al. [51] in their research presented an enhanced Snake Optimization Algorithm (SOEA) to overcome the deficiencies of the original Snake Optimization Algorithm. The developed SOEA improves performance through the use of an elite opposition-based learning strategy, which optimizes population placement to achieve optimal global search efficacy and iteration speed. Moreover, the adaptive threshold technique is utilized, which enhances local search power and speeds up convergence. Experimental comparisons emphasize that the introduced SOEA considerably outperforms SOA, decreasing iterations to 34% and capturing a lesser Mean Squared Error. The usefulness of SOEA is further ensured through its usage in path planning for multiple UAVs, proving the algorithm’s capability in handling high-dimensional and nonlinear optimization problems. Liao and Wang [52] improve Time-Difference-of-Arrival (TDOA) localization with an Improved Snake Optimization algorithm to solve problems in accuracy and convergence speed. The enhanced SO combines a chaotic system for multi-purpose initialization, a better exploration strategy for accelerated early-stage convergence, an adaptive evolutionary threshold to deal with variations in noise, and a snake oviposition strategy based on genetic principles for precise search performance. Simulations validate that this method enhances localization accuracy and robustness and is worth using in wireless sensor networks, surveillance systems, radar, and underwater acoustic localization. Yang et al. [53] proposed a Snake Optimization (MSSO)-based routing protocol for Wireless Sensor Networks (WSN), employing Fuzzy C-Means (FCM) to competently select the cluster head and the Minimum Spanning Tree (MST) to route amongst clusters. The proposed MSSO enhances traditional SO with the help of dynamic updating of parameters, adaptive alpha mutation, and bi-directional search with an increase in convergence speed and efficiency. The new protocol reduces energy usage (26.64%), increases network lifespan (25.84%), stabilizes the clusters (52.43%), and increases throughput (40.99%) as opposed to previous methods, hence becoming very efficient for energy-saving WSN routing. An improved variant of SO for cloud task scheduling was proposed by Damera et al. [54]. In the suggested approach, SO is enhanced using sine chaos mapping, spiral search method, and dynamic adaptive weights, enhancing global search ability and avoiding local optima. The findings from the experiment show that the algorithm performs better than the existing methods by effectively optimizing Quality-of-Service (QoS) parameters such as make span and energy efficiency, with performance gains of 6%, 4.6%, and 3.27% compared to current methods, making it very efficient for dynamic cloud environments. Zhao and Zhou [55] in their paper offer a multi-step short-term wind power forecasting model combining Complementary Ensemble Empirical Mode Decomposition (CEEMD), an Improved Snake Optimization Algorithm (ISCASO), and Kernel Extreme Learning Machine (KELM) to improve forecasting performance. CEEMD separates complicated wind power data into smoother parts, eliminating instability. ISCASO tunes the KELM parameters, enhancing predictive accuracy. By integrating individual component forecasts, the final model attains better accuracy in the forecast of short-term wind power trends, outperforming present models. Zhang et al. [56] introduced an Improved Snake Optimizer (ISO) to optimize task offloading and resource scheduling in Ultra-Dense Edge Computing (UDEC) networks. The Power Allocation (PA) issue of UDEC network is addressed using a quasi-convex method that basically reduces energy consumption, while the Joint Request Offloading and Resource Scheduling (JRORS) issue is represented as a mixed-integer nonlinear programming problem to reduce request delay while enhancing welfare. The ISO incorporates an oscillation term to optimize weight parameter updates and enhance search efficiency. Simulation results indicate that the presented approach surpasses DE, PSO, ALO, BOA, and GWO in terms of minimum energy consumption and maximum welfare and is, therefore, an economical alternative for edge computing task management. Kong and Liu [57] in their research introduced an Improved Snake Optimization (ISO) algorithm to improve the melting point of zinc smelting slag, which is an important factor influencing the smelting efficiency and quality of products. A predictive model based on CatBoost is created, whose parameters are optimized with Tree-structured Parzen Estimator (TPE) to provide higher accuracy. The suggested ISO algorithm is then used to calculate optimal production parameters, thereby achieving an average melting point reduction of 65°C. Experimental verification on actual smelter data in Shaanxi, China, validates the superiority of this method over conventional methods, with the TPE-optimized CatBoost model achieving an R2 of 93.89%. Ghamari et al. [58] proposed a technique that combined Reinforcement Learning (RL) and Snake Optimization (SO) to achieve optimized gain initialization for an adaptive PI controller for DC/DC Boost converters. The proposed method improved robustness, real-time adaptability, and frequency matching utilizing the Deterministic Policy Gradient (DPG) technique while avoiding RL’s exploration limitation. The SO algorithm enhances initial parameter choice, resulting in accelerated convergence and dependable performance. Experimental verification on a hardware platform verifies higher control efficiency. Mai et al. [59] introduce a dual-layer MPPT control technology using a dynamic adaptive Snake Optimization Algorithm (ISO), and variable-step perturb & observe (IP&O) to effectively monitor maximum power in photovoltaic systems under partial shading and dynamic weather. The hybrid method utilizes the global search of ISO to prevent local optima and IP&O accuracy for high-speed convergence, with 99.91% average tracking efficiency (simulation) and 99.68% (experimental) at speeds of 0.07 and 0.66 s, respectively, being faster, more stable, and more adaptive to shading compared to traditional methods (P&O, IC). Verified on an HIL+RCP platform, the technique reduces power oscillations and performs exceptionally well in real-world partial shading conditions, providing a robust solution for complex PV system conditions. Zhou et al. [60] developed an adaptive switching control scheme with anti-disturbance attitude for quadrotor Unmanned Aerial Vehicles (UAVs) based on an Improved Snake Optimizer (ISO) to improve performance with measurement noise. The control scheme, or the Adaptive Switching Disturbance Rejection Controller (AWDRC), integrates linear active disturbance rejection control with an adaptive switching extended state observer to effectively reconstruct noisy signals. The ISO algorithm uses quadratic interpolation and holistic learning methods to improve the quadrotor’s attitude controller parameters. Experimental verification using state-of-the-art meta-heuristic algorithms confirms the efficacy of the proposed ISO-based AWDRC in quadrotor attitude tracking control and presents its robustness using Monte Carlo experiments. Wang et al. [61] recommended a short-term photovoltaic power forecasting technique using a combination of K-means clustering, an enhanced Snake Optimization (ISO) algorithm, and a CNN-Bidirectional Long Short-Term Memory (BiLSTM) network. The K-means algorithm is employed for weather classification; model parameter optimization is improved using ISO. The proposed method provides high regression coefficients of 0.99216 for sunny, 0.95772 for cloudy, and 0.93163 for rainy days, thus providing better prediction precision and flexibility across different weather conditions, thus supporting renewable energy management. Zhi et al. [62] proposed an Improved Snake Optimization (I-SO) algorithm to optimize the parameters of Active Disturbance Rejection Control (ADRC), which uses chaotic elite opposition learning and sine-cosine search for diverse initial solutions and enhanced local optimization, respectively. Further, the experimental result highlights that the proposed I-SO is efficient in finding ideal parameters, achieving optical path scanning stability of 99.2%. Singh and Kaushik [63] suggested a technique that combines anisotropic filtering, segmentation using Mask R-CNN, and a better adaptive snake optimizer (ASO) with SqueezeNet for leaf classification and a deep Q-network for disease detection. ASO increases adaptability, which enhances accuracy (0.924 for classification, 0.919 for detection). Agrawal and Mahapatra [64] proposed a novel Meta-Heuristic Snake Optimization Algorithm, an improved variant of SO, designed with the purpose of enhancing the exploration and exploitation, optimizing PSS, TCSC, and their coordinated control. The coordinated model achieves superior stability, with oscillations settling in under 2 s and a 99.30% damping ratio. To optimize the SVM’s parameters, an Improved Snake Optimizer (ISO) is proposed by Lu et al. [65] and his team members to address the limitations such as weak population initialization, slow early convergence, and local optima trapping, of the original Snake Optimization Algorithm. The proposed ISO combines Mirror Opposition-Based Learning (MOBL) to enhance population diversity, a Novel Evolutionary Population Dynamics (NEPD) model for precise searching, and Differential Evolution Strategy (DES) to prevent getting stuck in local optima. Further, the experiments performed on numerous benchmark functions and CEC2022 datasets validate ISO’s better optimization accuracy and speed. Zhang et al. [66] and his team members announced an Improved Snake Optimization (ISO) algorithm for beamforming design in sparse conformal arrays, to boost array sparsity and as well optimize radiation patterns. The proposed ISO integrated Sobol sequences for better population initialization, Cauchy mutation to escape local optima, and a nonlinear time-varying factor inspired by the Whale Algorithm, along with a modified flag control function to improve global exploration. The simulations performed on a sparse cylindrical conformal array highlights that 50% array sparsity, lower sidelobe levels, and faster convergence-10% faster than SO and 46% faster than GA. Kaliraj et al. [67] offered a metaheuristic method based on reinforcement learning called Snake Swarm Optimization (SSO) for actual resource allocation and offloading. The technique integrates operator data with IoT infrastructure and employs neural caching to maximize execution efficiency. Optimal resource allocation through a cost mapping table and incentive factor-based optimal resource allocation ensures efficient distribution among edge users. Performance is measured based on criteria such as delivery ratio, energy expenditure, throughput, and delay, with results proving to be better than conventional optimization methods like Gray Wolf Optimization (GWO), Ant Colony Optimization, and Genetic Algorithms. Mohammed et al. [68] proposed an Improved Snake Optimizer Algorithm (ISO) for fast and accurate Maximum Power Point Tracking (MPPT). The proposed strategy differentiates between Uniform Shading Conditions (USCs) and PSCs, avoiding unnecessary searches and guaranteeing rapid tracking. Additionally, a method for detecting load variations with high convergence speed is introduced. Proposed technique was experimentally confirmed using a buck–boost converter with a sampling time of 0.05 s, achieving 99.86% efficiency and an average tracking time of less than 0.75 s under varying weather conditions. Comparative analysis against recent metaheuristic confirms its superior performance. Zheng et al. [69] proposed a Compact Snake Optimization (cSO) algorithm, that employed a compact strategy into the standard Snake Optimization algorithm, making it more efficient especially in situations where limited computing and memory resources were required. Performance evaluation performed across 28 test functions from CEC2013 confirms that cSO outperforms existing intelligent computing algorithms. Furthermore, integrating cSO with WKNN and Received Signal Strength Indicator (RSSI) positioning significantly reduces localization errors, signifying its effectiveness in refining indoor positioning accuracy. Braik et al. [70] anticipated three improved adaptive variants of the Snake Optimizer namely Exponential SO (ESO), Power SO (PSO), and delayed S-shaped SO (SSO), to address the optimizer’s tendency to get trapped in local optima and to enhance the search performance. These adaptive models efficiently maintain the balance between exploration and exploitation, guaranteeing more efficient feature selection. Their binary versions were developed for Feature Selection (FS) tasks using the K-Nearest Neighbor (KNN) classifier and tested on 24 datasets. Experimental results demonstrated that the Binary Power Snake Optimizer (BPSO) outperformed competing FS methods across multiple evaluation metrics, including classification accuracy, sensitivity, specificity, and fitness scores, achieving over 90% accuracy in several datasets. Khurma et al. [71] in his proposed work integrated a logarithm operator with SO, to enhance exploitation through a cosine-based separation metric. Furthermore, three selection mechanisms namely, Tournament Logarithmic Snake Optimizer (TLSO), Proportional Logarithmic Snake Optimizer (PLSO), and Linear Order Logarithmic Snake Optimizer (LLSO) are introduced to improve the exploration phase. The experiment performed over 22 medical datasets demonstrated that TLSO achieved the highest accuracy in 86% of cases and the best feature reduction in 82%, proving its reliability and efficiency in optimizing feature selection for medical diagnosis. An enhanced version of the Snake Optimizer which incorporates chaotic maps into its parameter choice instead of random strings resulted in creating Chaotic Snake Optimizer (CSO) variants based on four different mappings with chaos, presented by Yıldızdan [72]. The performance of this proposed variant was evaluated across classical as well as CEC2019 test functions, which highlighted noteworthy improvements in SO’s efficiency. The proposed CSO attained higher mean values in most of the cases and based on the analysis it ranked second overall among competing algorithms, proving to be not the best but a promising optimization method. An improved version of the Snake Optimizer, known as Mixed Strategy Snake Optimizer (MSSO) was introduced by Qi et al. [73] that employed a dual inverse learning strategy using Tent chaotic mapping and mirror imaging learning. In addition, a spiral foraging strategy is applied to enhance population diversity and for premature convergence avoidance. Further, a sinusoidal exponential food index is also introduced to replace the original algorithm’s limited benchmark, thereby boosting global search efficiency. The proposed MSSO is tested on eight benchmark functions and compared with several metaheuristic, including Whale, Gray Wolf, Grasshopper, Harris Hawk, and Pelican algorithms. The experimental result highlighted that the proposed MSSO outperforms these algorithms in terms of convergence speed and search efficiency. Belabbes et al. [74] proposed the Snake Optimizer metaheuristic for parameter estimation of three different PV cells: monocrystalline silicon, amorphous silicon, and RTC France with one-diode and two-diode models. Though the default SO algorithm is effective, a better version shows better accuracy, decreasing root mean square error (RMSE) up to 16% and computing time significantly. The enhanced SO is highly accurate at smaller population sizes and a lower number of iterations and hence is one of the most effective techniques for PV parameter extraction. Yao et al. [75] introduced the Enhanced Snake Optimizer (ESO), which integrates an opposition-based learning strategy and dynamic update mechanisms, including parameter adjustments, sine–cosine composite perturbation factors, and Tent-chaos & Cauchy mutation. The usefulness of ESO has been verified with 23 classic benchmark functions, the CEC 2019 function set, and four real world engineering design problems. These results from the experiments, along with statistical calculations, show that ESO has better results than 13 other current algorithms, including SO, proving ESO’s optimization skill is unmatched. Wang et al. [76] proposed a Multi-Strategy Snake Optimizer (MSO) incorporating opposition-based learning, adaptive weight factor, and Lévy flight strategy is proposed. Testing on 23 benchmark functions confirms MSO’s superior accuracy and convergence speed. Among 11 optimization algorithms evaluated for HESS capacity configuration, MSO demonstrated the best performance, reducing LCC by 4.5% while ensuring reliable power supply even during wind turbine and PV unit failures. Li et al. in [77] developed a Snake Optimizer which incorporates a dual mutation mechanism with the intention of automating the scheduling of workflows within cloud environments. The algorithm attempts to minimize the makespan while observing the contractual budget limitations defined by the user. Typically, SO is designed for continuous optimization problems, but this research converts it into a discrete version which is more appropriate for workflow scheduling by implementing a task execution order aware fitness function that reduces waiting times for dependent tasks. As the study also aims to improve SO’s performance, it was found necessary to change the parameter control strategies in different stages of evolution for the problem at hand. A dual mutation mechanism is proposed where “snakes” that do not improve undergo bit mutations to ensure diversity and greater exploration of potential solutions. Based on real-world scientific workflows, the algorithm was validated and showed greater efficacy in finding feasible solutions in less time compared to other available approaches. Yan and Razmjooy [78] presented a new approach for lung cancer detection using CNNs with lung CT images, where preprocessing is done prior to analysis. A variant of the Snake Optimizer, termed iSO, is also introduced to improve CNN performance. The model is validated on the IQ-OTH/NCCD Lung Cancer Dataset and compares the results with other existing methods to demonstrate higher accuracy and effectiveness. Li et al. [79] presented a variable step multiscale single threshold SloEn (VSM-StSloEn) that enhances complexity analysis at several time scales while streamlining the calculations. In addition, a snake optimization-based VSM-StSloEn (SO-VSM-StSloEn) is proposed for better threshold selection, ensuring greater robustness and accuracy. The SO-VSM-StSloEn has been experimentally validated on simulated and real data, proving its insensitivity to signal length and threshold variations, while outperforming existing methods based on entropy in classifying different signal types. Li and Ye [80] created a new description of Snake Optimizer based multi-level image segmentation which uses SO with an enhanced version of Otsu’s thresholding technique, named SO-Otsu. SO-Otsu enhances SO by extracting important regions from TPD images and improving the diagnostic accuracy and efficiency. The authors compare their results with five other optimization algorithms: the Fruit Fly Optimization Algorithm, Sparrow Search Algorithm, Grey Wolf Optimizer, Whale Optimization Algorithm, and Harris Hawks Optimization. The authors justify their choice of optimization algorithms in the study based on experimental results which conclude that SO-Otsu, unlike the other methods, has superior speed, detail extraction, and fidelity, efficiency in TPD image segmentation. Shi et al. [81] in their research solves the issues of multi-AGV scheduling in automated electric meter verification workshops by introducing a better Snake Optimization Algorithm (ISOA). The model seeks to optimize completion time and charging expenses with collision-avoidance constraints incorporated. The algorithm applies AGV-order-address three-level mapping to task assignment and sorting, uses an improved A algorithm for shortest pathfinding and conflict avoidance, and uses large neighborhood search to reduce charging costs. Simulations with real data indicate a 16.4% improvement in completion time and a 60.3% reduction in charging occurrences compared to First-In First-Out (FIFO) scheduling. The approach outperforms alternative state-of-the-art algorithms, providing evidence of its usefulness in optimizing multi-AGV scheduling. Khurma et al. [82] suggested an enhanced Snake Optimizer by alleviating its tendency towards highly fit solutions, which may limit search space diversity. Inspired by Evolutionary Algorithms (EA) and the Darwinian “survival of the fittest” concept, the paper introduces novel selection operators to replace the standard global best operator in SO. The suggested variations are SO-Roulette Wheel, SO-Tournament, SO-Linear Rank, and SO-Exponential Rank, all intended to enhance solution diversity and performance. These variants were tested under experimental comparison to determine their efficacy, and the pivotal role played by selection mechanisms in enhancing the efficiency of SO. Further, parameter analysis was performed to determine their effect on optimization performance. Dai et al. [83] gives an improved thermal error prediction for motorized spindles via a Kernel Extreme Learning Machine (KELM) optimized using the Snake Optimization algorithm. Simulation tools ANSYS software are used to simulate the spindle’s thermal property and measure the temperature field distribution. The data of axial temperature and thermal displacement are acquired with a designed specific experimental platform. The FCM algorithm and grey relation analysis are utilized to improve feature selection, lowering the number of measurement points from ten to four. The SO-KELM model is put forward and compared to KELM and PSO-KELM models, and it proves to be more accurate and stable in thermal error forecasting. Liu et al. [84] presented an Adaptive Chaotic Gaussian Variant Snake Optimization Algorithm (ACGSOA) algorithm is proposed here. The technique includes a chaotic operator to improve the position parameter for optimization as well as accelerate convergence speed and an adaptive Gaussian variant operator for avoiding the algorithm being stuck at local optima. Comparison simulations to SO, WOA, ABC, and FOA exhibit that ACGSOA converges quickly and enhances coverage rates by 7.20%, 7.32%, 7.96%, and 11.03%, respectively, solidifying its superiority in the optimization of SEMWSN deployment.
A hybridized variant basically means combining the advantages of various algorithms and intending to improve the performance. A hybridized SO is capable of yielding better results as it improves exploration, exploitation, and prevents premature convergence, thereby leading towards better solutions and improving the efficiency. By integrating with other complementary algorithms and strategies, a hybrid variant is capable of adapting complex problem landscapes more effectively, leading to more optimal and reliable outcomes. Guan et al. [85] introduced a hybrid model, SO-VMD-LSTM (Snake Optimization-Variational Mode Decomposition-Long Short-Term Memory), to perform ultra short-term forecasting of Electric Vehicle (EV) charging demand to ensure an efficient power grid management. VMD breaks down the past data into several Intrinsic Mode Functions (IMFs) for decreasing complexity, and SO automatically optimizes VMD parameters with improved accuracy and efficiency. LSTM networks subsequently forecast loads from such IMFs, and their results are added for the ultimate prediction. This strategy strengthens feature extraction, eliminates noise, and notably increases prediction accuracy, decreasing RMSE and MAE by 30.1% and 32.9% over VMD-LSTM, and 59.3% and 62.6% over the baseline LSTM technique. Puri et al. [86] introduced a hybrid Reptile Search Algorithm (RSA) and Snake Optimizer for EEG channel selection called RSO to detect Alzheimer’s Disease (AD) and Mild Cognitive Impairment (MCI). Due to the introduction of redundancy and complexity by the multi-channel EEG data, channel selection is optimized by RSO to improve the classification accuracy. Whereas SO performs fine-tuning of feature selection, the RSA search space is optimally searched. EMD, LCOWFB, VMD, and DWT decomposition methods are employed to analyze EEG sub-bands. RSO performed better than standalone metaheuristic and existing AD detection approaches when applied to two publicly available AD EEG datasets. It demonstrated high accuracy with reduced channels-optimal binary classification with 4 out of 16 channels (EMD) and 89.99% accuracy for three-class classification with 7 out of 19 channels (LCOWFB). Bölükbaş et al. [87] presented a hybrid method known as Scatter Search Snake Optimization (SSSO) intended to identify the most relevant feature, minimizing redundancy, however, without compromising on accuracy. While Snake Optimizer enhances the balance between exploration and exploitation, Scatter Search (SS) is good for global search but can become stuck in local optima. SSSO guarantees a better balance between local and global searches by combining SS for structured solution creation with SO for improved search capacity. Tested on UCI Machine Learning and epileptic disease classification datasets, the strategy outperforms popular optimization strategies in feature selection, exhibiting a robust metaheuristic approach with improved accuracy with fewer features. The paper by Fan et al. [88] presents a novel Snake Optimization–Tangential Functional Link Artificial Neural Network (ASO-TFLANN) model to improve the linear range of the Linear Variable Differential Transformer (LVDT), which is a core device for vibration noise measurement and active vibration isolation. The ASO component is an enhanced form of the Snake Optimization Algorithm that combines Latin hypercube sampling and the Levy flight technique to improve global search ability and maintain population diversity, thereby avoiding overfitting and local optimality problems often found with gradient descent. A voltage–displacement test bench is constructed to measure input and output values under different excitation conditions, which are utilized by the TFLANN to calculate the weight vectors for precise LVDT output mapping. Simulation and online testing prove that this hybrid model lowers error significantly, increases the LVDT’s linear operating range, and improves overall measurement precision, thus creating a robust approach for enhanced measurement quality. Duraibi [89] introduced a hybrid framework that integrates a Snake Optimization Algorithm with deep learning for image-based malware classification (SODCNN-IMC). Its primary goal is to accurately detect and classify malware images such as binary representations or screenshots, thereby enhancing cybersecurity defences against evolving threats. In this setup, ShuffleNet is used to extract feature vectors from malware images, and the Snake Optimization Algorithm is used to optimize the hyperparameters of ShuffleNet to ensure better feature extraction. Moreover, an attention-based bi-directional long short-term memory (LSTM) model is used to identify and classify the malware based on the optimized features. Evaluated on the Malimg malware dataset, the combined approach profoundly enhances classification accuracy, with the highest accuracy reported at 98.42%, thus proving that it is capable of improving the malware detection effectiveness of current models. Alawad et al. [90] proposed a Hybrid Snake Optimizer Algorithm (HSOA) aimed at enhancing population diversity, search effectiveness and avoiding premature convergence. It basically comprises of two primary techniques namely, Oppositional-Mutual Learning for better initialization and Dynamic Polynomial Mutation to progress searchability during optimization. The proposed HSOA is applied to the Economic Load Dispatch (ELD) problem with Valve Point Effects (VPE), a non-convex, complex optimization problem in power systems. Further, it is compared to 47 optimization algorithms on five real-world ELD instances, and HSOA ranks at the top in many scenarios, proving its efficiency. Further evaluations using the IEEE-CEC 2014 benchmark functions confirm its competitiveness among metaheuristic. HSOA is a hybrid variant of SO, integrating opposition-based learning and mutation techniques to enhance its optimization capabilities. Ersali et al. [91] in their research, proposed a novel hybrid SO algorithm, referred to as the Opposition-Based Snake Optimizer with Pattern Search (OSOPS). The OSOPS improves the Snake Optimizer by incorporating Opposition-Based Learning (OBL) and Pattern Search (PS), thus improving exploration and exploitation abilities. In addition, a crossover frequency constraint is proposed to mitigate high-frequency noise and ensure reliable performance under disturbances. The effectiveness of OSOPS is demonstrated using statistical box plot analysis and convergence performance comparisons with the baseline SO algorithm. The experimental outcome emphasized that the proposed OSOPS-based system exhibits quicker rise times (14.21% improvement over SO, 32.10% improvement over PP), quicker settling times (15.38% improvement over SO, 84.95% improvement over PP), and greater bandwidths (18.74% improvement over PP, 17.03% improvement over SO). Alkahtani et al. [92] introduced a Hybrid Snake Optimizer-based Route Selection Approach for Unmanned Aerial Vehicles Communication (HSO-RSAUAVC) to increase UAV deployment effectiveness. The designed HSO-RSAUAVC method blends the Snake Optimizer with Bernoulli Chaotic Mapping and Lévy Flight (LF) to optimize route selection in UAV communication. Additionally, a fitness function involving residual energy, distance, and UAV degree is used to dynamically adjust UAV routes, reduce communication interference, and optimize energy consumption. Simulation findings revealed that the proposed method achieves significant improvements in UAV communication performance and reliability compared with the current approaches. Jithendra et al. [93] introduced a hybrid Adaptive Neuro-Fuzzy Inference System (ANFIS) framework enhanced with Snake Optimizer and Induced Ordered Weighted Average (IOWA) for hourly atmospheric pressure prediction. The proposed IOWA-ANFIS-SO model effectively handles dimensionality issues and optimizes ANFIS performance by incorporating meteorological factors such as air temperature, sea surface temperature, wind speed, and wind direction. Based on the observations from weather buoy network stations of Ireland, the model was trained (70%) and validated (30%) to make it reliable. The suggested IOWA-ANFIS-SO model performed much better than conventional IOWA-ANFIS and ANFIS-SO models with RMSE (0.4698), MAE (0.3593), MAPE (0.0003), and R2 (0.9903) at the best alpha values. Amor et al. [94] designed a hybrid Grey Theory-Snake Optimizer (GT-SO) for improving the machinability (cutting force) and surface roughness of nTiO2-GFRPC. Incorporating grey theory to reduce more than one response to a single objective function and the snake optimizer to perform the optimization, the work produces the optimum. Validation validates enhanced output performance, and comparison with other metaheuristics identifies GT-SO’s superiority in cutting, milling, and shaping composite materials, which renders it a useful tool in manufacturing and materials engineering. Fan et al. [95] and his team members, in their work, proposed a hybrid Advanced Snake Optimizer-Linear Quadratic Regulator (ASO-LQR) for the purpose of optimization of vehicle active suspension systems through the enhancement of LQR weight coefficient tuning. The proposed ASO is typically the enhancement of Snake Optimizer with Genetic Algorithm crossover-mutation, Lévy Flight perturbation, and adaptive oscillating weights to ensure improved convergence and precision. Further, the simulation results for different road conditions indicate that the proposed ASO-LQR reduces vibration damping performance over SO, GA, and APSO, rendering it very efficient in automotive suspension control.
Wang et al. [96] proposed an energy-aware multi-hop routing protocol, ESO-GJO, for Wireless Sensor Networks (WSNs) to improve network lifetime and save energy. The protocol combines an Enhanced Snake Optimizer (ESO) with Golden Jackal Optimization (GJO). ESO enhances Snake Optimizer using a Brownian motion function within the exploitation step and employs variables such as CH energy, degree of nodes, and distance from BS to find the best CHs. GJO is subsequently applied to create multi-hop routing structures between CHs and BS. Simulations verify that ESO-GJO significantly surpasses currently used protocols such as LSA, LEACH-IACA, and LEACH-ANT through its energy efficiency and network duration. Agrawal et al. [97] introduced the Advanced Heffron-Phillips Model (AHPM), implemented through a higher-order Synchronous Generator Model 1.1 with ten K-Constants, which improves power system stability. The proposed model combines the Snake Optimization Algorithm and Linear Quadratic Regulator (LQR) for optimal control. AHPM achieves a 99.98% damping ratio and a settling time between 1.5 and 2.0 s, providing fast and robust stability. Ablin and Prabin [98] suggested a Gated Graph Attention-based Crossover Snake (GGA-CS) algorithm that incorporated Graph Neural Networks (GNN) to incorporate spectral-spatial connections and a gated attention mechanism to amplify spectral bands. A crossover Snake Optimization Algorithm tunes parameters and enhances classification precision. As SOA is paired with GNN and attention mechanisms, it is a hybridized variant. The approach is validated on Indian Pines, University of Pavia, and Salinas datasets and is hence beneficial for remote sensing, environmental monitoring, precision agriculture, and land cover classification. Al-Qazzaz et al. [99] introduced a Snake-optimized Xception model using the Xception deep CNN model for effective feature extraction, along with a Snake Optimization Algorithm adapting parameters dynamically to conduct enhanced search space exploration to defend against highly realistic forgeries applied in fraud, misinformation, and identity crimes. This hybridized SOA version enhances the accuracy of DeepFake detection up to 98.53% mean accuracy, precision, recall, and F1-score rates over the classic Xception, as well as compared to other mechanisms. Approved on several DeepFake datasets, the model substantially minimizes misdetection and false positives, which is extremely beneficial for digital media forensics, media verification, legal evidence authentication, and prevention of fake news and identity fraud. Chen et al. [100] investigate pavement performance model development through the use of ensemble machine learning methods for forecasting pavement distress. Employing the Research Institute of the Highway Ministry of Transport track data, a comparison of predictive models such as Random Forest (RF), Extreme Gradient Boosting (XGB), Snake Optimizer-Random Forest (SO-RF), and Snake Optimizer-Extreme Gradient Boosting (SO-XGB) was conducted. Reliable evaluation was ensured by using repeated K-Fold cross-validation. Among the models, SO-XGB showed higher accuracy, especially in rutting depth prediction, with better performance on various performance indices like R2, MAE, RMSE, MAPE, and PI. The results affirm that SO-XGB is a good and efficient tool for automated forecasting of pavement deterioration and present a strong solution for pavement performance evaluation. Aljebreen et al. [101] suggested an automated attack detection improvement by DDoS attack detection through a Snake Optimizer with Ensemble Learning (DDAD-SOEL) technique. The Snake Optimizer is used for feature selection (FS) to reduce the size of the data and to improve model effectiveness. A blend of deep learning algorithms like Long Short-Term Memory (LSTM), Bidirectional LSTM (BiLSTM), and Deep Belief Network (DBN) is applied for classification, while model parameters are optimized by Adadelta optimizer. Simulation on benchmark datasets confirms that DDAD-SOEL is superior to existing models in the detection of malicious activities with greater accuracy and effectiveness. Wang et al. [102] proposed a new Ultra-Wide-Band (UWB) location system combined with the Snake Optimizer Long Short-Term Memory (SO-LSTM) architecture. It is a multi-base station and multi-tag UWB configuration, wherein Time Division Multiple Access (TDMA) and Two-Way Ranging (TWR) are utilized to increase real-time distance estimation. The hyperparameters of LSTM are also optimized by the Snake Optimizer, enhancing positioning estimation accuracy. Experimental results verify that the SO-LSTM model performs much better than traditional methods such as the Least Squares (LS) method and the Kalman Filter (KF), lowering root mean square error (RMSE) by as much as 63.39% and maximum positioning error (MPE) by 60.77%, illustrating its higher accuracy and reliability.
Ismail [103] proposed a new Snake-Optimized Framework (CKD-SO) that improves CKD prediction accuracy by combining Snake Optimizer with five ML algorithms for feature selection and classification. Through effective determination of the most useful medical data, CKD-SO significantly enhances diagnostic accuracy to an impressive 99.7% accuracy rate. This framework supports early intervention strategies, minimizing CKD-related mortality and enhancing healthcare outcomes. Samiayya et al. [104] presented a Hybrid Snake Whale Optimization (HSWO) Algorithm to optimize CH selection to improve energy efficiency and network life. The three-phase approach includes initialization, where the network parameters are set; cluster head selection, where CHs are optimally selected using HSWO with respect to energy, distance, and delay; and route maintenance, which maintains stable transmission of data. Performance assessment verifies that HSWO has a dramatic network lifetime enhancement (5600 rounds) and energy efficiency (0.98) over current practices. Masood et al. [105] proposed a new ELM-SO model, which combines Extreme Learning Machine (ELM) with Snake Optimization Algorithm to improve the accuracy of predictions. The model makes use of air quality, weather parameters and is further compared with numerous ML models, such as SVR, RF, ELM, GBR, XGBoost, and deep learning-based LSTM. The experimental results show that the proposed ELM-SO performs better than other models, with a squared correlation coefficient (R2) of 0.928 and a Root Mean Square Error (RMSE) of 30.325 µg/m3, indicating its effectiveness in PM2.5 forecasting. Jiang et al. [106] came up with the idea of applying a hybrid deep belief network (HDBN) combined with an improved snake optimization algorithm (ISOA) to enhance accuracy in forecasting. The proposed HDBN applies deep learning and probabilistic graphical models in identifying complex patterns in data and ISOA, inspired by snake movement, optimizing forecasting performance. Proven using real-world data of a commercial building, the HDBN-ISOA methodology performs better than traditional techniques and proves to be feasible for the optimization of energy consumption and reduced operational expenses of air conditioning systems. Kassem [107] proposed Snake Optimization with Deep Learning Enabled Disease Detection Model for Colorectal Cancer (SODL-DDCC) to detect colorectal cancer (CC) in histopathological images. The approach starts with bilateral filtering (BF) to remove noise, followed by Inception v3 to extract features, in which the hyperparameters are optimized using the Snake Optimization Algorithm. A Graph Convolution Network (GCN) is applied for classification. The validity of the model is tested through a benchmark dataset, exhibiting better performance than other methods. Al-Shourbaji et al. [108] in their study introduce a Feature Selection (FS) approach called Reptile Search Algorithm–Snake Optimizer (RSA-SO) to enhance Machine Learning (ML) model performance by selecting the Optimal Feature Subset (OFS). RSA-SO integrates the Reptile Search Algorithm (RSA) and Snake Optimizer in a parallel mechanism, minimizing the risk of being stuck in local optima while balancing exploration and exploitation. The effectiveness of RSA-SO is validated through experiments on ten UCI datasets and two real-world engineering problems. Comparison against seven popular Meta-Heuristic (MH) algorithms confirms RSA-SO’s competitive performance in optimizing FS tasks. Fu et al. [109] in their research put forward a gas outburst prediction model using Multiple Strategy Fusion and Improved Snake Optimization (MFISO) and Temporal Convolutional Network (TCN) to improve the accuracy of prediction in underground mines. The Snake Optimization Algorithm is upgraded using sine chaos mapping, spiral search strategy, and dynamic adaptive weight to improve global search ability and escape from local optima. Moreover, the Tangent-based ReLU (ThLU) enhances TCN’s generalization capacity. The MFISO algorithm fine-tunes TCN’s hyperparameters, improving prediction accuracy. Comparative experiments compared to GRU, LSTM, SO-TCN, WOA-TCN, and PSO-TCN demonstrate that MFISO-TCN has lower MAE (3.11%), MAPE (0.47%), and RMSE (3.31%), affirming its better performance in gas outburst prediction. Rawa [110] proposed a hybrid meta-heuristic technique combining Snake Optimizer and Sine Cosine Algorithm (SCA) (SO-SCA). This hybridization enhances exploration and exploitation, improving solution accuracy and efficiency. Additionally, the study integrates energy storage systems (ESS) and fault current limiters (FCLs) into the planning process, ensuring network reliability under different cascading failure conditions. The suggested SO-SCA model is also utilized for load forecasting, which performs better than other methods. Simulation studies were conducted using the Garver and IEEE 24-bus systems to confirm the efficacy of SO-SCA in optimizing TEP and accurately forecasting load growth, which proves its efficiency in improving power grid resilience. Major MAs that are hybridized with SO are depicted in Fig. 11. There are numerous MAs exist in the literature. However, only six well-known MAs are utilized to develop hybrid SO variants. Therefore, researchers must utilize other popular MAs to develop hybrid SO variants in different optimization fields.

Figure 11: Major MAs used to develop Hybrid SO.
4.3 Multi-Objective Variants of SO
A multi-objective version of the Snake Optimizer improves the original algorithm by solving multiple conflicting objectives at the same time rather than concentrating on a single objective. In contrast to the original SO, which finds a single optimum, these versions provide a Pareto-optimal solution set, efficiently trading-off among conflicting objectives. Gao and Liu [111] in their work introduced a novel method named the Integrated Multi-Objective Snake Optimizer (IMOSO) to improve convergence precision and population diversity in a multi-objective optimization scenario. The proposed algorithm enhances the Snake Optimization Algorithm, basically using two primary strategies: adaptive mating among subpopulations, mimicking snakes’ multi-partner mating behavior, and an external archive local disturbance mechanism, which improves non-inferior solutions with low diversity. Comparison against seven leading-edge algorithms on WFG test functions shows that IMOSO performs better in convergence and diversity and thus is very efficient on challenging multi-objective optimization problems. Li et al. [112] introduce an Artificial Potential Field-Enhanced Improved Multiobjective Snake Optimization Algorithm (APF-IMOSO) to overcome the limitations of premature convergence and solution diversity deficiency in the conventional path-planning techniques of mobile robots. The algorithm introduces four major improvements to SO and incorporates multiple fitness functions to optimize path length, safety, energy usage, and time efficiency. Experimental outcomes in both static and dynamic settings exhibit significant enhancements, such as an 8.02% decrease in path length, 7.61% improvement in safety, 50.71% improvement in energy efficiency, and 12.74% time saving over the original SO. Robotic testing in real-world scenarios verifies the algorithm’s performance, recording an average path length error of only 1.19%, echoing its robust potential for dynamic and complicated robotic navigation operations.
4.4 Binary Variants of Snake Optimizer
Abu Khurma et al. [113] proposed two wrapper-based feature selection methods employing the Snake Optimizer, namely Binary Snake Optimizer (BSO) and an enhanced variant, BSO-CV, that combines crossover operators (one-point, two-point, and uniform crossover) governed by a switch probability for enhancing search effectiveness. The Experimental evaluation was performed across the COVID-19 dataset and 23 disease benchmark datasets, which clearly highlights that the proposed BSO-CV is superior in comparison to that of regular BSO in terms of accuracy and execution time, lowering the dimension of the COVID-19 dataset by 89% vs. BSO’s 79%. In addition, proposed BSO-CV improves the exploration-exploitation balance significantly, surpassing state-of-the-art wrapper-based FS techniques like HLBDA, LBMFO-V3, and CHIO-GC, with a performance greater than 90% on the majority of datasets. Braik et al. [70] proposed three improved adaptive variants of the Snake Optimizer, namely Exponential SO (ESO), Power SO (PSO), and delayed S-shaped SO (SSO), to address the optimizer’s tendency to get trapped in local optima and improve search performance. These adaptive models provide a better trade-off between exploration and exploitation and guarantee more effective feature selection. Their binary variants were formulated for FS applications in the context of the KNN classifier and validated on 24 datasets. The experimental outcomes confirmed that BPSO was superior to alternative FS techniques under several evaluation metrics, including classification accuracy, sensitivity, specificity, and fitness scores, and attained over 90% accuracy in various datasets.
Finally, the proportional distribution of the different variants is graphically presented in Fig. 12. It can be seen that the Improved variants of SO are a major development in the literature. Whereas, hybridized variants of SO are the second most developed variants in the literature. There has been little effort put to develop Mult-objective and Binary variants of SO in the literature.

Figure 12: Proportional distribution of variants of SO in literature.
4.5 Analytical Synthesis of SO Variants and Design Trends
While numerous variants of the Snake Optimizer (SO) have been proposed in recent years, a closer examination reveals recurring design patterns and common improvement strategies. Rather than viewing these developments as isolated contributions, this section synthesizes the literature to identify dominant modification trends, comparative advantages, and underlying research directions. A systematic analysis of existing SO variants indicates that most improvements fall into four principal categories: (i) exploration enhancement, (ii) exploitation refinement, (iii) adaptive parameter control, and (iv) hybridization mechanisms. The same is highlighted in Table 6.

i) Exploration Enhancement Strategies: A significant number of studies focus on strengthening SO’s global search capability to prevent premature convergence. Common techniques include: Chaotic map-based initialization, Lévy flight mechanisms, Random walk strategies, Opposition-based learning (OBL), and Mutation-based perturbation operators. These modifications primarily aim to increase population diversity during early iterations. The recurring pattern suggests that researchers perceive exploration insufficiency as a primary limitation of the original SO formulation. Empirical comparisons across studies consistently report improved global optima discovery when diversity-enhancing mechanisms are incorporated.
ii) Exploitation Refinement Techniques: Another dominant trend involves improving SO’s local search performance in later iterations. Approaches include: Local search integration, Differential evolution-inspired operators, Adaptive step-size reduction, and Neighbourhood search strategies. These techniques strengthen convergence accuracy and solution refinement. However, comparative observations indicate that excessive exploitation may reduce diversity and increase the risk of stagnation, suggesting a delicate balance between intensification and diversification.
iii) Adaptive Parameter Control: Many recent SO variants introduce dynamic or self-adaptive parameter control mechanisms. Instead of using fixed control parameters throughout the optimization process, these approaches adjust parameters based on iteration count, population fitness distribution, or convergence rate. The recurring design choice of adaptive control reflects a broader trend in metaheuristic research, where static parameter settings are increasingly viewed as inadequate for complex, high-dimensional problems. Studies incorporating adaptive mechanisms often report improved convergence stability and reduced sensitivity to initialization conditions.
iv) Hybridization-Based Improvements: Hybrid SO models represent another significant research direction. These variants combine SO with other metaheuristic or learning-based techniques, such as: SO-PSO hybrids, SO-RSA hybrids, SO-SCA, SO-GJO combinations, and Integration with machine learning models. Hybridization aims to leverage the complementary strengths of multiple algorithms. Comparative insights suggest that hybrid SO models generally demonstrate superior solution quality and robustness across benchmark functions. However, this improvement often comes at the cost of increased computational complexity and implementation overhead.
4.6 Critical Limitations in Current SO Research
There are a number of gaps in the literature on the SO, particularly with regard to its theoretical foundation, resilience, and effectiveness for specific types of issues. The original SO has issues with convergence and scalability, particularly in high-dimensional or extremely complicated optimization situations.
• Robustness and parameter sensitivity: The classical SO is unable to maintain the key balance between exploring search space and exploiting the best solutions. This problem could lead to early convergence, which would prevent the SO from finding the global optimal solution and instead trap it in a local optimum. While current adaptive versions of the SO are intended to tackle this challenge, comprehensive comparison studies of the best methods for various problem types are lacking in the literature.
In accordance with the particular problem, SO necessitates careful adjustment of a number of control parameters. Self-adaptive methods that can dynamically alter these parameters over the course of search are still the subject of extensive research. They would become less reliant on manual adjustment and more resilient to a greater variety of optimization problems as a result.
• Theoretical framework and convergence analysis: The literature does not define the theoretical convergence features of the SO in as much detail. A systematic mathematical study that guarantees convergence to the global optimum for a particular class of problems is still a significant shortcoming, despite multiple studies showing good empirical convergence rates. When used for complex, high-dimensional, non-linear optimization problems, the conventional SO has stability issues. Thus, more research is needed in this area.
• Performance and scalability: Traditional SO, according to studies, performs poorly in high-dimensional search spaces, making solution finding more complex and inefficient. While some enhanced versions attempt to address this, further work is needed to create highly scalable versions capable of rapidly resolving contemporary large-scale optimization issues. Single-objective issues are the target of the traditional SO. Even so, first attempts have been made to develop binary, fractional, and multi-objective variations. To develop robust multi-objective, fractional, and binary variations that can effectively handle various difficult optimization problems, more research is required. Hybridization of SO with different MAs and their operators is the focus of many recent studies. To create hybrid SO variations, only a few well-known MAs are used. In order to create hybrid SO variants in various optimization domains, researchers must employ other widely used MAs. To guide future study and determine the most effective combinations for different problem types, a thorough taxonomy and comparative analysis of these hybridization processes are crucial. It is observed that the improvement of SO does not make use of orthogonal learning, fitness distance balance-based selection mechanism, Weibull distribution, local escape operator, or multi-population techniques. As a result, these tactics need to be integrated into SO across many optimization areas.
• Computational efficiency and practical application: Future study on parallel implementation of SO on distributed computing systems or custom hardware, like GPUs, is appealing due to the rise of high-performance computing. This could significantly increase the algorithm’s effectiveness for large-scale applications. While SO is utilized to address numerous optimization issues, it has yet to be deployed to a wider range of complicated practical issues. Illustrative studies, specifically those including large-scale and multi-objective situations, are critical for comprehensively validating the algorithm’s practical applicability and determining the top combinations for a variety of problem types.
This section presents the application domains of the So variants as per the survey. Table 7 presents a comprehensive application of the different variants of the SO across various domains.

Table 7 showcases the versatility and advancement of SO and its variants, emphasizing its wide-ranging implementations in numerous application areas ranging from Energy, Image Processing, Edge Computing, Wireless Sensor Networks, Classical Benchmark Engineering Optimization, and many more. In the field of energy systems, the SO algorithm has been widely applied to address a variety of complex control and optimization problems. For instance, it has been used in power system stability enhancement, where a robust damping controller is designed to improve system performance [114]. Similarly, SO has been employed in power electronic applications, such as optimizing adaptive backstepping controllers for buck-boost converters, ensuring stable voltage regulation under varying conditions [115]. The algorithm has also shown strong effectiveness in solving optimal power flow problems, as demonstrated in [116], where it is used to determine efficient operating conditions in power networks. In addition, SO has been applied to combined heat and power economic dispatch problems, taking into account power losses and improving overall energy utilization efficiency [117]. In the context of renewable and hybrid energy systems, SO has been utilized for frequency regulation in diesel-wind generation systems, enhancing system stability under fluctuating renewable inputs [118]. Beyond traditional energy optimization, it has also been integrated into intelligent system modelling and filter design for nonlinear industrial processes [119], further highlighting its adaptability. Moreover, SO has been applied in energy-aware system design, such as task scheduling in energy-harvesting wearable biomedical systems, where efficient energy utilization is critical [120]. It has also been used in motor control applications, including the speed control of brushless DC motors, demonstrating its effectiveness in electromechanical energy systems [121]. SO also contributes to condition monitoring and fault diagnosis in energy-related mechanical components, such as rolling bearings, by optimizing signal decomposition techniques [122]. Furthermore, it has been applied to power system inertia estimation, considering transient variations in frequency and voltage, thereby improving the accuracy of dynamic system analysis [123]. Overall, these studies collectively demonstrate the robustness, flexibility, and effectiveness of the SO algorithm in addressing diverse and complex challenges across modern energy systems and related engineering domains.
In addition, SO has been successfully utilized in image processing applications, as reported in [124,125], where it contributes to improved performance in tasks such as enhancement and feature extraction. The algorithm has also been explored in edge computing environments [126], demonstrating its capability to handle distributed and resource-constrained systems. Furthermore, SO has been applied in wireless sensor networks [127], highlighting its suitability for network optimization and efficient resource management. Beyond these areas, it has also shown strong performance in classical benchmark engineering optimization problems, along with several other emerging application domains. It provides a useful reference for gaining insights into the algorithm’s broad impact and effectiveness in technological and engineering fields. Table 8 shows the application percentage of SO variants over some major fields. This table clearly helps researchers to find the major and minor application domains of the different SO variants, like improved, hybrid, multi-objective, and binary. The graphical representation of the proportional distribution of SO variants across various domains is depicted in Fig. 13.


Figure 13: Proportional distribution of variants of SO across various applications (As per the survey).
6 Theoretical Comparison with Other Metaheuristics
The Snake Optimizer is widely used due to its adaptive movement strategy, which dynamically balances exploration and exploitation by imitating snake behaviors like mating, combating, etc. Unlike other algorithms, it requires minimal parameter tuning and efficiently avoids local minima through self-adaptive step adjustments. SO’s flexible constraint handling and robust search mechanism make it perfect for engineering design, machine learning optimization, and scheduling problems, guaranteeing fast convergence and preventing premature stagnation. However, no metaheuristic in its original form can effectively address a wide range of problems. Table 9 showcases the comparison of SO with six other well-known metaheuristics namely Particle Swarm Optimization [7], Cuckoo Search Algorithm (CSA) [10], Grey Wolf Optimization (GWO) [32], Whale Optimization Algorithm [33], Tuna Swarm Optimization (TSO) [34] and Artificial Hummingbird Algorithm (AHA) [35] in terms of parameters such as inspirational source, exploration-exploitation balance, phase control, population dynamics, search mechanism, mathematical operator used and diversity preservation mechanism.

While Section 6 presents a theoretical comparison of representative optimization algorithms, a comprehensive experimental evaluation involving a wide range of recent and superior algorithms is provided in Section 7.
It is evident from Table 6 that SO and its modified versions have not been applied in the field of clustering-based imagery segmentation. As a consequence, this study uses an SO-based clustering approach to evaluate image segmentation. Clustering-based image segmentation essentially represents an unsupervised learning method involving minimal training. The current research uses SO to construct a crisp partitional clustering construction.
7.1 Crisp Partitional Clustering
To increase intra-cluster similarity and decrease inter-cluster similarity, pixels are separated into clusters during image clustering. Each pixel in crisp partitional clustering is allocated to a single cluster, indicating a binary connection between pixels and clusters. Assume that the input image
where
The vulnerability of traditional crisp clustering techniques like K-Means (KM) to local optima trapping brought on by random center initialization, which results in empty clusters, is a major shortcoming. Therefore, the convergence of KM to a global optimal is a significant obstacle. The Stirling numbers of the second kind provide the number of ways to split
The number of possible partitions can be determined using
Metaheuristic algorithms (MAs) constitute essential approaches for solving complex optimization problems. These methods are effective and operate independently of gradient information. MAs are distinguished by their autonomous functionality, which facilitates adaptability and broad applicability across various real-world optimization scenarios [128].
In Algorithm 2, the objective function defined in Eq. (21) is initially considered, followed by the generation of a randomly initialized population of candidate solutions. Each solution corresponds to a specific set of cluster centers. Subsequently, each pixel is assigned to its nearest cluster center. The fitness of each solution is then evaluated based on the objective function. Since this clustering task is formulated as a minimization problem, lower fitness values correspond to better solutions. The solutions are subsequently refined through the application of MA’s operators to achieve improved outcome. The pixels are subsequently assigned to clusters according to the minimal distance. Additionally, the efficacy of the solutions is assessed, and the optimal solution is retained. This method is executed repeatedly until the stopping requirements are not met. The optimal global solution, serving as the superior cluster center, is attained, and segmentation is performed accordingly.

7.2 Practical Role and Significance of SO-Based Image Clustering
In the proposed framework, the SO algorithm serves as an optimization-driven mechanism for enhancing clustering-based color image segmentation. The primary role of SO is to determine optimal cluster centroids by minimizing intra-cluster variance and maximizing inter-cluster separability. By efficiently balancing exploration and exploitation during the search process, SO mitigates the limitations of conventional clustering approaches, such as sensitivity to initialization and premature convergence to local optima. Unlike traditional K-means or fuzzy clustering methods, which often depend heavily on initial centroid selection, SO introduces a global search capability that enables more stable and robust convergence toward optimal clustering configurations. This results in improved region homogeneity, clearer object boundaries, reduced pixel misclassification, and enhanced preservation of structural details. Consequently, the segmented image better represents meaningful regions within the original image. Image clustering is not an isolated task but a fundamental preprocessing stage in many real-world computer vision and image analysis systems. The quality of clustering directly affects the performance and reliability of subsequent computational processes. The practical significance of SO-based image clustering can be understood through the following application domains:
1. Medical Image Analysis: Accurate clustering facilitates the separation of abnormal tissues (e.g., tumors, lesions) from healthy regions. This enables quantitative measurements such as region size, shape, and intensity distribution, which are critical for diagnosis, treatment planning, and disease monitoring.
2. Remote Sensing and Environmental Monitoring: In satellite and aerial imagery, clustering is used to identify land-cover types, vegetation zones, water bodies, and urban areas. Improved clustering accuracy leads to better environmental assessment and urban planning decisions.
3. Industrial Inspection and Quality Control: In automated manufacturing systems, clustering assists in detecting surface defects, cracks, or irregularities. Enhanced segmentation improves defect localization and reduces false detection rates.
4. Surveillance and Smart Security Systems: In intelligent monitoring systems, clustering enables foreground-background separation, object isolation, and anomaly detection. High-quality segmentation improves tracking accuracy and behavioral analysis.
5. Machine Learning and Pattern Recognition: Clustering simplifies complex image data by partitioning it into meaningful regions, thereby reducing computational complexity in subsequent feature extraction and classification stages.
The clustered output produced by the SO-based framework serves as a structured input for higher-level analytical processes. After clustering, the segmented regions can be utilized for: Feature extraction (texture descriptors, shape metrics, color histograms), Object detection and recognition, Image classification using machine learning or deep learning models, Region-based statistical analysis, Morphological processing and boundary refinement and Decision-support systems in medical and industrial environments. By improving clustering accuracy and robustness, SO enhances the reliability of these downstream tasks. Poor segmentation often propagates errors to later stages, reducing overall system performance. Therefore, the optimization capability of SO contributes not only to improved segmentation metrics but also to enhanced effectiveness of the complete image processing pipeline.
This section delineates the experimental outcomes pertaining to the SO model alongside eighteen additional MAs-based image clustering models. A comparative analysis of these MAs was conducted using a dataset comprising 100 standard color images from the BSD500 dataset [129]. This BSD500 dataset is a publicly available open access datasets. The ground truth images of the BSD500 dataset are publicly available with the dataset. The eighteen tested MAs are Cuckoo Search (CS) [10], Differential Evolution (DE) [128], Grey Wolf Optimizer (GWO) [32], Sine-Cosine Algorithm (SCA) [130], Aquila Optimizer (AO) [131], Arithmetic Optimization Algorithm (AOA) [132], Artemisinin Optimizer (ATRO) [133], Black-winged Kite Algorithm (BKA) [134], Crested Porcupine Optimizer (CPO) [135], Dwarf Mongoose Optimization (DMO) [136], Educational Competition Optimizer (ECO) [137], Fata Morgana Algorithm (FATA) [138], Hiking Optimization Algorithm (HOA) [139], Moss Growth Optimization (MGO) [140], Polar Light Optimizer (PLO) [141], Puffer-fish Optimisation Algorithm (POA) [142], Parrot Optimizer (PARO) [143], and Reptile Search Algorithm (RSA) [144]. These algorithms were selected to ensure a fair and comprehensive evaluation of the proposed method. The set includes both well-established benchmark algorithms and recent state-of-the-art optimization techniques. Classical algorithms were chosen due to their widespread use and well-documented performance, enabling reliable and meaningful comparisons. In addition, several recently proposed algorithms were included to reflect current developments in metaheuristic optimization and to assess the competitiveness of the proposed approach against modern methods. These algorithms represent diverse search strategies and convergence behaviors, helping to avoid bias toward any specific optimization mechanism. All algorithms were implemented using consistent experimental settings and parameter configurations to ensure fairness and reproducibility, allowing for a robust evaluation of the proposed method across different optimization scenarios.
Parameter tuning is essential because of the great sensitivity of metaheuristics to parameters; suitable parameter configurations dictate their efficiency, stability, convergence rate, and fairness in comparison. However, there is no single best way to select parameters for all problems. In metaheuristics research, it is typical to use the parameter settings from the source paper, which means following the original authors’ suggestions. Use of source parameters is permissible when: Use of source parameters is permissible when you aim to replicate or validate something. This is especially true if previous studies have used the same settings. Some algorithms have defaults that a majority agree on. Hence, we utilize parameter setting by following the source paper’s suggestions, and the parameter settings for the tested MAs are reported in Table 10. MatlabR2018b, Windows-10 OS, an x64-based PC with an Intel Core i5 CPU and 8 GB RAM were used for the experiment. In the comparison of MAs, it is essential to set population size and maximum iterations to ensure a fair and scientifically valid assessment. The fundamental premise is that MAs should be evaluated under equivalent computational constraints rather than identical parameter values. Each MA must utilize the same number of function evaluations (NFE) for a fair comparison. NFE equals the product of population size and the maximum number of iterations. To ensure a fair comparison, the population size and the maximum number of iterations were fixed at 50 and 300, respectively. All algorithms have been executed with an identical number of function evaluations (FEs) under a constant population size and maximum iterations configuration. Researchers employ identical maximum iterations or identical function evaluations (FEs) for all MAs throughout execution for better comparison among them [145]. This research met both criteria, namely the same maximum iterations and the same number of function evaluations. Each MA must be executed multiple times per image because of its stochastic nature, as varying executions may yield different outcomes. A single run conveys an isolated occurrence. Several iterations reveal the following conclusions: Does the MA converge prematurely? Is it often stuck in local optima? Is the difference between the results high? An MA may sometimes find the best solutions or become stuck in local optima. A single successful outcome may simply be a chance event. In optimization research, it is common to do 20 to 30 iterations. For better comparison, it is important to include the mean, standard deviation, best, and worst outcomes. The mean shows how well the MA works on average over several runs. The standard deviation quantifies the variability of findings across multiple iterations. The best outcome demonstrates the MA’s ultimate potential. The worst outcome illustrates the MA’s failure mode. After full execution of the MAs, the values of fitness and convergence curves are recorded to evaluate their convergence characteristics. Each MA was executed 30 times per image. Subsequently, the optimization performance of the evaluated MAs was assessed by calculating the mean fitness


Figs. 14 and 15 present the segmentation outcomes of the assessed MAs-based clustering models. Specifically, Fig. 14 illustrates the results for images numbered 3036, 296028, 79073, 100099, and 130014 from the BSD500 dataset, while Fig. 15 depicts the segmentation results for images numbered 107045, 288024, 306052, 69022, and 347031 from the same dataset. The DB method calculated

Figure 14: Segmentation results of the MA-based clustering models over BSD500 (image numbers: 3036, 296028, 79073, 100099, 130014).

Figure 15: Segmentation results of the MA-based clustering models over BSD500 (image numbers: 107045, 288024, 306052, 69022, 347031).







Figure 16: Execution time of MA-based image clustering models.
An optimization process is deemed successfully completed only when it satisfies one or more specified conditions, commonly referred to as convergence criteria. These criteria encompass various factors, including the completion of a maximum number of iterations, achieving a predetermined fitness threshold, the total number of function evaluations performed, or adherence to a predefined time constraint.
The study employs identical maximum iterations and function evaluation counts to evaluate the convergence characteristic. Convergence curves, based on the same maximum number of iterations, are widely employed to demonstrate the convergence characteristic of MAs [35,149]. The convergence curve illustrates the fitness value of the globally optimal solution at each iteration. Alternatively, it illustrates the ability of MA to identify the optimal solutions with a set number of iterations. The convergence performance of an MA is considered good if it attains the best solution with fewer iterations than other competing MAs. It is also observed that some MAs converge quickly with a moderate solution, i.e., premature convergence, and MAs get trapped in a local optimum. In this context, efficiency denotes the MA’s ability to quickly find accurate solutions, its convergence behavior, and its overall computational performance. These three aspects can be analyzed using a convergence curve. Fig. 17 illustrates the convergence trajectories of the evaluated MAs across ten sample images. The convergence behavior of the SO is satisfactory, as evidenced in Fig. 17. On the other hand, the convergence curves of CPO, RSA, SCA, and AOA are not satisfactory. Therefore, these MAs exhibit suboptimal convergence patterns, suggesting a likelihood of getting trapped in local optima.

Figure 17: Convergence curves of the metaheuristic algorithms.
By measuring average quality metrics, we assessed the ability of MA-based clustering algorithms to segment images. The numerical values of these quality metrics, based on the reference and ground truth, are provided in Table 18. The image clustering model that is based on SO achieved the highest PSNR, MSE, SSIM, FSIM, and BDE values. AOA-based image clustering models, on the other hand, achieved the highest PRI and BDE score. The best VoI is attained by the RSA-based image clustering approach. The best GCE is achieved by POA based clustering method. Hence, it can be concluded that a superior ability to minimize the objective function does not always lead to a superior image segmentation model in terms of all quality measures. Although the RSA algorithm has no optimization results, the RSA-based image clustering model achieves the highest values of VoI. However, it is apparent that SO consistently achieved good numerical values of all quality metrics compared to other MA-based image clustering models.

8 Conclusion and Future Directions
Proposed in 2022, the initial SO algorithm was designed to solve four different real-world engineering challenges. Its performance was then assessed using a collection of 30 typical unconstrained benchmark functions. Later on, though, the Snake Optimizer attracted a lot of interest and was widely used in a variety of application areas. Numerous SO variations with notable improvements to increase its performance, adaptability, and robustness across a range of problem scenarios and optimization criteria were developed as a result of its broad application. Our work presents a comprehensive review of the Snake Optimizer, explaining it in detail with the help of illustrations and a mathematical framework. Further, this paper presents a critical review of the variants of the SO algorithm, and the analysis reveals that SO has garnered substantial interest within the research community, leading to the development of 49 improved or modified variants within just two years, each tailored to address complex challenges across diverse application domains. However, both (original as well as modified) versions of the SO algorithm have been efficaciously combined with other well-known metaheuristics approaches, thereby resulting in the development of 25 novel hybrid algorithms. These hybrid enhancements of SO have shown strong potential in resolving numerous complex optimization problems in a varied fields of engineering and scientific fields. Although limited in number, multi-objective and binary variants of the Snake Optimizer have also been developed to meet specific requirements in fields such as engineering, robotics, and medical imaging. While this study provides a comprehensive overview of the Snake Optimizer (SO), including its development, variants, and diverse applications, certain aspects offer opportunities for further exploration. The analysis is mainly based on existing published literature, and given the rapidly evolving nature of metaheuristic optimization, new variants and advancements may continue to emerge beyond the scope of this review. Furthermore, the experimental evaluation emphasizes the clustering-based image segmentation domain, while future studies could additionally consider the performance of SO across a broader range of real-world optimization problems. Expanding theoretical investigations, such as convergence analysis and stability, may also provide deeper insights into the algorithm’s behavior. Overall, these aspects present promising directions for continued research and development in the field of Snake Optimizer-based optimization techniques, thus providing significant scope for further exploration and development. Based on this, the following research directions are outlined, which shall offer a valuable guide for researchers seeking to broaden the scope of SO to new and emerging areas of study.
1. Robust SO for Noisy Optimization Problems: To improve the flexibility as well as robustness of SO in dynamic or noisy environments, it would be highly useful to integrate memory-based or predictive modelling methods. Furthermore, resampling methods, robust fitness estimation or noise removal in selection methods could be employed.
2. Extension to Dynamic and Time-Varying Environments: To make SO capable of dealing with the dynamic optima in dynamic problems, the researcher can consider including memory tracking techniques, change detection methods, or prediction-based update strategies to handle the changes in the environment. An external archive of top solutions can be used to improve adaptability.
3. Theoretical Study of Algorithm Convergence and Stability: To further support the mathematical basis of SO, the dynamics of SO could be expressed using stochastic process theory or dynamical systems theory. This would help in establishing theoretical limits, which would increase scientific rigor.
4. Improved Multi-Objective and Many-Objective SO: From the research articles reviewed in this survey, it can be inferred that limited variants of the multi-objective Snake Optimizer have been proposed to date. In the future, the researcher may introduce more variants of SO, especially for achieving and maximizing difficult goals in complex multi-objective optimization problems.
5. Parallel and Distributed SO Implementations: To lower computational cost and to further maintain performance in high-dimensional or large-scale real-time optimization problems, future research can focus on examining scalable, lightweight variants or parallelized implementations of SO.
6. Integration with Deep Learning and AutoML: Integrating SO with deep learning and AutoML allows AI models to automatically find better designs and training settings, making the learning process faster and more efficient.
7. Hybridization Approach: Some of the swarm-based metaheuristics have been considered by SO for the hybridization to address diverse optimization problems effectively. However, upcoming research could focus on integrating SO with plant-inspired, human-inspired, and physics- or chemistry-inspired metaheuristics with SO. These advanced hybridizations have the great potential to enhance SO’s computing efficiency and produce better results for interesting and challenging problem domains.
8. Self-Adaptive, Parameter-Free SO Algorithm: The research could focus on developing a self-adaptive, parameter-free SO algorithm that is capable of automatically tuning its parameters, enhances learning-based control to balance exploration and exploitation, and reduces parameter sensitivity to boost the overall robustness. Therefore, self-adjusting parameter control mechanisms should be prioritized.
9. Snake Optimizer for Advanced Cybersecurity Applications: Given the surge in cyberthreats and the growing demand for strong security in cloud computing, IoT, and distributed systems, cybersecurity is a vital and exciting field where Snake Optimization may have a big influence, and what would it be like in the future direction.
10. Standardized Benchmarking and Reproducibility: To improve comparability and scientific rigor, uniform benchmark protocols (e.g., fixed dimensions, statistical tests, population sizes) need to be established and open-source implementations to ensure fair comparisons.
11. Computational Complexity Reduction: To improve suitability for real-time systems, researchers can streamline update rules, reduce redundant fitness evaluations, and design lightweight SO variants with fewer control parameters while maintaining performance quality.
Acknowledgement: Not applicable.
Funding Statement: The authors received no specific funding for this study.
Author Contributions: Rebika Rai: Conceptualization, Visualization, Methodology, Writing—Original Draft Preparation. Totan Bharasa: Investigation, Software, Data Curation, Visualization, Writing—Reviewing and Editing. Arunita Das: Conceptualization, Visualization, Data Curation, Software, Writing—Original Draft Preparation. Krishna Gopal Dhal: Conceptualization, Methodology, Software, Visualization, Investigation, Supervision, Writing—Reviewing and Editing. All authors reviewed and approved the final version of the manuscript.
Availability of Data and Materials: Not applicable.
Ethics Approval: Not applicable.
Conflicts of Interest: The authors declare no conflicts of interest.
References
1. Glover F. Future paths for integer programming and links to artificial intelligence. Comput Oper Res. 1986;13(5):533–49. doi:10.1016/0305-0548(86)90048-1. [Google Scholar] [CrossRef]
2. Yang XS. Metaheuristic optimization. Scholarpedia. 2011;6(8):11472. doi:10.4249/scholarpedia.11472. [Google Scholar] [CrossRef]
3. Rai R, Das A, Dhal KG. Nature-inspired optimization algorithms and their significance in multi-thresholding image segmentation: an inclusive review. Evol Syst. 2022;13(6):889–945. doi:10.1007/s12530-022-09425-5. [Google Scholar] [PubMed] [CrossRef]
4. Wolpert DH, Macready WG. No free lunch theorems for optimization. IEEE Trans Evol Computat. 1997;1(1):67–82. doi:10.1109/4235.585893. [Google Scholar] [CrossRef]
5. Rai R, Dhal KG, Das A, Ray S. An inclusive survey on marine predators algorithm: variants and applications. Arch Comput Meth Eng. 2023;30(5):3133–72. doi:10.1007/s11831-023-09897-x. [Google Scholar] [PubMed] [CrossRef]
6. Dorigo M, Di Caro G. Ant colony optimization: a new meta-heuristic. In: Proceedings of the 1999 Congress on Evolutionary Computation-CEC99 (Cat. No. 99TH8406); 1999 Jul 6–9; Washington, DC, USA. p. 1470–7. doi:10.1109/cec.1999.782657. [Google Scholar] [CrossRef]
7. Kennedy J, Eberhart R. Particle swarm optimization. In: Proceedings of ICNN’95-International Conference on Neural Networks; 1995 Nov 27–Dec 1; Perth, WA, Australia. p. 1942–8. [Google Scholar]
8. Faramarzi A, Heidarinejad M, Mirjalili S, Gandomi AH. Marine predators algorithm: a nature-inspired metaheuristic. Expert Syst Appl. 2020;152(4):113377. doi:10.1016/j.eswa.2020.113377. [Google Scholar] [CrossRef]
9. Yang XS, He X. Firefly algorithm: recent advances and applications. Int J Swarm Intell. 2013;1(1):36. doi:10.1504/ijsi.2013.055801. [Google Scholar] [CrossRef]
10. Ray S, Dhal KG, Naskar PK. Rough cuckoo search: a novel mathematics based optimization approach based on rough set. Pattern Recognit Image Anal. 2022;32(1):228–47. doi:10.1134/S1054661822010084. [Google Scholar] [CrossRef]
11. Hashim FA, Hussien AG. Snake optimizer: a novel meta-heuristic optimization algorithm. Knowl Based Syst. 2022;242(10):108320. doi:10.1016/j.knosys.2022.108320. [Google Scholar] [CrossRef]
12. Wei Z, Huang C, Wang X, Han T, Li Y. Nuclear reaction optimization: a novel and powerful physics-based algorithm for global optimization. IEEE Access. 2019;7:66084–109. doi:10.1109/access.2019.2918406. [Google Scholar] [CrossRef]
13. Zhao W, Wang L, Zhang Z. Atom search optimization and its application to solve a hydrogeologic parameter estimation problem. Knowl Based Syst. 2019;163:283–304. doi:10.1016/j.knosys.2018.08.030. [Google Scholar] [CrossRef]
14. Lam AYS, Li VOK. Chemical reaction optimization: a tutorial. Memet Comput. 2012;4(1):3–17. doi:10.1007/s12293-012-0075-1. [Google Scholar] [CrossRef]
15. Formato RA. Central force optimization. Prog Electromagn Res. 2007;77(1):425–91. [Google Scholar]
16. Kaveh A, Dadras A. A novel meta-heuristic optimization algorithm: thermal exchange optimization. Adv Eng Softw. 2017;110:69–84. doi:10.1016/j.advengsoft.2017.03.014. [Google Scholar] [CrossRef]
17. Askari Q, Younas I, Saeed M. Political optimizer: a novel socio-inspired meta-heuristic for global optimization. Knowl Based Syst. 2020;195(5):105709. doi:10.1016/j.knosys.2020.105709. [Google Scholar] [CrossRef]
18. Shabani A, Asgarian B, Salido M, Asil Gharebaghi S. Search and rescue optimization algorithm: a new optimization method for solving constrained engineering optimization problems. Expert Syst Appl. 2020;161:113698. doi:10.1016/j.eswa.2020.113698. [Google Scholar] [CrossRef]
19. Rao RV, Savsani VJ, Vakharia DP. Teaching-learning-based optimization: a novel method for constrained mechanical design optimization problems. Comput Aided Des. 2011;43(3):303–15. doi:10.1016/j.cad.2010.12.015. [Google Scholar] [CrossRef]
20. Mousavirad SJ, Ebrahimpour-Komleh H. Human mental search: a new population-based metaheuristic optimization algorithm. Appl Intell. 2017;47(3):850–87. doi:10.1007/s10489-017-0903-6. [Google Scholar] [CrossRef]
21. Moosavian N, Kasaee Roodsari B. Soccer league competition algorithm: a novel meta-heuristic algorithm for optimal design of water distribution networks. Swarm Evol Comput. 2014;17(4):14–24. doi:10.1016/j.swevo.2014.02.002. [Google Scholar] [CrossRef]
22. Cai W, Yang W, Chen X. A global optimization algorithm based on plant growth theory: plant growth optimization. In: Proceedings of the 2008 International Conference on Intelligent Computation Technology and Automation (ICICTA); 2008 Oct 20–22; Changsha, China. p. 1194–9. doi:10.1109/icicta.2008.416. [Google Scholar] [CrossRef]
23. Zhang H, Zhu Y, Chen H. Root growth model: a novel approach to numerical function optimization and simulation of plant root system. Soft Comput. 2014;18(3):521–37. doi:10.1007/s00500-013-1073-z. [Google Scholar] [CrossRef]
24. Salhi A, Fraga ES. Nature-inspired optimisation approaches and the new plant propagation algorithm. In: Proceedings of the ICeMATH 2011; 2011 Jun 6–8; Yogyakarta, Indonesia. [Google Scholar]
25. Labbi Y, Ben Attous D, Gabbar HA, Mahdad B, Zidan A. A new rooted tree optimization algorithm for economic dispatch with valve-point effect. Int J Electr Power Energy Syst. 2016;79(8):298–311. doi:10.1016/j.ijepes.2016.01.028. [Google Scholar] [CrossRef]
26. Kong X, Chen YL, Xie W, Wu X. A novel paddy field algorithm based on pattern search method. In: Proceedings of the 2012 IEEE International Conference on Information and Automation; 2012 Jun 6–8; Shenyang, China. p. 686–90. doi:10.1109/icinfa.2012.6246764. [Google Scholar] [CrossRef]
27. Dixit M, Upadhyay N, Silakari S. An exhaustive survey on nature inspired optimization algorithms. Int J Softw Eng Its Appl. 2015;9(4):91–104. doi:10.21742/ijdcasd.2014.1.1.02. [Google Scholar] [CrossRef]
28. Soni V, Sharma A, Singh V. A critical review on nature inspired optimization algorithms. IOP Conf Ser Mater Sci Eng. 2021;1099(1):012055. [Google Scholar]
29. Kumar A, Nadeem M, Banka H. Nature inspired optimization algorithms: a comprehensive overview. Evol Syst. 2023;14(1):141–56. doi:10.1007/s12530-022-09432-6. [Google Scholar] [CrossRef]
30. Vardhini KK, Sitamahalakshmi T. A review on nature-based swarm intelligence optimization techniques and its current research directions. Indian J Sci Technol. 2016;9(10):1–13. doi:10.17485/ijst/2016/v9i10/81634. [Google Scholar] [CrossRef]
31. Houssein EH, Younan M, Hassanien AE. Nature-inspired algorithms: a comprehensive review. In: Hybrid computational intelligence. Boca Raton, FL, USA: CRC Press; 2019. p. 1–25. doi:10.1201/9780429453427-1. [Google Scholar] [CrossRef]
32. Mirjalili S, Mirjalili SM, Lewis A. Grey wolf optimizer. Adv Eng Softw. 2014;69:46–61. doi:10.1016/j.advengsoft.2013.12.007. [Google Scholar] [CrossRef]
33. Mirjalili S, Lewis A. The whale optimization algorithm. Adv Eng Softw. 2016;95(12):51–67. doi:10.1016/j.advengsoft.2016.01.008. [Google Scholar] [CrossRef]
34. Xie L, Han T, Zhou H, Zhang ZR, Han B, Tang A. Tuna swarm optimization: a novel swarm-based metaheuristic algorithm for global optimization. Comput Intell Neurosci. 2021;2021(1):9210050. doi:10.1155/2021/9210050. [Google Scholar] [PubMed] [CrossRef]
35. Sasmal B, Das A, Dhal KG, Saha R, Rai R, Bharasa T, et al. Artificial hummingbird algorithm: theory, variants, analysis, applications, and performance evaluation. Comput Sci Rev. 2025;56:100727. doi:10.1016/j.cosrev.2025.100727. [Google Scholar] [CrossRef]
36. Wang Y, Xin B, Wang Z, Sun J. Improved snake optimizer based on forced switching mechanism and variable spiral search for practical applications problems. Soft Comput. 2025;29(2):803–38. doi:10.1007/s00500-025-10404-6. [Google Scholar] [CrossRef]
37. Zhang SH, Wang JS, Zhang SW, Xing YX, Sui XF. Multi-strategy fusion snake optimizer on task offloading and scheduling for IoT-based fog computing multi-tasks learning. Clust Comput. 2024;28(1):69. doi:10.1007/s10586-024-04766-z. [Google Scholar] [CrossRef]
38. Li Q, Zhou Y, Luo Q. Multi-task snake optimization algorithm for global optimization and planar kinematic arm control problem. Peerj Comput Sci. 2025;11(13):e2688. doi:10.7717/peerj-cs.2688. [Google Scholar] [PubMed] [CrossRef]
39. Song H, Wang J, Bei J, Wang M. Modified snake optimizer based multi-level thresholding for color image segmentation of agricultural diseases. Expert Syst Appl. 2024;255:124624. doi:10.1016/j.eswa.2024.124624. [Google Scholar] [CrossRef]
40. Kong LG, Wang B, Fan DJ, Shi S, Ouyang X, Xu M. Optimize photovoltaic MPPT with improved snake algorithm. Energy Rep. 2024;11:5033–45. doi:10.1016/j.egyr.2024.04.064. [Google Scholar] [CrossRef]
41. Houssein EH, Abdalkarim N, Hussain K, Mohamed E. Accurate multilevel thresholding image segmentation via oppositional Snake Optimization algorithm: real cases with liver disease. Comput Biol Med. 2024;169(1):107922. doi:10.1016/j.compbiomed.2024.107922. [Google Scholar] [PubMed] [CrossRef]
42. Mai C, Zhang L, Hu X. An adaptive snake optimization algorithm incorporating Subtraction-Average-Based Optimizer for photovoltaic cell parameter identification. Heliyon. 2024;10(15):e35382. doi:10.1016/j.heliyon.2024.e35382. [Google Scholar] [PubMed] [CrossRef]
43. Yang B, Li M, Qin R, Luo E, Duan J, Liu B, et al. Extracted power optimization of hybrid wind-wave energy converters array layout via enhanced snake optimizer. Energy. 2024;293(1):130529. doi:10.1016/j.energy.2024.130529. [Google Scholar] [CrossRef]
44. Wang J, Wang Y. Multi-strategy enhanced snake optimizer for quantitative structure-activity relationship modeling. Appl Math Model. 2024;132(2):531–60. doi:10.1016/j.apm.2024.04.057. [Google Scholar] [CrossRef]
45. Nafeh AEA, Omran AEA, Elkholy A, Yousef HM. Optimal economical sizing of a PV-battery grid-connected system for fast charging station of electric vehicles using modified snake optimization algorithm. Results Eng. 2024;21(5):101965. doi:10.1016/j.rineng.2024.101965. [Google Scholar] [CrossRef]
46. Zheng W, Ai Y, Zhang W. Improved snake optimizer using sobol sequential nonlinear factors and different learning strategies and its applications. Mathematics. 2024;12(11):1708. doi:10.3390/math12111708. [Google Scholar] [CrossRef]
47. Jin Z, Li H, Liang J. A node deployment method based on improved snake optimizer for marine disasters. IEEE Sens J. 2024;24(9):15446–56. doi:10.1109/jsen.2024.3379741. [Google Scholar] [CrossRef]
48. Bao X, Kang H, Li H. An improved binary snake optimizer with Gaussian mutation transfer function and hamming distance for feature selection. Neural Comput Appl. 2024;36(16):9567–89. doi:10.1007/s00521-024-09581-6. [Google Scholar] [CrossRef]
49. Sarkhi SMK, Koyuncu H. Optimization strategies for atari game environments: integrating snake optimization algorithm and energy valley optimization in reinforcement learning models. AI. 2024;5(3):1172–91. doi:10.3390/ai5030057. [Google Scholar] [CrossRef]
50. Zheng K, Liu H, Li B. Improved snake optimization algorithm for global optimization and engineering applications. Sci Rep. 2025;15(1):18171. doi:10.1038/s41598-025-01299-2. [Google Scholar] [PubMed] [CrossRef]
51. Liu P, Sun N, Wan H, Zhang C, Zhao J, Wang G. Improved adaptive snake optimization algorithm with application to multi-UAV path planning. Trans Inst Meas Control. 2025;47(8):1639–50. doi:10.1177/01423312241263637. [Google Scholar] [CrossRef]
52. Liao Y, Wang Y. Source localization using TDOA based on improved snake optimizer. Circuits Syst Signal Process. 2024;43(8):5237–61. doi:10.1007/s00034-024-02703-4. [Google Scholar] [CrossRef]
53. Yang L, Zhang D, Li L, He Q. Energy efficient cluster-based routing protocol for WSN using multi-strategy fusion snake optimizer and minimum spanning tree. Sci Rep. 2024;14(1):16786. doi:10.1038/s41598-024-66703-9. [Google Scholar] [PubMed] [CrossRef]
54. Damera VK, Vanitha G, Indira B, Sirisha G, Vatambeti R. Improved snake optimization-based task scheduling in cloud computing. Computing. 2024;106(10):3353–85. doi:10.1007/s00607-024-01323-9. [Google Scholar] [CrossRef]
55. Zhao M, Zhou X. Multi-step short-term wind power prediction model based on CEEMD and improved snake optimization algorithm. IEEE Access. 2024;12:50755–78. doi:10.1109/access.2024.3385643. [Google Scholar] [CrossRef]
56. Zhang SH, Wang JS, Zhang SW, Li YX, Xing YX, Zhang YH. Snake optimizer with oscillating factors to solve edge computing task unloading and scheduling optimization problem. Alex Eng J. 2024;91(2):273–304. doi:10.1016/j.aej.2024.02.009. [Google Scholar] [CrossRef]
57. Kong Y, Liu Z. Optimization of zinc smelting slag melting point based on catboost and improved snake optimization algorithm. Appl Sci. 2024;14(11):4603. doi:10.3390/app14114603. [Google Scholar] [CrossRef]
58. Ghamari SM, Hajihosseini M, Habibi D, Aziz A. Design of an adaptive robust PI controller for DC/DC boost converter using reinforcement-learning technique and snake optimization algorithm. IEEE Access. 2024;12:141814–29. doi:10.1109/access.2024.3440580. [Google Scholar] [CrossRef]
59. Mai C, Zhang L, Hu X. Combining dynamic adaptive snake algorithm with perturbation and observation for MPPT in PV systems under shading conditions. Appl Soft Comput. 2024;162:111822. doi:10.1016/j.asoc.2024.111822. [Google Scholar] [CrossRef]
60. Zhou T, Chen Z, Jiao J. Quadrotor attitude control by improved snake optimizer based adaptive switching disturbance rejection approach. Meas Sci Technol. 2024;35(7):076203. doi:10.1088/1361-6501/ad37d0. [Google Scholar] [CrossRef]
61. Wang Y, Yao Y, Zou Q, Zhao K, Hao Y. Forecasting a short-term photovoltaic power model based on improved snake optimization, convolutional neural network, and bidirectional long short-term memory network. Sensors. 2024;24(12):3897. doi:10.3390/s24123897. [Google Scholar] [PubMed] [CrossRef]
62. Zhi L, Huang M, Qian L, Wang Z, Wen Q, Han W. Research on active disturbance rejection control with parameter autotuning for a moving mirror control system based on improved snake optimization. Electronics. 2024;13(9):1650. doi:10.3390/electronics13091650. [Google Scholar] [CrossRef]
63. Singh V, Kaushik VD. Adaptive snake optimization-enabled deep learning-based multi-classification using leaf images. Signal Image Video Process. 2024;18(4):3043–52. doi:10.1007/s11760-023-02969-2. [Google Scholar] [CrossRef]
64. Agrawal N, Khan FA, Mahapatra S. Next generation heffron-phillips model for damping power system oscillations based on a novel meta-heuristic snake optimization algorithm. Sci Technol Asia. 2024;29(1):160–81. doi:10.1109/odicon62106.2024.10797541. [Google Scholar] [CrossRef]
65. Lu H, Zhan H, Wang T. A multi-strategy improved snake optimizer and its application to SVM parameter selection. Math Biosci Eng. 2024;21(10):7297–336. doi:10.3934/mbe.2024322. [Google Scholar] [PubMed] [CrossRef]
66. Zhang X, Gao X, Bu X, An J. An improved snake optimization algorithm for sparse conformal array beamforming. IEEE Trans Veh Technol. 2024;73(6):8542–8. doi:10.1109/tvt.2024.3361454. [Google Scholar] [CrossRef]
67. Kaliraj S, Sivakumar V, Premkumar N, Vatchala S. Snake swarm optimization-based deep reinforcement learning for resource allocation in edge computing environment. Concurrency and Computation. 2024;36(18):e8130. doi:10.1002/cpe.8130. [Google Scholar] [CrossRef]
68. Mohammed KK, Mekhilef S. Improved snake optimizer algorithm-based GMPPT with a fast response to the load variations under different weather conditions for PV systems. IEEE Trans Ind Electron. 2024;71(7):7147–57. doi:10.1109/tie.2023.3301526. [Google Scholar] [CrossRef]
69. Zheng W, Pang S, Liu N, Chai Q, Xu L. A compact snake optimization algorithm in the application of WKNN fingerprint localization. Sensors. 2023;23(14):6282. doi:10.3390/s23146282. [Google Scholar] [PubMed] [CrossRef]
70. Braik MS, Hammouri AI, Awadallah MA, Al-Betar MA, Alzubi OA. Improved versions of snake optimizer for feature selection in medical diagnosis: a real case COVID-19. Soft Comput. 2023;27(23):17833–65. doi:10.1007/s00500-023-09062-3. [Google Scholar] [CrossRef]
71. Abu Khurma R, Alhenawi E, Braik M, Hashim FA, Chhabra A, Castillo PA. A bio-medical snake optimizer system driven by logarithmic surviving global search for optimizing feature selection and its application for disorder recognition. J Comput Des Eng. 2023;10(6):2361–83. doi:10.1093/jcde/qwad101. [Google Scholar] [CrossRef]
72. Yıldızdan G. Chaotic snake optimizer. Afyon Kocatepe Üniversitesi Fen Ve Mühendislik Bilim Derg. 2023;23(5):1122–41. [Google Scholar]
73. Diao Q, Chan WH, Zain AM, Junaidi A, Yang H. An improved algorithm based on snake optimizer. Proc Comput Sci. 2023;1:17. doi:10.55092/pcs2023020017. [Google Scholar] [CrossRef]
74. Belabbes F, Cotfas DT, Cotfas PA, Medles M. Using the snake optimization metaheuristic algorithms to extract the photovoltaic cells parameters. Energy Convers Manag. 2023;292(2):117373. doi:10.1016/j.enconman.2023.117373. [Google Scholar] [CrossRef]
75. Yao L, Yuan P, Tsai CY, Zhang T, Lu Y, Ding S. ESO: an enhanced snake optimizer for real-world engineering problems. Expert Syst Appl. 2023;230:120594. doi:10.1016/j.eswa.2023.120594. [Google Scholar] [CrossRef]
76. Wang C, Jiao S, Li Y, Zhang Q. Capacity optimization of a hybrid energy storage system considering wind-solar reliability evaluation based on a novel multi-strategy snake optimization algorithm. Expert Syst Appl. 2023;231(12):120602. doi:10.1016/j.eswa.2023.120602. [Google Scholar] [CrossRef]
77. Li H, Xu G, Chen B, Huang S, Xia Y, Chai S. Dual-mutation mechanism-driven snake optimizer for scheduling multiple budget constrained workflows in the cloud. Appl Soft Comput. 2023;149(4):110966. doi:10.1016/j.asoc.2023.110966. [Google Scholar] [CrossRef]
78. Yan C, Razmjooy N. Optimal lung cancer detection based on CNN optimized and improved Snake optimization algorithm. Biomed Signal Process Control. 2023;86(10):105319. doi:10.1016/j.bspc.2023.105319. [Google Scholar] [CrossRef]
79. Li Y, Tang B, Jiao S, Su Q. Snake optimization-based variable-step multiscale single threshold slope entropy for complexity analysis of signals. IEEE Trans Instrum Meas. 2023;72:1–13. doi:10.1109/tim.2023.3317908. [Google Scholar] [CrossRef]
80. Li S, Ye L. Multi-level thresholding image segmentation for rubber tree secant using improved Otsu’s method and snake optimizer. Math Biosci Eng. 2023;20(6):9645–69. doi:10.3934/mbe.2023423. [Google Scholar] [PubMed] [CrossRef]
81. Shi K, Zhang M, He Z, Yin S, Ai Z, Pan N. Scheduling of multi-AGV systems in automated electricity meter verification workshops based on an improved snake optimization algorithm. Symmetry. 2023;15(11):2034. doi:10.3390/sym15112034. [Google Scholar] [CrossRef]
82. Khurma RA, Alazab M, Merelo JJ, Castillo PA. New evolutionary selection operators for snake optimizer. In: Proceedings of the 14th International Conference on Evolutionary Computation Theory and Applications; 2022 Oct 24–26; Valletta, Malta. p. 82–90. [Google Scholar]
83. Dai Y, Pang J, Li Z, Li W, Wang Q, Li S. Modeling of thermal error electric spindle based on KELM ameliorated by snake optimization. Case Stud Therm Eng. 2022;40(1):102504. doi:10.1016/j.csite.2022.102504. [Google Scholar] [CrossRef]
84. Liu X, Tian M, Zhou J, Liang J. An efficient coverage method for SEMWSNs based on adaptive chaotic Gaussian variant snake optimization algorithm. Math Biosci Eng. 2022;20(2):3191–215. doi:10.3934/mbe.2023150. [Google Scholar] [PubMed] [CrossRef]
85. Guan Q, Liu Q, Tao S, Xu Y, Zhou D, Chen H, et al. Snake optimizer improved variational mode decomposition for short-term prediction of vehicle charging loads. IEEE Open J Power Energy. 2025;12:76–87. doi:10.1109/oajpe.2025.3529944. [Google Scholar] [CrossRef]
86. Puri D, Kachare P, Khare S, Al-Shourbaji I, Jabbari A, Alameen A. Hybrid reptile-snake optimizer based channel selection for enhancing Alzheimer’s disease detection. J Bionic Eng. 2025;22(2):884–900. doi:10.1007/s42235-024-00636-x. [Google Scholar] [CrossRef]
87. Bölükbaş O, Haber Z, Uğuz H. The performance evolution of the new scatter search snake optimization algorithm for feature selection problems. Arab J Sci Eng. 2025;50(19):15931–49. doi:10.1007/s13369-025-10015-1. [Google Scholar] [CrossRef]
88. Fan Q, Zhang X, Wen Z, Xu L, Zhang Q. Nonlinear compensation of the linear variable differential transducer using an advanced snake optimization integrated with tangential functional link artificial neural network. Sensors. 2025;25(4):1074. doi:10.3390/s25041074. [Google Scholar] [PubMed] [CrossRef]
89. Duraibi S. Enhanced image-based malware classification using snake optimization algorithm with deep convolutional neural network. IEEE Access. 2024;12(9):95047–57. doi:10.1109/access.2024.3425593. [Google Scholar] [CrossRef]
90. Alawad NA, Abed-alguni BH, El-ibini M. Hybrid snake optimizer algorithm for solving economic load dispatch problem with valve point effect. J Supercomput. 2024;80(13):19274–323. doi:10.1007/s11227-024-06207-5. [Google Scholar] [CrossRef]
91. Ersali C, Hekimoglu B, Yilmaz M, Martinez-Morales AA, Akinci TC. Disturbance rejecting PID-FF controller design of a non-ideal buck converter using an innovative snake optimizer with pattern search algorithm. Heliyon. 2024;10(14):e34448. doi:10.1016/j.heliyon.2024.e34448. [Google Scholar] [PubMed] [CrossRef]
92. Alkahtani HK, Mahgoub H, Alotaibi FA, Othman KM, Allafi R, Salama AS. Design of hybrid snake optimizer based route selection approach for unmanned aerial vehicles communication. IEEE Access. 2024;12:54426–34. doi:10.1109/access.2024.3383031. [Google Scholar] [CrossRef]
93. Jithendra T, Basha SS, Das R. Modelling atmospheric pressure through the hybridization of an ANFIS using IOWA and a snake optimizer. Model Earth Syst Environ. 2024;10(3):4475–95. doi:10.1007/s40808-024-02015-1. [Google Scholar] [CrossRef]
94. Amor N, Tayyab Noman M, Petru M, Sebastian N, Balram D. Machining performance of TiO2 embedded-glass fiber reinforced composites with snake optimizer. Measurement. 2024;227:114253. doi:10.1016/j.measurement.2024.114253. [Google Scholar] [CrossRef]
95. Fan Q, Zhang K, Xu L, Zhang Q. Multi-strategy advanced snake optimizer-based optimal feedback control of half vehicle suspension. IEEE Access. 2024;12:125681–96. doi:10.1109/access.2024.3445874. [Google Scholar] [CrossRef]
96. Wang Z, Duan J, Xing P. Multi-hop clustering and routing protocol based on enhanced snake optimizer and golden jackal optimization in WSNs. Sensors. 2024;24(4):1348. doi:10.3390/s24041348. [Google Scholar] [PubMed] [CrossRef]
97. Agrawal N, Khan FA, Gowda M. Robust design of damping controller for power system using a combination of snake optimisation algorithm and optimal control theory. Int J Energy Technol Policy. 2024;19(1/2):171–215. doi:10.1504/ijetp.2024.138547. [Google Scholar] [CrossRef]
98. Ablin R, Prabin G. Gated graph attention-based crossover snake (GGA-CS) algorithm for hyperspectral image classification. Ann Data Sci. 2025;12(1):281–305. doi:10.1007/s40745-024-00567-8. [Google Scholar] [CrossRef]
99. Al-Qazzaz AS, Salehpour P, Aghdasi HS. Robust deepfake face detection leveraging xception model and novel snake optimization technique. J Robot Control (JRC). 2024;5(5):1444–56. doi:10.23919/indiacom66777.2025.11115841. [Google Scholar] [CrossRef]
100. Chen S, Cao J, Wan Y, Shi X, Huang W. Enhancing rutting depth prediction in asphalt pavements: a synergistic approach of extreme gradient boosting and snake optimization. Constr Build Mater. 2024;421(4):135726. doi:10.1016/j.conbuildmat.2024.135726. [Google Scholar] [CrossRef]
101. Aljebreen M, Mengash HA, Arasi MA, Aljameel SS, Salama AS, Hamza MA. Enhancing DDoS attack detection using snake optimizer with ensemble learning on Internet of Things environment. IEEE Access. 2023;11:104745–53. doi:10.1109/access.2023.3318316. [Google Scholar] [CrossRef]
102. Wang L, Fan G, Wang Q, Li H, Huo J, Wei S, et al. Snake optimizer LSTM-based UWB positioning method for unmanned crane. PLoS One. 2023;18(11):e0293618. doi:10.1371/journal.pone.0293618. [Google Scholar] [PubMed] [CrossRef]
103. Ismail WN. Snake-efficient feature selection-based framework for precise early detection of chronic kidney disease. Diagnostics. 2023;13(15):2501. doi:10.3390/diagnostics13152501. [Google Scholar] [PubMed] [CrossRef]
104. Samiayya D, Radhika S, Chandrasekar A. An optimal model for enhancing network lifetime and cluster head selection using hybrid snake whale optimization. Peer Peer Netw Appl. 2023;16(4):1959–74. doi:10.1007/s12083-023-01487-9. [Google Scholar] [CrossRef]
105. Masood A, Hameed MM, Srivastava A, Pham QB, Ahmad K, Razali SFM, et al. Improving PM2.5 prediction in New Delhi using a hybrid extreme learning machine coupled with snake optimization algorithm. Sci Rep. 2023;13(1):21057. doi:10.1038/s41598-023-47492-z. [Google Scholar] [PubMed] [CrossRef]
106. Jiang H, Li M, Fathi G. Optimal load demand forecasting in air conditioning using deep belief networks optimized by an improved version of snake optimization algorithm. IET Renew Power Gener. 2023;17(12):3011–24. doi:10.1049/rpg2.12819. [Google Scholar] [CrossRef]
107. Kassem AA. Snake optimization with deep learning enabled disease detection model for colorectal cancer. J Smart Internet Things. 2023;2022(1):178–95. doi:10.2478/jsiot-2022-0012. [Google Scholar] [CrossRef]
108. Al-Shourbaji I, Kachare PH, Alshathri S, Duraibi S, Elnaim B, Abd Elaziz M. An efficient parallel reptile search algorithm and snake optimizer approach for feature selection. Mathematics. 2022;10(13):2351. doi:10.3390/math10132351. [Google Scholar] [CrossRef]
109. Fu H, Shi H, Xu Y, Shao J. Research on gas outburst prediction model based on multiple strategy fusion improved snake optimization algorithm with temporal convolutional network. IEEE Access. 2022;10:117973–84. doi:10.1109/access.2022.3220765. [Google Scholar] [CrossRef]
110. Rawa M. Towards avoiding cascading failures in transmission expansion planning of modern active power systems using hybrid snake-sine cosine optimization algorithm. Mathematics. 2022;10(8):1323. doi:10.3390/math10081323. [Google Scholar] [CrossRef]
111. Gao L, Liu Z. An integrated external archive local disturbance mechanism for multi-objective snake optimizer. Chin J Elect. 2024;33(4):989–96. doi:10.23919/cje.2023.00.023. [Google Scholar] [CrossRef]
112. Li Q, Ma Q, Weng X. Dynamic path planning for mobile robots based on artificial potential field enhanced improved multiobjective snake optimization (APF-IMOSO). J Field Robot. 2024;41(6):1843–63. doi:10.1002/rob.22354. [Google Scholar] [CrossRef]
113. Abu Khurma R, Albashish D, Braik M, Alzaqebah A, Qasem A, Adwan O. An augmented Snake Optimizer for diseases and COVID-19 diagnosis. Biomed Signal Process Control. 2023;84(3):104718. doi:10.1016/j.bspc.2023.104718. [Google Scholar] [PubMed] [CrossRef]
114. Agrawal N, Mahapatra S, Khan FA. Robust design of damping controller for power system with snake optimization algorithm. Int J Syst Assur Eng Manag. 2025;16(3):1256–86. doi:10.1007/s13198-025-02708-5. [Google Scholar] [CrossRef]
115. Mohammadi F, Kaffash A, Donyagozashteh Z, Marasi M, Tavakoli M. Design of a novel robust adaptive backstepping controller optimized by snake algorithm for buck-boost converter. IET Control Theory Appl. 2025;19(1):e12770. doi:10.1049/cth2.12770. [Google Scholar] [CrossRef]
116. Rehiara AB, Bawan EK, Palintin AD, Wihyawari BR, Paisey FYS, Pasalli YR. Snake Optimization based optimal power flow. Int J Intell Eng Syst. 2024;17(5):683–93. [Google Scholar]
117. Heidari MH, Abdi H, Moradi M. Solving the combined heat and power economic dispatch problem considering power losses by applying the snake optimization. J Energy Manag Technol. 2024;8(2):78–92. doi:10.1016/j.seta.2022.102512. [Google Scholar] [CrossRef]
118. Rameshar V, Sharma G, Bokoro PN, Çelik E. Frequency support studies of a diesel-wind generation system using snake optimizer-oriented PID with UC and RFB. Energies. 2023;16(8):3417. doi:10.3390/en16083417. [Google Scholar] [CrossRef]
119. Janjanam L, Saha SK, Kar R. Optimal design of Hammerstein cubic spline filter for nonlinear system modeling based on snake optimizer. IEEE Trans Ind Electron. 2023;70(8):8457–67. doi:10.1109/tie.2022.3213886. [Google Scholar] [CrossRef]
120. Yousri R, Elbayoumi M, Soltan A, Darweesh MS. A power-aware task scheduler for energy harvesting-based wearable biomedical systems using snake optimizer. Analog Integr Circuits Signal Process. 2023;115(2):183–94. doi:10.1007/s10470-023-02154-y. [Google Scholar] [CrossRef]
121. Çelik E, Karayel M. Effective speed control of brushless DC motor using cascade 1PDf-PI controller tuned by snake optimizer. Neural Comput Appl. 2024;36(13):7439–54. doi:10.1007/s00521-024-09470-y. [Google Scholar] [CrossRef]
122. Ji H, Huang K, Mo C. Research on the application of variational mode decomposition optimized by snake optimization algorithm in rolling bearing fault diagnosis. Shock Vib. 2024;2024(1):5549976. doi:10.1155/2024/5549976. [Google Scholar] [CrossRef]
123. Pang Y, Li F, Qian H, Liu X, Yao Y. A snake optimization algorithm-based power system inertia estimation method considering the effects of transient frequency and voltage changes. Energies. 2024;17(17):4430. doi:10.3390/en17174430. [Google Scholar] [CrossRef]
124. Cheng R, Qiao Z, Li J, Huang J. Traffic signal timing optimization model based on video surveillance data and snake optimization algorithm. Sensors. 2023;23(11):5157. doi:10.3390/s23115157. [Google Scholar] [PubMed] [CrossRef]
125. Tarek Z, Ali Alhussan A, Khafaga DS, El-Kenawy EM, Elshewey AM. A snake optimization algorithm-based feature selection framework for rapid detection of cardiovascular disease in its early stages. Biomed Signal Process Control. 2025;102:107417. doi:10.1016/j.bspc.2024.107417. [Google Scholar] [CrossRef]
126. Taher SSH, Ameen SY, Ahmed JA. Enhancing blockchain scalability with snake optimization algorithm: a novel approach. Front Blockchain. 2024;7:1361659. doi:10.3389/fbloc.2024.1361659. [Google Scholar] [CrossRef]
127. Liu Q, Wang P, Sun J, Li R, Li Y. Wireless channel prediction of GRU based on experience replay and snake optimizer. Sensors. 2023;23(14):6270. doi:10.3390/s23146270. [Google Scholar] [PubMed] [CrossRef]
128. Ray S, Parai S, Das A, Dhal KG, Naskar PK. Cuckoo search with differential evolution mutation and masi entropy for multi-level image segmentation. Multimed Tools Appl. 2022;81(3):4073–117. doi:10.1007/s11042-021-11633-1. [Google Scholar] [CrossRef]
129. Martin DR, Fowlkes CC, Malik J. Learning to detect natural image boundaries using local brightness, color, and texture cues. IEEE Trans Pattern Anal Machine Intell. 2004;26(5):530–49. doi:10.1109/tpami.2004.1273918. [Google Scholar] [PubMed] [CrossRef]
130. Dhal KG, Rai R, Das A, Ghosh TK. Hybridization of sine-cosine algorithm with K-means for pathology image clustering. In: Artificial intelligence. Cham, Switzerland: Springer Nature; 2022. p. 76–86. doi:10.1007/978-3-031-22485-0_8. [Google Scholar] [CrossRef]
131. Sasmal B, Hussien AG, Das A, Dhal KG. A comprehensive survey on Aquila optimizer. Arch Comput Meth Eng. 2023;30(7):4449–76. doi:10.1007/s11831-023-09945-6. [Google Scholar] [PubMed] [CrossRef]
132. Dhal KG, Sasmal B, Das A, Ray S, Rai R. A comprehensive survey on arithmetic optimization algorithm. Arch Comput Meth Eng. 2023;30(5):3379–404. doi:10.1007/s11831-023-09902-3. [Google Scholar] [PubMed] [CrossRef]
133. Yuan C, Zhao D, Heidari AA, Liu L, Chen Y, Wu Z, et al. Artemisinin optimization based on malaria therapy: algorithm and applications to medical image segmentation. Displays. 2024;84(5):102740. doi:10.1016/j.displa.2024.102740. [Google Scholar] [CrossRef]
134. Wang J, Wang W-C, Hu X-X, Qiu L, Zang H-F. Black-winged kite algorithm: a nature-inspired meta-heuristic for solving benchmark functions and engineering problems. Artif Intell Rev. 2024;57(4):98. doi:10.1007/s10462-024-10723-4. [Google Scholar] [CrossRef]
135. Abdel-Basset M, Mohamed R, Abouhawwash M. Crested Porcupine Optimizer: a new nature-inspired metaheuristic. Knowl Based Syst. 2024;284(1):111257. doi:10.1016/j.knosys.2023.111257. [Google Scholar] [CrossRef]
136. Agushaka JO, Ezugwu AE, Abualigah L. Dwarf mongoose optimization algorithm. Comput Meth Appl Mech Eng. 2022;391:114570. doi:10.1016/j.cma.2022.114570. [Google Scholar] [CrossRef]
137. Lian J, Zhu T, Ma L, Wu X, Heidari AA, Chen Y, et al. The educational competition optimizer. Int J Syst Sci. 2024;55(15):3185–222. doi:10.1080/00207721.2024.2367079. [Google Scholar] [CrossRef]
138. Qi A, Zhao D, Heidari AA, Liu L, Chen Y, Chen H. FATA: an efficient optimization method based on geophysics. Neurocomputing. 2024;607(5):128289. doi:10.1016/j.neucom.2024.128289. [Google Scholar] [CrossRef]
139. Oladejo SO, Ekwe SO, Mirjalili S. The Hiking Optimization Algorithm: a novel human-based metaheuristic approach. Knowl Based Syst. 2024;296(4):111880. doi:10.1016/j.knosys.2024.111880. [Google Scholar] [CrossRef]
140. Zheng B, Chen Y, Wang C, Heidari AA, Liu L, Chen H. The moss growth optimization (MGOconcepts and performance. J Comput Des Eng. 2024;11(5):184–221. doi:10.1093/jcde/qwae080. [Google Scholar] [CrossRef]
141. Yuan C, Zhao D, Heidari AA, Liu L, Chen Y, Chen H. Polar lights optimizer: algorithm and applications in image segmentation and feature selection. Neurocomputing. 2024;607(7):128427. doi:10.1016/j.neucom.2024.128427. [Google Scholar] [CrossRef]
142. Al-Baik O, Alomari S, Alssayed O, Gochhait S, Leonova I, Dutta U, et al. Pufferfish optimization algorithm: a new bio-inspired metaheuristic algorithm for solving optimization problems. Biomimetics. 2024;9(2):65. doi:10.3390/biomimetics9020065. [Google Scholar] [PubMed] [CrossRef]
143. Lian J, Hui G, Ma L, Zhu T, Wu X, Heidari AA, et al. Parrot optimizer: algorithm and applications to medical problems. Comput Biol Med. 2024;172(5):108064. doi:10.1016/j.compbiomed.2024.108064. [Google Scholar] [PubMed] [CrossRef]
144. Sasmal B, Hussien AG, Das A, Dhal KG, Saha R. Reptile search algorithm: theory, variants, applications, and performance evaluation. Arch Comput Meth Eng. 2024;31(1):521–49. doi:10.1007/s11831-023-09990-1. [Google Scholar] [CrossRef]
145. Dhal KG, Das A, Gálvez J, Ray S, Das S. An overview on nature-inspired optimization algorithms and their possible application in image processing domain. Pattern Recognit Image Anal. 2020;30(4):614–31. doi:10.1134/S1054661820040100. [Google Scholar] [CrossRef]
146. Dhal KG, Das A, Bharasa T, Sasmal B, Saha R. A comprehensive survey on runge kutta optimizer. Arch Comput Meth Eng. 2025;35(5):4099. doi:10.1007/s11831-025-10432-3. [Google Scholar] [CrossRef]
147. Rodriguez A, Laio A. Clustering by fast search and find of density peaks. Science. 2014;344(6191):1492–6. doi:10.1126/science.1242072. [Google Scholar] [CrossRef]
148. García S, Fernández A, Luengo J, Herrera F. Advanced nonparametric tests for multiple comparisons in the design of experiments in computational intelligence and data mining: experimental analysis of power. Inf Sci. 2010;180(10):2044–64. doi:10.1016/j.ins.2009.12.010. [Google Scholar] [CrossRef]
149. Das A, Rai R, Sasmal B, Dhal KG, Abu Khurma R, Saha R. Metaheuristic algorithms since 2020: development, taxonomy, analysis, and applications. Arch Comput Meth Eng. 2025;267(1):66. doi:10.1007/s11831-025-10408-3. [Google Scholar] [CrossRef]
Cite This Article
Copyright © 2026 The Author(s). Published by Tech Science Press.This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.


Submit a Paper
Propose a Special lssue
View Full Text
Download PDF
Downloads
Citation Tools