Open Access iconOpen Access

ARTICLE

Several Improved Models of the Mountain Gazelle Optimizer for Solving Optimization Problems

Farhad Soleimanian Gharehchopogh*, Keyvan Fattahi Rishakan

Department of Computer Engineering, Ur., C., Islamic Azad University, Urmia, Iran

* Corresponding Authors: Farhad Soleimanian Gharehchopogh. Email: email, email

Computer Modeling in Engineering & Sciences 2026, 146(1), 24 https://doi.org/10.32604/cmes.2025.073808

Abstract

Optimization algorithms are crucial for solving NP-hard problems in engineering and computational sciences. Metaheuristic algorithms, in particular, have proven highly effective in complex optimization scenarios characterized by high dimensionality and intricate variable relationships. The Mountain Gazelle Optimizer (MGO) is notably effective but struggles to balance local search refinement and global space exploration, often leading to premature convergence and entrapment in local optima. This paper presents the Improved MGO (IMGO), which integrates three synergistic enhancements: dynamic chaos mapping using piecewise chaotic sequences to boost exploration diversity; Opposition-Based Learning (OBL) with adaptive, diversity-driven activation to speed up convergence; and structural refinements to the position update mechanisms to enhance exploitation. The IMGO underwent a comprehensive evaluation using 52 standardised benchmark functions and seven engineering optimization problems. Benchmark evaluations showed that IMGO achieved the highest rank in best solution quality for 31 functions, the highest rank in mean performance for 18 functions, and the highest rank in worst-case performance for 14 functions among 11 competing algorithms. Statistical validation using Wilcoxon signed-rank tests confirmed that IMGO outperformed individual competitors across 16 to 50 functions, depending on the algorithm. At the same time, Friedman ranking analysis placed IMGO with an average rank of 4.15, compared to the baseline MGO’s 4.38, establishing the best overall performance. The evaluation of engineering problems revealed consistent improvements, including an optimal cost of 1.6896 for the welded beam design vs. MGO’s 1.7249, a minimum cost of 5885.33 for the pressure vessel design vs. MGO’s 6300, and a minimum weight of 2964.52 kg for the speed reducer design vs. MGO’s 2990.00 kg. Ablation studies identified OBL as the strongest individual contributor, whereas complete integration achieved superior performance through synergistic interactions among components. Computational complexity analysis established an O (T × N × 5 × f (P)) time complexity, representing a 1.25× increase in fitness evaluation relative to the baseline MGO, validating the favorable accuracy-efficiency trade-offs for practical optimization applications.

Keywords

Metaheuristic algorithm; dynamical chaos integration; opposition-based learning; mountain gazelle optimizer; optimization

Cite This Article

APA Style
Gharehchopogh, F.S., Rishakan, K.F. (2026). Several Improved Models of the Mountain Gazelle Optimizer for Solving Optimization Problems. Computer Modeling in Engineering & Sciences, 146(1), 24. https://doi.org/10.32604/cmes.2025.073808
Vancouver Style
Gharehchopogh FS, Rishakan KF. Several Improved Models of the Mountain Gazelle Optimizer for Solving Optimization Problems. Comput Model Eng Sci. 2026;146(1):24. https://doi.org/10.32604/cmes.2025.073808
IEEE Style
F. S. Gharehchopogh and K. F. Rishakan, “Several Improved Models of the Mountain Gazelle Optimizer for Solving Optimization Problems,” Comput. Model. Eng. Sci., vol. 146, no. 1, pp. 24, 2026. https://doi.org/10.32604/cmes.2025.073808



cc Copyright © 2026 The Author(s). Published by Tech Science Press.
This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
  • 419

    View

  • 106

    Download

  • 0

    Like

Share Link