Open Access iconOpen Access

ARTICLE

Med-ReLU: A Parameter-Free Hybrid Activation Function for Deep Artificial Neural Network Used in Medical Image Segmentation

Nawaf Waqas1, Muhammad Islam2,*, Muhammad Yahya3, Shabana Habib4, Mohammed Aloraini2, Sheroz Khan5

1 Department of Instrumentation and Control Engineering, Universiti Kuala Lumpur Malaysian Institute of Industrial Technology (UniKL MITEC), Bandar Seri Alam, Masai, 81750, Johor, Malaysia
2 Department of Electrical Engineering, College of Engineering, Qassim University, Buraydah, 52571, Saudi Arabia
3 Data Science Institute, University of Galway, IDA Business Park, Lower Dangan, Galway, H91 AEX4, Ireland
4 Department of Information Technology, College of Computer, Qassim University, Buraydah, 51452, Saudi Arabia
5 Department of Electrical Engineering, College of Engineering and Information Technology, Onaizah Colleges, Qassim, 56447, Saudi Arabia

* Corresponding Author: Muhammad Islam. Email: email

Computers, Materials & Continua 2025, 84(2), 3029-3051. https://doi.org/10.32604/cmc.2025.064660

Abstract

Deep learning (DL), derived from the domain of Artificial Neural Networks (ANN), forms one of the most essential components of modern deep learning algorithms. DL segmentation models rely on layer-by-layer convolution-based feature representation, guided by forward and backward propagation. A critical aspect of this process is the selection of an appropriate activation function (AF) to ensure robust model learning. However, existing activation functions often fail to effectively address the vanishing gradient problem or are complicated by the need for manual parameter tuning. Most current research on activation function design focuses on classification tasks using natural image datasets such as MNIST, CIFAR-10, and CIFAR-100. To address this gap, this study proposes Med-ReLU, a novel activation function specifically designed for medical image segmentation. Med-ReLU prevents deep learning models from suffering dead neurons or vanishing gradient issues. It is a hybrid activation function that combines the properties of ReLU and Softsign. For positive inputs, Med-ReLU adopts the linear behavior of ReLU to avoid vanishing gradients, while for negative inputs, it exhibits the Softsign’s polynomial convergence, ensuring robust training and avoiding inactive neurons across the training set. The training performance and segmentation accuracy of Med-ReLU have been thoroughly evaluated, demonstrating stable learning behavior and resistance to overfitting. It consistently outperforms state-of-the-art activation functions in medical image segmentation tasks. Designed as a parameter-free function, Med-ReLU is simple to implement in complex deep learning architectures, and its effectiveness spans various neural network models and anomaly detection scenarios.

Keywords

Medical image segmentation; U-Net; deep learning models; activation function

Cite This Article

APA Style
Waqas, N., Islam, M., Yahya, M., Habib, S., Aloraini, M. et al. (2025). Med-ReLU: A Parameter-Free Hybrid Activation Function for Deep Artificial Neural Network Used in Medical Image Segmentation. Computers, Materials & Continua, 84(2), 3029–3051. https://doi.org/10.32604/cmc.2025.064660
Vancouver Style
Waqas N, Islam M, Yahya M, Habib S, Aloraini M, Khan S. Med-ReLU: A Parameter-Free Hybrid Activation Function for Deep Artificial Neural Network Used in Medical Image Segmentation. Comput Mater Contin. 2025;84(2):3029–3051. https://doi.org/10.32604/cmc.2025.064660
IEEE Style
N. Waqas, M. Islam, M. Yahya, S. Habib, M. Aloraini, and S. Khan, “Med-ReLU: A Parameter-Free Hybrid Activation Function for Deep Artificial Neural Network Used in Medical Image Segmentation,” Comput. Mater. Contin., vol. 84, no. 2, pp. 3029–3051, 2025. https://doi.org/10.32604/cmc.2025.064660



cc Copyright © 2025 The Author(s). Published by Tech Science Press.
This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
  • 595

    View

  • 245

    Download

  • 0

    Like

Share Link