Open Access iconOpen Access

ARTICLE

Automated Gleason Grading of Prostate Cancer from Low-Resolution Histopathology Images Using an Ensemble Network of CNN and Transformer Models

Md Shakhawat Hossain1,2,#,*, Md Sahilur Rahman2,#, Munim Ahmed2, Anowar Hussen3, Zahid Ullah4, Mona Jamjoom5

1 School of Informatics, Kochi University of Technology, Kami, 782-8502, Japan
2 RIoT Center, Independent University, Bangladesh, Dhaka, 1229, Bangladesh
3 Department of Histopathology, Armed Forces Institute of Pathology, Dhaka, 1216, Bangladesh
4 Information Systems Department, College of Computer and Information Sciences, Imam Mohammad Ibn Saud Islamic University (IMSIU), Riyadh, 11432, Saudi Arabia
5 Department of Computer Sciences, College of Computer and Information Sciences, Princess Nourah bint Abdulrahman University, Riyadh, 11671, Saudi Arabia

* Corresponding Author: Md Shakhawat Hossain. Email: email
# These authors contributed equally to this work

(This article belongs to the Special Issue: Cutting-Edge Machine Learning and AI Innovations in Medical Imaging Diagnosis)

Computers, Materials & Continua 2025, 84(2), 3193-3215. https://doi.org/10.32604/cmc.2025.065230

Abstract

One in every eight men in the US is diagnosed with prostate cancer, making it the most common cancer in men. Gleason grading is one of the most essential diagnostic and prognostic factors for planning the treatment of prostate cancer patients. Traditionally, urological pathologists perform the grading by scoring the morphological pattern, known as the Gleason pattern, in histopathology images. However, this manual grading is highly subjective, suffers intra- and inter-pathologist variability and lacks reproducibility. An automated grading system could be more efficient, with no subjectivity and higher accuracy and reproducibility. Automated methods presented previously failed to achieve sufficient accuracy, lacked reproducibility and depended on high-resolution images such as . This paper proposes an automated Gleason grading method, ProGENET, to accurately predict the grade using low-resolution images such as . This method first divides the patient’s histopathology whole slide image (WSI) into patches. Then, it detects artifacts and tissue-less regions and predicts the patch-wise grade using an ensemble network of CNN and transformer models. The proposed method adapted the International Society of Urological Pathology (ISUP) grading system and achieved 90.8% accuracy in classifying the patches into healthy and Gleason grades through using WSI, outperforming the state-of-the-art accuracy by 27%. Finally, the patient’s grade was determined by combining the patch-wise results. The method was also demonstrated for grading and binary classification of prostate cancer, achieving 93.0% and 99.6% accuracy, respectively. The reproducibility was over 90%. Since the proposed method determined the grades with higher accuracy and reproducibility using low-resolution images, it is more reliable and effective than existing methods and can potentially improve subsequent therapy decisions.

Keywords

Gleason grading; prostate cancer; whole slide image; ensemble learning; digital pathology

Cite This Article

APA Style
Hossain, M.S., Rahman, M.S., Ahmed, M., Hussen, A., Ullah, Z. et al. (2025). Automated Gleason Grading of Prostate Cancer from Low-Resolution Histopathology Images Using an Ensemble Network of CNN and Transformer Models. Computers, Materials & Continua, 84(2), 3193–3215. https://doi.org/10.32604/cmc.2025.065230
Vancouver Style
Hossain MS, Rahman MS, Ahmed M, Hussen A, Ullah Z, Jamjoom M. Automated Gleason Grading of Prostate Cancer from Low-Resolution Histopathology Images Using an Ensemble Network of CNN and Transformer Models. Comput Mater Contin. 2025;84(2):3193–3215. https://doi.org/10.32604/cmc.2025.065230
IEEE Style
M. S. Hossain, M. S. Rahman, M. Ahmed, A. Hussen, Z. Ullah, and M. Jamjoom, “Automated Gleason Grading of Prostate Cancer from Low-Resolution Histopathology Images Using an Ensemble Network of CNN and Transformer Models,” Comput. Mater. Contin., vol. 84, no. 2, pp. 3193–3215, 2025. https://doi.org/10.32604/cmc.2025.065230



cc Copyright © 2025 The Author(s). Published by Tech Science Press.
This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
  • 390

    View

  • 132

    Download

  • 0

    Like

Share Link