Open Access
ARTICLE
Automated Gleason Grading of Prostate Cancer from Low-Resolution Histopathology Images Using an Ensemble Network of CNN and Transformer Models
1 School of Informatics, Kochi University of Technology, Kami, 782-8502, Japan
2 RIoT Center, Independent University, Bangladesh, Dhaka, 1229, Bangladesh
3 Department of Histopathology, Armed Forces Institute of Pathology, Dhaka, 1216, Bangladesh
4 Information Systems Department, College of Computer and Information Sciences, Imam Mohammad Ibn Saud Islamic University (IMSIU), Riyadh, 11432, Saudi Arabia
5 Department of Computer Sciences, College of Computer and Information Sciences, Princess Nourah bint Abdulrahman University, Riyadh, 11671, Saudi Arabia
* Corresponding Author: Md Shakhawat Hossain. Email:
# These authors contributed equally to this work
(This article belongs to the Special Issue: Cutting-Edge Machine Learning and AI Innovations in Medical Imaging Diagnosis)
Computers, Materials & Continua 2025, 84(2), 3193-3215. https://doi.org/10.32604/cmc.2025.065230
Received 07 March 2025; Accepted 16 May 2025; Issue published 03 July 2025
Abstract
One in every eight men in the US is diagnosed with prostate cancer, making it the most common cancer in men. Gleason grading is one of the most essential diagnostic and prognostic factors for planning the treatment of prostate cancer patients. Traditionally, urological pathologists perform the grading by scoring the morphological pattern, known as the Gleason pattern, in histopathology images. However, this manual grading is highly subjective, suffers intra- and inter-pathologist variability and lacks reproducibility. An automated grading system could be more efficient, with no subjectivity and higher accuracy and reproducibility. Automated methods presented previously failed to achieve sufficient accuracy, lacked reproducibility and depended on high-resolution images such as . This paper proposes an automated Gleason grading method, ProGENET, to accurately predict the grade using low-resolution images such as . This method first divides the patient’s histopathology whole slide image (WSI) into patches. Then, it detects artifacts and tissue-less regions and predicts the patch-wise grade using an ensemble network of CNN and transformer models. The proposed method adapted the International Society of Urological Pathology (ISUP) grading system and achieved 90.8% accuracy in classifying the patches into healthy and Gleason grades through using WSI, outperforming the state-of-the-art accuracy by 27%. Finally, the patient’s grade was determined by combining the patch-wise results. The method was also demonstrated for grading and binary classification of prostate cancer, achieving 93.0% and 99.6% accuracy, respectively. The reproducibility was over 90%. Since the proposed method determined the grades with higher accuracy and reproducibility using low-resolution images, it is more reliable and effective than existing methods and can potentially improve subsequent therapy decisions.Keywords
Cite This Article

This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.