Open Access iconOpen Access

REVIEW

A Systematic Review of Multimodal Fusion and Explainable AI Applications in Breast Cancer Diagnosis

Deema Alzamil1,2,*, Bader Alkhamees2, Mohammad Mehedi Hassan2,3

1 Department of Information Systems, College of Computer and Information Technology, Majmaah University, Majmaah, 11952, Saudi Arabia
2 Department of Information Systems, College of Computer and Information Sciences, King Saud University, Riyadh, 11451, Saudi Arabia
3 Research Chair of Pervasive and Mobile Computing and Department of Information Systems, College of Computer and Information Sciences, King Saud University, Riyadh, 11543, Saudi Arabia

* Corresponding Author: Deema Alzamil. Email: email

(This article belongs to the Special Issue: Exploring the Impact of Artificial Intelligence on Healthcare: Insights into Data Management, Integration, and Ethical Considerations)

Computer Modeling in Engineering & Sciences 2025, 145(3), 2971-3027. https://doi.org/10.32604/cmes.2025.070867

Abstract

Breast cancer diagnosis relies heavily on many kinds of information from diverse sources—like mammogram images, ultrasound scans, patient records, and genetic tests—but most AI tools look at only one of these at a time, which limits their ability to produce accurate and comprehensive decisions. In recent years, multimodal learning has emerged, enabling the integration of heterogeneous data to improve performance and diagnostic accuracy. However, doctors cannot always see how or why these AI tools make their choices, which is a significant bottleneck in their reliability, along with adoption in clinical settings. Hence, people are adding explainable AI techniques that show the steps the model takes. This review investigates previous work that has employed multimodal learning and XAI for the diagnosis of breast cancer. It discusses the types of data, fusion techniques, and XAI models employed. It was done following the PRISMA guidelines and included studies from 2021 to April 2025. The literature search was performed systematically and resulted in 61 studies. The review highlights a gradual increase in current studies focusing on multimodal fusion and XAI, particularly in the years 2023–2024. It found that studies using multi-modal data fusion achieved the highest accuracy by 5%–10% on average compared to other studies that used single-modality data, an intermediate fusion strategy, and modern fusion techniques, such as cross attention, achieved the highest accuracy and best performance. The review also showed that SHAP, Grad-CAM, and LIME techniques are the most used in explaining breast cancer diagnostic models. There is a clear research shift toward integrating multimodal learning and XAI techniques into the breast cancer diagnostics field. However, several gaps were identified, including the scarcity of public multimodal datasets. Lack of a unified explainable framework in multimodal fusion systems, and lack of standardization in evaluating explanations. These limitations call for future research focused on building more shared datasets and integrating multimodal data and explainable AI techniques to improve decision-making and enhance transparency.

Keywords

Breast cancer; classification; explainable artificial intelligence; XAI; deep learning; multi-modal data; explainability; data fusion

Cite This Article

APA Style
Alzamil, D., Alkhamees, B., Hassan, M.M. (2025). A Systematic Review of Multimodal Fusion and Explainable AI Applications in Breast Cancer Diagnosis. Computer Modeling in Engineering & Sciences, 145(3), 2971–3027. https://doi.org/10.32604/cmes.2025.070867
Vancouver Style
Alzamil D, Alkhamees B, Hassan MM. A Systematic Review of Multimodal Fusion and Explainable AI Applications in Breast Cancer Diagnosis. Comput Model Eng Sci. 2025;145(3):2971–3027. https://doi.org/10.32604/cmes.2025.070867
IEEE Style
D. Alzamil, B. Alkhamees, and M. M. Hassan, “A Systematic Review of Multimodal Fusion and Explainable AI Applications in Breast Cancer Diagnosis,” Comput. Model. Eng. Sci., vol. 145, no. 3, pp. 2971–3027, 2025. https://doi.org/10.32604/cmes.2025.070867



cc Copyright © 2025 The Author(s). Published by Tech Science Press.
This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
  • 246

    View

  • 48

    Download

  • 0

    Like

Share Link