Table of Content

Open Access iconOpen Access

ARTICLE

Artificial intelligence improves urologic oncology patient education and counseling

Yash B. Shah, Anushka Ghosh, Aaron Hochberg, James R. Mark, Costas D. Lallas, Mihir S. Shah

Department of Urology, Sidney Kimmel Medical College, Thomas Jefferson University, Philadelphia, Pennsylvania, USA
Address correspondence to Dr. Mihir S. Shah, Department of Urology, Thomas Jefferson University, 1025 Walnut St., Suite 1100, Philadelphia, PA 19107 USA

Canadian Journal of Urology 2024, 31(5), 12013-12018.

Abstract

Introduction: Patients seek support from online resources when facing a troubling urologic cancer diagnosis. Physician-written resources exceed the recommended 6-8th grade reading level, creating confusion and driving patients towards unregulated online materials like AI chatbots. We aim to compare the readability and quality of patient education on ChatGPT against Epic and Urology Care Foundation (UCF).
Materials and methods: We analyzed prostate, bladder, and kidney cancer content from ChatGPT, Epic, and UCF. We further studied readability-adjusted responses using specific AI prompting (ChatGPT-a) and Epic material designated as Easy to Read. Blinded reviewers completed descriptive textual analysis, readability analysis via six validated formulas, and quality analysis via DISCERN, PEMAT, and Likert tools.
Results: Epic met the recommended grade level, while UCF and ChatGPT exceeded it (5.81 vs. 8.44 vs. 12.16, p < 0.001). ChatGPT text was longer with more complex wording (p < 0.001). Quality was fair for Epic, good for UCF, and excellent for ChatGPT (49.5 vs. 61.67 vs. 64.33). Actionability was overall poor but particularly lowest (37%) for Epic. On qualitative analysis, Epic lagged on all quality measures. When adjusted for user education level (ChatGPT-a and Epic Easy to Read), readability improved (7.50 and 3.53), but only ChatGPT-a retained high quality.
Conclusions: Online urologic oncology patient materials largely exceed the average American’s literacy level and often lack real-world utility for patients. Our ChatGPT-a model indicates that AI technology can improve accessibility and usefulness. With development, a healthcare-specific AI program may help providers create content that is accessible and personalized to improve shared decision-making for urology patients.

Keywords

ChatGPT, artificial intelligence, urologic oncology, patient education, health literacy, health systems research

Cite This Article

APA Style
Shah, Y.B., Ghosh, A., Hochberg, A., Mark, J.R., Lallas, C.D. et al. (2024). Artificial intelligence improves urologic oncology patient education and counseling. Canadian Journal of Urology, 31(5), 12013–12018.
Vancouver Style
Shah YB, Ghosh A, Hochberg A, Mark JR, Lallas CD, Shah MS. Artificial intelligence improves urologic oncology patient education and counseling. Can J Urology. 2024;31(5):12013–12018.
IEEE Style
Y.B. Shah, A. Ghosh, A. Hochberg, J.R. Mark, C.D. Lallas, and M.S. Shah, “Artificial intelligence improves urologic oncology patient education and counseling,” Can. J. Urology, vol. 31, no. 5, pp. 12013–12018, 2024.



cc Copyright © 2024 The Author(s). Published by Tech Science Press.
This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
  • 129

    View

  • 112

    Download

  • 0

    Like

Share Link